MERLOT Journal of Online Learning and Teaching
Vol. 5, No. 1, March 2009


Development and Examination of an Individualized Online Adjunct to In-Class Education


Shawn Davis
School of Professional Psychology
Pacific University
Hillsboro, OR 97123 USA
davissh@pacificu.edu

Bryant Kilbourn
School of Professional Psychology
Pacific University
Hillsboro, OR 97123 USA
Kilb8968@pacificu.edu


Abstract

The present study was an investigation of the effectiveness of an individualized class website serving as an adjunct to traditional in-class instruction in improving class performance, class attendance, and the overall class experience. Participating individuals were enrolled in either an undergraduate-level cognitive psychology course consisting of in-class instruction and a generic, non-individualized class website or in a similar course with the same in-class instruction and a class website that contained highly-individualized information (i.e., automated individual feedback on test performance, personalized study suggestions based on test performance, and a personal record of class attendance). As hypothesized, individuals in the individualized section performed significantly better on exams as the class progressed. No significant differences were found, however, in class absences between the two course sections. A significant positive correlation between the number of times the class website was accessed and final class average was found only for individuals in the test section.

Keywords : Internet, tailored, website, personalized, study skills, cognition


Introduction

The continuing challenge of educators is to find a way to best facilitate the delivery and comprehension of information and maximize the learning situation. Unfortunately, because of factors such as the limited time available to devote to individual students and ever-increasing class sizes, traditional in-class instruction does not always achieve this goal. By default, the method most often utilized is generally a rather limited form of direct instruction.

Direct instruction is one method used by educators and is perhaps the most recognizable. Mcvittie (2008) explains that direct instruction involves an educator lecturing for a time on a specific topic after which students are guided through a complex problem. The educator typically breaks the complex problem down into simpler steps and then gives the students these simpler steps to work with on their own. Finally the educator gives the students sample complex problems with which they are expected to apply the information learned in the lesson. There may be variations of the description above such as using video presentation but the principle remains the same: successful transmission of information to the student to meet established educational goals.

The direct instruction method has its roots in the concepts of behaviorism and has its advantages and disadvantages (Mcvittie, 2008). Direct instruction allows the instructor to control the timing and pace of the learning environment. The instructor also is in control of what will be learned and who will learn it. Finally, some material is best taught in this fashion, especially if the concept to be taught is simple or there is only one right answer to the problem.

There are, however, disadvantages to using this method (Mcvittie, 2008). Direct instruction is based on specific learning theories that may not be applicable in all situations or to all people. For example, students who are not verbal or auditory learners may do poorly when this method is used. Secondly, students are usually unaware of the overall purpose to the lesson when direct instruction is used, as the problem is broken down to simpler steps. Also, because transmission, not collaboration, is the goal of this teaching method, educators are unable to assess any prior knowledge or lack there of which may inhibit a student’s learning. Retention of the information has the tendency to be low if the students are not allowed to practice the complex problems on their own.

Direct instruction can be useful in a classroom setting but as the preceding information suggests, it may not be effective in every instance. Fortunately, there are other methods of teaching that can reach a greater variety of students. One need not abandon direct instruction but perhaps augment it so that every student receives the full benefit of an instructor’s educational efforts.

As stated above, simply transmitting information from the instructor to the student or learner may be ineffective in many instances. There are many factors that must be considered in order to design the most useful instructional method. Smith and Ragan (1999) enumerate such considerations. First, education should be oriented toward problem-solving both for the instructor and learner. Education is most effective when it begins with a needs assessment of the learners. This information can then guide the educator in designing an effective teaching program.

Thus, from the very outset, learning should be learner centered. Educators should consider the similarities and differences between the learners or students. Educators should be aware of the students’ knowledge base prior to embarking on any new lessons as well as the developmental level of the learners. Educators should take into account the learning environment as well.

Additionally, different styles of learning influenced by cognitive ability should be taken into account. Instructional strategies should be flexible when teaching material from different domains. Declarative knowledge, principle learning, concept learning, and attitude change are examples of different domains which may require different instructional designs, all being centered on the learner.

While instructional strategies addressing the individual needs of the learner are deemed most desirable in many situations, these methods are often judged to be either too time consuming or expensive. As technological use and access increases, however, so do methods of most adequately addressing the individual needs of students. One method, in particular, that holds much promise for successful educational application is the adoption of a tailored approach to communication that is individually-centered and responsive to the developing needs of the individual.

Tailored Communications

Educational interactions have traditionally aimed to provide as much information as possible within a one-way interaction. This is, unfortunately, often done without considering any specific characteristics of the prospective recipient. Tailored communications, however, are individualized communications intended to reach a specific person that are based on information pertaining to characteristics that are unique to that individual (Kreuter, Strecher, & Glassman, 1999; Rakowski, 1999).

The strength of tailored communications is that they are based upon an individual's needs, interests, and concerns, and that they utilize personally relevant information in the creation of messages and materials to fit that specific individual. This form of communication follows an assessment-based approach in which personal data (i.e., behavioral, attitudinal, or demographic) is obtained related to a desired outcome. Those data are, in turn, used to determine the most appropriate information to meet each person's unique needs. As the level of assessment increases, a higher degree of individualization is possible in the content of the tailored communication message. The burden of the individual having to filter through potentially non-relevant information is reduced in tailored forms of communication.

The effectiveness of tailored communications can be explained through established theories of information processing. In particular, Petty and Cacioppo's (1981) Elaboration Likelihood Model details that individuals are more likely to process information actively and elaborately if they perceive it to be personally relevant. When the arguments used in a message are deemed personally relevant to the recipient of the message, the information contained in the message is processed more deeply and the expected change in the individual (in terms of increased knowledge and / or behavioral change) will be greater than if the message held little or no relevance to the receiver.

The concept of enhanced or “deep” processing leading to positive cognitive effects is one that is not new to psychology. According to Craik and Lockhart (1972; Craik, 1979) deep (i.e., meaningful) processing of information leads to retention that is more permanent than shallow (i.e., sensory) forms of processing. In an application of this levels-of-processing approach, Rogers, Kuiper, & Kirker (1977) found that information that is deemed personal in nature will be more easily recalled via the self-reference effect. As explanation for the self-reference effect, Bellezza (1984; Bellezza & Hoyt, 1992) suggests that the “self” is a rich structure of internal cues to which new information can readily be associated. Tailored communications eliminate information that is not pertinent to the individual recipient and instead focus on information that the individual indicates as being important. As such, tailored messages address the specific interests, concerns, and needs of a single individual. This individual is therefore more likely to attend to the tailored information and thoughtfully consider it because it is viewed as personally relevant.

While the tailoring approach has been found successful in bringing about behavioral change within a number of areas such as weight loss (Kreuter, Bull, Clark, & Oswald, 1999), smoking cessation (Kreuter & Holt, 2001), the adoption of healthy eating behaviors (Davis, 2008), and advertising (Pepper & Rogers, 1993), a formalized application of such an approach is lacking in mainstream education. Given the nature of education as a structured exercise in communication and the enhanced technological tools at our disposal, the development and examination of an adjunct to traditional in-class instruction that is founded in the tailoring approach is called for.

The Present Study

The present study was an investigation of the effectiveness of an individualized class website serving as an adjunct to traditional in-class instruction in improving class performance, class attendance, and the overall class experience. Participating individuals were enrolled in either an undergraduate-level cognitive psychology course consisting of in-class instruction and a generic, non-individualized class website or in a similar course with the same in-class instruction and a class website that contained highly-individualized information (i.e., individual feedback on test performance, personalized study suggestions based on test performance, and a personal record of class attendance). Students in both sections of the course were unaware as to the differences in instruction between sections.

It was hypothesized that individuals receiving individualized feedback from the class website would 1) perform better overall in the course (i.e., have a higher class average than those individuals in the non-individualized section), 2) demonstrate greater improvement across individual tests during the term, 3) have significantly fewer class absences, and 4) have a qualitatively better class experience than those individuals in the non-individualized section.

Method

Participants

In total, 40 students (14 male, 26 female) enrolled in one of two sections of an undergraduate-level cognitive psychology course participated in the present study. The control section consisted of traditional in-class instruction with the addition of a generic, non-individualized class website. The test section consisted of the same in-class instruction with the addition of a class website that contained highly-individualized information with content pertinent to the student’s individual performance. In the control course section there were 19 student participants (8 male and 11 female) and in the test section there were 21 student participants (6 male and 15 female). The average age of participants was 25.0 years (SD = 4.45 years) in the control section and 23.1 years (SD = 3.05 years) in the test section.

Procedure

At the onset of the term, students in both sections were informed verbally that a password-protected class website would serve as an adjunct to their traditional in-class instruction. All students were told that this website would contain within it the course syllabus, a running class schedule, and information concerning their test performance and class attendance. Students were told that the class website was for their benefit and that visiting the website was completely voluntary; no points would be added to or taken away from their class average based on accessing or failing to access the website. Also, all students were presented with and signed a document of informed consent detailing, in writing, the same information presented verbally.

During the term, student knowledge was assessed with five multiple-choice examinations that were non-cumulative and would contain questions that were definitional, conceptual, or applied in nature. Class attendance was taken at the onset of each class session for students in both sections. To ensure standardization of course material across sections, the in-class lecture was highly structured and was presented by the same instructor in both sections. While the structural aspects of the class website (i.e., font, color scheme, and layout) were similar in both the general and individualized forms of the site, the content contained within each form differed in its level of personalization.

General Class Website

Student participants in the control section were given access to a class website that was very general in nature. In particular, upon entering their personal access code, the student was presented with the same instructor contact information, course syllabus, and class schedule that all other students received. Automated information regarding their performance on a given exam was provided on the website within 24 hours of completing the exam. This information included a copy of the exam itself, their individual responses to the exam questions, and indication of whether each response was correct or incorrect. Also within the website was a running average (presented as an online calendar) of the number of students who didn’t attend a given class session. This running average made no reference to their individual attendance patterns, just the pattern of attendance of the entire class.

Individualized Class Website

Student participants in the test section were given access to an automated version of the class website that was much more personalized than was the site in the control section. Upon entering their individual password access code, the student was directed to their individualize website. This website contained some of the same information presented to students in the control condition. For example, students in the test section were presented with the same instructor contact information, course syllabus, and class schedule as were students in the control section. However, the feedback regarding exam performance and record of class attendance was much more individualized.

Within 24 hours of completing a given exam, students in both sections were presented with a copy of the exam itself and with their individual responses to each question within the exam. Students in the test section, however, were presented with the correct answer to each question and provided a brief explanation of why this answer (as opposed to other answer alternatives) was correct. Students in the test section were presented with positive feedback for each correct answer given as well.

Based on the pattern of student responses across questions within the exam, a brief report was presented to each student regarding their level of success within each type of question (i.e., definitional, conceptual, or applied). This report addressed the individual student by name and contained either feedback regarding their success within a given question type or suggestions for ways to improve their performance on a given question type. For example, if a student performed well (above 85%) for a certain question type, they were presented with a statement indicating their success and provided with encouragement to continue preparing for that particular type of question as they had previously done. Alternatively, if a student performed below the 85% cutoff for a particular question type they were presented with suggestions for ways to best study that type of information. In the box below is an example of individualized student feedback with appropriate suggestions for improvement.

Individualized attendance information was also provided in the class website for students in the test section. Students in the control section received information regarding average class attendance patterns presented in a calendar format. Students in the test section, however, were presented with (in the same calendar presentation format) notice of which class sessions they personally missed.

John , let me commend you on your overall strong performance on the exam that you took yesterday. You improved from your previous exam by 12 points…good job! In looking over your answers, it seems that you are having the most trouble with the definitional questions. Fortunately, there are some ways that you can improve your performance on this type of question! As you prepare for the next exam, you might…

  • Read the material from the book before class for basic understanding and then again following the class to integrate the class discussion into your developing understanding of the material.
  • Make note cards for all bolded terms in your textbook chapters. This might sound elementary, but quizzing yourself on the terms in this manner is a wonderful way to be successful with definitional questions.
  • Take the time to work on the crossword puzzle exercises that I’ve been giving out in class. In that successful completion of the crossword involves you knowing the important terms within a given chapter, this is a wonderful way to prepare for definitional type questions on the exams.

These are just a few examples of ways that you can improve your study skills; we will be discussing these in more detail as the term progresses. As always, feel free to contact me if you have any questions or concerns.

(Note: The italicized words and phrases in the examples above represent content that varied by individual and situation.)

Open-Ended Summary Information

Upon completion of the final exam, students in both the control and test section were asked to provide feedback regarding their thoughts and feelings regarding the class website and the course itself. This feedback was open-ended in nature and no restrictions were placed on the responses that the students could provide. Furthermore, it was made clear to all students that this feedback would remain in a sealed envelope until after final class grades have been submitted to the registrar and that their class standing would not be affected either positively or negatively based on the responses given.

Results

Of the 40 students participating in the present study, all were enrolled from the beginning of the term and completed the course in its entirety. No significant differences were found between sections in the distribution of males and females (X 2 (1) = .803, n.s.). Likewise, no significant difference was found between sections in regard to student age (t (38) = 1.59, n.s.).

Exam Performance

While the mean class average was higher for students in the test section (M = 82.02) than for students in the control section (M = 78.26), the difference in final class averages was not significant (t (38) = -1.52, n.s.). Closer examination of performance across the five individual exams, however, revealed significant differences between classes. An ANOVA procedure revealed a significant course section by exam interaction (F (4, 152) = 3.67, p < .01). Average exam grades for each section are presented in Table 1 and a graphical representation of the distribution of scores across exams by section is presented in Figure 1.

Table 1: Average exam grades

 

Control Section

Test Section

 

M

SD

M

SD

Exam 1

76.93

12.25

77.11

17.18

Exam 2

78.42

8.67

79.10

8.20

Exam 3

78.93

6.35

82.72

6.52

Exam 4

78.47

5.69

84.43

6.81

Exam 5

78.75

7.85

86.76

6.43

 


Figure 1. Average exam grades by course section

To determine at which point the difference between average scores between course sections reached significance, a series of independent samples t-test procedures was conducted. The Bonferroni error correction procedure was employed to minimize the likelihood of repeated testing leading to an inflation of the Type-I error rate. As indicated through these analyses, the difference in exam average between course sections was not significant for the first exam (t (38) = -.036, n.s.) the second exam (t (38) = -.253, n.s.), nor the third exam (t (38) = -1.85, n.s.). A significant difference, however, was found in exam averages between sections for the fourth exam (t (38) = -2.98, p < .01) and the fifth exam (t (38) = -3.55, p < .01).

Class Absences

The average numbers of class absences for students in both the control and test sections are presented in Table 2 below. Examination of the number of class absences, on average, for students in the two course sections revealed no significant differences neither overall across the term (t (38) = .454, n.s.) nor at any of five assessment points during the term (F (4, 152) = .146, n.s.).

Participant Gender and Age

No significant differences were found between male and female participants in either section on their test performance at any point during the term. Likewise, the correlation between participant age and overall grade average was found to be non-significant both across class sections (r (40) = .135, n.s.) and for the control section (r (19) = .176, n.s.) and test sections (r (21) = .257, n.s.) when examined independently.


Table 2: Average number of class absences

 

Control Section

Test Section

 

M

SD

M

SD

Exam 1

.68

.89

.67

.80

Exam 2

.53

.70

.52

.68

Exam 3

.53

.51

.48

.75

Exam 4

.53

.77

.48

.51

Exam 5

.63

.68

.43

.60

Website Access

A significant difference was found between sections in the number of times, on average, student participants accessed the class website. In particular, across the term students in the test section visited the site more often (M = 11.29, SD = 2.35) than did students in the control section (M = 8.21, SD = 1.55) (t (38) = -4.83, p < .001).

Across both sections, a significant correlation was found between the number of times the class website was visited and the final grade average in the course (r (40) = .45, p < .01). When examined by class section, however, a significant correlation between website access and final class average was found for individuals in the test section (r (21) = .56, p < .01), but not for students in the control section (r (19) = .09, n.s.).

Qualitative Analysis

Open-ended responses made by students in the two sections after taking their final exam revealed a number of noteworthy themes. A number of students in the test section made comments expressing feelings that they were, “being treated fairly” (N = 3), “treated like an individual” (N = 8), and that there was “apparent concern for (their) success in the class” (N = 7). These comments, however, were absent from those provided by students in the control section. In fact, the comment most often made by students in the control section was, “this class was too difficult” (N=4).

Discussion

The present study was an examination of the effectiveness of a personalized online adjunct to traditional in-class education. The results of the study indicate that use of a class website containing information tailored to the characteristics, performance, and needs of the individual student was not only successful in bringing about increases in exam performance, but that its use also created an atmosphere of concern for the student that is often times lacking in traditional in-class instruction. By interacting with the student as an individual, rather than as a passive recipient of instruction as emphasized through much direct instruction, students are provided concrete tools with which they become a more active and effective participant in their education.

While the overall difference in final class average between the test and control class sections was not found to be significantly different, the average for the class section receiving the individualized website was nearly 4 points higher. More importantly, when viewed as a resulting letter grade, the final class average for the test section was in the B range while the final average for the control section was a C. It is likely that similarities in average scores from exams given during the initial portion of the term are behind the similarity in overall class averages observed between sections. When examined on an exam-by-exam basis, there is an identifiable trend that differs between sections. Specifically, as the term progressed, individuals in the test section scored significantly higher (on average) on exams than did students in the control section. In fact, students receiving the individualized website were (on average) in the B range while students in the control section were in the C range for the final 3 (of 5 total) exams given during the term. The lack of significant different between sections on the first exam is not surprising, in that the individualization of website information presented to students in the test section was the result of their performance on that exam. In other words, there was no difference in personalization between the two sections until after the first exam and, therefore, no true differences were expected until subsequent examination. To this point, the classes were the same in terms of information presentation.

Contrary to what was expected at the onset of the term, no difference was observed between students in the two course sections in the number of class absences both overall or at any assessment point during the term. The average number of absences for students at each of the assessment point was less than one class period missed for both sections, a finding that, while not significant in terms of statistical evaluation is one that is welcomed by most educators. In that class personalization alone did not bring about an increase in class attendance, future research might examine the influence of such personalization with consequences (both positive and negative) for one’s attendance. Also educationally desirable were the findings that male and female students didn’t differ in terms of exam scores and there was no significant relationship between the age of the student and their exam performance.

The difference between sections in the number of times students accessed the class website is likely reflective of the level of personalization that they encountered when visiting the site. Students in the test section visited the class website more often than did students in the control section. Perhaps more importantly, it was found that within the test section students who visited the class website (that contained the individualized information content) were those students who had higher overall class averages. This relationship, however, was not seen in the control section; website access in the control section (wherein the website did not contain individualized content) was not related to overall class performance. It is important to note that failing to access the class website or encountering a class website that is general rather than personalized in nature was not detrimental to performance in the class. Increased access to the individualized content in the website utilized in the test section, however, clearly indicates a benefit for the student.

Beyond consideration of quantitative analysis and statistical significance, the aspect of the current study that is perhaps most reflective of the true value of the individualization of experience made possible with such a personalized website adjunct is the differences in reflective responses made by students in the two course sections at the end of the term. Not only were there overall more responses made by students in the test section, the nature of these responses was considerably more expressive and highly positive. For example, responses made by students receiving the individualized website content were consistent in their appreciation of being treated “like an individual” and many expressed that there was an “apparent concern for (their) success in the class.” While the presentation of individualized content was entirely automated and there was actually no functional difference in the distribution of the website to the students, the fact that the messages that were presented in the test section were reflective of and a response to their individual efforts there was the appearance of individual concern. That is not to say that the instructor / researcher has anything less than true concern for the student; it is merely testament to the power of our technological tools to create much more “human” interaction than is often expected in the absence of true one-on-one contact.

This study represents the different ways that an automated website adjunct to traditional in-class instruction can be beneficial to the students we serve. While this study has been successful in terms of highlighting positive outcomes of such an adjunct both quantitatively in terms of exam grade improvement, and qualitatively as is reflected in comments made by students at the end of the course term, there is still much research to be conducted on the many ways to hone and improve its effectiveness. For example, increased attention to the student’s individual personality, previous knowledge, and learning style characteristics would further increase the level of personalization possible. Also, expanded use of multimedia presentation within the website and increased integration of feedback with course content and student performance will undoubtedly enhance the effectiveness of such an online tool. An individual’s educational experience is something that is quite personal to them. We have the tools at our disposal to not only highlight this, but to capitalize on it as well.


References

Bellezza, F. S. (1984). The self as a mnemonic device: The role of internal cues. Journal of Personality and Social Psychology, 47, 506-516.

Bellezza, F. S. & Hoyt, S. K. (1992). The self-reference effect and mental cueing. Social Cognition, 10, 51-78.

Craik, F. I. M. (1979). Levels of processing: Overview and closing arguments. In L. S. Cermak & F. I. M. Craik (Eds.), Levels of Processing in Human Memory. Hillsdale, NJ: Erlbaum.

Craik, F. I. M. & Lockhart, R. S. (1972). Levels of processing: A framework for memory research. Journal of Verbal Learning and Verbal Behavior, 11, 671-684.

Davis, S. E. (2008, February). Cultural value orientation and tailored health communications. Paper presented at the 2008 annual meeting of the Society of Cross-Cultural Research, New Orleans, LA.

Kreuter, M. W., Bull, F. C., Clark, E. M., & Oswald, D. L. (1999). Understanding how people process health information: A comparison of tailored and untailored weight loss materials. Health Psychology, 118, 487-494.

Kreuter, M. W. & Holt, C. L. (2001). How do people process health information? Applications in an age of individualized communication. Current Directions in Psychological Science, 10, 206-209.

Kreuter, M. W., Strecher, V. J., & Glassman, B. (1999). One size does not fit all: The case for tailoring print materials. Annals of Behavioral Medicine, 21(4), 276-283.

Mcvittie, J. (2008). Teaching Methods: Direct Instruction. Retrieved July 11, 2008, from http://www.usask.ca/education/coursework/mcvittiej/methods/direct.html.

Peppers, D., & Rogers, M. (1993). The One to One future: Building relationships one customer at a time. New York: Doubleday. Petty, R. T. & Cacioppo, J. T. (1981). Attitudes and persuasion: Classic and contemporary approaches. Dubuque, IA: William C. Brown Company.

Rakowski, W. (1999). The potential variances of tailoring in health behavior interventions. Annals of Behavioral Medicine, 21(4), 284-289.

Rogers, T. B., Kuiper, N. A., & Kirker, W. S. (1977). Self-reference and the encoding of personal information. Journal of Personality and Social Psychology, 35, 677-688.

Smith, P. L., & Ragan, T. J. (1999). Instructional Design, (3 rd Ed.). New York: John Wiley & Sons, Inc.

 


Manuscript received 30 Nov 2008; revision received 3 Mar 2009.

Creative Commons License

This work is licensed under a

Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License

 

   
Copyright © 2005-2009  MERLOT. All Rights Reserved.
Portions Copyright by MERLOT Community Members. Used with Permission.
ISSN: 1558-9528
Questions? Email: jolteditor@merlot.org
Last Modified : 2009/3/15