MERLOT Journal of Online Learning and Teaching
Vol. 5, No. 2, June 2009

 

Technology and Pedagogy: The Association between Students’ Perceptions
of the Quality of Online Courses and the Technologies Employed



Karl L. Wuensch, Shahnaz Aziz, Erol Ozan, Masao Kishore, and M. H. N. Tabrizi

East Carolina University
Greenville, NC 27858 USA
WuenschK@ECU.edu

Abstract

A nationwide sample of college and university students completed a survey that asked questions about the pedagogical and technological characteristics of their most recently completed online course. Complete data were obtained from 1,805 students at 46 different universities and colleges in the United States. Findings from multiple regression analyses showed that students’ perceptions of the quality and the difficulty of the course were significantly related to the frequency of use of the various technologies. Specifically, the best predictor of perceived quality of the course was frequency of contact with the instructor via e-mail. These results are interpreted with respect to immediacy and social presence. It is argued that instructors and students can learn to employ technologies such as e-mail and discussion boards in ways that enhance teacher immediacy and social presence beyond what might be expected based on the nature of the medium.

Keywords: E-Mail, Social Presence, Immediacy, Asynchronous, Synchronous, Student Satisfaction, Survey Research.

Introduction

Enrollment in online courses in colleges and universities across the United States has recently grown much more rapidly (9.7%) than has been the case overall (1.5%), with almost 3.5 million students taking at least one online class in the Fall semester of 2006 (Allen & Seaman, 2007). With such growth, there has also been an explosion in the variety of technologies available to support online teaching. The online teacher not only needs to consider how to employ teaching techniques that have been demonstrated to be pedagogically sound, such as the seven principles for good practice in undergraduate education outlined by Chickering and Gamson (1987), but also determine which technologies to employ.

It is not difficult to find articles in the current literature about the effectiveness of online teaching using particular technologies. Typically these articles involve the investigation of a single technology and present results that suggest it has great potential in the online classroom. Online teachers who have kept abreast of the literature may conclude that they need to employ a dozen new technologies to make their online courses effective. The authors of the current study set out to evaluate the relative pedagogical effectiveness of multiple technologies, using students in online classes across the United States as the judges of the effectiveness of such techniques.

Although the focus of our research is on the effectiveness of different technologies, it should be noted that a technology will be effective only to the extent that it enables or enhances the application of good teaching practices. Batts (2008) recently researched student and instructor opinions about whether or not Chickering and Gamson’s (1987) Seven Principles were evident in their online classes. Both students and teachers reported with high frequency that only two of those principles were prevalent: frequent teacher-student contact and prompt feedback. Five of the principles (time on task, active learning, cooperation among students, high expectations, and diverse talents and ways of learning) were less frequently identified as having been practiced. Batts suggested technologies that could be employed to increase the use and practice of these principles. For example, real-time online discussions could be used to improve active learning, and discussion board threads could be employed to foster cooperation among students.

Student learning and satisfaction with online courses have been shown to be related to their perceptions of the degree of interaction with the instructor and with their fellow students (Arbaugh, 2000; Fredericksen, Pickett, Shea, Pelz, & Swan, 2000; Swan, 2001). While adequate interactivity may be achieved using technologies that support only asynchronous communication, the addition of technologies that support synchronous communication may also augment the degree of interaction among participants in an online class.

The online teacher must decide in what proportions to use asynchronous versus synchronous communications, an issue that is of little, if any, concern to the teacher in the traditional (face-to-face) classroom. Many students, especially those with busy schedules outside of their academic lives, will greatly appreciate the opportunity to be taught on their own schedule rather than the teacher’s schedule. However, most students would also like to be able to interact in real time with their teacher and other students. Online courses that are primarily asynchronous in nature may benefit from the addition of technologies that allow the students to interact in real time with the teacher and other students (Teng & Taveras, 2004-2005).

Technologies that support synchronous interactions between members of an online class come in many different configurations. For instance, they may provide for only one-way synchronous communication (instructor to students) or two-way synchronous communication (instructor to students and students to instructor). In addition, they may support only synchronous communication between instructor and students, or they may support synchronous communication between students as well.

Synchronous communication may be text only, audio only, video only, or some combination of text, audio, and video. A survey of graduate students in online classes, conducted earlier in this century, revealed that only 16% of the students had participated in an online class where there was both two-way audio and two-way video communication (Hijazi, Bernard, Plaisent, & Maguiraga, 2003). An additional 4% had participated in online classes with two-way audio but only one-way video. The authors reported that 64% of the students indicated that their online course provided adequate interactivity between student and instructor; however, it appears that they did not investigate whether or not there was a relationship between type of technology employed and student opinions regarding adequacy of interactivity. Students’ perceptions of interactivity were associated with their satisfaction with the course.

Although two-way audio and video synchronous communication may be the ideal for maximizing interactivity among participants in online classes, it frequently is not feasible due to the need for great bandwidth to support two-way video conferencing (Disbrow, 2008; Teng & Taveras, 2004-2005). Accordingly, instructors may find it necessary to adapt to having two-way synchronous communication only in the audio modality. It may be possible to achieve adequate two-way video and audio synchronous communication with reduced bandwidth demands by representing the participants as avatars in virtual worlds (Dickey, 2003; Hodge, Tabrizi, Farwell, & Wuensch, 2007). Instant messaging provides another way to support synchronous communications among class participants with relatively lower bandwidth requirements (Kuyath & Winter, 2006; Maushak & Ou, 2007; Nicholson, 2002).

There are also technologies being developed that enable students in online classes to participate in virtual labs. A typical virtual lab facility employs software that simulates the actions that would take place in a physical lab setting. The effectiveness of such virtual labs has been studied for classes in Biology (Stuckey-Mickell & Stuckey-Danner, 2007), Electrical Engineering (Campbell et al., 2004), and Physics (Finkelstein, Adams, Keller, Perkins, & Wieman, 2006). Although students who have completed both virtual and face-to-face labs in the same class have reported that they found the face-to-face lab more effective (Stuckey-Mickell & Stuckey-Danner), others have reported that students who have completed only the virtual lab score as well (Campbell et al.) or even better (Finkelstein et al.) on lab-related examinations. There have also been virtual labs developed where the online students remotely control instruments that are located in a laboratory on campus (Ammari & Slama, 2006; Cancilla & Albon, 2005). There appears to have been no research comparing the relative effectiveness of virtual labs that involve simulations with that of virtual labs that involve remote control of instruments.

Although most research on the effectiveness of technologies used in online education has focused on medium to high tech applications, a few researchers have investigated the effectiveness of a low tech application, e-mail (Eom, Wen, & Ashill, 2006; Heiman, 2008; Lightfoot, 2006; Smith, Ferguson, & Caris, 2003). E-mail has frequently been disparaged as an educational technology. Faculty members often complain that keeping up with e-mail takes too much time and that, in e-mail exchanges, students become inappropriately aggressive and questioning of the instructor’s authority (Smith, Ferguson, & Caris). Much of the content of face-to-face communication (nonverbal and verbal intonation) is missing in e-mail, which may result in miscommunication and low social presence. However, this can be ameliorated if both students and instructors put more thought into composing the e-mail than they typically do when communicating face-to-face (Lightfoot). There is some evidence that students who receive course-related e-mail are more satisfied with their course (Heiman). Timely, meaningful feedback from instructors is associated with both student satisfaction and student learning, and e-mail is one of the technologies that can be effectively deployed to provide such feedback (Eom, Wen, & Ashill).

Online course delivery systems generally include the ability to set up discussion boards and to administer quizzes and examinations for both practice and grading purposes. Quizzes may contribute more to student learning than does participation in discussion boards (Glass & Sue, 2008). It is most convenient not to proctor these quizzes and examinations, but there is some evidence that students learn more and are more on task when proctored tests are employed than when the tests are not proctored (Wellman & Marcinkiewicz, 2004).

It has long been recognized that student learning and satisfaction with distance education courses depend, in part, on the availability and deployment of appropriate educational technologies (Biner, Barone, Welsh, & Dean, 1997; Hiltz, 2005). The purpose of the current research was to investigate the associations between the use of several educational technologies and students’ perceptions of the quality of the online course. Of course, the instructional methods employed with the deployed technologies are also critically important, but most often methods and technologies are confounded, making it difficult (if not impossible) to separate their effects (Joy & Garcia, 2000). The focus of the research presented here is on the characteristics of the technologies used rather than the methods.

Method

The Survey

A team of faculty from a Southeastern university developed a survey instrument designed to measure student attitudes toward various pedagogical characteristics of online courses. Drafts of the survey were shared with members of focus groups comprised of faculty with extensive online teaching experience and graduate students with experience taking online courses. Feedback from these focus group sessions was used to modify the survey prior to its deployment. The complete survey can be viewed online at http://core.ecu.edu/psyc/wuenschk/StudentSurvey.htm.

The sampling procedure is best described as one of convenience, but it also included some elements of cluster random sampling and snowball sampling. Paper surveys were taken to on-campus classes at the authors’ university and three other universities within easy driving distance. The authors deliberately selected a diverse sample of classes, which had moderate to large enrollments and the approval of the instructor to allow time in class for the students to complete the survey. Prior to deploying the online survey, at least one university was randomly selected from each of the 50 states in the United States. When possible, the authors then e-mailed to a large proportion of the students at each university an invitation to participate in the online survey. The invitation explained the purpose of the survey and pointed the students to the URL where the survey was located. Potential respondents were advised that each student who completed the survey would be entered in a raffle, with two respondents randomly chosen to receive a prize (an iPod). Students were also asked to share the invitation with other students who might be interested in participating in the survey. The authors had no way of knowing how many students actually read the e-mails that were sent. Those students who completed the online survey were asked to provide their university e-mail address. It was explained that these addresses would not be paired with their responses, so their responses would remain confidential. The survey software did check the validity of each respondent’s e-mail address, rejecting any which were not valid university e-mail addresses or which had been used with a previous completion of the survey. Aside from being used to verify student status and prevent individuals from completing the survey more than once, the only use of the e-mail addresses was to contact those students who won prizes.

The survey consisted of 86 items. Eleven of the items on the survey concerned the students’ perceptions of the pedagogical characteristics of their most recently completed online course. They were asked to rate on a five-point scale 1 = very low, 2 = low, 3 = moderate, 4 = high, and 5 = very high) each of the following pedagogical characteristics: (a) quality of communication between instructor and students, (b) quality of communication with other students, (c) convenience, (d) pleasantness of the experience, (e) aid in learning of complex material, (f) organization of course materials, (g) allowing you to self-pace, (h) accurate evaluation of your learning, (i) amount of effort necessary to complete course, (j) overall understanding of course material, and (k) level of difficulty of course.

Additional items asked the students to indicate how frequently each of 21 educational technologies were employed in that course, on a five point scale (1 = not at all, 2 = rarely, 3 = sometimes, 4 = often, and 5 = almost all the time). The technologies about which the students were asked included: (a) lecture with video and audio input from the teacher, (b) lecture with audio but no video input from the teacher, (c) scheduled online lecture/meeting [synchronous], (d) archived online lecture/meeting [asynchronous], (e) student participation with live video and audio, (f) student participation with live audio but no video, (g) slide presentations [e.g. PowerPoint], (h) electronic white board, (i) online synchronous chat room, (j) online asynchronous discussion board, (j) online digital drop box [for submission of projects, homework, and assignments], (k) e-mail communication with instructor, (l) telephone communication with instructor, (m) instant messaging, (n) remotely accessible lab/virtual lab, (o) proctored online testing, (p) student presentations in synchronous format, (q) student presentations in asynchronous format, (r) online testing that is not proctored, (s) course-specific Web page, and (t) 3-D virtual classroom resembling face-to-face classroom environment.

Respondents

A total of 4,789 students completed the survey. Respondents were students from 46 different universities and colleges in 26 different states of the United States. The modal student was a female (65%) senior (23%) undergraduate student. Seventy-two percent of the respondents were undergraduate students, 23% were graduate students, and the remaining 5% were non-degree students and others. After culling data from respondents who had not taken at least one online course, there were 1,805 students who provided complete data on their evaluations of the pedagogical characteristics of and technologies employed in their most recently completed online course. The modal student among these 1,805 participants was also a female (70%) senior (28%) undergraduate student.

Results

As shown in Table 1, the most frequently employed technology was e-mail. Many of the technologies were only rarely used.

Table 1. Frequency of Use of Various Technologies in Online Classes

Technology

M

SD

% Used

E-mail with Instructor

4.28

1.02

98

Online digital drop box

3.76

1.48

84

Asynchronous discussion

3.37

1.57

78

Slide presentations

2.95

1.59

69

Course-specific Web page

2.92

1.62

67

Not-proctored testing

2.78

1.67

60

Asynchronous online lecture

2.59

1.62

58

Synchronous chat

2.06

1.37

46

Telephone with instructor

2.04

1.36

46

Proctored online testing

2.01

1.47

38

Student asynchronous presentation

1.99

1.41

39

Synchronous online lecture

1.93

1.32

41

Lecture with video and audio

1.77

1.26

34

Electronic white board

1.71

1.24

31

Remote/virtual lab

1.67

1.17

31

Lecture with audio only

1.57

1.06

28

Instant messaging

1.51

1.03

25

Student synchronous presentation

1.50

1.02

23

Student has audio and video input

1.35

0.85

18

Student has only audio input

1.33

0.82

17

3-D virtual classroom

1.24

0.77

12

Data from the 11 pedagogical characteristics were subjected to a principal components analysis with varimax rotation. Two well-defined components were obtained (see Table 2). High scores on the first component are associated with the student perceiving the course of having been of high quality. High scores on the second component are associated with the student perceiving the course as having been difficult. Component scores were written to the data file for use in subsequent analysis.

Table 2. Correlations of Components with the Original Pedagogical Variables

Technology

Component 1

Component 2

Pleasantness of the experience

.877

.124

Aid in learning of complex material

.798

.264

Accurate evaluation of learning

.782

.313

Organization of course materials

.756

.200

Overall understanding

.752

.284

Communication with instructor

.742

.251

Convenience

.698

-.017

Allowing self-pacing

.646

-.116

Communication with other students

.509

.336

Difficulty of the course

.041

.876

Effort necessary to complete course

.203

.833

Multiple regression analysis was employed to predict scores on the two course-characteristics components from the 21 technology-use variables. Students’ evaluation of the quality of the course (the first component) was significantly related to the reported frequency of use of the various technologies, F(21, 1783) = 20.21, p < .001, R = .44, CI .95 = .39, .47. As shown in Table 3, students’ evaluation of the quality of the course was most strongly related to frequency of e-mail communication with the instructor. Other course characteristics having nontrivial (r ³ .1) associations with quality of the course included frequent asynchronous discussions, telephone communication with the instructor, use of an online digital drop box, student asynchronous presentations, a course specific Web page, and asynchronous online lecture. Only two of these predictors had significant unique effects in the full model: E-mail communication with the instructor and asynchronous discussion.

Student evaluation of the difficulty of the course (the second component) was also significantly related to the reported frequency of use of the various technologies, F(21, 1783) = 8.41, p < .001, R = .30, CI .95 = .24, .33. As shown in Table 4, course difficulty was related in a nontrivial fashion to frequent use of asynchronous discussion, telephone communication with the instructor, student asynchronous presentations, synchronous chat, synchronous online lectures, e-mail communication with the instructor, asynchronous online lectures, lecture with video and audio, student synchronous presentations, use of a digital drop box, and use of a course-specific Web-page. In the full model, significant unique effects were found for frequency of asynchronous discussion, telephone communication with the instructor, student asynchronous presentations, synchronous online lectures, lecture with video and audio, and (due to suppression) use of a remote/virtual lab.

Table 3. Predicting Quality of the Online Course from Frequency of Use of Various Technologies

Technology

b

r

CI .95 for r

E-mail with instructor

.37**

.41**

.37

.45

Asynchronous discussion

.08*

.19**

.15

.23

Telephone with instructor

.04

.16**

.11

.20

Online digital dropbox

.00

.14**

.09

.18

Student asynchronous presentation

.06*

.14**

.09

.18

Course-specific Web page

.04*

.14**

.09

.18

Asynchronous online lecture

.03

.13**

.08

.18

Slide presentations

-.03

.09**

.04

.14

Not-proctored testing

.03

.09**

.04

.14

Lecture with video and audio

.00

.08*

.03

.13

Lecture with audio only

.05

.08*

.03

.13

Synchronous chat

.00

.07*

.02

.12

Remote/virtual lab

.01

.06*

.01

.11

Instant messaging

-.01

.05*

.003

.10

Synchronous online lecture

-.01

.05*

.003

.10

3-D virtual classroom

.04

.03

-.02

.08

Student synchronous presentation

-.05

.03

-.02

.08

Student has only audio input

-.01

.02

-.03

.07

Electronic white board

-.04

.01

-.04

.06

Proctored online testing

-.05

.01

-.04

.06

Student has audio and video input

.01

.01

-.04

.06

*p < .05 **p < .001


Table 4. Predicting Difficulty of the Online Course from Frequency of Use of Various Technologies

Technology

b

r

CI .95 for r

Asynchronous discussion

.11**

.19**

.15

.23

Telephone with instructor

.12**

.18**

.13

.22

Student asynchronous presentation

.08*

.17**

.12

.21

Synchronous chat

.04

.15**

.10

.19

Synchronous online lecture

.07*

.15**

.10

.19

E-mail with instructor

.04

.14**

.09

.18

Asynchronous online lecture

.02

.13**

.08

.18

Lecture with video and audio

.06*

.11**

.06

.16

Student synchronous presentation

.00

.11**

.06

.16

Online digital dropbox

.02

.10**

.05

.15

Course-specific Web page

.04

.10**

.05

.15

Electronic white board

.01

.09**

.04

.14

Instant messaging

.00

.09**

.04

.14

Slide presentations

-.01

.08**

.03

.13

Proctored online testing

.03

.08**

.03

.13

3-D virtual classroom

.00

.07*

.02

.12

Lecture with audio only

-.01

.07*

.02

.12

Student has only audio input

-.02

.06*

.01

.11

Student has audio and video input

-.02

.05*

.003

.10

Not-proctored testing

.01

.04

-.01

.09

Remote/virtual lab

-.09*

.03

-.02

.08

*p < .05 **p < .001

Discussion

It is not surprising that e-mail was the most frequently used technology – everybody has it and knows how to use it. The top seven most frequently used technologies in Table 1 are all asynchronous technologies that are available in commonly used course delivery systems such as BlackBoard. Although technologies supporting synchronous communication were less frequently employed, nearly half of the students reported having engaged in synchronous chat as part of their most recent online class. The least frequently experienced technologies were those that allow two-way audio and video interactions. This is not surprising given that these technologies can be expensive as well as inconvenient for students and/or instructors.

Frequency of use of each of the 21 technologies was positively correlated with students’ perceptions of the quality of the course, although for six of these technologies, the association was so small that it fell short of statistical significance. The clear message to instructors here is to use these technologies frequently – the more instructors use these technologies to connect with their students, the greater the quality of the course as perceived by students.

It should be noted that the correlation between frequency of use of a technology and students’ perceptions of the quality of the course might be expected to be truncated when a technology has infrequently been deployed. For example, if very few online classes have employed 3-D virtual classrooms, then it is unlikely that frequency of use of 3-D virtual classrooms will be well associated with students’ perceptions of the quality of the course. Put another way, without variance (in the frequency of use of the technology), there cannot be covariance (between frequency of use of the technology and student perception of the quality of the course). Note that the standard deviation for frequency of use of the most frequently employed technology (e-mail, SD = 1.02) is not very much greater than that of the least frequently employed technology (3-D virtual classrooms, SD = 0.77).

To investigate the effect of low frequency of deployment on a technology’s correlation with perceived quality of the course, the authors did redo the analysis while restricting the sample to those whose class involved the use of an infrequently deployed technology. These exploratory analyses produced results that were not greatly different from those obtained with the total sample. For example, when restricting the analysis to those students whose class did include a 3-D virtual classroom, the technology whose frequency of use was best associated with the perception that the course was of high quality was still e-mail with the instructor (r = .58). Also of note, in this subsample the frequency of use of the 3-D virtual classroom was not significantly associated with perceived quality of the course. Likewise, when restricting the analysis to those students whose classes included both video and audio input from students, frequency of e-mail contact with the instructor was again the variable best associated with perceived quality of the course (r = .41), while frequency of student video and audio input was much less well associated with perceived quality (r = .16).

Frequency of use of e-mail with the instructor is the “whopper” association found in this investigation of associations between technologies and perceived quality of the course. While this was not anticipated by the researchers, in hindsight it is not surprising. Previous research has established that students are more satisfied with online courses when instructor presence is salient and the student feels socially present as well. Originally, “social presence” was conceived as a property of the medium, and technologies such as e-mail were thought to be lacking in social presence due to their inability to capture the many nonverbal and vocal cues present in face-to-face communication (Gunawardena & Zittle, 1997; Swan & Shih, 2005). More recently, however, social presence has been conceived as a function of both the medium and the actions and perceptions of the communicators (Gunawardena & Zittle; Swan & Shih). A related construct is that of “immediacy.” Although originally conceived as the use by teachers of nonverbal behaviors that reduce the psychological distance between teacher and students, more recently, immediacy has been conceived as including both nonverbal and verbal components (Freitas, Myers, & Avtgis, 1998; Gorham, 1988). While students in online classes do perceive less nonverbal immediacy than do those in face-to-face classes, they perceive at least as much verbal immediacy as those in face-to-face classes (Freitas et al.).

Although e-mail may never be as good for establishing social presence and teacher immediacy as is face-to-face communication, experienced users of e-mail may learn how to substitute for the missing nonverbal and vocal cues, for example, through the use of emoticons (Derks, Bos, & von Grumbkow, 2007). Instructors of online classes could also increase the frequency of verbal immediacy behaviors such as addressing the students by name, asking them questions, giving them prompt feedback, and, when appropriate, praise and the appropriate use of humor. All of these verbal immediacy behaviors can be accomplished through the skillful use of e-mail. This is not to say that other technologies might not have the potential to be just as good or even better for establishing social presence and verbal immediacy, but at the moment, email appears to be the most pervasive tool used in online learning settings.

The pattern of associations between perceived difficulty of the course and the technology variables was similar to that observed with perceived quality of the course. Generally, the more frequently a technology was employed, the more difficult the course was perceived. It would be a mistake to interpret this finding as indicating that these technologies are making online courses more difficult. It is more likely that both students and instructors recognize the utility of these technologies and when the course material is more difficult (or the instructor more demanding), then these technologies are more frequently used.

Study Limitations and Future Research

The current study is not without its limitations. For instance, a lthough thousands of students from different universities and colleges across the United States were surveyed, well over two-thirds (i.e., 70%) of the participants were women. No attempt was made to sample in a way that would guarantee that male students would respond in the same proportion as female students, and there is no reason to believe that the proportion of female students who received our email differs from the proportion of students in the sampled population who are female. The high percentage of female respondents likely results from women being more responsive to survey enquiries than men, or women being more likely to have taken an online class than men. Of the 4,738 students who did respond, the female students were much more likely to have taken at least one online class (53%) than were the male students (41%) The fact that the sample primarily consisted of women may have influenced the results, although the ratio of women to men may be representative of the population of students who take online classes. Female students did differ significantly from male students on about half of the variables used in this study. For example, compared to men, women rated their online course as greater in both quality and difficulty. It may be desirable in future research to attempt to recruit a higher proportion of men into the sample. Similarly, the ethnic composition of the sample was extremely homogeneous, with Caucasians representing 83% of the total sample and 85% of the participants who had completed at least one online class. Although no reviewed research has indicated that ethnicity plays a role in the associations between the use of educational technologies and students’ perceptions of the quality (and difficulty) of an online course, the sample used in the current study is limited in ethnic diversity, and not truly representative of the student population. Hence, in future studies, an effort should be made to recruit more minority students.

Another potential limitation involves the use of a self-report survey. Individuals might have inaccurate perceptions, which can result in erroneous reports of the quality and difficulty of the online course. However, given that perceptions of the study constructs lie in the eye of the beholder, the survey was most appropriately conducted by asking students to indicate their own attitudes. Self-report data are irreplaceable as a means of collecting information on individuals’ perceptions. Given the self-report nature of the study, causal inferences were not made. Future studies could use longitudinal designs to enhance our understanding of how perceptions of quality and difficulty develop or change with experience in taking online courses.

Some researchers might consider the relatively large sample size as a limitation to the study. That is, even those effects that are negligible in size might be deemed “statistically significant” if the sample size is large. In the current study, however, standardized coefficients (i.e., Betas) are reported in the multiple regression analyses, thereby reducing the possibility of confusing “significant” findings with “large” effects. In fact, a relatively large sample size actually resulted in more accurate estimates of the associations between the use of several educational technologies and students’ perceptions of the quality (and difficulty) of the online course .

Acknowledgment

The research described here was supported by NSF grant BCS-0525087.

References

Allen, I. E., & Seaman, J. (2007). Online nation: Five years of growth in online learning. Retrieved from the Sloan Consortium: http://www.sloan-c.org/publications/survey/pdf/online_nation.pdf.

Ammari, A. C., & Slama, J. B. H. (2006). The development of a remote laboratory for internet-based engineering education. Journal for Asynchronous Learning Networks, 10(4), 3-13. Retrieved from http://www.sloan-c.org/publications/jaln/v10n4/v10n4_ammari_member.asp.

Arbaugh, J. B. (2000). Virtual classroom characteristics and student satisfaction with internet-based MBA courses. Journal of Management Education, 24, 32-54.

Batts, D. (2008). Comparison of student and instructor perceptions of best practices in online technology courses. Journal of Online Learning and Teaching, 4, 477-489. Retrieved from https://jolt.merlot.org/vol4no4/batts_1208.htm.

Biner, P., Barone, N., Welsh, K., & Dean, R. (1997). Relative academic performance and its relation to facet and overall satisfaction with interactive telecourses. Distance Education, 18, 318-326.

Campbell , J. O., Bourne, J. R., Mosterman, P. J., Nahvi, M., Rassai, R., Brodersen, A. J., et al. (2004). Cost-effective distributed learning with electronics labs. Journal for Asynchronous Learning Networks, 8(3), 5-10. Retrieved from http://www.sloan-c.org/publications/jaln/v8n3/pdf/v8n3_campbell.pdf.

Cancilla, D. A., & Albon, S. P. (2005). Creating authentic learning activities in pharmaceutical instrumental analysis: Using the integrated laboratory network for remote access to scientific instrumentation. Journal for Asynchronous Learning Networks, 9(2), 4-10. Retrieved from http://www.aln.org/publications/jaln/v9n2/v9n2_cancilla_member.asp.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 38(7) 3-7. Retrieved from http://learningcommons.evergreen.edu/pdf/fall1987.pdf.

Derks, D., Bos, A. E. R., & von Grumbkow, J. (2007). Emoticons and social interaction on the Internet: The importance of social context. Computers in Human Behavior, 23, 842-849.

Dickey, M. D. (2003). 3D virtual worlds: An emerging technology for traditional and distance learning. Proceedings of The Convergence of Learning and Technology, Windows on the Future. Retrieved from http://www.oln.org/conferences/OLN2003/papers/Dickey3DVirtualWorlds.pdf.

Disbrow, L. M. (2008). The overall effect of online audio conferencing in communication courses: What do students really think? Journal of Online Learning and Teaching, 4, 226-233. Retrieved from https://jolt.merlot.org/vol4no2/disbrow0608.pdf.

Eom, S. B., Wen, H. J., & Ashill, N. (2006). The determinants of students’ perceived learning outcomes and satisfaction in university online education: An empirical investigation. Decision Sciences Journal of Innovative Education, 4, 215-235.

Finkelstein, N., Adams, W., Keller, C., Perkins, K., & Wieman, C. (2006). High-tech tools for teaching physics: The physics education technology project. Journal of Online Learning and Teaching, 2, 110-121. Retrieved from https://jolt.merlot.org/vol2no3/finkelstein.pdf.

Fredericksen, E., Pickett, A., Shea, P., Pelz, W., & Swan, K. (2000). Student satisfaction and perceived learning with on-line courses: Principles and examples from the SUNY learning network. Journal for Asynchronous Learning Networks, 4(2), 7-41. Retrieved from http://www.sloan-c.org/publications/jaln/v4n2/v4n2_fredericksen.asp.

Freitas, F. A., Myers, S. A., & Avtgis, T. A. (1998). Student perceptions of instructor immediacy in conventional and distributed learning classrooms. Communication Education, 47, 366-372.

Glass, J., & Sue, V. (2008). Student preferences, satisfaction, and perceived learning in an online mathematics class. Journal of Online Learning and Teaching, 4, 325-338. Retrieved from https://jolt.merlot.org/vol4no3/glass_0908.pdf.

Gorham, J. (1988). The relationship between verbal teacher immediacy behaviors and student learning. Communication Education, 37, 40-53.

Gunawardena, C. N., & Zittle, F. J. (1997). Social presence as a predictor of satisfaction within a computer-mediated conferencing environment. American Journal of Distance Education, 11(3), 8-26.

Heiman, T. (2008). The effects of e-mail messages in a distance learning university on perceived academic and social support, academic satisfaction, and coping. The Quarterly Journal of Distance Education, 9, 237-248.

Hijazi, S., Bernard, P., Plaisent, M., & Maguiraga, L. (2003). Interactive technology impact on quality distance education. Electronic Journal of e-Learning, 1, 35-44. Retrieved from http://www.ejel.org/volume-1-issue-1/issue1-art5-hajazi.pdf.

Hiltz, S. R. (2005). Creating and sustaining effective ALNS. Journal for Asynchronous Learning Networks, 9(2), 11-15. Retrieved from http://www.sloan-c.org/publications/jaln/v9n2/pdf/v9n2_hiltz.pdf.

Hodge, E. M., Tabrizi, M. H. N, Farwell, M. A., & Wuensch, K. L. (2007). Virtual reality classrooms:  Strategies for creating a social presence. International Journal of Social Sciences, 2, 105-109. Retrieved from http://www.waset.org/ijss/v2/v2-2-15.pdf.

Joy, E. J., II, & Garcia, F. E. (2000). Measuring learning effectiveness. A new look at no-significant-difference findings. Journal for Asynchronous Learning Networks, 4(1), 33-39. Retrieved from http://www.sloan-c.org/publications/jaln/v4n1/pdf/v4n1_joygarcia.pdf.

Kuyath, S. J., & Winter, S. J. (2006). Distance education communications: The social presence and media richness of instant messaging. Journal for Asynchronous Learning Networks, 10(4), 67-81. Retrieved from http://www.sloan-c.org/publications/jaln/v10n4/v10n4_kuyath_member.asp.

Lightfoot, J. M. (2006). A comparative analysis of e-mail and face-to-face communication in an educational environment. Internet and Higher Education, 9, 217-227.

Maushak, N. J., & Ou, C. (2007). Using synchronous communication to facilitate graduate students’ online collaboration. The Quarterly Review of Distance Education, 8, 161-169.

Nicholson, S. (2002). Socialization in the “virtual hallway:” Instant messaging in the asynchronous web-based distance education classroom. Internet and Higher Education, 5, 363-372.

Smith, G. G,. Ferguson, D., & Caris, M. (2003). The web versus the classroom: Instructor experiences in discussion-based and mathematics-based disciplines. Journal of Educational Computing Research, 29, 29-59.

Stuckey-Mickell, T. A., & Stuckey-Danner, B. D. (2007). Virtual labs in the online biology course: Student perceptions of effectiveness and usability. Journal of Online Learning and Teaching, 3, 105-111. Retrieved from https://jolt.merlot.org/vol3no2/stuckey.pdf.

Swan, K. (2001). Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22, 306-331.

Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course discussions. Journal for Asynchronous Learning Networks, 9(3), 115-136. Retrieved from http://www.sloan-c.org/publications/jaln/v9n3/pdf/v9n3_swan.pdf.

Teng, T.-L., & Taveras, M. (2004-2005). Combining live video and audio broadcasting, synchronous chat, and asynchronous open forum discussions in distance education. Journal of Educational Technology Systems, 33, 121-129.

Wellman, G. S., & Marcinkiewicz, H. (2004). Online learning and time-on-task: Impact of proctored vs. un-proctored testing. Journal for Asynchronous Learning Networks, 8(4), 93-104. Retrieved from http://www.sloan-c.org/publications/jaln/v8n4/pdf/v8n4_wellman.pdf.

 


Manuscript received 25 Feb 2009; revision received 8 May 2009.

Creative Commons License

This work is licensed under a

Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License

 

   
Copyright © 2005-2009  MERLOT. All Rights Reserved.
Portions Copyright by MERLOT Community Members. Used with Permission.
ISSN: 1558-9528
Questions? Email: jolteditor@merlot.org
Last Modified : 2009/6/15