Introduction
Distance learning in higher education is not a
new phenomenon. It has been studied from many
different perspectives, such as instructor or
student willingness to participate in distance
education, evaluation of platforms for distance
education delivery, virtual interaction of teacher
and student, and student retention in web-based
learning.
Many instructor hours are devoted to creating an
"online presence" in distance education, a
psychological perception for students that the
instructor is omnipresent and responding to them in
an online class. Without this psychological
perception, students quickly become insecure and
tend to drop the class (Smith, Ferguson, & Caris,
2001). Since most web-based courses rely primarily
on asynchronous communication in delivering course
information to students, instructors and students do
not interact simultaneously. Instead, messages are
posted on a forum, web page, or are sent as e-mail,
and a reply is provided at some unspecified later
time. Any follow-up questions are dealt with through
additional postings or messages with requisite
delays. Overall, this process limits the amount and
depth of interactions regarding course materials and
procedures (Wang & Newlin, 2001).
The purpose of this study was to compare a
graduate-level educational research course taught by
direct instruction (traditional in-person lecture)
and online instruction (distance learning via the
Internet) to determine whether or not there was a
significant difference in student outcomes and
opinions for the two methods. Comparable results in
direct instruction and online instruction would
ensure the continued viability of offering
educational research (and other similar courses) by
means of online instruction at the institution
utilized in this study, DeSales University (and
other similar institutions).
Instructors and Online Instruction
From an instructor’s point of view, an online course requires more
precise planning and more time to prepare
instructional materials than direct instruction (Servonsky,
Daniels, & Davis, 2005), and online instructors must
log on to the course web site at least three or four
times a week for a number of hours each session,
respond to threaded discussion questions, evaluate
assignments, answer questions clearing up
ambiguities, and often spend an inordinate amount of
time communicating by e-mail (Smith et al., 2001).
Interactions with the online instructors using
e-mail or virtual chat demand greater efficiency
than open oral discussion and are perhaps the
greatest limitation of the online delivery method
(Gallagher, Dobrosielski-Vergona, Wingard, &
Williams, 2005). Lee and Busch (2005) reported in
their study on distance education at The University
of North Carolina at Charlotte that instructors were
more comfortable and able in direct instruction than
online instruction to (a) interact with students,
(b) have students interact and participate in class
discussions, and (c) accommodate different learning
styles. Instructors’ willingness to participate in
distance education was a function of their
perception of the adequacy of training for distance
education and recognition received, and not related
to effort and time needed to develop course
materials for distance education.
Student Opinions of Online Learning
In a study by Ryan (2000) that compared student survey responses at
a University of Oklahoma course on lecture and
online construction equipment and methods classes
produced no evidence of quality perception
differences between direct instruction and online
instruction classes. A survey of students in both
course formats enrolled in a gerontology course in
the University of Pittsburgh Dental Hygiene program
agreed that either method of instruction chosen by
students was effective and beneficial (Gallagher et
al., 2005). In a study of Asian open universities by
Zhanga and Perris (2004), students had positive
perceptions of the flexibility of the Internet in
their courses along with the abilities of sharing
resources with others, sharing ideas and answers
with others, and the ability to have an equal
opportunity to contribute, due to the democratic
nature of the medium.
In terms of the class as a learning experience, and
for ratings of the course and instructor overall,
Gagne and Shepherd (2001) found there were no
differences between direct and online instruction in
an introductory graduate accounting class. However,
there was a significant difference in responding to
instructor availability. The research of Johnson,
Aragon, Shaik, and Palma-Rivas (1999) of a
graduate-level instructional design course for human
resource professionals at The University of Illinois
at Urbana-Champaign found that students enrolled in
the traditional face-to-face course had a
significantly better experience in terms of
communication with other students in the class,
sharing learning experiences with other students,
perceptions of a sense of community, and being able
to work in teams. Ryan (2000) also found in his
aforementioned study that interaction was the
greatest weakness of the class, and suggested that
mandatory times for interaction be included in the
class format. The availability of the class on the
web at all times was the greatest strength of the
class.
The opportunities to work independently at one’s own
pace and to communicate “anonymously” are two of the
primary advantages of working online. In the
aforementioned Zhanga and Perris (2004) study,
students perceived the greatest disadvantages of
online learning to be in relation to their greater
comfort with more traditional mediums and their
inexperience using computers. Johnson et al. (1999)
noted that differences between groups for instructor
support stemmed from the characteristics of
instructor feedback, perception of interaction
between the instructor and the students (as assessed
using items covering teaching style), students being
informed about their progress in the course, and
student and instructor interactions during the
course.
Student Performance in Online Courses
There have been several studies that
revealed no differences between direct instruction
and online delivery methods. The performance of
students in an online course was similar to the
performance of students in the Gagne and Shepherd
(2001) study of an introductory accounting graduate
class; there was no difference in class project
ratings in the Johnson et al. (1999) study involving
a graduate-level
instructional design course; final grades for online
and lecture participants were not significantly
different for either course offering in the Ryan
(2000) study on construction equipment and methods
classes; and Fredda (2000) had higher internet
grades in one graduate business course, and no
significant difference in grades in two other
business courses, at Nova Southeastern
University.
Regarding student characteristics, students
selecting a web-based course format demonstrated
greater motivation and learning success based on
final course grades, completion of assignments, and
knowledge retention over time, and trouble spots
included the correlation between student age and
previous experience with online courses (Gallagher
et al., 2005).
Methodology
Quantitative methods were implemented in the
research design. The instrument utilized in the
descriptive research component of the study was the
Instructor and Course Evaluation System (ICES),
developed by the Center for Teaching Excellence at
the
According to Christopher Migotsky, Head of
Measurement and Evaluation at the University of
Illinois at Urbana-Champaign Center for Teaching
Excellence, validity of the ICES lies with the
interpretation and use of the results based on
concurrent and predictive validity with other
measures of teaching effectiveness such as peer
observations, alumni surveys, and, in this study,
the student comments on the open-ended
instructor-designed questionnaire. ICES-rated items
correlate well with these measures. Reliability of
the ICES is dependent on context (particularly the
specific item and the number of students) and
generally exceeds 0.90 (C. Migotsky, personal
communication, June 14, 2006).
Numerical analysis of the methods of student
evaluation in the course (three quizzes, an oral
presentation, and a written proposal) comprised an
experimental research component of the study. Data
from an open-ended instructor-made questionnaire
provided additional supporting explanations in
free-form responses from students in both delivery
modes (direct instruction and online). The
open-ended questionnaire included the following
items: (a) What did you like about this course?, (b)
What did you dislike about this course?, (c) What
would you specifically change about the course
format?, (d) Was the instructor fair and how can you
substantiate this?, (e) Compare this course to
others you have taken at this college, (f) Compare
this instructor to others you have had in college,
and (g) Do you feel that the grade you will earn in
this course will be an accurate reflection of your
work?
Participants.
Students enrolled in the masters-level educational
research course at DeSales University were the
participants in the study. DeSales University is a
small, private, suburban, Catholic institution
located in Pennsylvania’s Lehigh Valley. Most
students enrolled in the master of education program
are employed full-time as elementary, middle school,
or high school teachers. The educational research
course is required of all master of education
students, which means that students were enrolled in
any of the programs offered at the university:
Academic Standards and Reform, Biology, Chemistry,
English, Mathematics, Special Education, Teaching
English to Speakers of Other Languages (TESOL), or
Technology in Education. Students were also
matriculating at various points in their programs of
study, ranging from the initial course (zero
graduate credits earned) to the next-to-last course
(27 or more graduate credits earned). It should be
noted that some students may not have chosen to
complete the educational research course based upon
delivery method, but instead chose it in the
particular semester offered in order to complete
their program requirements in a timely fashion. The
gender breakdown of participants in both course
delivery options was 8 males and 24 females (32
total) in the direct instruction course, and 10
males and 19 females (29 total) in the online
course.
Course details and delivery comparisons.
Students completed the course in either direct
instruction or online format. The course was offered
in the fall 2004 and 2005 semesters via direct
instruction, and online in the spring 2005 and
winter 2006 semesters. The courses were conducted as
similarly to each other as possible considering the
differences in delivery methods. Assessments for
both delivery methods
included three 10-question multiple-choice
quizzes (completed online by students enrolled in
both course formats); a 10-minute oral presentation
(completed in-person on campus on the final day of
the semester by students enrolled in both course
formats); and a written research proposal.
Students in the direct instruction course were
able to view and participate in the instructor’s
lectures in real-time, while students in the online
course viewed the same lecture on video recorded
from the previous semester, either via Internet
streaming video or on a CD-ROM, and participated by
sending the instructor e-mail messages with the
replies sent via e-mail to all students in the
class. Both groups of students participated in blind
reviews of their classmates’ written research
proposals, and both groups of students formally
evaluated their classmates’ oral presentations using
a standard rubric. Students in the direct
instruction course were able to interact with each
other freely, but students in the online course were
facilitated more similarly to a one-on-one
independent study course with the instructor. No
discussion board postings or virtual classroom
environments were utilized. Formal instructor office
hours were available to both groups of students, but
online students were encouraged to first e-mail the
instructor to keep the online course true to its
intended format.
Results
A t-test for independent samples was used to
determine whether or not there was a statistically
significant difference (at an alpha level of .05) in
the response means of the direct instruction
students and the online students for 13 selected
items on the ICES questionnaire. It should be noted
in some instances the data below indicates different
degrees of freedom because students failed to
respond to one or more items on the ICES form.
Table 1 shows a comparison of ICES student responses
by course delivery method. The online group (M =
4.35) gave a lower rating than the direct
instruction group (M = 4.72) for the item titled,
“The instructor evaluated my work in a meaningful
and conscientious manner”. It was shown to be
statistically significant, t(59) = 2.08, p = .042.
The online group (M = 3.38) also gave a lower rating
than the direct instruction group (M = 4.38) for the
item titled, “There was enough student participation
for this type of course”. It was shown to be
statistically significant, t(51) = 3.10, p = .003.
The remaining 11 out of 13 selected items did not
show statistically significant differences between
groups. Four items, though not shown to be
statistically significant, were rated higher by the
online group than the direct instruction group, and
one item was rated identically by both groups.
Table 1. Comparison of Instructor and Course
Evaluation System (ICES) Student Responses by
Course Delivery Method
Direct Instructiona
Onlineb
M SD M
SD p
Rate the instructor’s overall teaching
effectiveness 4.34
0.827 4.39 1.137 0.832
Rate the overall quality of this
course 4.25
0.842 4.26 1.021 0.966
The course was
organized
4.50 0.749 4.69 0.888
0.253
This course improved your understanding of
concepts 4.22 0.870
4.28 1.017 0.796
and principles in this field
Your ability to solve real problems in this
field 4.09
0.928 3.93 1.105 0.512
improved
The amount of work required for this course
was 4.00 1.078
4.00 1.129 1.000
appropriate
The methods of evaluation reflected content
and 4.19 0.859
4.00 1.044 0.405
emphasis of the course
The instructor evaluated my work in a meaningful
and 4.72 0.581 4.35
1.046 0.042*
conscientious manner
The instructor was knowledgeable about the
subject 4.66 0.827
4.61 0.885 0.793
The instructor motivated me to do my best
work 4.53 0.621
4.33 1.091 0.316
There was enough student participation for this
type 4.38 0.976
3.38 1.398 0.003**
of course
The instructor treated me with
respect 4.84
0.454 4.68 0.975 0.256
The instructional materials used in this
course 4.26
0.631 4.11 1.089 0.461
were excellent
Note.
Group means determined using a 5-point Likert scale
from Low (1) to High (5).
an = 32. bn = 29 *p < 0.05; **p
< 0.01.
Table 2 shows a
statistical analysis of assessments/outcomes by
course delivery method. The online group (M = 75.3)
scored lower on quiz 2 than the direct instruction
group (M = 85.9). It was shown to be statistically
significant, t(60) = 2.72, p = .009. Differences in
group means were not shown to be statistically
significant for quiz 1, quiz 3, oral presentation,
written research proposal, and final average.
Table 2. Statistical Analysis of
Assessments/Outcomes by Course Delivery Method
Direct Instructiona
Onlineb
M SD M
SD p
Quiz
#1
81.6 15.05 84.2
14.51 0.491
Quiz
#2
85.9 13.16 75.3
17.37 0.009**
Quiz #3
85.3 14.59 83.8
12.22 0.668
Oral
Presentation
95.3 3.40 96.0
2.67 0.411
Written Research Proposal
88.2 8.97 91.5
4.74 0.080
an = 32. bn = 29 **p < 0.01
An analysis of the
online delivery group responses on the
instructor-designed open-ended survey form indicated
positive responses, specifically: (a) students liked
that the course was well organized and well prepared
at the onset, (b) liked the inclusion of grading
rubrics, (c) felt that the step-by-step breakdown of
assignments made the course easier, (d) noted that
it was beneficial to be able to “go at my own pace
and look over (video) lectures if needed”, (e)
recognized the benefit to create their own schedule
or that the online course fit their personal
schedule, (f) felt that the instructor was helpful
and strict but fair, and (g) felt that the
instructor provided good, prompt, and ample feedback
on the assignments. Some of the negative responses
included complaints about timed quizzes, having
trouble with self-motivation, difficulty
understanding some items/topics without having the
in-person interaction, and difficulty because of a
lack of sufficient computer expertise. A common
suggestion was to have an in-person meeting with the
instructor midway through the online course.
Discussion
The fact that there were no group differences in
11 of the 13 items on the ICES questionnaire may be
interpreted that online courses can be developed for
instruction, conducted similarly, and yield similar
(or nearly identical) student outcomes in both
direct instruction and online formats for a graduate
educational research course (and other similar
courses). The fact that the analysis of student
outcomes comparison yielded a significant difference
for just one item (quiz 2) also supports this
statement. A graduate level research course seems to
work well in an online environment when it is set up
and conducted properly. Also of note was the fact
that both groups gave identical ratings in
evaluating the amount of work required for the
course. These findings further support the viability
of offering graduate educational research and other
similar courses to students in an online
environment.
Implications
The implications are apparent for DeSales
University (and similar universities) in particular
and for higher education in general. According to
Distance Education at Degree-Granting Postsecondary
Institutions: 2000–2001 (Waits & Lewis, 2003),
college-level, credit-granting distance education
courses at either the undergraduate or
graduate/first-professional level were offered by
55% of all 2-year and 4-year institutions, and
college-level, credit-granting distance education
courses were offered at the
graduate/first-professional level by 52% of
institutions that had graduate/first-professional
programs.
Limitations and Generalizability
The limitations were that only students enrolled
at DeSales University were included in the sample.
Consumers of this research may or may not be able to
generalize its findings to persons, settings, and
times different from those involved in the research.
Generalizations from this study for populations that
are demographically different from the participants
involved are done so at the risk of the consumer of
the research.
Conclusion
As graduate students in higher education become more
technologically savvy and demand even more
convenience in their post baccalaureate course work,
online courses will only continue to follow suit and
become more the rule rather than the exception.
Continuing to seek and demand instructional quality
that mirrors the traditional classroom environment
is then of similarly increasing importance for
universities when offering more courses (and entire
degree programs) online. This study supports the
continuance of online course delivery in graduate
education
References
Buros Institute of Mental Measurements, Test Reviews
Online.
(n.d.). Retrieved June 22, 2006, from
http://buros.unl.edu/buros/jsp/reviews.jsp?item=04001194
Fredda, J. V. (2000). Comparison of selected
student outcomes for internet- and campus-based
instruction at the Wayne Huizenga Graduate School of
Business and Entrepreneurship (Report No.
NSU-RP-R-00-14). Nova Southeastern University
Research and Planning, Fort Lauderdale, FL. (ERIC
Document No. ED453743). Retrieved April 9, 2006,
from EDRS Online.
Gallagher,
J. E., Dobrosielski-Vergona, K. A., Wingard, R. G.,
& Williams, T. M. (2005). Web-based vs. traditional
classroom instruction in gerontology: A pilot study.
Journal of Dental Hygiene, 79(3), 1-11.
Gagne, M., & Shepherd, M. (2001). A
comparison between a distance and a traditional
graduate accounting class. T.H.E. Journal, 28(9),
58-65.
Johnson, S. D., Aragon, S. R., Shaik,
N., & Palma-Rivas, N. (1999). Comparative
analysis of online vs. face-to-face instruction.
Paper presented at the WebNet 99 World Conference on
the WWW and Internet Proceedings, Honolulu, HI,
October 24-30, 1999. (ERIC Document No. ED448722).
Retrieved April 9, 2006, from EDRS Online.
Lee, J., & Busch, P. E. (2005). Factors related to
instructors’ willingness to participate in distance
education. The Journal of Educational Research,
99(2), 109-115.
Ryan, R. C. (2000). Student
assessment comparison of lecture and online
construction equipment and methods classes. T.H.E.
Journal, 27(6), 78-83.
Servonsky, E. J., Daniels, W. L., & Davis, B. L.
(2005). Evaluation of Blackboard as a platform for
distance education delivery. The ABNF Journal, 16(6),
132-135.
Smith, G. G., Ferguson, D., & Caris, M. (2001).
Teaching college courses online versus face-to-face.
T.H.E. Journal, 28(9), 18-26.
Waits, T. & Lewis, L. (2003). Distance Education
at Degree-Granting Postsecondary Institutions:
2000–2001 (NCES 2003017). U.S. Department of
Education. Washington, DC: National Center for
Education Statistics.
Wang, A. Y., & Newlin, M. H. (2001).
Online lectures: Benefits for the virtual classroom.
T.H.E. Journal, (29)1,
17-24.
Zhanga, W., & Perris, K. (2004).
Researching the efficacy of online
learning: A ollaborative effort amongst scholars in Asian open
universities. Open Learning, 19(3), 247-264.
|