According to Christopher Migotsky, Head of
Measurement and Evaluation at the University of
Illinois at Urbana-Champaign Center for Teaching
Excellence, validity of the ICES lies with the
interpretation and use of the results based on
concurrent and predictive validity with other
measures of teaching effectiveness such as peer
observations, alumni surveys, and, in this study,
the student comments on the open-ended
instructor-designed questionnaire. ICES-rated items
correlate well with these measures. Reliability of
the ICES is dependent on context (particularly the
specific item and the number of students) and
generally exceeds 0.90 (C. Migotsky, personal
communication, June 14, 2006).
Numerical analysis of the methods of student
evaluation in the course (three quizzes, an oral
presentation, and a written proposal) comprised an
experimental research component of the study. Data
from an open-ended instructor-made questionnaire
provided additional supporting explanations in
free-form responses from students in both delivery
modes (direct instruction and online). The
open-ended questionnaire included the following
items: (a) What did you like about this course?, (b)
What did you dislike about this course?, (c) What
would you specifically change about the course
format?, (d) Was the instructor fair and how can you
substantiate this?, (e) Compare this course to
others you have taken at this college, (f) Compare
this instructor to others you have had in college,
and (g) Do you feel that the grade you will earn in
this course will be an accurate reflection of your
work?
Participants.
Students enrolled in the masters-level educational
research course at DeSales University were the
participants in the study. DeSales University is a
small, private, suburban, Catholic institution
located in Pennsylvania�s Lehigh Valley. Most
students enrolled in the master of education program
are employed full-time as elementary, middle school,
or high school teachers. The educational research
course is required of all master of education
students, which means that students were enrolled in
any of the programs offered at the university:
Academic Standards and Reform, Biology, Chemistry,
English, Mathematics, Special Education, Teaching
English to Speakers of Other Languages (TESOL), or
Technology in Education. Students were also
matriculating at various points in their programs of
study, ranging from the initial course (zero
graduate credits earned) to the next-to-last course
(27 or more graduate credits earned). It should be
noted that some students may not have chosen to
complete the educational research course based upon
delivery method, but instead chose it in the
particular semester offered in order to complete
their program requirements in a timely fashion. The
gender breakdown of participants in both course
delivery options was 8 males and 24 females (32
total) in the direct instruction course, and 10
males and 19 females (29 total) in the online
course.
Course details and delivery comparisons.
Students completed the course in either direct
instruction or online format. The course was offered
in the fall 2004 and 2005 semesters via direct
instruction, and online in the spring 2005 and
winter 2006 semesters. The courses were conducted as
similarly to each other as possible considering the
differences in delivery methods. Assessments for
both delivery methods
included three 10-question multiple-choice
quizzes (completed online by students enrolled in
both course formats); a 10-minute oral presentation
(completed in-person on campus on the final day of
the semester by students enrolled in both course
formats); and a written research proposal.
Students in the direct instruction course were
able to view and participate in the instructor�s
lectures in real-time, while students in the online
course viewed the same lecture on video recorded
from the previous semester, either via Internet
streaming video or on a CD-ROM, and participated by
sending the instructor e-mail messages with the
replies sent via e-mail to all students in the
class. Both groups of students participated in blind
reviews of their classmates� written research
proposals, and both groups of students formally
evaluated their classmates� oral presentations using
a standard rubric. Students in the direct
instruction course were able to interact with each
other freely, but students in the online course were
facilitated more similarly to a one-on-one
independent study course with the instructor. No
discussion board postings or virtual classroom
environments were utilized. Formal instructor office
hours were available to both groups of students, but
online students were encouraged to first e-mail the
instructor to keep the online course true to its
intended format.
Results
A t-test for independent samples was used to
determine whether or not there was a statistically
significant difference (at an alpha level of .05) in
the response means of the direct instruction
students and the online students for 13 selected
items on the ICES questionnaire. It should be noted
in some instances the data below indicates different
degrees of freedom because students failed to
respond to one or more items on the ICES form.
Table 1 shows a comparison of ICES student responses
by course delivery method. The online group (M =
4.35) gave a lower rating than the direct
instruction group (M = 4.72) for the item titled,
�The instructor evaluated my work in a meaningful
and conscientious manner�. It was shown to be
statistically significant, t(59) = 2.08, p = .042.
The online group (M = 3.38) also gave a lower rating
than the direct instruction group (M = 4.38) for the
item titled, �There was enough student participation
for this type of course�. It was shown to be
statistically significant, t(51) = 3.10, p = .003.
The remaining 11 out of 13 selected items did not
show statistically significant differences between
groups. Four items, though not shown to be
statistically significant, were rated higher by the
online group than the direct instruction group, and
one item was rated identically by both groups.
Table 1. Comparison of Instructor and Course
Evaluation System (ICES) Student Responses by
Course Delivery Method
Direct Instructiona
Onlineb
M SD M
SD p
Rate the instructor�s overall teaching
effectiveness 4.34
0.827 4.39 1.137 0.832
Rate the overall quality of this
course 4.25
0.842 4.26 1.021 0.966
The course was
organized
4.50 0.749 4.69 0.888
0.253
This course improved your understanding of
concepts 4.22 0.870
4.28 1.017 0.796
and principles in this field
Your ability to solve real problems in this
field 4.09
0.928 3.93 1.105 0.512
improved
The amount of work required for this course
was 4.00 1.078
4.00 1.129 1.000
appropriate
The methods of evaluation reflected content
and 4.19 0.859
4.00 1.044 0.405
emphasis of the course
The instructor evaluated my work in a meaningful
and 4.72 0.581 4.35
1.046 0.042*
conscientious manner
The instructor was knowledgeable about the
subject 4.66 0.827
4.61 0.885 0.793
The instructor motivated me to do my best
work 4.53 0.621
4.33 1.091 0.316
There was enough student participation for this
type 4.38 0.976
3.38 1.398 0.003**
of course
The instructor treated me with
respect 4.84
0.454 4.68 0.975 0.256
The instructional materials used in this
course 4.26
0.631 4.11 1.089 0.461
were excellent
Note.
Group means determined using a 5-point Likert scale
from Low (1) to High (5).
an = 32. bn = 29 *p < 0.05; **p
< 0.01.
Table 2 shows a
statistical analysis of assessments/outcomes by
course delivery method. The online group (M = 75.3)
scored lower on quiz 2 than the direct instruction
group (M = 85.9). It was shown to be statistically
significant, t(60) = 2.72, p = .009. Differences in
group means were not shown to be statistically
significant for quiz 1, quiz 3, oral presentation,
written research proposal, and final average.
Table 2. Statistical Analysis of
Assessments/Outcomes by Course Delivery Method
Direct Instructiona
Onlineb
M SD M
SD p
Quiz
#1
81.6 15.05 84.2
14.51 0.491
Quiz
#2
85.9 13.16 75.3
17.37 0.009**
Quiz #3
85.3 14.59 83.8
12.22 0.668
Oral
Presentation
95.3 3.40 96.0
2.67 0.411
Written Research Proposal
88.2 8.97 91.5
4.74 0.080
an = 32. bn = 29 **p < 0.01
An analysis of the
online delivery group responses on the
instructor-designed open-ended survey form indicated
positive responses, specifically: (a) students liked
that the course was well organized and well prepared
at the onset, (b) liked the inclusion of grading
rubrics, (c) felt that the step-by-step breakdown of
assignments made the course easier, (d) noted that
it was beneficial to be able to �go at my own pace
and look over (video) lectures if needed�, (e)
recognized the benefit to create their own schedule
or that the online course fit their personal
schedule, (f) felt that the instructor was helpful
and strict but fair, and (g) felt that the
instructor provided good, prompt, and ample feedback
on the assignments. Some of the negative responses
included complaints about timed quizzes, having
trouble with self-motivation, difficulty
understanding some items/topics without having the
in-person interaction, and difficulty because of a
lack of sufficient computer expertise. A common
suggestion was to have an in-person meeting with the
instructor midway through the online course.
Discussion
The fact that there were no group differences in
11 of the 13 items on the ICES questionnaire may be
interpreted that online courses can be developed for
instruction, conducted similarly, and yield similar
(or nearly identical) student outcomes in both
direct instruction and online formats for a graduate
educational research course (and other similar
courses). The fact that the analysis of student
outcomes comparison yielded a significant difference
for just one item (quiz 2) also supports this
statement. A graduate level research course seems to
work well in an online environment when it is set up
and conducted properly. Also of note was the fact
that both groups gave identical ratings in
evaluating the amount of work required for the
course. These findings further support the viability
of offering graduate educational research and other
similar courses to students in an online
environment.
Implications
The implications are apparent for DeSales
University (and similar universities) in particular
and for higher education in general. According to
Distance Education at Degree-Granting Postsecondary
Institutions: 2000�2001 (Waits & Lewis, 2003),
college-level, credit-granting distance education
courses at either the undergraduate or
graduate/first-professional level were offered by
55% of all 2-year and 4-year institutions, and
college-level, credit-granting distance education
courses were offered at the
graduate/first-professional level by 52% of
institutions that had graduate/first-professional
programs.
Limitations and Generalizability
The limitations were that only students enrolled
at DeSales University were included in the sample.
Consumers of this research may or may not be able to
generalize its findings to persons, settings, and
times different from those involved in the research.
Generalizations from this study for populations that
are demographically different from the participants
involved are done so at the risk of the consumer of
the research.
Conclusion
As graduate students in higher education become more
technologically savvy and demand even more
convenience in their post baccalaureate course work,
online courses will only continue to follow suit and
become more the rule rather than the exception.
Continuing to seek and demand instructional quality
that mirrors the traditional classroom environment
is then of similarly increasing importance for
universities when offering more courses (and entire
degree programs) online. This study supports the
continuance of online course delivery in graduate
education
References
Buros Institute of Mental Measurements, Test Reviews
Online.
(n.d.). Retrieved June 22, 2006, from
http://buros.unl.edu/buros/jsp/reviews.jsp?item=04001194
Fredda, J. V. (2000). Comparison of selected
student outcomes for internet- and campus-based
instruction at the Wayne Huizenga Graduate School of
Business and Entrepreneurship (Report No.
NSU-RP-R-00-14). Nova Southeastern University
Research and Planning, Fort Lauderdale, FL. (ERIC
Document No. ED453743). Retrieved April 9, 2006,
from EDRS Online.
Gallagher,
J. E., Dobrosielski-Vergona, K. A., Wingard, R. G.,
& Williams, T. M. (2005). Web-based vs. traditional
classroom instruction in gerontology: A pilot study.
Journal of Dental Hygiene, 79(3), 1-11.
Gagne, M., & Shepherd, M. (2001). A
comparison between a distance and a traditional
graduate accounting class. T.H.E. Journal, 28(9),
58-65.
Johnson, S. D., Aragon, S. R., Shaik,
N., & Palma-Rivas, N. (1999). Comparative
analysis of online vs. face-to-face instruction.
Paper presented at the WebNet 99 World Conference on
the WWW and Internet Proceedings, Honolulu, HI,
October 24-30, 1999. (ERIC Document No. ED448722).
Retrieved April 9, 2006, from EDRS Online.
Lee, J., & Busch, P. E. (2005). Factors related to
instructors� willingness to participate in distance
education. The Journal of Educational Research,
99(2), 109-115.
Ryan, R. C. (2000). Student
assessment comparison of lecture and online
construction equipment and methods classes. T.H.E.
Journal, 27(6), 78-83.
Servonsky, E. J., Daniels, W. L., & Davis, B. L.
(2005). Evaluation of Blackboard as a platform for
distance education delivery. The ABNF Journal, 16(6),
132-135.
Smith, G. G., Ferguson, D., & Caris, M. (2001).
Teaching college courses online versus face-to-face.
T.H.E. Journal, 28(9), 18-26.
Waits, T. & Lewis, L. (2003). Distance Education
at Degree-Granting Postsecondary Institutions:
2000�2001 (NCES 2003017). U.S. Department of
Education. Washington, DC: National Center for
Education Statistics.
Wang, A. Y., & Newlin, M. H. (2001).
Online lectures: Benefits for the virtual classroom.
T.H.E. Journal, (29)1,
17-24.
Zhanga, W., & Perris, K. (2004).
Researching the efficacy of online
learning: A ollaborative effort amongst scholars in Asian open
universities. Open Learning, 19(3), 247-264.