MERLOT Journal of Online Learning and Teaching

Vol. 3, No. 1, March 2007


 

Student Satisfaction with a Distance Learning MPA Program: A Preliminary Comparison of On-Campus and Distance Learning Students’ Satisfaction with MPA Courses

  

David Clayton Powell
Graduate Center for Public Policy and Administration
California State University, Long Beach

Long Beach, CA 90840
USA
dpowell@csulb.edu

 

Abstract 

This research explores student perceptions of course quality and instructor effectiveness in a hybrid MPA distance learning program. The MPA distance learning program under analysis utilizes a synchronous computer software program for 21 hours of instruction per course, an asynchronous computer software program for 21 hours of instruction per course, and six hours of on-campus in-person instruction per course.

 

Survey data from students who have completed eight (8) courses in this distance learning program (repeated samples n = 90) will be compared to the evaluations of students who have taken the same courses from the same instructors in the on-campus program (n=100). 

 

The purpose of the research is two-fold. First, the research will determine if there is a significant difference between the perceptions of course quality and instructor effectiveness between students in the distance learning program and students enrolled in the on-campus program. Second, the research will explore student satisfaction with the use of the synchronous and asynchronous computer delivery methods. It is anticipated that students will express satisfaction levels with course quality and instructor effectiveness equal to, or exceeding, the satisfaction levels expressed by students in the on-campus program.

 

            Keywords: distance, education, internet, web, graduate 

 

Introduction

A common issue of contention in public affairs education is the tension between expanding educational access and maintaining academic quality and rigor. There are certainly important benefits and externalities associated with expanding access to undergraduate and graduate level public affairs education. These benefits accrue not only to the prospective students entering public affairs programs but to the greater societal good as well. If we are to heed the call of DiIulio and Kettl (1995) to “rediscover government”, civic education initiatives must be the cornerstone of such an endeavor.  

However, while few question the virtue of expanding access to public affairs education, some academics and practitioners alike fear the possible erosion of academic quality that may accompany such expansion. Brower and Klay (2000) caution public affairs programs to not put new innovations to work without considering the implications for the future. Specifically, they warn us that a rush to use new technologies can create substandard programs that may actually detract from academic quality.  

Distance learning public affairs programs have thus found themselves at a crossroads. As new technologies develop that greatly facilitate the creation and delivery of public affairs education, programs must proceed cautiously in creating programs in response to the educational needs of the profession rather than mere market factors. Despite these warnings, the pace toward distance learning public affairs education has quickened, and many authors have devoted their attention to assessing student satisfaction with distance learning technologies (Biner, 1997; Richardson, 2005; Hiltz, 1990). The number of Master of Public Administration (MPA) programs that utilize distance learning as a mode of delivery has grown steadily over the past decade from 12% in the early 1990s to 43% in 1996. Today, a cursory review of MPA programs uncovers over 70 programs with a distance learning component to their curricula. In fact, the National Association of Schools of Public Affairs and Administration (NASPAA) website currently lists no less than 20 programs that deliver the entire MPA curriculum (or a significant portion thereof) via distance learning technologies.  

It appears that despite reservations from some faculty, distance learning has become a rather well-entrenched aspect of public affairs education at the graduate level. However, distance education should not be discussed in monolithic fashion. Instead, distance education can (and does) encompass a variety of media and modes of delivery from synchronous to asynchronous modes of delivery (Kidney, 2004; Phelps, 1991; Solomon, 2005; Travis, 2005). For example, Goodsell and Armstrong (2001), in their review of a state public policy course, describe the use of multiple modes of delivery and learning. These modes include weekly televised class meetings, small group discussions, field experiences, asynchronous video delivery, and in-person sessions. The authors describe this as a “converged” approach to distance learning instruction. Scheer (2001), in one of the first truly quantitative reviews of the distance MPA education, examined the three dominant methods of delivery: traditional on-campus delivery, video courses, and online instruction. These methods themselves may be multi-dimensional and may include a variety of different approaches. For example, online delivery may be synchronous or asynchronous. An asynchronous method of delivery may utilize a platform such as WebCT or Blackboard as a posting board for asynchronous communication with students. More synchronous delivery modes include the use of web conferencing, virtual chat, or web cams to provide for real-time interaction between instructor and students or between learning sites. Likewise, video delivery may consist of asynchronous, semi-synchronous, or synchronous methods. The delivery of self paced videotapes and the use of fiber optic technologies are usually categorized as video delivery methods.  

While the delivery methods differ, new technologies emerge that require programs to migrate from asynchronous or semi-synchronous approaches to platforms that provide for more real-time interaction between participants. That is the experience of the MPA program at California State University, Long Beach. This paper reports the preliminary observations related to the new delivery mechanism utilized in the California State University, Long Beach MPA Distance Learning program (CSULB-MPADL). In Fall 2004, the CSULB-MPADL program replaced its fiber optic broadcasts with a new synchronous computer assisted learning platform (Centra Symposium). This paper examines the satisfaction levels of distance learning students with the education that they are receiving with this new technology. Specifically, the study compares the satisfaction of distance learning students who receive their primary instruction using this new platform with on-campus students in a traditional classroom setting. The findings, while preliminary, do provide a basis for drawing initial conclusions regarding the use of this new platform.

Literature Survey 

The Impetus for Developing Distance Learning Programs 

A great deal of literature exists regarding the benefits and potential advantages of migrating toward distance education in undergraduate and graduate education. Much of this discussion has certainly already been covered in other venues by more skillful hands. Essentially, one of the primary benefits of distance education is the expansion of access to education that it affords  students. This expanded access may mean the erosion of existing geographical barriers. For instance, Schuhmann (2000) cites the absence of institutions of higher education in Wyoming as a major impetus for the development of the MPA distance learning program at the University of Wyoming.  

The expansion of distance education may also reduce non-geographic barriers to access. It carries the potential of increasing the access for students with physical disabilities by eliminating or reducing the need for these students to travel to on-campus sessions. In the CSULB-MPADL program, two students have recently joined the program for this very reason. Distance learning programs also usually afford more flexibility in scheduling for students, thus assisting full-time employees in obtaining the MPA degree. Depending on the modes of delivery utilized in a specific program, students may have the option of completing assignments early and attending asynchronous sessions at a more convenient time than in a traditional on-campus setting. Since many fully employed MPA students travel as part of their official duties, on-campus classes may not be practical options for them. Distance learning courses that utilize computer learning platforms may allow these students to log into the virtual classroom from any remote location and participate in the class.  

Many authors argue that distance learning may also enhance the amount of participation in class sessions (Du, 2005; Hung, 2005; Jewell, 2005; Reagan, 2005; Ritchie, 1989; Yang, 2005). The relative anonymity provided through virtual chats and email may benefit students who are reticent to participate in a traditional classroom setting. Few question the importance of active learning in public affairs education. 

From an administrative perspective, there is great monetary appeal for increasing the number of courses provided through distance learning. As many public universities struggle with access issues and enrollment levels increase, classroom space becomes a precious commodity. Obviously, virtual classrooms help alleviate these concerns, potentially increase enrollments, and thus increase revenue. Depending on the mode of delivery utilized in the distance learning program, the marginal costs of delivering the program vary greatly. While the initial fixed costs may be prohibitive (e.g. purchasing equipment, securing site licenses for software), marginal costs may decrease over time thus leading to large net profits for programs and colleges. Therefore, during poor fiscal times, programs may feel the pressure of migrating more courses into a distance learning delivery mode. 

Criticisms of Distance Learning Programs 

Many criticisms of distance learning public affairs programs emanate from concerns over academic quality. Brower and Klay (2000) lament the loss of personal contact that may occur in a distance learning environment. They express concern about the impact that this lack of personal contact may cause in a distance learning environment. Specifically, they express concern about the impact that this lack of personal contact may have on the professional socialization of students. Of course, the amount and type of contact between instructors and students is contingent on the type of technology that is employed. Some synchronous modes of delivery do provide more opportu high dropout rates are most probably associated with the specific design of the program. The CSULB-MPADL program is a cohased program where students enter the program together, complete all coursework as a cohort, and usually graduate in the same semester. Dropout rates in the CSULB-MPADL program are relatively low. Five out of 54 students (9.2%) who enrolled in Fall 2004 and Fall 2005 have subsequently withdrawn from the program. One of these individuals was dismissed for violating the terms of academic probation.  

As the sophistication of distance learning technology increases, there is a heightened concern that this increase in technology will actually reduce access. Students may need more sophisticated computers and learning resources and, therefore, some segments of society may be precluded from enrolling in these courses. Students in the CSULB- MPADL program are required to have a computer with Windows 2000 or newer, a sound card, broadband Internet access, and a headset and microphone. The only requirement that has presented an obstacle for some students in the program has been the broadband Internet connection. While a dial-up connection can be used, audio and video streams are facilitated with broadband connections. 

Concerns regarding academic dishonesty have also been cited as problems associated with distance learning programs (Grijavia, 2006). While it is true that it may be difficult to monitor academic dishonesty for timed examinations conducted online, random variation of examination questions can partially mitigate concerns over cheating and plagiarism. 

While many of these criticisms are certainly valid points to consider in developing distance learning programs, they may not necessarily preclude the increasing use of distance learning instructional modes in public affairs education. Perhaps the best source of data for measuring the validity of these concerns for the CSULB-MPADL program is the distance learning student population enrolled in the distance learning and traditional MPA programs at California State University, Long Beach.  

The MPA Programs 

Traditional MPA Program 

The Graduate Center for Public Administration and Policy (GCPPA) currently offers a traditional on-campus MPA program as well as a distance learning MPA program. The student populations enrolled in the programs are distinct, and it is rare that the GCPPA will allow an on-campus student to enroll in distance learning classes. The traditional on-campus program was established in 1973 and is NASPAA accredited. The traditional program also offers an Option in Public Works and Urban Affairs as well as certificates in areas such as Public Finance. The past five years have marked continued growth in the student population, and as of Fall 2005, there were 252 students enrolled in the traditional on-campus program and an additional 20 students in the Public Works and Urban Affairs Option Programs. The majority of traditional MPA students are female (58.7%), and 93.1% of students are in good standing. The remaining 6.9% are on probation and must maintain a 3.0 GPA to return to good academic standing. 

The curriculum consists of 36 units: 21 required units and 15 elective units. Required courses include an introductory course and courses in public budgeting, human resource management, organization theory, policy analysis, research methods, and a final directed research course that serves as a capstone for the program. Students must then choose five elective courses to complete their degrees. In Fall 2001, the GCPPA initiated a new portfolio graduation requirement to replace the existing comprehensive examination. In order to graduate, students must complete a four part portfolio, the cornerstone of which includes examples of their "best" work from all of their required courses. Since the inception of the portfolio requirement, graduation rates have increased from 50% to 67%, and the GCPPA graduates approximately 60-90 students per year from its traditional on-campus program. 

The Distance Learning Program 

In 1998, the GCPPA began a distance learning MPA program. The program is designed on a cohort model in which students begin the program as a cohort and progress through the program together. The sixth distance learning cohort began classes in Fall 2005 and will complete its studies in August, 2007. Cohort five is scheduled to complete coursework in August, 2006. This marks the first time that two cohorts are completing coursework concurrently. The program is designed to take 22 months to complete and consists of the same required courses as the traditional MPA program. Due to logistical necessity, distance learning students are not able to select their five elective courses. Rather, these "pre-selected" electives are constructed by the Distance Learning Director, and all students in the cohort take the same elective courses.  

The distance learning courses are offered in an accelerated format. Each course is six weeks in duration with a three hour on-campus meeting during the first week and a three hour on-campus meeting during the last week of classes. In addition to these two on-campus sessions, cohorts one through four also received instruction through one synchronous and one asynchronous session per week. The synchronous session consisted of a television broadcast utilizing fiber optics technology. Instructors broadcasted their lectures to various worksite locations throughout Los Angeles County. Students would share a microphone at each worksite that would allow limited communication with the instructor. This synchronous session would then be followed later in the week by an asynchronous session utilizing Blackboard. Students would participate in an asynchronous discussion board posting session. This afforded students the flexibility to complete assignments and postings during the week and did not mandate that they remain at their worksite for these asynchronous sessions.  

Beginning with cohort five in Fall 2004, the broadcast sessions were replaced with a synchronous session utilizing a computer assisted instructional platform (Centra Symposium). The new Symposium technology produced a virtual classroom where students have real-time interaction with the instructor. Each student can communicate with the professor and his/her classmates through audio or text chat. The Symposium platform allows students to indicate their desire to speak and then allows the instructor to open student microphones to facilitate discussion. Symposium is currently being used in both cohorts five and six.  

Student enrollment has increased steadily in the program over the past three years. Cohorts five and six currently have 25 and 20 students respectively, which represent cumulatively a 350% increase over cohort four.  

The demographic profile of the students in the distance learning program is relatively similar to the profile of on-campus students. The average distance learning student is 37.2 years old and holds an undergraduate degree in the social sciences. Specifically, 43.3% of students majored in a social science discipline. The second most frequent undergraduate major is business (23.3%) followed by liberal arts and engineering (16.7% each). As expected, they enter the program with above average undergraduate point averages (Mean = 3.27), and only two students hold another advanced degree. The student population is evenly distributed on the gender variable, and the average years between earning an undergraduate degree and entering the MPA program is 7 years, 10 months. Many of the students have used this time to rise to management level positions in government agencies. All of the students are currently employed in full-time positions with either government agencies or non-profit organizations. 46.7% hold management positions.   Most (46.7%) work for county agencies and departments; 33.3% work for city governments or city government organizations; 6.7% are federal employees; and 13.3% are employed in non-governmental/non-profit organizations.  

Method 

As discussed earlier, the purpose of this paper is to compare the satisfaction of students enrolled in the MPA-DL with the satisfaction of their counterparts in the traditional on-campus MPA program. One of the most vexing problems in distance learning research is the lack of comparability between the courses offered in distance learning and traditional MPA programs. While many programs have distance learning MPA programs and compile satisfaction indicators for students enrolled in these programs, it is usually not possible to compare equivalent classes across the two student populations. The CSULB MPA-DL program’s curriculum is nearly identical to the curriculum offered to traditional on-campus students. Specifically, the core courses are identical and are often taught by the same instructors. This study explores the student satisfaction scores for four (4) core courses that are offered in both the distance learning and on-campus programs. These courses include: an introductory/foundations course; a course in public budgeting and finance; a course in research methods; and a policy analysis course.  

To ensure comparability of content, the syllabus for each of these courses was examined to assess several factors that could threaten the comparability of the courses. First, the syllabi were examined to determine the number and type of assignments used. Each of these classes utilized essay based examinations, practical exercises, and a portfolio assignment that required students to integrate salient aspects of the course into a practical assignment. The length of the assignments was equivalent between all four classes. Second, the syllabi were examined to determine the amount of reading required of students. Each course required two textbooks, and the weekly required reading was approximately 100 pages. Third, to determine the style of content delivery, the researcher observed and/or participated in the distance learning versions of three of the four courses. Each course was a  primarily lecture based course that afforded ample opportunity for student participation. The comparability was also assured through informal interviews with the instructors to determine their assessment of the equivalency of the distance learning and on-campus versions of the course. The primary differences between the distance learning and on-campus class offerings included the medium of delivery and the accelerated nature of the distance learning class. While the on-campus courses encompassed 15 weeks of instruction, distance learning courses involved only 6 weeks of direct instruction. Given the demographic equivalency of the distance learning and on-campus student populations and the equivalency of the course material and instructors,  it was highly likely that any differences in student satisfaction between distance learning and on-campus students was attributable to the method of instruction native to distance learning or the instructor’s use of distance learning technology. 

Once the equivalency of the courses was established, the distance learning sections of the course were compared to two on-campus sections of the same course. Each of the courses was offered in the same semester by the same instructor. Data were collected from student evaluation of instruction surveys that were routinely distributed during the last two weeks of each class. The surveys used in the study were conducted in the Spring and Fall semesters, 2005. The survey questions are listed in Appendix A. The first eight questions were university required questions and appeared on both the distance learning and on-campus surveys. These questions measured student satisfaction of various aspects of the course including clarity of course objectives, consistency of grading, usefulness of assignments, reasonability of instructor expectations, instructor preparation, instructor effectiveness in presenting content, instructor availability, and a measure of the overall effectiveness of the instructor. The optional questions (normally questions 9-13) differed between the distance learning and on-campus evaluations. Question 9 on the distance learning survey measured the effectiveness of the Symposium software used in the weekly synchronous, interactive sessions. Question 10 measured student satisfaction of the Blackboard software that was used in the weekly asynchronous sessions. Question 11 assessed the usefulness of the course in improving a student’s understanding of concepts in the field. This question appeared as Question 9 on the on-campus survey. Question 12 measured the students’ assessment of the instructor’s knowledge of the course subject. This question appeared as Question 10 on the on-campus survey. All of the questions with the exception of the question regarding overall teaching effectiveness (Question 8) were scored on a Likert scale with a score of 5 indicating “Strongly Agree” and a score of 1 indicating “Strongly Disagree”. Student satisfaction with the overall teaching effectiveness of the instructor is also measured with a Likert scale with a score of 5 indicating “Most Effective” and a score of 1 indicating “Least Effective”.  

The surveys were not required, and students were permitted to opt out of submitting the survey. The on-campus student response rate was 82% (124 out of 151 possible respondents completed the survey). The distance learning student response rate was 44% (42 out of 96 possible respondents completed the survey). This difference in response rates was primarily attributable to the method of survey administration. For on-campus courses, instructors distributed the hard copy of the survey during the last two weeks of classes. According to university policy, instructors may not be present while evaluations are completed, and a proctor seals and signs the envelope before returning it to the Public Policy Office. Beginning in Fall 2004, distance learning students completed their student evaluation surveys on-line. The Distance Learning Director created the evaluation form using the survey function in Blackboard. Students then logged into the Blackboard site to complete their survey during the two weeks between courses. The self-directed nature of completing the distance learning surveys certainly contributed to the lower response rates.  

Caveat 

Obviously, given the relatively small sample sizes reflected in Tables 1-8, statistical significance was difficult to achieve. The data were analyzed with descriptive statistics as well as t-tests for independent sample means with unequal variance. While the tables note the levels of statistical significance of the t-values, caution should be used in asserting statistical significance given these sample sizes. These results should be viewed as exploratory in nature and represent an initial glimpse into the satisfaction levels of distance learning and on-campus students enrolled in equivalent courses. No attempt was made in this study to measure learning outcomes. The focus was merely to identify any important differences in the satisfaction levels of students enrolled in these two programs.  

Results 

Table 1 presents the mean scores and standard deviations for Courses A and B (an introductory level on-campus class) and the introductory level distance learning course. These courses were taught by the same instructor during the same semester. The mean satisfaction levels of distance learning students are higher than students in Course A for every question with the exception of Question 4. However, these differences are not statistically significant. On-campus student satisfaction with the reasonableness of the instructor’s expectations is marginally higher than distance learning student satisfaction. Other than this one exception, distance learning student satisfaction is marginally higher than on-campus satisfaction. However, the differences are rather small. The only significant difference in satisfaction levels concerns Question 3. Distance learning students express much larger satisfaction levels with the usefulness of assignments than their on-campus counterparts.  

The same distance learning class offering is then compared to another section of the introductory level on-campus class from the same semester (Course B) in Table 1. Again, the distance learning students express higher satisfaction levels than on-campus students for eight out of the ten questions. The exceptions to this pattern are question 2 (satisfaction with the consistency of instructor grading standards) and question 5 (satisfaction with instructor preparation). However, as was the case with Course A, the differences are all small, and distance learning students are at least as satisfied as their on-campus counterparts.  

Table 2 shows the student satisfaction means for two on-campus research methods courses (Courses C and D) as compared to a distance learning research methods course. The instructor for this course is different than the instructor for Courses A and B. For this instructor, student satisfaction scores are lower among distance learning students for all of the questions on each survey with the exception of Question 9 in Table 2 and the differences are minor except for questions 3 and 10. For Course D, Table 2 shows that on-campus satisfaction levels are higher for every question with the exception of question 9.  However, for Course D, the differences between the satisfaction levels of students in this course and the distance learning course are significant for questions 3, 5, 7,8, and 10. The only discernable pattern for the research methods courses is the higher satisfaction levels for on-campus students regarding the usefulness of assignments and instructor knowledge. On-campus student satisfaction in these two aspects is significantly higher than distance learning student satisfaction.
 

Table 1: Mean Student Satisfaction Responses of two On-Campus Introductory MPA Courses and an Introductory Distance Learning MPA Course


Question #

Distance Learning

On-Campus
Course A
n = 18

On-Campus
Course B
n = 11

#1 Instructor Information

4.87
(.71)

4.58
(.61)

NS*

4.63
(.50)

NS*

#2 Instructor Grading

4.87
(.35)

4.47
(.62)

NS*

4.90
(.30)

NS*

#3 Useful Assignments

4.875
(.35)

3.94
(1.19)

P < 0.05

4.30
(.82)

NS*

#4 Instructor Expectations

4.37
(.74)

4.47
(.80)

NS*

4.18
(1.07)

NS*

#5 Instructor Preparation

4.87
(.35)

4.70
(.58)

NS*

4.90
(.30)

NS*

#6 Effective Presentation

4.87
(.35)

4.35
(.25)

NS*

4.54
(.68)

NS*

#7 Instructor Availability

4.75
(.46)

4.64
(.63)

NS*

4.54
(.68)

NS*

#8 Overall Teaching Effectiveness

4.87
(.35)

4.35
(.78)

NS*

4.45
(.68)

NS*

#9 Understanding

4.87
(.35)

4.38
(.85)

NS*

4.36
(.80)

NS*

#10 Instructor Knowledge

4.875
(.35)

4.77
(.42)

NS*

4.72
(.46)

NS*

     

*NS = Difference is Not Statistically Significant at the p ≤ 0.05 level.
Note: Standard deviations appear in parentheses.

Note: The full text of the survey questions may be found in Appendix A.

 

Table 2: Mean Student Satisfaction Responses of two On-Campus Research Methods MPA Courses and a Research Methods  Distance Learning MPA Course


Question #

Distance Learning
n = 11

On-Campus
Course C
n = 22

On-Campus
Course D
n = 22

#1 Instructor Information

4.45
(.52)

4.72
(.70)

NS*

4.85
(.47)

NS*

#2 Instructor Grading

4.36
(.50)

4.72
(.70)

NS*

4.71
(.64)

NS*

#3 Useful Assignments

4.27
(.47)

4.77
(.14)

P < .05

4.76
(.52)

P<.05

#4 Instructor Expectations

4.54
(.52)

4.63
(.90)

NS*

4.71
(.64)

NS*

#5 Instructor Preparation

4.54
(.52)

4.77
(.14)

NS*

4.90
(.30)

P<.05

#6 Effective Presentation

4.18
(.60)

4.54
(.18)

NS*

4.71|
(.90)

NS*

#7 Instructor Availability

4.45
(.52)

4.80
(.11)

NS*

4.85
(.35)

P<.05

#8 Overall Teaching Effectiveness

4.27
(.47)

4.63
(.16)

NS*

4.81
(.68)

P<.05

#9 Understanding

4.54
(.52)

4.50
(.80)

NS*

4.50
(.68)

#10 Instructor Knowledge

4.18
(.75)

4.85
(.65)

P<.05

4.95
(.21)

P<.001

             

*NS = Difference is Not Statistically Significant at the p ≤ 0.05 level.
Note: Standard deviations appear in parentheses.
Note: The full text of the survey questions may be found in Appendix A. 
 

Table 3 represents the student satisfaction means for core public budgeting and finance courses (Courses E and F). The instructor for these courses is the same instructor for Courses C and D. Again, Tables 3 shows a pattern of lower levels of satisfaction among distance learning students. The only exception to this involves question 9 in which distance learning students were slightly more satisfied than their on-campus counterparts in Course F. However, the vast majority of differences (15 out of 20 questions) were small.  

Finally, Table 4 shows the student satisfaction means for a core policy analysis course (Courses G and H). The instructor for these courses is the same instructor who offered the introductory courses discussed earlier. As was the case with the introductory course students, distance learning students in the policy analysis course expressed higher levels of satisfaction than their peers in Course G, although in only three of the questions were the differences statistically significant. Surprisingly, satisfaction levels of distance learning students were actually higher than the satisfaction levels of students in Course H. However, none of these differences were statistically significant.   

Table 3: Mean Student Satisfaction Responses of two On-Campus Public Budgeting Courses and a Public Budgeting Distance Learning MPA Course


Question #

Distance Learning
n = 8

On-Campus
Course E
n = 17

On-Campus
Course F
n = 18

#1 Instructor Information

4.50
(.75)

4.94
(.24)

P<.05

4.94|
(.23)

NS*

#2 Instructor Grading

4.75
(.46)

4.94
(.24)

NS*

4.83
(.38)

NS*

#3 Useful Assignments

4.62
(.52)

4.76
(.43)

NS*

4.83
(.38)

NS*

#4 Instructor Expectations

4.62
(.52)

4.76
(.43)

NS*

4.88
(.32)

NS*

#5 Instructor Preparation

4.62
(.52)

5.00
(.00)

P<.05

5.00
(.00)

P<.05

#6 Effective Presentation

4.5
(1.07)

4.76
(.43)

NS*

4.94
(.23)

NS*

#7 Instructor Availability

4.75
(.71)

4.93
(.25)

NS*

4.82
(.39)

NS*

e; border-bottom: 1.0pt solid windowtext; padding-left: 5.4pt; padding-right: 5.4pt; padding-top: 0in; padding-bottom: 0in; background: #FFFFCC">

4.37
(1.19)

5.00
(.00)

P<.05

4.83
(.38)

NS*

#9 Understanding

4.5
(1.07)

4.80
(.41)

NS*

4.47
(.87)

NS*

#10 Instructor Knowledge

4.87
(.35)

5.00
(.00)

P<.05

4.77
(.54)

NS*

             

*NS = Difference is Not Statistically Significant at the p ≤ 0.05 level.
Note: Standard deviations appear in parentheses.

Note: The full text of the survey questions may be found in Appendix A.
 

Table 4: Mean Student Satisfaction Responses of two On-Campus Policy Analysis  MPA Courses and a Policy Analysis  Distance Learning MPA Course


Question #

Distance Learning
n = 15

On-Campus
Course G
n = 9

On-Campus
Course H
n = 7

#1 Instructor Information

4.60
(.63)

4.33
(1.11)

NS*

4.71
(.48)

NS*

#2 Instructor Grading

4.80
(.56)

4.55
(.88)

NS*

4.85
(.37)

NS*

#3 Useful Assignments

4.60
(.91)

4.00
(1.11)

NS*

4.71
(.48)

NS*

#4 Instructor Expectations

4.07
(1.53)

3.44|
(1.5)

NS*

4.42
(.78)

NS*

#5 Instructor Preparation

4.80
(.56)

4.55
(1.01)

NS*

4.85
(.37)

NS*

#6 Effective Presentation

4.40
(.91)

4.22
(1.30)

NS*

4.57
(.53)

; border-top: medium none; border-bottom: 1.0pt solid windowtext; padding-left: 5.4pt; padding-right: 5.4pt; padding-top: 0in; padding-bottom: 0in">

#7 Instructor Availability

5.00
(.00)

4.55
(.88)

P<.05

4.85
(.37)

NS*

#8 Overall Teaching Effectiveness

4.67
(1.05)

4.11
(1.05)

P<.01

4.57
(.53)

NS*

#9 Understanding

4.73
(.46)

4.22
(1.09)

P<.05

4.42
(.78)

NS*

#10 Instructor Knowledge

5.00
(.00)

4.55
(1.01)

NS*

4.85
(.37)

NS*

             

*NS = Difference is Not Statistically Significant at the p ≤ 0.05 level.
Note: Standard deviations appear in parentheses.

Note: The full text of the survey questions may be found in Appendix A. 

After reviewing the tables, it becomes apparent that the satisfaction levels of distance learning and on-campus students are very similar. The differences are rather negligible in 66 out of 80 questions. The next step is to analyze the questions where larger than anticipated differences are noted to uncover any discernable patterns in the data. Of the fourteen significant differences, three of the differences come in the form of Question 3. However, the directionality is different between instructors. While distance learning students are more satisfied with the usefulness of assignments for the instructor in the introductory course, they are less satisfied with the assignments used in the research methods course. Two of differences involve the instructor preparation question (Question 5) in the public budgeting course. Again these differences are seen primarily for this instructor in this particular course and do not appear to be part of any larger pattern in the data. Another question that involves significant differences between distance learning and on-campus students is Question 8. This question measures the overall teaching effectiveness of the instructor. The mean on this question is significantly lower among on-campus students in only one of the on-campus sections of the public budgeting course (Course E). The opposite is true for Course G (an on-campus policy analysis course) as the distance learning students are actually more satisfied with the teaching effectiveness of this instructor than distance learning students. 

The most obvious pattern to the data does not involve the mode of instruction (distance learning or on-campus) but rather, the instructors. The instructor for Courses A, B, G, and H received more favorable evaluations from distance learning students on 31 out of 40 possible questions (78%). The content of the question does not appear to impact the student satisfaction levels. For this instructor, distance learning students are more satisfied than on-campus students with most aspects of the course measured by the survey. However, as mentioned previously, these differences are not statistically significant. Interestingly, the opposite is true for the instructor in Courses C, D, E, and F. In this instance, the instructor received higher satisfaction ratings from distance learning students in only four out of 40 possible questions (10%).

Discussion  

As mentioned earlier, the small sample sizes and unique setting preclude reasonable assertions of statistical significance. However, anecdotally, the data do suggest that the instructor may be a more important variable to consider in evaluating student satisfaction with distance learning MPA education than the mode of instructional delivery. The most obvious pattern in the data suggests that distance learning students respond to the instructor more so than the mode of delivery. The mode of delivery for all four distance learning courses is essentially identical. The students receive the same number and duration of synchronous and asynchronous sessions. The delivery method for all of the courses in this analysis is primarily lecture based, and the assignments require equivalent amounts of work and academic rigor. The primary variation between these courses is the instructor. Both instructors are tenured full-time faculty members with more than ten years of full-time faculty experience teaching in MPA programs. Instructor 1 has taught in the MPA program since its inception in 1998. Instructor 2 has taught in the program since 2001 but has also taught Courses C, D, E, and F in online versions using multiple asynchronous platforms in addition to Blackboard. Instructors 1 and 2 both use Blackboard extensively in their on-campus classes however both instructors only began using the synchronous Symposium software in Fall 2004. Therefore, both instructors have limited experience using the synchronous software platform but extensive experience utilizing Blackboard. Given these different levels of experience, it is interesting to note if distance learning students are more satisfied with the synchronous or asynchronous aspects of the instructors’ delivery. As seen in Table 9, student satisfaction of the asynchronous element of the introductory and policy analysis courses is marginally higher than the student satisfaction levels for the synchronous element. However, the satisfaction levels for both elements increase for the policy analysis course, which is offered as the third course in the program. The same is true for the instructor in the research methods and policy analysis course. This difference is attributable to students becoming more comfortable with the synchronous and asynchronous elements as they progress through the program. However, any advantage that emerges from instructor experience using the technology is probably temporal in nature. 

Table 5. Comparison of On-Campus and Distance Learning Student Satisfaction of Instructional Technology 

Question

On Campus Mean
n = 37

Distance Learning Mean
n = 42

Satisfaction with Symposium/Technology

4.8
(.46)

4.59
(.70)

Satisfaction with Blackboard/Technology

4.8
(.46)

4.64
(.52)

Note: Standard deviations appear in parentheses.
Note: The full text of the survey questions may be found in Appendix A. 

Conclusion 

Therefore, the results of this exploratory study indicate there are few significant differences between the satisfaction levels of distance learning and on-campus students. Despite the different modes of delivery and the accelerated nature of the distance learning program, both distance learning and on-campus students are highly satisfied with the quality and delivery of the four courses analyzed in this study. The only discernable pattern to these preliminary data is the variation in the directionality of the satisfaction levels. It appears that satisfaction is more a function of the instructor in the course rather than the mode of delivery. As the database expands and more surveys are added to the study, it will be interesting to see if this pattern continues. If it does, the implications may be important for public affairs distance education. Rather than focusing on the merits of the delivery mode itself, perhaps an emphasis should be placed on assisting instructors in adapting to these new technologies. The majority of existing literature on distance learning has focused on the technology used in delivering course content. This research suggests that instructors are still relevant even in a distance education setting. Perhaps future research should explore the instructor component of effective distance education. Additionally, distance learning programs may want to place additional emphasis on recruiting faculty who already possess an interest in computer assisted instruction. While experience in using the particular method of computer assisted instruction is certainly useful, it is likely that the initial advantages afforded by this experience are temporary. The desire to utilize this technology to its fullest potential is perhaps more important than prior experience. As this research expands, the debate will hopefully expand beyond a discussion of the feasibility or sagacity of distance learning public affairs education and focus more on how to enhance the distance learning experience for MPA students.

 

References 

Biner, P., et. al. (1997). The impact of remote-site group size on student satisfaction and relative performance in interactive telecourses. The American Journal of Distance Education. 11:1, 23-33.  

Brower, R. and Klay, W. (2000). Distance learning: Some fundamental questions for public affairs education. Journal of Public Affairs Education. 6:4, 215-231.

DiIulio, J. and Kettl, D. (1995). Fine print: The contract with America, devolution, and the administrative realities of American federalism. Washington, D.C.: Brookings. 

Du, J. (2005). Dynamic online discussion: Task oriented interaction for deep learning. Educational Media International. 42:3. 207-218.  

Goodsell, C. and Armstrong, J. (2001). Teaching state public policy: Distance learning and converged instruction. Journal of Public Affairs Education. 7:2, 91-100. 

Grijavia, T. (2006). Academic honesty and online courses. College student journal. 40:1, 180-186. 

Hiltz, S. (1990). Evaluating the virtual classroom in L.M Harasim. Online education: Perspectives on a New Environment. New York: Praeger Publishing.  

Hung, D., et. al. (2005). How the internet facilitates learning as dialog. International Journal of Instructional Media. 32:1, 37.  

Jewell, V. (2005). Continuing the classroom community: Suggestions for using online discussion boards. English Journal. 4:1, 83-87.  

Kidney, G. (2004). When the cows come home: A proven path of professional development for faculty pursuing e-learning. THE Journal. 31:11. 12-16.  

Phelps, R, Wells, R. Ashworth, R and Hahn, H. (1991). Effectiveness and costs of distance education using computer mediated communication. The American Journal of Distance Education. 5:3, 7-19.  

Reagan, C. (2004). Analyzing students’ conversations in chat room discussion groups. College Teaching. 52:4, 143-149.  

Richardson, J. (2005). Students’ perceptions of academic quality and approaches to study in distance education. British Education Research Journal. 31:1, 7-27.  

Ritchie, H. and T. Newby. (1989). Classroom lecture/discussion vs. live televised instruction: A comparison of effects on student performance, attitude, and interaction. The American Journal of Distance Education. 3:3, 36-45.  

Scheer, T. (2001). Exploring the impact of distance learning on MPA students. Journal of Public Affairs Education. 7:2, 101-115.  

Schuhmann, R, R. Cowley, and R. Green. (2000). The MPA and distance education: A story as a tool of engagement. Public Administration and Management: An Interactive Journal. 5:4,190-213.

Solomon, G. (2005). Shaping e-learning policy. Technology and Learning. 25:10, 26-31. 

Travis, J. and K. Price. (2005). Instructional culture in distance learning. Journal of Faculty Development. 20:2, 99-104.  

Yang, Y., T. Newby, and R. Bill. (2005). Using Socratic questioning to promote critical thinking skills through asynchronous discussion forums in distance learning environments. American Journal of Distance Education. 19:3, 163-181.


Appendix A

 

On-Campus Course Evaluation

 

Instructions: This form is provided for you to use in evaluating the instructor of this course. A summary of the evaluations from all students in this class and this evaluation will be read by your instructor only after the semester grades have been submitted. Please be candid in your responses. These evaluations are used to assess the quality of teaching by this instructor as perceived by students. Responses may be used in making personnel decisions regarding your instructor. IF ANY PERSON(S) HAS TRIED TO INFLUENCE YOUR RATINGS ON THIS EVALUATION THROUGH SUBSTANTIVE ADVICE OR INSTRUCTION AS TO WHAT RATINGS YOU SHOULD GIVE, YOU SHOULD REPORT THAT PERSON(S) TO THE DEPARTMENT CHAIR OR OTHER UNIVERSITY ADMINISTRATOR SO APPROPRIATE ADMINISTRATIVE ACTION MAY BE TAKEN.

 

  1. Instructor provided clear and accurate information regarding course objectives, requirements and grading procedures.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A 

  1. The instructor’s grading was consistent with stated criteria and procedures.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor provided assignments/activities that were useful for learning and understanding the subject.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor’s expectations concerning work to be done in this course were reasonable.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was well prepared for class.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was effective in presenting subject content and materials in the class.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was available during posted office hours for conferences about the course.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. Rate the overall teaching effectiveness of the instructor in this course.

5 = Excellent

4

3

2

1 = Poor

N/A

 

  1. This course improved my understanding of concepts and principles in this field

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor’s knowledge of the subject was excellent.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A 

 

Appendix B 

Distance Learning Course Evaluation 

Instructions: This form is provided for you to use in evaluating the instructor of this course. A summary of the evaluations from all students in this class and this evaluation will be read by your instructor only after the semester grades have been submitted. Please be candid in your responses. These evaluations are used to assess the quality of teaching by this instructor as perceived by students. Responses may be used in making personnel decisions regarding your instructor. IF ANY PERSON(S) HAS TRIED TO INFLUENCE YOUR RATINGS ON THIS EVALUATION THROUGH SUBSTANTIVE ADVICE OR INSTRUCTION AS TO WHAT RATINGS YOU SHOULD GIVE, YOU SHOULD REPORT THAT PERSON(S) TO THE utospace: none; margin-left: .25in"> 1. Instructor provided clear and accurate information regarding course objectives, requirements and grading procedures.  

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A 

2. The instructor’s grading was consistent with stated criteria and procedures 

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor provided assignments/activities that were useful for learning and understanding the subject.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor’s expectations concerning work to be done in this course were reasonable.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was well prepared for class.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was effective in presenting subject content and materials in the class.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor was available during posted office hours for conferences about the course.

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. Rate the overall teaching effectiveness of the instructor in this course.

5 = Excellent

4

3

2

1 = Poor

N/A

 

  1. The Symposium sessions enhanced student learning.

5 = Strongly Agree

4

3

2

1 = Strongly Disagree

N/A

 

  1. The Blackboard sessions enhanced student learning.

5 = Strongly Agree

4

3

2

1 = Strongly Disagree

N/A

 

  1. This course improved my understanding of concepts and principles in this field

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A

 

  1. The instructor’s knowledge of the subject was excellent.

 

5 = Strongly Agree

            4

            3

            2

            1 = Strongly Disagree

            N/A


Manuscript received 28 Nov 2006; revision received 19 Feb 2007.
 

This work is licensed under a

Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License

 

 

 
Copyright © 2005 MERLOT. All Rights Reserved.
Portions Copyright by MERLOT Community Members. Used with Permission.
Questions? Email: jolteditor@merlot.org
Last Modified : 2005/04/14