Introduction
Cosumnes River College is a community college in Sacramento, California. It is part of the Los Rios Community College District and was founded in 1970. More than 12,000 students attend. Cosumnes River College has offered online classes since the fall of 2001, and the growth of online enrollment has been spectacular. During that first semester enrollment totaled 376; at the spring 2010 first census (enrollment at the fourth week of the semester) enrollment totaled 5,289. That marks a 1,307% increase in nine years. Most of the online classes are taught using eLearning, a district-wide service for integrating the Internet with instruction. During the period of this study, the learning management system at the heart of eLearning was Blackboard.
The district provides server support, and application support for faculty is provided by a distance education coordinator. This is a faculty position created to train fellow faculty how to integrate technology with their instruction. The college lacks a rigid structure for determining online course offerings, though before a course can be taught in this modality it must undergo review by the local curriculum committee. The allotment of faculty time to online teaching has heretofore been largely a matter of negotiation between individual teachers and area managers.
Typically, faculty in traditional classrooms focus on the content area of their courses and do not teach study skills (how to take notes, test taking techniques, etc.). The same can be said for online instructional faculty. The application support and training provided to instructors includes little provision for how to help students succeed in this learning modality, and that consists mostly of orienting students to the learning management system. Instructors are encouraged to include an orientation in their initial contact with students at the beginning of a class, but there is no requirement to do so.
Initially the college did not provide organized support for online students. Comments from online faculty about the amount of their time that was spent helping students with technical concerns inspired the college in fall 2002 to enlist students to provide that support. The helping students were enrolled in computer information science course on how to be computer support technicians. This lasted for two semesters and was abandoned in the summer of 2003 when district management assigned the district's full-time help desk staff the role of providing student support. The help desk answers many questions from students and is particularly supportive in the area of access to the learning management system and basic technical issues. However, questions related to class content are directed to instructors. Only instructors can make changes to online content, allow students to resubmit online assignments, or reset online test attempts.
Server support and application support for faculty and students does not address the academic preparation for distance education or the study skills necessary to succeed in this learning modality. One of the college's early online instructors created a new course called Online Student Success (OSS), which is designed to prepare students for this new environment. This is a fully online class, and a sample syllabus for this course is in appendix A.
OSS was first offered in fall 2002 and has two main objectives: it prepares students to succeed in the online learning environment in general and exposes them to the college's learning management system in order to help them become familiar with it. As with any software program, if Blackboard's "rules of engagement" are understood the user will have a more positive experience. After taking this class, students should not have any technical issues or basic access problems with the system in their other online courses, as they are supposed to be used to interacting with the system. It works like an in-depth orientation and by the end of the course students use each of the various tools available in Blackboard. They use the discussion board to create a community with their fellow learners, use each of the available assessment instruments, and explore the different options that instructors have for disseminating content. In other online courses that use this system, students who have met this objective will be able to concentrate on course content and not be distracted by the novelty, learning curve, and frustration involved in learning how to use new software.
Blackboard is not the only software used to support online education, and so the second objective of the course keeps this in mind by teaching students the skills that will help them in any online learning environment. Beyond the basics of where to click and how to log in, the course gives students the study skills and engenders an appreciation for the challenges unique to this milieu. This includes topics like working in groups on a project where their fellow learners are never together in the same location. Another topic is netiquette, the conventions and informal rules of electronic communication. They also learn their place in the history of instructional and technological innovation. To help them balance online learning responsibilities with their other duties, they study time management. By the end of the course, students will know the skills and habits necessary to manage effectively an online learning path, and they will be able to judge whether they want to continue along that path.
Because the motivation for this course was to help students with their other online courses, OSS is typically offered during the middle of the semester. It is a one-unit course that lasts six weeks. Cosumnes River College currently has an eighteen-week semester, and this class is taught during the second-six-week term. The purpose for scheduling it then is to provide timely help for students who may be struggling in concurrent online classes.
In spring 2004 the college developed its Distance Education Master Plan to provide organization for its distance education efforts. One of the plan's goals is to create the infrastructure to support a fully distance education general education degree. Filling curriculum gaps and schedule planning is a part of this goal, but no less important is the provision of services to distance education students. Though training in study skills has long been a part of student services, technology-mediated instruction adds a new dimension. This is especially true for online learning, and part of student services is making sure that students are well prepared to succeed in all learning modalities used by the college's instructional programs.
This research was motivated by the questions, “Does OSS fulfill its promise? Does taking this class improve the success rate of students in their other online classes?” This investigation concentrates on what happens before those classes begin. The hypothesis of this project has two parts. First, the students who take this class will be more successful in online classes than students who have not taken the class. Second, these students will also be more successful in online classes after they have succeeded in OSS.
Literature Review
A review of the literature finds little focus on student preparation for online learning. Much of the literature on online education in general (not focused on a case study or particular course) is centered on how teachers can create a learning environment that engages students and supports active learning. Some authors deal primarily with how technology can be used to enact good teaching. Horton (2001) discusses how Web design principles and interactive tools can be used to make online learning effective. Others start from the teaching side and then move to technological considerations (Chickering and Ehrmann, 1996; Collison et al., 2000; Elbaum, McIntyre, and Smith, 2002). These works are focused on the teacher and their consideration of students is informed by what online instructors can do to help students within a particular course.
One trend in the literature is to recognize the different role that teachers play in an online course. New names such as “moderator”, “guide,” and “facilitator” are applied to this role, though these works also concentrate on the teacher’s part of the online learning discourse. Palloff and Pratt (2001) argue that the most successful online courses are ones that are centered on learners, not faculty. They extend this model to the design of online programs and believe that highlighting the faculty interest, as opposed to the students’ interest, may be a reason why online students have lower retention rates than on-campus students.
Some authors provide specific tips on helping students in online classes. For example, Chute, Thompson, and Hancock (1999) encourage instructors to conduct hands-on orientation sessions and set up Web pages with frequently asked questions, noting that complete introductions taking place near the beginning of a term help establish familiarity with the online learning tools. They also recommend the establishment of a help desk so that students can get technical difficulties resolved quickly.
Other guides on online program development realize the special needs of students in this modality, but their focus is also on technology. Moore, Winograd, and Langue (2001) include a list of ten benchmarks for evaluating online courses. One is included under the heading “Student Support,” and it encourages students to be trained on how to use required technology. Making the interface seamless for learners is recognized as good teaching practice, so students can focus their energies on meeting course learning objectives.
Case studies and those that derive data from student surveys also tend to support technology-related tasks or instructional design within courses. Vonderwell and Zachariah (2005) conducted a case study on participation, and they found that technology and interface characteristics are the most important factors in encouraging quality participation. In Chee and Warner’s study (2005) technology training was cited as significant in promoting student satisfaction.
Recent studies have added a focus on the attributes shared by students who are successful as online learners. For example, Berenson et al. (2008) found that students’ emotional intelligence, which the authors recognized as an intrinsic factor, as a significant direct predictor of grade point average in online classes. Yen and Liu (2009) discovered a similar relationship between course success and learner autonomy. In a case study focused on attendees of an online course in aviation physiology, Artino (2009) revealed that the educational goals of students prior to enrollment were linked with their value of and satisfaction with that particular course. These studies focus on what successful online students brought with them to the online learning environment.
The general thrust of student-success literature is that instructional design by professors, technology training for students, and identifying the characteristics correlated with student success are key; little attention so far has been paid to the effectiveness of focused training in preparation for online learning.
Methodology
Two different methodologies were used: a review of historical data as well as data collected via a follow-up survey. Each subsection below provides information on both types of research.
Definitions
The following terms are used throughout this report:
Fully online: This is a course that has been approved by the Cosumnes River College Curriculum Committee for distance education delivery in the online modality. There is no college-wide requirement for on-campus orientations, testing, or other meetings.
Successful: This is a final grade for a course of A, B, C, or CR (credit).
Unsuccessful: This is a final grade for a course of D, F, NC (no credit), I (incomplete), and W (withdrawal, which is a student dropping a class after the drop deadline during a semester).
Subjects
Historical Data
The subjects of this study are students who enrolled in fully online classes at Cosumnes River College from fall 2003 to fall 2005. This group was divided into four subgroups, each being defined by its relationship to Online Student Success (OSS). Table 1 shows the count of each group:
Table 1 : Online Students by Relationship to OSS
Category |
Number |
No OSS |
5,147 |
OSS Successful |
78 |
OSS Unsuccessful |
52 |
OSS Drop |
45 |
Students who never enrolled in OSS make up the “No OSS” group. They are the comparison group for the first hypothesis of this project. Members of "OSS Successful" are the people who enrolled and passed OSS. Those who did not pass OSS comprise "OSS Unsuccessful." This group is not differentiated between the students who completed some of the work and those who did not do any of the class assignments. In other words, a student who enrolled in the class but did not contact the instructor, complete any assignments, or attend the orientation did not succeed in the class. Finally, the "OSS Drop" students were enrolled in the class but dropped before the drop deadline and so do not have a transcript record of this class. Like the second group, there is no distinction between students who completed some work and those who did none.
Though some group sizes are small, the demographic comparison revealed in Table 2 is interesting. The percentage of women was higher among the students with a transcript record of OSS, though women were a majority of all four groups. Among all the groups except the OSS Unsuccessful, white students constituted the largest ethnic group. African American students were the largest ethnic group among the OSS Unsuccessful.
All four groups share a similar age distribution, with the largest group being those students who are in the traditional college age group (fewer than 25 years old). The greatest demographic disparity between No OSS students and those who have some involvement with the class has to with residence. For all three OSS-related groups, a majority of the students live near campus (within the campus ZIP code or an adjacent ZIP code). For the No OSS students the opposite was true: a minority of them lived nearby. This is shown in Table 3.
The stated educational goals for each group of students appear in Table 4. For all four groups, an academic plan that included a transfer to a four-year college was by far the most commonly identified educational goal. The OSS Successful group was more likely to be undecided than the other three groups.
The popularity of transfer is also reflected in the transfer status of the courses selected by students, as shown in Tables 5 and 6. Table 5 presents the number of transfer-level course sections offered during the study period. Table 6 shows that three of the four groups enrolled in transfer-level courses at a rate comparable to the rate of sections offered. Only did the OSS Unsuccessful students not make transfer-level courses as high a priority.
Table 2 : Gender and Ethnicity by Research Group
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
Demographic |
N |
% |
N |
% |
N |
% |
N |
% |
Gender |
Male |
1,976 |
38% |
17 |
22% |
12 |
22% |
17 |
38% |
Female |
3,145 |
61% |
59 |
78% |
43 |
78% |
28 |
62% |
Total |
5,121 |
100% |
76 |
100% |
55 |
100% |
45 |
100% |
Ethnicity |
African American |
604 |
12% |
7 |
9% |
17 |
31% |
13 |
29% |
Asian/Pacific Islander |
1,136 |
22% |
18 |
24% |
9 |
16% |
9 |
20% |
Latino/Hispanic |
509 |
10% |
10 |
13% |
6 |
11% |
6 |
13% |
White |
2,472 |
48% |
37 |
49% |
15 |
27% |
14 |
31% |
Other |
426 |
8% |
4 |
5% |
8 |
15% |
3 |
7% |
Total |
5,147 |
100% |
76 |
100% |
55 |
100% |
45 |
100% |
Table 3 : Age and Residence by Research Group
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
Age at term
Because a student’s age at term increases each year, the total number may be greater than the number of subjects in each category. |
<25 |
2,396 |
40% |
51 |
47% |
30 |
42% |
29 |
50% |
25-35 |
1,777 |
30% |
25 |
23% |
22 |
31% |
17 |
29% |
36+ |
1,754 |
30% |
33 |
30% |
20 |
28% |
12 |
21% |
Total |
5,927 |
100% |
109 |
100% |
72 |
100% |
58 |
100% |
Residence at term
Because some students move between or during terms, the total number may be greater than the number of subjects in each category. |
Near college
ZIP code that is adjacent to or the same as the college ZIP code |
2,240 |
43% |
49 |
61% |
33 |
54% |
33 |
67% |
Far from college |
3,008 |
57% |
31 |
39% |
28 |
46% |
16 |
33% |
Total |
5,248 |
100% |
80 |
100% |
61 |
100% |
49 |
100% |
Table 4 : Educational Goal by Research Group (The educational goal must be updated each term, so as individual goals change the total number may be greater than the number of subjects in each category.)
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
Goal |
N |
% |
N |
% |
N |
% |
N |
% |
Acquire new job skills, only |
460 |
8% |
3 |
3% |
4 |
6% |
2 |
4% |
Discover career interests |
146 |
3% |
6 |
6% |
1 |
2% |
2 |
4% |
Earn a vocational certificate |
339 |
6% |
1 |
1% |
6 |
9% |
3 |
5% |
Earn a vocational degree w/o transfer |
293 |
5% |
7 |
7% |
3 |
5% |
3 |
5% |
Earn AA/AS Degree w/o transfer |
535 |
9% |
10 |
10% |
6 |
9% |
4 |
7% |
Educational development |
181 |
3% |
5 |
5% |
|
|
4 |
7% |
Improve basic skills |
49 |
1% |
1 |
1% |
|
|
|
|
Maintain certificate/license |
104 |
2% |
1 |
1% |
2 |
3% |
3 |
5% |
Transfer to 4-year after AA/AS |
2,056 |
36% |
35 |
36% |
35 |
54% |
19 |
35% |
Transfer to 4-year w/o AA/AS |
679 |
12% |
13 |
13% |
3 |
5% |
9 |
16% |
Undecided on goal |
529 |
9% |
15 |
15% |
4 |
6% |
5 |
9% |
Update job skills, only |
272 |
5% |
1 |
1% |
1 |
2% |
|
|
Table 5 : Transfer Level of Online Course Sections Offered (Courses numbered 300-499 are transfer level. Los Rios renumbered its courses in summer 2004, so the enrollment data in this table therefore include the fall 2004, spring 2005, and fall 2005 semesters only.)
Transfer Status of Offered Class |
N |
% |
Transfer level |
160 |
65% |
Non transfer level |
85 |
35% |
Total |
245 |
100% |
Table 6 : Transfer Level of Enrollments by Research Group
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
Transfer Status of Enrolled Class |
N |
% |
N |
% |
N |
% |
N |
% |
Transfer level |
5,083 |
70% |
90 |
66% |
44 |
38% |
76 |
76% |
Non transfer level |
2,142 |
30% |
46 |
34% |
71 |
62% |
24 |
24% |
Total |
7,225 |
100% |
136 |
100% |
115 |
100% |
100 |
100% |
A final illustrative comparison relates to the academic departments each group of students enrolled in during the study period. This is shown in Table 7. For all four groups, classes in computer information science were selected more often than any other, though the OSS Unsuccessful students were just as interested in management. This was also popular with the OSS Successful and OSS Drop groups. Table 8 shows the number of online sections offered by department.
Table 7. Department of Enrollment by Research Group
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
Department |
N |
% |
N |
% |
N |
% |
N |
% |
Accounting |
68 |
1% |
|
|
|
|
|
|
Agriculture Business |
76 |
1% |
|
|
|
|
2 |
2% |
Allied Health |
647 |
6% |
12 |
8% |
7 |
10% |
2 |
2% |
Biology |
79 |
1% |
|
|
|
|
1 |
1% |
Building Inspection Technology |
60 |
1% |
1 |
1% |
1 |
1% |
|
|
Business |
1,126 |
11% |
8 |
5% |
5 |
7% |
8 |
8% |
Computer Information Science |
4,655 |
46% |
56 |
36% |
19 |
27% |
40 |
42% |
Communications Media |
59 |
1% |
|
|
|
|
|
|
Economics |
249 |
2% |
|
|
1 |
1% |
2 |
2% |
English |
191 |
2% |
1 |
1% |
3 |
4% |
1 |
1% |
Environmental Technology |
64 |
1% |
|
|
|
|
|
|
Geography |
48 |
<1% |
|
|
|
|
|
|
Geology |
32 |
<1% |
|
|
|
|
|
|
Health Education |
24 |
<1% |
|
|
|
|
|
|
Health Information Technology |
301 |
3% |
|
|
|
|
2 |
2% |
Journalism |
73 |
1% |
|
|
|
|
1 |
1% |
Math |
591 |
6% |
7 |
5% |
7 |
10% |
2 |
2% |
Management |
967 |
10% |
46 |
30% |
19 |
27% |
24 |
25% |
Marketing |
165 |
2% |
2 |
1% |
1 |
1% |
3 |
3% |
Nutrition |
643 |
6% |
22 |
14% |
7 |
10% |
7 |
7% |
Total |
10,118 |
100% |
155 |
100% |
70 |
100% |
95 |
100% |
Follow-up Survey
The subjects who completed the follow-up survey represented a subset of the historical data subjects. All of them had some relationship to OSS, and each possible relationship was represented in this group. There were 21 respondents total. Three of this group were in the OSS Unsuccessful group, four were in the OSS Drop group, and 14 were in the OSS Successful group. Table 9 shows the semester each of the students enrolled in OSS.
Procedure
Historical Data
Historical data were extracted from the college's application and academic records databases. The time period was the fall 2003 through fall 2005 semesters. All academic records were retrieved for any course account identified as a fully online class. This was easy because of the consistency with which our online classes have been assigned a code in our student records database. Enrollment data within those courses include the student identification number; that was used to retrieve demographic data from the application database. The college research officer produced a data file for each of the four groups. These files were generated in the middle of the fall 2005 semester, so for that term enrollment data were included based on the fourth-week census. Grade data were not included for that term.
Table 8 : Department of Online Sections Offered
Department |
N |
% |
Accounting |
1 |
<1% |
Agriculture Business |
6 |
1% |
Allied Health |
13 |
3% |
Biology |
2 |
<1% |
Building Inspection Technology |
2 |
<1% |
Business |
44 |
9% |
Computer Information Science |
173 |
37% |
Communications Media |
2 |
<1% |
Economics |
6 |
1% |
English |
8 |
2% |
Environmental Technology |
4 |
1% |
Geography |
2 |
<1% |
Geology |
1 |
<1% |
Health Education |
1 |
<1% |
Health Information Technology |
4 |
1% |
Journalism |
4 |
1% |
Math |
33 |
7% |
Management |
128 |
27% |
Marketing |
7 |
1% |
Nutrition |
1 |
6% |
Total |
467 |
100% |
Table 9 : Survey Respondent Semester of Enrollment in OSS
Semester |
OSS Successful |
OSS Unsuccessful |
OSS Drop |
Fall 2002 |
1 |
1 |
0 |
Spring 2003 |
2 |
0 |
0 |
Fall 2003 |
0 |
1 |
1 |
Spring 2004 |
3 |
0 |
0 |
Fall 2004 |
4 |
0 |
3 |
Spring 2005 |
4 |
1 |
0 |
Total |
14 |
3 |
4 |
Follow-up Survey
The follow-up survey instrument was administered online using the same learning management system (Blackboard) that was used to teach the course. A course account was created on the system for the sole purpose of conducting this research, and all of the students in the OSS Successful, OSS Unsuccessful, and OSS Drop groups were enrolled in the course account. Only one item was in the account, a test with 18 questions. (A "test" in Blackboard allows the instructor to see how each student answered each question, whereas a "survey" is anonymous within the course account. For this project, a test was used so that respondents could be placed within their appropriate OSS group for analysis.) Four of the questions were multiple -answer, 11 were multiple choice, and three were open-ended essay questions. The multiple-choice and multiple-answer questions prompted students to provide information about the number of online classes they had taken prior to enrolling in OSS, the number since, and where they had taken those classes. The questions also asked students to judge the effectiveness of OSS at preparing them for their online learning experiences. The essay questions gave students an opportunity to provide information not covered elsewhere and comments about their experiences in OSS. The survey questions are available in appendix B. Students were contacted via e-mail, and the class and test were available from 17 November 2005 to 9 December 2005.
Data Analysis
Historical Data
The college researcher provided the data files in Microsoft Excel format, so that software was used for the analysis. A pivot table was created to isolate particular characteristics (gender, age, grade, etc.). As each attribute was quantified, it was recorded on the spreadsheet and then Excel was used to create the data tables in this report. The success rate was calculated by summing the number of grades that were A, B, C, or CR (credit) and dividing by the total number of grades received.
Follow-up Survey
As students completed the online survey, their answers were recorded on the gradebook inside the course account. The instrument was not anonymous, so the student identification number could be used to connect a set of answers to the student's performance in OSS and the student's demographic data from the historical dataset. Excel was used to aggregate the answers to multiple-choice and multiple-answer questions. The respondents were divided into the three groups of OSS students so their answers could be grouped separately. The answers to essay questions were collated in a text file for review.
Results
Historical Data
The first part of this project's hypothesis concerned the online success rate of students who have taken Online Student Success (OSS) compared to the online success rate of other students. Table 10 shows the academic performance data for each group of students and their success rate in online classes from fall 2003 to spring 2005. For students associated with OSS, the data reflect online grades earned concurrently with and after their enrollment in OSS. Compared to students who did not take OSS, the OSS Successful students had a higher success rate in their online classes overall. OSS Unsuccessful and OSS Drop students had a lower success rate in their online classes overall.
Table 10 : Online Academic Performance by Research Group
|
No OSS |
OSS
Successful |
OSS
Unsuccessful |
OSS
Drop |
|
N |
% |
N |
% |
N |
% |
N |
% |
Success |
3,814 |
54% |
89 |
61% |
7 |
13% |
22 |
39% |
Not success |
1,956 |
46% |
52 |
39% |
46 |
87% |
34 |
61% |
Total |
5,770 |
100% |
141 |
100% |
53 |
100% |
56 |
100% |
The second hypothesis for this project was that the students who succeeded in OSS would have a higher online success rate after taking OSS than they did before enrolling in it. Table 11 displays the success rates for online classes before and after taking OSS for the OSS Successful group. Though only 11 students took online classes before and after enrolling in OSS, the improvement in their success rate is dramatic.
Table 11 : Online Academic Performance Before and After Passing OSS (n = 11)
(Only students who took online classes before and after enrolling in OSS are included in this table.)
|
Successful
Grades |
Total
Grades |
Success Rate
(Successful Grades /
Total Grades |
Grades before OSS |
11 |
29 |
38% |
Grades after OSS
This includes online classes taken concurrently with OSS. |
18 |
23 |
78% |
Follow-up Survey
Survey respondents from all three OSS groups (OSS Successful, OSS Unsuccessful, OSS Drop) expressed appreciation for the additional preparation the course gave them for learning online. One question asked if students agreed that the class helped them to succeed in future online classes. Among the 14 OSS Successful participants, 13 either agreed or strongly agreed. Interestingly, all three OSS Unsuccessful respondents either agreed or strongly agreed with this statement. Three of the four OSS Drop participants either agreed or strongly agreed.
Other questions were more specific and asked how helpful the class was in preparing students for typical tasks in online classes. All of the OSS Successful participants said the class was very helpful or helpful in preparing them to using discussion boards, three of the four OSS Drops felt the same way, and all of the OSS Unsuccessful students agreed. Every respondent said that the class was helpful or very helpful in preparing them to take tests online. Finally, the same reply was given (very helpful or helpful) by everyone when asked if the class prepared them to conduct research online.
The open response questions give students the opportunity to express in their own words some of the key findings revealed by the data. One student said:
"I would take an online class again because I learned a lot from our OSS class. It gave me the confidence to take future online classes."
Another wrote:
"I know that if had not taken the OSS course I would have dropped out of all of my online (and other distance learning) courses."
The most succinct response was one to the question that asked how OSS affected their decisions to enroll in other online classes:
"I want all of my classes to be online."
Detailed survey results are in appendix C
Summary and Conclusions
The strongest implication of this study is that students who intend to enroll in an online course should be encouraged to take an online student success course. This study suggests that, if they pass Online Student Success (OSS), it is likely they will be more successful in their online course attempts. This is best shown by the dramatic improvement in success rate for those students who took online classes before and after taking OSS. Students who did not take online classes before OSS also show a higher online success rate than students who have no enrollment relationship with OSS.
A second implication for the classroom is that students benefit from preparation for the online learning modality. Survey data from all OSS groups reveal their increased comfort with this environment after taking the class, and this testimony corroborates the increased success rates found in the historical data. If students feel that OSS prepares them for online learning, it follows that their academic performance will be better in online classes.
However, there are students who are successful online without taking OSS. They do not need the intervention in their online learning skills, so a third implication for the classroom is that OSS should be targeted at those students who need it. The risk is acute for those students who explore online learning because it is the only way they can access higher education. If they rush in unprepared to this environment and do not experience success, they may turn away from this method for meeting their educational goals. One suggestion is to create an assessment tool that will encourage prospective online students to take OSS if needed.
Finally, if the OSS class should be targeted at those students who need help learning online, it would benefit from concentrating on developing the skills that make a student successful online. This classroom implication points to the first research implication, and that is to study what makes online students successful. Given the large number of online classes at the college, this project would require the involvement of many students and instructors.
Other implications for future research will help refine the classroom suggestions mentioned. The historical data will be more complete if academic records from other online colleges are included. Several students reported they had taken online classes at other colleges, but their performance in those classes was not included in historical data. It may be difficult to incorporate data from all online colleges, but in a multi-campus district that uses one database for its enrollment and academic data it ought to be possible to expand the historical data to include records from the other colleges in the district.
More historical and academic data about students from all four groups will allow additional analysis. For example, the number of units attempted online and in non-online classes would suggest how much of the students’ academic effort was engaged in online learning. (This assumes that a unit of academic credit requires the same amount of work regardless of the course.) This could be combined with their performance in non-online classes to explore broader questions about study skills and academic preparation. It also would show how many online students take on-campus classes, which has implications for how to provide student services to online learners.
This additional enrollment data would help analyze an unexpected result from the demographic data: OSS students in all three groups are more likely to live near campus. If OSS students are also more likely to take classes on campus, this would beg asking how these students found out about OSS and would have potential implications for marketing OSS to the online students who do not live near campus.
Research implications for the survey include finding a way to get additional responses. Survey participants were more likely to be recent enrollees in OSS. Asking them to contribute responses soon after their OSS enrollment might increase participation, as their e-mail addresses are more likely to be current. In addition, they might be better able to associate their OSS experience with their exploration of the online learning modality.
A final set of research implications comes from this project’s successes and should be repeated in similar efforts. What worked well was giving students a place for open-ended responses on the survey. It provided material to corroborate results of historical data analysis and humanized the presentation of results. Using Microsoft Excel to analyze the historical data also worked well. The pivot table allowed easy compilation of relevant data, and the statistical functions made quick work of counting and calculating project numbers. Finally, the process for conducing the survey should be repeated. Survey data were collected using the same tool (a Blackboard course account) that was used to teach OSS, so there was no technical learning curve for respondents.
Acknowledgments
The author would like to acknowledge the support offered by the @ONE Carnegie Scholar Program during the 2005-2006 academic year. The program encourages the effective use of technology in the classroom by helping faculty conduct research using the classroom action research methodology. Particularly helpful were the suggestions and comments made by the college researchers that took part in the project: Michelle Barton, director of institutional research and planning at Palomar College, and Darla Cooper, institutional research director at Oxnard College.