Learning Objects in Use: ‘Lite’ Assessment for Field Studies

 

Vivian Schoner
Strategic Research and Evaluation Consultant
vschoner@LT3.uwaterloo.ca

Dawn Buzza
Research Project Manager
dawn@LT3.uwaterloo.ca

Kevin Harrigan
Co-Director: Learning Object Programs
kevinh@uwaterloo.ca

Katrina Strampel
Research Associate
katrina@LT3.uwaterloo.ca


Centre for Learning and Teaching through Technology (LT3)
University of Waterloo
Waterloo, ON, N2L 3G1
Canada

 

 

Abstract

This study was conducted to evaluate the use and re-use of learning objects (LOs) that were developed under the Co-operative Learning Object Exchange (CLOE) in Ontario, Canada.  Student questionnaire data provided their perceptions of the learning value, added value, design usability and technology function.  Other perspectives on the value of using and repurposing learning objects were provided through student and instructor interviews.  Results indicated that technical functionality did not present usability problems, but that the design of learning objects was their most critical feature for students. The questionnaire used for measuring student perceptions may prove useful for instructors who want to assess the usefulness of learning objects within a variety of instructional contexts and disciplines. 

 


INTRODUCTION

This study, undertaken in spring, 2003, was commissioned to evaluate the use and re-use of learning objects (LOs) that were developed under the Co-operative Learning Object Exchange (CLOE) in Ontario, Canada. The study was funded by the Office of Learning Technologies, Human Resources Development Canada, 2002/2003.

CLOE is a collaborative project of 16 Ontario universities [eight of which participated in this study] that provides an innovative infrastructure for joint development of interactive learning resources. It is a sub-community of the MERLOT consortium and, as such, shares the commitment to demonstrate learning value and the advantages of using learning objects beyond the immediate instructional context through continuous re-use. Providing assessment data from stakeholders using learning objects in their classrooms is a means to present field-based information to potential users searching for an already available learning object before deciding to develop their own.

Others have commented extensively on the importance of using evaluation for re-use decisions (Recker, Dorward & Nelson, 2004; Sander, Huk & Floto, 2002; Woo, Gosper, Gibbs, Hand, Kerr, & Rich, 2004). Further, creating effective e-learning resources requires a significant commitment of time and funds for the design, development and deployment of learning objects. The resulting costs to institutions engaged in developing LOs have, we believe, impeded the transition to richer online learning. (See Bratina, Hayes & Blumsack, 2002; Littlejohn, Jung & Broumley, 2003; Oliver, 2001 for further discussion of this issue). Providing field-based assessment results presents one viable approach to remedying this situation.

One purpose for this study was to design and implement basic assessment techniques that could be used to provide information on the use and re-use of learning objects in cross-discipline classroom settings. Our objective was to begin to outline the kinds of information about a given learning object that instructors might like to see in a learning object repository before deciding to adapt/use it in what might be similar or different circumstances. As the stakeholders are instructors and students, we first focused on capturing students’ perceptions of their learning experiences with learning objects on different dimensions. Second, we looked at instructors’ observations as comparable, or not, to their students’ observations as this would give an indication of face validity for the questionnaire used with students. Third, we wanted to see if instructors also expressed interest in studying LO impacts on student learning outcomes – a means to move the field of learning object study from reliance on self-reports toward more formal experimental research.

Given the cross-discipline variability, our challenge was to find a systematic way to acquire information that was generic enough to deal with diverse curricula, instructors and students, and specific enough to provide useful information for repositories promoting re-usability of learning objects. To accomplish this we searched for a workable framework for this study and then derived four key dimensions of the LO learning experience that should apply to most instructional settings.

Although we reviewed a number of approaches to evaluation of educational software and multimedia (e.g., Nesbit, Belfer and Vargo (2002); Stufflebeam, (1971); Worthen, Sanders and Fitzpatrick, (1997)), we found that Williams’ (2002) participant-oriented approach best suited our needs. Williams’ model frames differences among various stakeholders’ instructional settings, definitions of learning objects, and criteria that impact on assessing their value. Williams (p.177) describes four types of evaluation: context, input, process and product:

  • Context evaluations assess institutional readiness such as the need for LOs, courses and support. Although no contextual assessments were done for this small study, awareness of these factors anticipates the need to plan for institutional and discipline variability. In this case, flexibility in adaptations of the student questionnaire would need to be considered.
  • Input evaluations focus on alternative means for meeting context needs inclusive of LOs. As reported below, it comes as no surprise that most instructors we interviewed made explicit comparative statements about the value of using LOs over other options they had used – a strong indication that the best LOs need to bring added value to the learning experience and that encouraging instructors to consider comparative research as a source of information is key to LO use and re-use.
  • Process evaluations, formative in nature, assess planning, design, development and implementation of LOs. Instructor interviews showed that planning, design and development was either done by, or monitored by the instructor concerned. Implementation, the entry point for this study, was generally iterative, including locally developed and borrowed learning objects.
  • Product evaluations, summative in nature, generally refer to learning outcomes. Given the nature of this study, however, we adapted ‘product evaluations’ to refer to outcomes for the study, that is, student self-reports and instructor observations. Student questionnaire sub-scales were designed to capture information on their perceptions of the value, quality and utility of the LOs encountered in the learning process. Instructor interview questions sought information on similar variables and, as noted above, may provide confirmation of their students’ observations.

Adapted in this way, William’s model proved to be a most useful framework for organizing both the data collection instruments and field-studies described herein.


PROCEDURE and METHOD

Institutional Contact Procedure
Instructors from all CLOE participating universities were contacted by the researchers. Although most were interested in participating, eight could fit the study into their schedules. These are the University of Windsor, York University, Lakehead University, Ryerson University, University of Waterloo (two instructors), University of Guelph, McMaster University and the University of Western Ontario. Due to space limitations, four of the nine studies are included in this paper: Windsor, Waterloo (Kinesiology), Waterloo (Particle Physics), and Ryerson. They were selected to illustrate the range of variability, both in discipline settings and the flexibility of generic questionnaire use through adaptations and extensions to suit particular needs. Note that information on all studies is available on the CLOE site. See Table 1 for the range of courses and learning objects represented in the full study.

Table 1. CLOE partner institutions, courses, and learning objects

Institutions

Courses

Learning Objects

University of Windsor

Immunology

White Cell Identification

York University *

Introduction to Data Structures

Interactive Applet for Recursive Sorting

Lakehead University*

Introductory Statistics for Psychology

Stroop Effect

Ryerson University

General Introductory Chemistry

Chemistry Virtual Lab

University of Waterloo

Kinesiology: Nutrition

Using Lipid Absorption in Small Intestine

University of Waterloo

Particle Physics

Feynman Diagrams

University of Guelph*

General Introductory Chemistry

E-Lecture Section online components

McMaster University*

Principles and Practices in University Teaching (graduate course)

Managing Conflict in the Classroom

University of Western Ontario*

Elementary Mathematics Education module and Teachers’ In-Service session

Applets involving math games for Number Concepts, Patterning & Algebra, and Probability

*Not included in this paper. Please check the CLOE site for information on these studies.

After initial contact, instructors were contacted by telephone and email to arrange for study implementation schedules. Student volunteers for administering the questionnaire and, in two cases, individual interviews, were enlisted by course instructors. Researchers traveled to the various sites to review study procedures and conduct interviews.

Classroom Procedure
First, learners’ experiences and perceptions of LOs were measured using a sample Student Questionnaire [Schoner, Bailey & Buzza, 2004] that instructors adapted to accommodate differences in discipline and in the design and purpose of the LO within sites (the generic questionnaire is presented in Appendix A). Thus, implementation of the Student Questionnaire varied somewhat across sites according to instructional context and need of instructors. Typically sample sizes were small due to existing contextual situations such as variability in students’ level of study, class sizes, and differences in students’ willingness to participate in the study. Additionally, a small sub-set of student volunteers were interviewed in two sites. The instructors provided questions on specific activities not covered in the generic questionnaire.

Post-questionnaire, instructor interview data was collected comprising self-reports of their experiences using or re-using learning objects in teaching. The faculty interview protocol is presented in Appendix B.

Method: Student Questionnaires
Student questionnaires were submitted to the researchers electronically by institutional representatives. Questionnaire items were combined into four sub-scales: ‘Learning Value’ of the learning object, ‘Value Added’ by the learning object, ‘Design Usability’ of the learning object, and Technology Function:

  • Learning Value includes items that reflect learners’ perceptions of how effectively the LO helped them learn or understand the relevant content.
  • Value Added by the learning object reflects perceptions of whether, and to what extent, the LO had advantages over other learning materials or methods.
  • Design Usability of the learning object focuses on learners’ perceptions of the ease and clarity with which they were able to follow the instructions and navigate through the object or activity.
  • Technology Function assesses students’ perceptions of how well the object functioned technically, and whether they had the technical knowledge required to use it.

Organizing the items into these sub-scales allowed us to more easily examine learner perceptions of LOs across varying academic disciplines, object designs and instructional uses. In cases where several students were interviewed the results are presented for interest, following the Student Questionnaire data. These include students in the Ryerson Chemistry course and the University of Waterloo Physics course. Note that due to the small number of student participants, questionnaire responses are reported as percentages of the overall number of response selections per sub-scale (n), rather than number of students (N), per course.

Method: Instructor Interviews
Instructors participated in structured interviews conducted by the researchers using the Faculty Interview Protocol. In addition to background information about LO development and use in courses, the interviews also provided information on reuse of the LOs, student feedback on the LOs and reflections on teaching and research.



RESULTS

The presentation format for each study includes a brief description of the LO and the course. This is followed by tabled student questionnaire responses which are then briefly summarized, including student interview data where available. The instructor interview results are presented next. Table 2 shows the student data collected for the four questionnaire sub-scales and the faculty interview data collected for each institution. Not all course participants produced data on all questionnaire and interview components due to instructor availability and options for variability. The study contributions for each participating institution are presented in Table 2.

Table 2. Types of data collected [X] or not collected [--] at each participating institution

Learning Value

Value Added

Design Usability

Tech

Function

Instructor Interviews

Windsor

X

X

X

X

X

York

X

X

X

X

--

Waterloo Kinesiology

X

X

X

X

X

Waterloo Physics

X

X

X

X

X

Lakehead

X

--

X

X

X

Ryerson

X

X

X

--

X

--

X

X

X

Western *

--

--

--

--

--

* The data collected at Western were part of a larger study being conducted by the participating faculty member. Our instruments were not used for data collection. Their results are presented in summary form in the report available at the CLOE site [CLOE].

Students from eight of the nine study sites responded to Learning Value and Design Usability of the learning object; six to the Value Added subscale and six to the Tech Usability subscale. Faculty interviews were completed at seven of the nine universities and complete course data sets were acquired from the Universities of Windsor, Guelph, and Waterloo.


I. University of Windsor: White Blood Cell Identification

Learning Object and Instructional Context
The LO studied at the University of Windsor, White Blood Cell Identification, was developed by the instructor with technical assistance from a student. It had evolved over several years and has incorporated enhanced quizzes for each offering. Approximately 1000 photographs are included in the LO. These are presented to students in sets of four or five, showing various combinations of cell types. Students are given immediate feedback on the accuracy of their identification choices.

This LO is used instead of wet labs in an undergraduate Immunology course to practice identification of different kinds of white blood cells. Students need to practice identifying white blood cells in preparation for a lab-based exam. The LO is used as a study tool and is not compulsory; however, virtually all students do use it, because it offers advantages over wet lab practice sessions as a time-saver and there is a perceived positive impact on exam results.

Student Questionnaire Results
Of the 50 students in the Immunology course, 12 completed the student questionnaires. Results are presented in Table 3.

Table 3. Windsor student questionnaire results in percentage of total # of responses per Subscale
(n = responses) (N=12)

Sub-scale

% Strongly Agree

% Agree

% Neutral

% Disagree

% Strongly Disagree

% N/A

Learning Value (n = 66)

62.1

28.8

3.0

6.1

0.0

0.0

Value Added by LO (n = 44)

52.3

29.5

18.2

0.0

0.0

0.0

Design Usability (n = 33)

60.6

36.4

3.0

0.0

0.0

0.0

Technology Function (n = 22)

50.0

50.0

0.0

0.0

0.0

0.0

Learning Value and Value Added sub-scales, respectively, were rated in the Agree and Strongly Agree categories 90.9% and 81.8% of the time. Students chose the Agree and Strongly Agree ratings for Design Usability and Technology Function 87.0% and 100.0% of the time, respectively. These positive results are in agreement with the comments made by the instructor as shown below.

Instructor Interview Results
Interview responses indicate that “virtually all” students used the LO and that their response to it was very positive because it helped them do well on the subsequent lab test. In agreement with student data, the instructor noted that students do better on the test after using the LO than they do when practice occurs in the wet lab. He gave two reasons for this. First, receiving feedback after each response within the LO quiz ensured that students did not continue making repetitive errors in cell identification, which would reinforce an erroneous cell type concept. Second, the instructor noted that students were able to attempt many more cell identifications using the LO than they could in the wet lab because they save set-up time for viewing each individual slide. Table 4 presents a summary of instructor comments.


Table 4. Instructor interview results: Windsor

Institution and Course

Learning Object

Development and Expert Review

Uses

Re-Uses

Reflections on Teaching

Research Ideas for Student Learning

University of Windsor

Immunology

White Blood Cell Identification

Developed by instructor;

Reviewed by six hematologists and other professionals

Optional

Used to practice cell type identification to prepare for lab tests

Re-used with evolving changes in other courses over four years

Almost all students use the LO. They like it because they do well on the test; Immediate feedback reinforces accuracy in identifications and takes far less time than lab work.

Could compare the number of hours spent with exam scores, holding GPA constant

Ideas for research: Comments on research suggest that a time study may yield useful information on the relationship between amount of time taken and achievement in LO use vs. the traditional Lab setting.

II. University of Waterloo, Kinesiology: Lipid Absorption in the Small Intestine

Learning Object and Instructional Context
The first LO studied at the University of Waterloo, Using Lipid Absorption in Small Intestine is used in a course on Human Nutrition. In this case, the LO was developed by the faculty member and had been fine-tuned over several years. The LO represents an animation of the lipid absorption process in human digestion, with an accompanying sound track. It is used as an adjunct to the text, lectures and assignments and is optional for students. All students receive a copy of the LO on a CD that accompanies the textbook. The LO is also available online on the course web site, and the faculty instructor promotes its use through the course web board. Its primary use is to prepare for class and to use it for review purposes.

Student Questionnaire Results
Of the 143 students enrolled in the Human Nutrition course, 33 completed the questionnaire. Table 5 shows the students’ results for the Lipid Absorption LO.


Table 5. Waterloo student questionnaire results in percentage of total # of responses for each sub-scale. (n = responses) (N=33)

Sub-scale

% Strongly Agree


% Agree


% Neutral


% Disagree

% Strongly Disagree


% N/A

Learning Value (n = 198)

20.2

47.0

22.7

4.5

2.5

3.0

Value Added by LO (n = 99)

20.2

45.5

23.2

6.1

4.0

1.0

Design Usability (n = 132)

47.7

35.6

10.6

1.5

0.0

4.5

Technology Function (n = 99)

49.5

38.4

4.0

2.0

5.1

1.0

These results show that 67.2% and 65.7% of the possible responses within the Learning Value sub-scale and the Value Added sub-scales, respectively, were in the Agree and Strongly Agree categories. The responses for Usability of Design and Usability of Technology were in the Agree and Strongly Agree categories 83.3% and 87.9% of the time, respectively. Although the students responding to this LO generally rated it positively, it is clear that only a small proportion of the class used it.

Instructor Interview Results
The instructor’s comments (see Table 6) indicate that he attributes the low usage to students not finding the time to look at the LO, and the lack of incentive to do so in terms of course value. The potential for reusing the LO is considered high by this instructor.

The primary difficulties experienced by students learning about lipid absorption is the ability to both visualize the sequence of events and the process.

Table 6. Instructor interview results: Waterloo Kinesiology

Institution and Course

Learning Object

Development and Expert Review

Uses

Re-Uses

Reflections on Teaching

Research Ideas for Student Learning

University of Waterloo, Kinesiology

Human Nutrition

Lipid Absorption in the Small Intestine

Developed by instructor

(No expert review reported)

Optional

Used as overheads in class; used by students for class prep and review

Has been used for the past several years, with ongoing adaptations; Could be used in medical or biology courses teaching anatomy of the gut and structure function relationships in digesting and absorbing of lipid

Helps students to better visualize the lipid absorption process especially in their ability to see the connection and event sequence better;

There is high acceptance of the object by those who use and comment on it

Compare students learning with and without the LO;

Collect usage and impact stats;

Find variables that impede use

Ideas for research: Comments on research suggest that a study between users and non-users of the LO should yield information about their comparative impacts on student learning outcomes.

III. University of Waterloo, Particle Physics: Feynman Diagrams Tool

Learning Object and Instructional Context
The LO titled, Feynman Diagrams was studied in the context of an undergraduate course in particle physics at the University of Waterloo. This LO provides students with opportunities to solve problems using Feynman Diagrams in an interactive, online format. The LO was developed and used by the instructor with some programming assistance. The LO was used several times and technically improved for each iteration.

To introduce the LO and to gauge student preference the instructor uses a cross-over strategy where students are divided into LO or freehand drawing groups for one assignment, then reversed for the next assignment. In a third assignment students choose to use either the LO or freehand drawing to produce the Feynman diagram.

Student Questionnaire Results
Student Questionnaires. Thirteen of the seventeen students enrolled in the course completed the Student Questionnaire. In addition, two students completed think-aloud protocols while navigating through the Feynman Diagrams LO. The results of the protocols are summarized following the Student Questionnaire results.

Table 7. Waterloo Particle Physics student questionnaire results in percentage of total # of responses per subscale (n = responses) (N=13)


Sub-scale

% Strongly Agree


% Agree


% Neutral


% Disagree

% Strongly Disagree


% N/A

Learning Value (n = 78)

5.1

30.8

30.8

20.5

3.8

9.0

Value Added by LO (n=39)

2.6

43.6

23.1

20.5

2.6

7.7

Design Usability (n = 52)

21.2

36.5

19.2

9.6

5.8

7.7

Technology Function (n = 39)

43.6

38.5

2.6

2.6

2.6

10.3


The Learning Value and Value Added sub-scales were rated in the Agree and Strongly Agree categories 35.9% and 46.2% of the time, respectively. Ratings on Design Usability and Technology Function were in the Agree and Strongly Agree categories 57.7% and 82.1% of the time, respectively. These results indicate that there may have been some difficulty for students in navigating through the LO, although technically it seems to work well. These apparent design usability issues may partly explain the moderate ratings on learning value.

Additional Student Think-Aloud Protocols. Two students volunteered to provide a running commentary while exploring the Feynman Diagrams LO. Commentary focused primarily on Learning Value, Value Added and Design Usability of this tool. Their responses are illustrative of theis LO learning experience and are summarized as follows.

Both students described the LO as helpful for checking the accuracy of their work, thus they first preferred to draw the diagrams by hand. One indicated that the use of the program was time-consuming, so that after an initial try, it was easier to revert to freehand drawing. These students indicated that the instructor explained how to use the LO during class, and that it was not difficult to use for feedback, but that most students did not use it because it took too long. They identified a number of technical issues that would make this LO more usable. These included a copy or save button, allowing similar diagrams to be copied instead of redrawn, a print button, and the ability to show more diagrams on the screen at one time. There were a few comments about technical problems in running the program, but these did not appear to be serious impediments to using the LO.

Instructor Interview Results
The Faculty Interview responses from the University of Waterloo Particle Physics course are presented in Table 8. According to this instructor, the reuse potential for this LO would be for courses dealing with the same kinds of physics content as in this course. The instructor’s perceptions about students’ use of the LO and its learning value was in line with the data from students themselves. Specifically, the instructor reported that students used the LO initially and found it somewhat helpful, but that they reverted to hand-drawn diagrams once they understood the concepts. The instructor saw these limitations as partly due to lack of speed in the program, some capabilities that were lacking, and attractiveness of the technology.


Table 8. Instructor interview results: Waterloo, Particle Physics

Institution

Learning
Object

Development and Expert Review

Uses

Re-Uses

Reflections on Teaching

Research Ideas for Student Learning

University of Waterloo

Particle Physics

Feynman Diagrams Tool

Developed by instructor

(No expert/peer review reported)

Required

Used to show what Feynman Diag– once students understand it and can use it, it’s faster to draw the diagrams

Continue to assess impact of LO use on student learning outcomes; This is

difficult because of the small numbers of students taking this class

Ideas for research: This instructor continues to assess the impact of using the Feynman Diagrams tool on learning outcomes using the crossover pattern described above. Due to small enrollments, the pattern is repeated for each course iteration. Additionally, questions designed to distinguish between LO users and non-users are embedded in the end-of-term exam.


IV. Ryerson University: Virtual Chemistry Lab

Learning Object and Instructional Context
The LO studied at Ryerson titled, the Virtual Chemistry Lab was developed at Carnegie Melon University. It is an interactive online chemistry laboratory simulation intended to enhance students’ problem solving skills, introduce experimental design concepts, and present contextual problems not normally accessible through pen and paper or wet lab exercises. One key objective of the software is to help students develop problem solving strategies that are conceptual or heuristic, rather than algorithmic.

The Virtual Chemistry Lab was used in an introductory, general Chemistry course with an enrollment of 76 students. The LO was provided as a support for problem solving, for use with one of the questions on three out of four course assignments. On the first assignment the problem can be solved either manually or with the Virtual Chemistry Lab equally well. On the second assignment the problem can be solved more easily and with fewer errors using the Virtual Chemistry Lab, but can still be solved manually. On the third assignment, the problem must be solved using the Virtual Chemistry Lab.

Student Questionnaire Results
Student Questionnaires were completed by 32 of the 76 students enrolled in the Introductory Chemistry course. Results from this survey appear in Table 9.

Table 9. Ryerson student questionnaire results in percentage of total # of responses per subscale
(n = number of responses) (N=32)

Sub-scale

% Strongly Agree

% Agree

% Disagree

% Strongly Disagree

% N/A

Learning Value (n = 160)

20.8

38.3

23.3

14.2

3.3

Value Added by LO (n = 96)

25.0

31.7

35.0

6.7

1.7

Design Usability (n = 64)

30.0

33.3

20.0

13.3

3.3

Because it was felt that the technology had been well tested and was very simple to use, the researchers at Ryerson chose to omit the Questionnaire items in the fourth sub-scale, Technology Function. In this case, 67.1% of the possible responses within the Learning Value sub-scale were in the Agree and Strongly Agree categories. Similarly, 56.7% of possible responses within the Value Added sub-scale were in the Agree and Strongly Agree categories. The responses for Design Usability were in the Agree and Strongly Agree categories 63.3% of the time.

Questionnaire Expansion. The Student Questionnaire was expanded to include the following items:

  • Rating the relative importance of four new features that are under consideration to be added to the LO; and
  • A set of five questions related to each of the three assignments in which the Virtual Lab could have been used.

Of the potential new features, a graphing function to be used in titrations was rated as important or very important by most students (80%). Also, “adding the ability to undo the most recent activity” was rated as very important by more students than was any other feature (58.1%). Adding a worksheet for notes and calculations was rated as not important by more respondents (41.9%) than other features were.

The problem in Assignment 1 could be calculated using pen and paper as easily as with the virtual lab; 51.3% of the students reported having used the pen and paper method and 48.7% reported having used a combination of pen and paper and the Virtual Lab. On Assignment 2, the problem could be solved more easily and with fewer errors using the Virtual Lab, but could still be solved manually. For this assignment, 19% of the students reported using pen and paper only, while 81% reported using the Virtual Lab. On Assignment 3, where the use of the Virtual Lab was required in order to solve the problem, only 12.5% of the students reported using the Virtual Lab only, while 87.5% reported using a combination of the Virtual Lab and pen and paper. On all three assignments, most students reported having used the Virtual Lab to check their answers (79.5%, 81%, and 87.5% respectively). Especially on the third assignment, students reported that by using the Virtual Lab they found errors they might otherwise have missed (51.3%, 52.4%, and 62.5%, respectively).

Additional Student Interviews. Four students also completed individual interviews about the ways in which they study chemistry and their experiences using the Virtual Lab LO. Two of the four students had obtained an “A” in the course and two had obtained a “C” in the course. The interview responses were remarkably similar, given the differences in final course grades. There was general agreement among all four students in that they:

  • Understood stoichiometry before using the virtual lab (VL), but the problems helped to solidify their understanding;
  • Solved problems using logic, rather than just substituting numbers in a formula; for this reason, they saw the VL problems as challenging and fun;
  • Used pen and paper as well as the VL to complete problems for assignments, in order to confirm their results and also to show their work;
  • Found the VL generally saved them time, in comparison to pen and paper solutions only;
  • Would strongly recommend using the VL software to incoming students, even if they had to spend a couple of hours learning how to use it;
  • Saw the VL as an excellent adjunct to real lab work, but would not recommend it as a replacement;
  • Liked the way in which the VL allowed some “playing” with stoichiometry problems and concepts and would like to see more of this capability built into the software;
  • Found some erroneous and/or confusing wording in the [Virtual Lab] problems resulted in wasted time spent trying to figure out what was required.

Instructor Interview Results
The instructor for this course decided to use an appropriate available learning object rather than design one himself. His goal was to change the emphasis of learning from the more surface algorithmic function to that of deeper understanding at the conceptual level. The Carnegie Melon LO perfectly suited this objective. The results of this interview are presented in Table 10.

Table 10. Instructor interview results: Ryerson, Virtual Chemistry Lab

Institution

Learning Object

Development and Expert Review

Uses

Re-Uses

Reflections on Teaching

Research Ideas for Student Learning

Ryerson University

Introductory Chemistry

The Virtual Chemistry Lab

Developed at Carnegie Melon University;

Well-tested and Well-reviewed

Required

Used for one of the questions on three out of four assignments in graduated levels of difficulty; The final problem can be solved only by using the Virtual Chemistry Lab;

It is used to supplement a wet lab and numbers of students using it for each assignment is monitored

It could be re-used in First term general chemistry courses for nutrition and engineering students, in addition to the biology and chemistry students using it now

A small survey is handed in with each of the assignments indicating whether or not they used the LO;

Students comments were generally favourable in nature;

Studies to show how well the LO helped students to learn and apply the concepts; Assess whether or not they shifted from an algorithm method of thinking to a heuristic problem-solving approach

Ideas for research: Instructor comments suggest designing a study to demonstrate the shift from surface (algorithmic) to deeper (heuristic) understanding through concept applications based on problem-solving activities.


SUMMARY AND CONCLUSIONS

Our goal was to present straightforward assessment techniques designed to provide practical information for users and potential users of learning objects available in the CLOE repository. To be practical, information about learning objects needs to be grounded in the teaching and learning experiences of students and instructors, in this case, from many different subject areas. Thus our challenge was to test the applicability and flexibility of the student questionnaire in a series of cross-disciplinary settings with highly variable implementation practices. We also sought information from instructors through an interview process we hoped would give practical information from the teaching experience and add a modicum of face validity to the questionnaire.

To this end, we defined ‘practical information’ in terms of how students experienced learning objects in four key areas: learning value, value added, design usability and technology function. Instructors were also asked about their experiences teaching with learning objects, with such questions as: How were they used or re-used? Did the LO meet learning objectives? Would they consider more formal research to be worthwhile? We then compared results from the two data sources.

The following table displays subscale questionnaire results from the ‘strongly agree’ and ‘agree’ response choices. It is provides a quick means to compare student responses with instructor commentary. It is not intended to be comparative across courses.

Table 11. Per cent* subscale responses in the ‘strongly agree and agree’ categories

Institution

Learning
Value

Value
Added

Design Usability

Technology
Function

Windsor

Optional

Peer Reviewed

91%

82%

97%

100%

Waterloo, Kinesiology

Optional

Not reviewed

67%

66%

83%

88%

Waterloo, Particle Physics

Required

Not reviewed

36%

46%

58%

82%

Ryerson

Required

Peer Reviewed

59%

57%

63%

N/A

Instructor Comments demonstrate a generally high level of agreement with student questionnaire responses in three of the four studies:

  • The Windsor instructor noted that almost all students used the LO because they perceived positive course value in terms of better performance outcomes and consequently, learning objectives were met. The LO performed very well.
  • The Waterloo Kinesiology instructor attributed low usage to no perceived course value, but high acceptance by students who did use it. Learning objectives were apparently met in the user group. This LO also performed well.
  • The Waterloo Physics instructor saw student usage hampered by the lack of speed in the programming, the need for more capabilities and visual appeal. Technically the LO worked well but as noted by the instructor and students interviewed, it was easier to revert to freehand diagramming and both gave general agreement on needed features identified by the instructor.
  • The Ryerson instructor monitored students’ use of the LO over three assignments with increasing expectation to use it. He noted generally positive comments from students through a usage survey. Usage tended to be mixed between hand and LO calculations, with the LO losing ground to hand calculations by the last assignment. Although the instructor did not comment on this, students indicated a preference for manual calculations and using the LO to check the accuracy of their work. Thus they did perceive the object to be useful as an adjunct and the students interviewed indicated taking a logical rather than an algorithmic problem solution path. Agreement with students is more implied than explicitly stated in this case.

In sum, in these four studies, we did not see differences between optional and required uses of the learning objects, nor between peer reviewed and not reviewed LOs. Where figures were available, the technology function presented no problems for students. Design issues, on the other hand, do affect student usage and this is generally agreed upon by both instructors and students. From this small study we can conclude that the most salient feature of a learning object is its perceived direct impact on student learning outcomes, something that could be more obviously instantiated into course assignments by instructors. Interestingly, students can and do find uses for learning objects other than to meet intended instructional objectives, a testimony to their resiliency.

On the whole, we conclude that we have acquired a beginning indication of face validity for this questionnaire such that it will yield useful information for instructors using learning objects and for those planning to reuse them in the same or other contexts. It does appear to be an easy, quick score method for acquiring student feedback. This is especially important as, combined with similar instructor observations, it points to issues where more formal research on the learning impacts of LO use would be desirable.

References

Bratina, T.A., Hayes, D. & Blumsack, S.L. (2002). Preparing teachers to use learning objects. Faculty and Staff Development, November/December.

Littlejohn, A., Jung, I. & Broumley, L. (2003). A comparison of issues in the reuse of resources in schools and colleges. In A. Littlejohn (Ed.), Reusing online resources: A sustainable approach to e-learning (pp 212-220), London: Kogan Page.

Nesbit, J., Belfer, K. & Vargo, J. (2002). A convergent participant model for evaluation of learning objects. Canadian Journal of Learning and Technology, 28 (3).

Oliver, R. (2001). Learning objects: Supporting flexible delivery of online learning. In G. Kennedy, M. Keppell, C. McNaught & T. Petrovic (Eds.), Meeting at the Crossroads. Proceedings of the 18th Annual Conference of the Australian Society for Computers in Learning in Tertiary Education. (pp. 453-460). Melbourne: Biomedical Multimedia Unit, The University of Melbourne. [http://www.ascilite.org.au/conferences/melbourne01/pubs/index.html]

Recker, M. M., Dorward, J., & Nelson, L.M. (2004). Discovery and Use of Online Learning Resources: Case Study Findings. Educational Technology & Society, 7 (2), 93-104.

Sander, U., Huk, T. & Floto, C. (2002). Evaluation of a website with learning objects for cell biology: Target groups and usability. Paper presented at the World Conference on E-Learning in Corporate, Government, Health Care and Higher Education, Montreal, October 15-19.

Schoner, V., Bailey, M. & Buzza, D.(2004). Evaluating student perceptions of connectedness, instruction and learning in online classroom settings. Paper presented at the Canadian Evaluation Society Annual Conference, Saskatoon, Sask., May 16-18.

Stufflebeam, D. L. (1971). The relevance of the CIPP evaluation model for educational accountability. Journal of Research and Development in Education, 5 (1), 19-25.

Williams, D. D. (2002). Evaluation of learning objects and instruction using learning objects. In D. A. Wiley (Ed.), The instructional use of learning objects. Bloomington, In: Agency for Instructional Technology/Association for Educational Communications & Technology.

Woo, K., Gosper, M., Gibbs, D., Hand, T., Kerr, S., & Rich, D. (2004). User perspectives on learning object systems. Paper presented at the Tenth Australian World Wide Web Conference, Gold Coast, Australia, July 3-7.

Worthen, B. R., Sanders, J. R. & Fitzpatrick, J. L. (1997). Program evaluation: Alternative approaches and practical guidelines (2nd Ed). New York: Longman.


 

APPENDIX A

Generic Student Questionnaire

Recently in a [Course number] class, you were introduced to the [Title and format of the learning tool, and web site if applicable]. We would like to gather student feedback about the effectiveness of this [type of learning tool] at meeting the objective of “helping to teach [the concept or other learning outcome desired]”. Please [visit the web-site, try out the CD, etc.] and explore, then use the following questions to rate the [name of learning tool].

[Insert instructions for how students should complete and submit their questionnaire responses. You may choose to put it on the course web site, or you may have a person attend an on-campus class to administer and collect it, as is done with course evaluations.] Your responses on the questionnaire will not affect any assignment marks or your overall grade in [Course].

What is the format of your class?

___ Distance Education / print-based

___ Distance Education/online

___ Traditional, on campus

___ Other/ Combined

Rating Categories are as follows:

  1. strongly agree (SA)
  2. agree (A)
  3. neutral (N)
  4. disagree (D)
  5. strongly disagree (SD)
  6. N/A [not applicable]

Object 1


Part 1.  Learning Value

>

 

 
SA
SD
 

The incorporation of this [name tool] for the [lab, session, seminar, etc.]…

1

2

3

4

5

N/A

1. helped me learn this material in a new way.

 

 

 

 

 

 

2. helped me learn these concepts at my own pace.

 

 

 

 

 

 

3. helped in that I did not have to spend additional time studying text or notes in order to grasp this concept.

 

 

 

 

 

 

4. helped me to visualize [concept, data, etc. to be learned].

 

 

 

 

 

 

5.

 

  

 

 

 

 

Part 2.  Value Added by Learning Object

 
SA
 
 
 
SD
 

When using this software tool…

1

2

3

4

5

N/A

7. I was able to develop a better understanding of how to [cognitive activity, type of problem solving, etc.]

 

 

 

 

 

 

8. I was able to work through examples in a way that would not have been possible by attending a lecture or reading a textbook.

 

 

 

 

 

 

9. I was able to [solve extra problems, experience situations, etc.] that otherwise I wouldn’t have done.

 

 

 

 

 

 



Part 3.  Usability of the Design

 
SA
 
 
 
SD
 

I found that…

1

2

3

4

5

N/A

10. the [web-site, CD, etc.] was easy to understand and use.

 

 

 

 

 

 

11. the ideas and concepts incorporated within the [web-site, CD, etc.] were clearly presented and easy to follow.

 

 

 

 

 

 

12. the sequencing of the [tool] sections flowed in a logical order.

 

 

 

 

 

 

13. I was able to fully [use the learning object; e.g., view the movie, complete the interactive learning activity] by following the instructions provided.

 

 

 

 

 

 



Part 4.  Technology Function

 
SA
     
SD
 

When using this [tutorial, etc.…

1

2

3

4

5

N/A

14. I was not disadvantaged because I possess adequate computer skills.

 

 

 

 

 

 

15. I did not miss important information because the technology worked correctly.

 

 

 

 

 

 

16. the hardware/software requirements did not present a problem for me.

 

 

 

 

 

 

17. I have not used this software tool due to technical difficulties.

 

 

 

 

 

 

18. I did not use this software tool for other reasons.

 

 

 

 

 

 



APPENDIX B

Faculty Interview Protocol

Name of Institution ____________________________
Name of Faculty Member ________________________

  1. Is this a learning object that you developed or helped to develop?
  2. Have you used it before in previous classes or other courses?
  3. Were there changes made in using it a second time?
  4. Can you tell me how you use this learning object/tool in your teaching?
  5. If its use is optional for students, do you know how many of them used it? [If not, can you speculate on this?]
  6. Do you have a sense of how helpful it was for them?
  7. What kinds of comment do students make about the learning object, if any?
  8. If you were going to make changes to this learning object or to the way you have students use it, what would these changes be?
  9. Can you think of other courses in which this learning object could be used? If so, how would it have to be changed for re-purposing it, if at all?
  10. What kinds of feedback would you like to obtain from your students?
  11. Is there other information you would like to gather, such as comparisons of student performance with and without the use of the learning object?
  12. Do you have any other questions we might be able to help answer in this study?


    Creative Commons License
    This work is licensed under a
    Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License.


   
Copyright © 2005 MERLOT. All Rights Reserved.
Portions Copyright by MERLOT Community Members. Used with Permission.
Questions? Email: jolteditor@merlot.org
Last Modified : 2005/04/14