MERLOT Journal of Online Learning and Teaching
Vol. 4, No. 2, June 2008


Investigating the Connection between Usability and Learning Outcomes
in Online Learning Environments


Gabriele Meiselwitz
Computer and Information Sciences
College of Science and Mathematics
Towson University
Towson, MD  USA
gmeiselwitz@towson.edu

 

William A. Sadera
Educational Technology and Literacy
College of Education
Towson University
Towson, MD  USA
bsadera@towson.edu
 

Abstract

Online learning is used in many institutions of higher education with course offerings ranging from complete online degrees to hybrid virtual and physical courses. Online learning environments are complex environments using a variety of technologies and tools to overcome time and location restrictions. The research presented in this article focuses on a web-based asynchronous learning environment and the integration of usability factors into the evaluation of student learning outcomes. Usability tools are often employed in Human-Computer Interaction (HCI) to measure the quality of a user’s experience when interacting with a web site and could potentially impact learning in web-based online learning environments. This study investigates the relationships between usability factors and learning outcomes in an online learning environment as well as differences in learning outcomes and system usability between several selected student groups, including student computer competency scores, gender, age, and student standing. The results of this survey-based study highlight the importance of integrating usability factors into the evaluation of learning outcomes in online learning environments.

Keywords: Online learning, usability, learning outcomes, evaluation, assessment

 

Introduction

Online learning has become an integral component in many institutions of higher education. Institutions offer a variety of online learning options, including complete online degrees, complete online courses integrated in traditional programs or a wide range of hybrid (or blended) learning courses.  Online courses, or online elements used in hybrid instruction, allow academic institutions to overcome time and location restrictions and offer a number of other advantages for the institution and the student (Moore, Sener & Fetzner, 2006). Currently, 58% of institutions in higher education consider online education part of their long-term strategy and are expecting enrollment in online courses to continue to increase (Allen & Seaman, 2006).

All online learning environments require the integration of technology and make it necessary for students as well as instructors to be familiar with at least certain aspects of technology. In these settings, instructors must develop instructional methods which include technology and related computing tools. However, many educators in higher education have little training regarding the potential and limitations of online learning environments and their applications. Instructors often have to learn, through trial and error, how to use these technologies and tools and how to teach effectively using these systems (Moore & Kearsley, 2005).

In addition to serving as a virtual setting where technology can help support learning, online learning environments have to fulfill many other user expectations. They are expected to offer advanced interfaces and features to suit a myriad of learners, and at the same time they are also expected to be flexible and easy to use to suit various learning styles and educational requirements (Allen & Seaman, 2006; Pollanen, 2007). The diversity of course offerings and learning situations adds further complexity. Each learning environment is unique and even the same course taught by the same instructor is never exactly the same instruction. Instructor facilitation and student participation add further variables to a course. Every online course is a combination of several variables, including the course management system (CMS), instructional activities, the students, and the instructor.

The integration of technology as well as the diversity of the learning environment and the learner are factors making assessment of online learning environments a complex issue and require re-evaluation of traditional assessment methods (Moore & Kearsley, 2005). Quality teaching and learning in virtual environments is often associated with the pedagogical principles of learner-centered education, active learning, higher order thinking skills and team work (American Distance Education Consortium, 2003). Moreover, technology and tools used to support online learning are often identified as significant factors influencing student learning outcomes in online learning environments (Fredericksen, Picket, Pelz, Swan & Shea, 1999; Oliver & Herrington, 2003).

Usability is an element of Human-Computer Interaction and is a common research practice in the computer science field. It is used to assess how well technology and tools are working for users. Usability measures the quality of a user’s experience when interacting with a product or system and is an essential element of web design and development. It is standard practice for professional web developers to apply usability guidelines to improve the user experience with Internet and World Wide Web technologies and to evaluate the ease of use and satisfaction of users with Web-Interfaces. Successful interaction with an online environment increases user satisfaction and productivity and strengthens acceptance of the product (Lazar, 2001; Shneiderman, 1998).

Evaluation of online learning technologies and tools can benefit from usability research and many initiatives encourage web developers to increase the level of usability of web documents (Koohang, 2004; Shneiderman, 1998; U.S. Department of Health and Human Services, n.d.). Research has shown that implementation of usability principles can assist instructors in enhancing the learning experience for students in online learning environments, and that it can influence student learning process and learner effectiveness (Koohang, 2004).

Current research suggests that evaluation of learning outcomes in online learning environments needs to examine the complexity and interconnection of education and technology by considering indicators from both fields (Association for Computing Machinery, 2001; Grice & Hart-Davidson, 2002). However, evaluation often does not include usability factors, thus making it problematic to analyze the usefulness and satisfaction with tools and interfaces of online learning environments (Association for Computing Machinery, 2001; Nelson & Wayne, 1999; Zaharias, 2006).  Little research has been done to investigate these relationships between usability and learning outcomes in online learning environments (Feldstein, 2002; Quigley, 2002).

This paper presents a research study which examined the application of usability research in online learning environments and the effects of usability on student learning outcomes, including student achievement, communication and collaboration. A survey instrument, employing user-based assessment, was developed and tested for reliability as part of this study.

Methodology

It was the purpose of this research to investigate both, relationships and differences, between student learning outcomes and usability factors in online learning environments. Additionally, this research focused on differences in learning outcomes and system usability between several selected student groups, including gender, age, student standing, and student computer competency scores.

The participants in this study were 240 students attending a medium-sized comprehensive university in the Mid-Atlantic. The course, from which this sample was drawn, was open to all students and spread across eight sections. This 15-week, three-credit course was offered in hybrid format and covered topics such as computer-based animation, sound editing, Web-publishing and an extensive term project. Approximately 50% of class material was taught using the World Wide Web, and approximately 50% was taught using face-to-face instruction. A CMS was used to provide online lecture notes, assignment instructions, to submit and discuss student work, and to record student grades. Students in all eight sections were enrolled in the same online class, and all eight sections were administered by the same instructor, along with two teaching assistants.

Quantitative data were collected through the use of a survey, administered during the last week of the course after students had finished the majority of class work and had used the online learning environment for the complete duration of the course. The research was guided by the following research questions:

·        Is there a statistically significant correlation (p < 0.05) between online learning system usability and student learning outcomes?

·        Is there a statistically significant difference (p < 0.05) between selected student groups (computer competency scores) and both student learning outcomes and online learning system usability?

·        Is there a statistically significant difference (p < 0.05) between selected student groups (selected groups were: gender, age, and student standing) and both student learning outcomes and online learning system usability?

To conduct this research, a survey instrument, integrating usability research and evaluation of student learning outcomes in online learning environments, was developed (Meiselwitz & Lu, 2005). A relationship between system usability and learning outcomes would demonstrate that when overall system usability increases, overall student learning experience also increases or vice versa. Disaggregated data provides indicators to assist in identification of possible causes for relationships. The new questionnaire, the Web-based Learning and Usability Questionnaire (WLUQ) was constructed based on two existing questionnaires: the Post-Study System Usability Questionnaire (PSSUQ) and the Web-based Learning Environment Instrument (WLEI). The PSSUQ is intended primarily for assessment of user satisfaction with the usability of a system (Lewis, 1995). The WLEI specifically targets user satisfaction in web-based learning environments (Chang, 1999).

The PSSUQ was developed in 1995 for IBM, contains 19 questions and provides opportunity for open-ended user comments (Lewis, 1995). Items selected from this questionnaire assess three areas of usability: (a) system usefulness (4 questions), (b) information quality (3 questions), and (c) interface quality (2 questions). System usefulness inquires about the usefulness of the tool for the task. Information quality addresses system support information and error handling; interface quality targets the general quality and functionality of the system interface.

The WLEI specifically targets user satisfaction in web-based learning environments (Chang, 1999). It consists of 32 questions, provides opportunity for open-ended comments and targets the effectiveness of web-based learning environment from a student’s perspective. Items selected from this questionnaire address four areas important for learning outcomes: (a) learner control and self-direction of the learning process (3 questions), (b) communication and collaboration (3 questions), (c) student achievement (3 questions), and (d) structure and organization of the learning environment (3 questions).

Both instruments have established reliability and validity. The PSSUQ has a Cronbach alpha of 0.97, and the WLEI has a Cronbach alpha of 0.87. The authors of the two instruments were contacted to obtain copies and permission to use their instruments.

The newly created instrument WLUQ consisted of selected items from both questionnaires and was tested for reliability in a pilot study and in this study. In this study, the instrument tested for overall reliability for all three sections of competency, usability and learning experience (28 items total) with a value of 0.9177 (Cronbach alpha).

Results

The WLUQ is organized in four sections and consists of 40 questions including three open-ended questions. Section I is designed to collect demographic data, section II is designed to collect data regarding computer competency, section III is designed to collect system usability data, and section IV is designed to collect data regarding student learning outcomes.

Description of Subjects

As noted earlier, the sample for this research study was taken from an introductory computer science class at a mid-sized, comprehensive university. Out of a total of 240 students enrolled in the course, 221 completed the course and 181 students completed the survey; yielding an 82% response rate. More than half the participants in this study were female students (54.7%) and the majority of students were sophomores and juniors (87.3%). Prior experience with a CMS was moderate for most students, 76% had taken between 2 and 5 courses using a CMS. Participants self-rated their computer competency level, and on a 5-point scale (1=poor, 5=excellent), students reported an overall computer competency score of 4.29. Usability scores were self-reported on a 5-point scale (1=strongly disagree, 5=strongly agree), and students rated the overall usability of the online learning system with 4.27. Scores for learning outcomes were also self-reported on a 5-point scale (1=strongly disagree, 5=strongly agree), and students rated their overall learning outcomes 4.36.

Correlation between Usability and Learning Outcomes

This section reports results pertaining to the following research question: Is there a significant relationship (p < 0.05) between online learning system usability and student learning outcomes? A relationship between system usability and learning outcomes would demonstrate that when overall system usability increases, overall student learning experience also increases or vice versa. Analysis of the relationship between usability and learning outcomes showed a significant positive correlation (r=0.83). This supported a linear relationship that when the overall system usability increased, the overall student learning outcomes also increased, or vice versa. Intercorrelations between subscales of the two variables confirmed the correlation between usability factors and learning outcomes and displayed significant positive correlations at the 0.01 level. To assess the relevance of the correlation between usability and learning outcomes, a stepwise multiple regression was performed. Learning outcomes was the dependent variable; system usefulness, interface quality, and information quality were the predictor variables.

Results confirmed the relevance of the correlation between system usability and student learning outcomes and showed adjusted R square = 0.68; F(3,156)=113.46, p<0.005 (using the stepwise method). This regression model showed that the measured usability factors system usefulness, interface quality, and information quality in this study accounted for approximately 68% of the variance in student learning outcomes in this regression model.

Differences between Computer Competency and both Usability and Learning Outcomes

This section reports results pertaining to the following research question: Is there a statistically significant difference (p < 0.05) between computer competency scores and both online learning system usability and student learning outcomes? One-way ANOVAs disclosed significant differences among the levels of competency for searching/browsing WWW, email, and word processing with overall usability. However, no significant difference among the level of competency for both electronic discussion boards and web development was discovered. One-way ANOVAs also disclosed significant differences among the levels of competency for searching/browsing WWW and email with overall learning outcomes. However, no significant difference among the level of competency for word processing, electronic discussions, or web development with overall learning outcomes was displayed. Table 1 summarizes the results for this series of one-way ANOVAs.

Results showed that abilities and skills, such as simple electronic communication and basic internet knowledge, seemed sufficient to increase a user’s view of higher system usability and higher learning outcomes. In this study, advanced knowledge about online learning environments or web applications did not increase the ratings of system usability or learning outcomes.    

  Table 1.Summary of one-way ANOVA Series among Levels of Competency

 

Search/browse WWW

Email

Word processing

Electronic discussions

Webpage development

Usability

F=4.81 *

F=5.91 * 

F=4.66 *

F=1.88 NS

F=1.20 NS

Learning outcomes

F=2.83 *

F=4.16 *

F=1.99 NS

F=.92 NS

F=1.49 NS

Note. *  =  Significant at 0.05, NS = Not Significant 

Differences between Gender, Age, and Student Standing and both Usability and Learning Outcomes

This section reports results pertaining to the following research question: Is there a statistically significant difference (p < 0.05) between student gender, age, and student standing and both student learning outcomes and online learning system usability? This section reports results pertaining to the differences between gender, age, and student standing and both usability and learning outcomes. A series of one-way ANOVAs was performed to analyze these differences.

 Table 2. Summary of one-way ANOVA Series among Gender, Age, and Student Standing

 

Gender

Age

Student standing

Usability

F=0.75 NS

F=0.03 NS 

F=0.14 NS

Learning outcomes

F=3.68 NS

F=0.23 NS

F=0.55 NS

Note. *  =  Significant at 0.05, NS = Not Significant


Results showed that no significant differences were reported among gender, age, and student standing with regard to both usability and learning outcomes. Table 2 summarizes the results for this series of one-way ANOVAs.
However, when evaluating results for these demographic groups it should be considered that the student group in this study was relatively homogeneous. Although 181 students completed this survey, more than 75% of all students were 20 years or younger and over 50% of all students were sophomores.

Discussion

This study showed that usability factors are important elements in assessment of student learning outcomes in online learning environments. Students reported high ratings for both system usability and learning outcomes. High ratings for learning outcomes (M=4.37) and further analysis of these ratings revealed that students clearly emphasized learner control as well as structure and organization. Student comments supported conclusions drawn from the statistical results. Over 30% of all students remarked how much they liked the online grade book that was used to provide grade-based feedback on their work. Through this immediate grade-based feedback, students felt enabled to exercise better control over their learning process, and they felt empowered to take responsibility for their learning. Students also appreciated the convenience and availability of online course material. This finding underscored that students in this course seemed to be task oriented. They clearly valued the increased independence of this online learning environment, especially the asynchronous nature, allowing for more flexibility in time and place required to attend class and complete course assignments.

Overall system usability also showed a high rating (M=4.35). Further analysis of the system usability indicators revealed that students were focused on “getting the job done” and appreciated the usefulness of the tool for the task, easy navigation, and the low learning curve. However, students also readily expressed their dissatisfaction with error messages, error recovery, and the online help system, thus supporting statistical results that identified information quality as the weakest area of online learning usability (M=4.0). Students mentioned that the system was sometimes slow to load, especially when many students were logged into the system concurrently. It was also reported that the login process was slow, cumbersome, and contained too many screens; it often took too many steps to get to their destination point.

Correlation

The significant positive correlation between usability factors and learning outcomes showed that, when system usability increased, learning outcomes also increased, or vice versa. A regression analysis confirmed the importance of this result and showed that the student learning outcomes were largely influenced by system usability. This finding supported existing research (Fredericksen et al., 1999; Oliver & Herrington, 2003, Valenta, Therriault, Dieter & Mrtek R, 2001), confirming that tool design and use of the tool indeed significantly influence student learning outcomes and attitudes. It further supported existing research regarding the demand for integration of usability research into the evaluation of online learning environments and student learning outcomes (Feldstein, 2002; Quigley, 2002).

Competency

Literature investigating the role of computer competency scores and learning outcomes or usability in online learning environments is not entirely consistent. (Fredericksen et al., 1999) reported no significant differences in learning outcomes among computer competency levels; however, this was contradicted by Dutton, Dutton, and Perry (2002) and Koohang (2004), whose studies found that prior computing experience improved learning outcomes.

This study discovered differences among certain areas of computer competency with usability and learning outcomes. Disaggregated data was evaluated and confirmed results noted by Dutton et al. (2002) and Koohang (2004). Students with high competency levels of basic Internet/WWW tasks, such as browsing, basic communication tasks such as email, and basic word processing tasks also reported high ratings for usability. Students with high competency levels of basic Internet/WWW tasks, and basic communication tasks, such as email, also reported high ratings for learning outcomes.   

Interestingly, advanced experience, such as web design, or advanced communication interfaces, such as electronic discussion boards, did not show significant differences for system usability or learning outcomes. These findings demonstrated that competency in basic computer tasks is sufficient to increase perceived system usability and learning outcomes in this online learning environment. Advanced knowledge is not necessary to raise usability ratings or learning outcomes. These results are important because they not only discovered the level of basic computer competency tasks as the cause for perceived higher usability and learning outcomes, these findings also provided indicators to possible training or preparation that could be offered to increase learning outcomes and system usability for students in online learning environments.

Gender, Age, and Student Standing

No significant differences for system usability or learning outcomes were identified among various levels of gender, age, or student standing. This study supported observations of Koohang (2004) and Dutton et al. (2002) who also reported no significant differences among gender or age and system usability. It should be noted that, as mentioned earlier, the evaluated student group was very homogeneous (75.7% were 20 years or younger, 87.3% were sophomores and juniors), and that this may be partially responsible for the results regarding gender, age, and student standing. Further evaluation in less homogeneous student groups or non-traditional learning environments would be beneficial to strengthen these findings. 

Suggestions for Future Research

Results of this study showed a clear connection between usability and student learning outcomes and suggest that usability factors must be considered in online learning environments. Training in usability and availability of methods and tools for instructors in these learning environments could significantly improve the integration usability in online learning environments.

In addition to the correlation of system usability and learning outcomes, this study also revealed differences among various levels of student computer competency. This suggests that possible training or prior competency assessment could assist students in improving their online learning experience. Further research could be designed to evaluate the effects of a pre-assessment or pre-training seminar to confirm the findings of this study.

Due to the homogeneity of the group, further studies should be performed with a more heterogeneous student group or an upper level course to evaluate correlations and to allow further conclusions on the correlation of system usability and online learning environments.

The nature of the course also had a large focus on skills development and sharing of individual, creative work. Further research using a course that is more discussion based, collaborative, and cooperative may also provide additional insight into the importance of usability in online learning environments.

Finally, a longitudinal study considering instructor input may also be of interest. Data from this study regarding successful learning outcomes was self-reported by students and collected at the end of the course. A study considering instructor-based (measured) learning outcomes (e.g. grades) over time could provide additional insight on the correlation between system usability and online learning outcome and differences among certain student groups.

Conclusion     

This study in conjunction with existing research suggests that usability factors are vital elements in online learning environments. Instructors should implement usability guidelines when creating content for online learning environments. The results of this research study help to provide an understanding of the importance of system usability and its relationship with regard to student learning outcomes. Students clearly valued structure, organization, and the increased control and flexibility that these learning environments provide. Considering the growing presence of online learning environments and hybrid learning environments in traditional institutions of higher education, it is vital to increase awareness about the role of usability in online learning.

Many instructors have little experience with regard to web design and development. It is crucial to provide information and training on how to implement usability guidelines into the creation of online educational content, and into the design and creation of online learning environments. The “Eight golden rules of interface design” (Shneiderman, 1998) or “Top ten guidelines for homepage usability” (Nielsen, 2003) could provide easy to use and easy to implement guidelines that could assist instructors in enhancing the learning process and the learning outcome by increasing the system usability.

Unfortunately instructors often have little influence on the choice of tool and the design of the actual shell of the CMS used in online instruction. However, in most cases instructors can control the content that is posted within their classes. As a result, it is even more important to provide instructors with tools to analyze and improve the online or hybrid learning environment. The WLUQ employed in this study can provide an easy to use tool for instructors to assess relationships between usability and student learning outcomes in an individual course, and it considers the particular modes of presentation and operation of the course as well as the goals of the course. The instrument allows for further analysis of disaggregated data on several factors affecting system usability and learning outcomes and provides indicators for improvement of the learning environment.   

This study indicates that integration of usability factors into online learning environments can assist in improving learning outcomes for students. Through the implementation of usability principles in virtual learning environments, instructors can support higher levels of usability and improve the online learning environment.


References

Allen, E., & Seaman, J. (2006). Making the grade – online education in the United States 2006. Retrieved November 3, 2007, from http://www.sloan-c.org/publications/survey/index.asp 

American Distance Education Consortium (2003). Guiding principles for distance teaching and learning. Retrieved March 25, 2005, from http://www.adec.edu/admin/papers/distance-teaching_principles. html.

Association for Computing Machinery (2001). Notes from online learning special interest group discussion at Computer Human Interaction Conference 2001 in Seattle, Washington. Retrieved October 28, 2004, from http://elearnmag.org/subpage/sub_page.cfm?section=6&list_item=1&page=1.  

Chang, V. (1999). Evaluating the effectiveness of online learning using a new web based learning instrument. Proceedings Western Australian Institute for Educational Research Forum. Retrieved February 15, 2004, from    http://education.curtin.edu.au/waier/forums/1999/chang.html  

Dutton, J., Dutton, M., & Perry, J. (2002). How do online students differ from lecture students? Journal of Asynchronous Learning Networks JALN, 6(1).  

Feldstein, M. (2002, April 8). Ignore usability at your peril. eLearn Magazine. Retrieved December 11, 2003, from http://elearnmag.org/subpage/sub_page.cfm?article_pk=5142&page_number _nb=1&title= COLUMN.  

Fredericksen, E., Picket, A., Pelz, W. Swan, K., & Shea, P. (1999). State University of New York student satisfaction and perceived learning with on-line courses – principles and examples from the SUNY learning network. Retrieved April 24, 2004, from http://SLN.suny.edu/SLN.  

Grice, R., & Hart-Davidson, B. (2002). Mapping the expanding landscape of usability: the case of distributed education. ACM Journal of Computer Documentation, 26, 159-167.  

Koohang, A. (2004). A study of users’ perceptions toward online learning courseware usability. International Journal on Online learning 3(2), 10-17.  

Lazar, J. (2001). User-centered web development. Sudbury, MA: Jones and Bartlett Publishers.  

Lewis, R.(1995). IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use. International Journal of Human-Computer Interaction 7(1), 57-78.  

Meiselwitz, G., & Lu, C. (2005). Questionnaire for evaluation of usability and learning outcomes in online instruction. Proceedings of ECEL - 4th European Conference on online learning, August, 2005, Amsterdam, NL.  

Moore, J., Sener, J., & Fetzner, M. (2006). Getting better: ALN and student success. Journal of Asynchronous Learning Networks 10 (3), 55-84.  

Moore, M., & Kearsley, G. Distance education. (2005). Belmont, CA: Thomson Wadsworth 

Nelson, B., & Wayne A. (1999). If you build it, they will come. But how will they use it? Journal of Research on Computing in Education, 32(2), 270-287.  

Nielsen, J. Usability 101 (2003). Retrieved March 28, 2004, from   http://www.useit.com/alertbox/20030825.html  

Oliver, R., & Herrington, J. (2003). Factors influencing quality online learning experiences. In (G. Davies & E. Stacey Eds.) Quality education @ a distance. London: Kluwer Academic Publishers.  

Quigley, A. (2002, April 8). Usability-tested online learning? Not until the market requires it. eLearn Magazine. Retrieved December 11, 2003, from http://elearnmag.org/subpage/sub_page.cfm?article pk=3301&page_number_nb= 1&title=FEATURE%20STORY 

Pollanen, M. (2007). Improving learner motivation with online assignments. Journal of Online Learning and Teaching, 3(2). Retrieved November 22, 2007, from https://jolt.merlot.org/vol3no2/pollanen.htm  

Shneiderman, B. (1998). Designing the user interface, 3rd ed. Reading, MA: Addison Wesley Longman, Inc.  

U.S. Department of Health and Human Services. (n.d.) Usability. Retrieved October 22 2005, from http://www.usability.gov.

Valenta, A., Therriault, D., Dieter, M., & Mrtek R. (2001). Identifying student attitudes and learning styles in distance education. Journal of Asynchronous Learning Networks JALN, 5(2), 111-127.  

Zaharias, P. (2006). A usability evaluation method for online learning: Focus on motivation to learn. Proceedings of CHI – Conference on Human Factors in Computing Systems, April, 2006, Quebec, CA.

 


Manuscript received 15 Sep 2007; revision received 3 May 2008.

Creative Commons License

This work is licensed under a

Creative Commons Attribution-NonCommercial-ShareAlike 2.5 License

 

 

   
Copyright © 2005-2008  MERLOT. All Rights Reserved.
Portions Copyright by MERLOT Community Members. Used with Permission.
ISSN: 1558-9528
Questions? Email: jolteditor@merlot.org
Last Modified : 2008/6/15