Blackboard vs. Moodle: Comparing User Experience of Learning Management Systems

Similar documents
Best Practices in Internet Ministry Released November 7, 2008

Meriam Library LibQUAL+ Executive Summary

Abstract. Janaka Jayalath Director / Information Systems, Tertiary and Vocational Education Commission, Sri Lanka.

Standard 5: The Faculty. Martha Ross James Madison University Patty Garvin

National Survey of Student Engagement (NSSE) Temple University 2016 Results

Software Development Plan

Group A Lecture 1. Future suite of learning resources. How will these be created?

Effective Recruitment and Retention Strategies for Underrepresented Minority Students: Perspectives from Dental Students

Contract Renewal, Tenure, and Promotion a Web Based Faculty Resource

Process Evaluations for a Multisite Nutrition Education Program

TA Certification Course Additional Information Sheet

Undergraduates Views of K-12 Teaching as a Career Choice

Statewide Strategic Plan for e-learning in California s Child Welfare Training System

Distributed Weather Net: Wireless Sensor Network Supported Inquiry-Based Learning

Education in Armenia. Mher Melik-Baxshian I. INTRODUCTION

Measures of the Location of the Data

An Introduction and Overview to Google Apps in K12 Education: A Web-based Instructional Module

Program Review

Linguistics Program Outcomes Assessment 2012

Robert S. Unnasch, Ph.D.

Massachusetts Juvenile Justice Education Case Study Results

Nearing Completion of Prototype 1: Discovery

Three Strategies for Open Source Deployment: Substitution, Innovation, and Knowledge Reuse

Faculty Schedule Preference Survey Results

Enhancing Customer Service through Learning Technology

OPAC and User Perception in Law University Libraries in the Karnataka: A Study

CONDUCTING SURVEYS. Everyone Is Doing It. Overview. What Is a Survey?

The Teaching and Learning Center

Software Maintenance

Guidelines for the Use of the Continuing Education Unit (CEU)

Training Staff with Varying Abilities and Special Needs

The Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation

Patient/Caregiver Surveys

Preliminary Report Initiative for Investigation of Race Matters and Underrepresented Minority Faculty at MIT Revised Version Submitted July 12, 2007

Leveraging MOOCs to bring entrepreneurship and innovation to everyone on campus

Eller College of Management. MIS 111 Freshman Honors Showcase

Managing Printing Services

Colorado State University Department of Construction Management. Assessment Results and Action Plans

Successfully Flipping a Mathematics Classroom

Summary BEACON Project IST-FP

BEST PRACTICES FOR PRINCIPAL SELECTION

The Dropout Crisis is a National Issue

MMOG Subscription Business Models: Table of Contents

EAP. updates KHENG WAICHE. early proficiency programs coordinator

Positive turning points for girls in mathematics classrooms: Do they stand the test of time?

CHAPTER V: CONCLUSIONS, CONTRIBUTIONS, AND FUTURE RESEARCH

San Francisco County Weekly Wages

Using Moodle in ESOL Writing Classes

I. Proposal presentations should follow Degree Quality Assessment Board (DQAB) format.

BLENDED LEARNING IN ACADEMIA: SUGGESTIONS FOR KEY STAKEHOLDERS. Jeff Rooks, University of West Georgia. Thomas W. Gainey, University of West Georgia

GUIDE TO EVALUATING DISTANCE EDUCATION AND CORRESPONDENCE EDUCATION

Empirical research on implementation of full English teaching mode in the professional courses of the engineering doctoral students

WASC Special Visit Research Proposal: Phase IA. WASC views the Administration at California State University, Stanislaus (CSUS) as primarily

Developing skills through work integrated learning: important or unimportant? A Research Paper

Principal vacancies and appointments

NCEO Technical Report 27

STUDENT LEARNING ASSESSMENT REPORT

1) AS /AA (Rev): Recognizing the Integration of Sustainability into California State University (CSU) Academic Endeavors

Integrating simulation into the engineering curriculum: a case study

David Erwin Ritter Associate Professor of Accounting MBA Coordinator Texas A&M University Central Texas

BLACKBOARD & ANGEL LEARNING FREQUENTLY ASKED QUESTIONS. Introduction... 2

Please find below a summary of why we feel Blackboard remains the best long term solution for the Lowell campus:

On-Line Data Analytics

Registration Fee: $1490/Member, $1865/Non-member Registration Deadline: August 15, 2014 *Please see Tuition Policies on the following page

Does the Difficulty of an Interruption Affect our Ability to Resume?

Student Support Services Evaluation Readiness Report. By Mandalyn R. Swanson, Ph.D., Program Evaluation Specialist. and Evaluation

November 17, 2017 ARIZONA STATE UNIVERSITY. ADDENDUM 3 RFP Digital Integrated Enrollment Support for Students

The Comparative Study of Information & Communications Technology Strategies in education of India, Iran & Malaysia countries

Integration of ICT in Teaching and Learning

MASTER S COURSES FASHION START-UP

Ministry of Education, Republic of Palau Executive Summary

TotalLMS. Getting Started with SumTotal: Learner Mode

Parent Information Welcome to the San Diego State University Community Reading Clinic

Course Content Concepts

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

Cooking Matters at the Store Evaluation: Executive Summary

Helping Graduate Students Join an Online Learning Community

2014 State Residency Conference Frequently Asked Questions FAQ Categories

eportfolio Trials in Three Systems: Training Requirements for Campus System Administrators, Faculty, and Students

Field Experience Management 2011 Training Guides

2017 FALL PROFESSIONAL TRAINING CALENDAR

1. Programme title and designation International Management N/A

Evaluating Usability in Learning Management System Moodle

USC VITERBI SCHOOL OF ENGINEERING

Executive summary (in English)

ESTABLISHING A TRAINING ACADEMY. Betsy Redfern MWH Americas, Inc. 380 Interlocken Crescent, Suite 200 Broomfield, CO

DEPARTMENT OF FINANCE AND ECONOMICS

5. UPPER INTERMEDIATE

Running Head: Implementing Articulate Storyline using the ADDIE Model 1. Implementing Articulate Storyline using the ADDIE Model.

Chamilo 2.0: A Second Generation Open Source E-learning and Collaboration Platform

GOVERNOR S COUNCIL ON DISABILITIES AND SPECIAL EDUCATION. Education Committee MINUTES

TU-E2090 Research Assignment in Operations Management and Services

Using GIFT to Support an Empirical Study on the Impact of the Self-Reference Effect on Learning

The IDN Variant Issues Project: A Study of Issues Related to the Delegation of IDN Variant TLDs. 20 April 2011

Chicago State University Ghana Teaching and Learning Materials Program:

Learning Objectives by Course Matrix Objectives Course # Course Name Psyc Know ledge

LOS ANGELES CITY COLLEGE (LACC) ALTERNATE MEDIA PRODUCTION POLICY EQUAL ACCESS TO INSTRUCTIONAL AND COLLEGE WIDE INFORMATION

COLLEGE OF BUSINESS AND ECONOMICS DEPARTMENT OF MARKETING CLINICAL FACULTY POLICY AND PROCEDURES

THE INFORMATION SYSTEMS ANALYST EXAM AS A PROGRAM ASSESSMENT TOOL: PRE-POST TESTS AND COMPARISON TO THE MAJOR FIELD TEST

Evaluation of Hybrid Online Instruction in Sport Management

Transcription:

vs. : Comparing User Experience of Learning Management Systems Michael Machado, Eric Tao School of Information Technology and Communication Design, California State University Monterey Bay, Seaside, CA 93955, michael_machado@csumb.edu, eric_tao@csumb.edu Abstract - Learning management systems are becoming ubiquitous technology adopted at institutions of higher learning. Before these systems can be considered effective the user experience must be studied and analyzed to provide the optimum solution to meet pedagogical needs of both faculty and students. This study compared the user experience between the leading proprietary solution,, and the leading open source solution,. We established a control group that only used the proprietary solution and two study groups, a faculty group and a student group that used the open source solution, but had previous experience with the proprietary solution. We used online surveys to compare the user experience of the basic functionality of each system such as communication tools, student-student interaction tools, student-instructor interaction tools. The study was conducted during the Fall 2006 semester at California State University Monterey Bay and included five upper division courses with the learning management systems used as an adjunct to a traditional face-to-face delivery modality. Index Terms - Distributed Learning, e-learning, Human Computer Interface, Learning Management System INTRODUCTION The use of technology in the classroom to assist instructors in meeting their pedagogical goals has become ubiquitous since 1990s. The days of mimeograph machines and chalkboards has long past. The flood of technological innovations can be overwhelming and necessitates the careful consideration of which technologies are the most effective and provide the highest cost/benefit ratio to the organizations using them. One of the technologies that have been adopted for both corporate training and use in institutions of higher education are learning management systems. An organization now has the choice between many competing learning management systems, both from proprietary software manufacturers and open source projects. A learning management system is a software application designed with the specific intent of assisting instructors in meeting their pedagogical goals of delivering learning content to students. The types of learning management systems that this project considered were all web-based and accessible through an internet connection and a web browser. This aspect of the learning management systems allows for round the clock access to the learning content provided for students and facilitates the ability of an educational institution to provide distance learning opportunities. All learning management systems under consideration meet the following criteria [1]: high availability, usability, scalability, interoperability, stability and security. In the current market space there are many commercially available learning management systems from which to choose. The primary contender in this market space is, especially when its acquisition of its main competitor WebCT is taken into consideration. The combined market share of and WebCT is estimated at 80-90% of the secondary schools and universities that use learning management systems. It is estimated that 20% of institutions of higher learning in the United States uses software. The market for learning management software has great potential as it is estimated that approximately 59% of institutions of higher learning in the United States do not currently use any learning management system [2-3]. The open source community has also been active in creating alternative learning management system choices that are free of licensing costs. The main advantages to an open source learning management system are the ability to modify these products and redistribute them back into the community. In the more popular open source projects, as new features become available they can be integrated into the users existing system as needed at minimal cost. The disadvantages to open source software are a lack of dedicated support unlike proprietary systems from software manufacturers and if an organization modifies the common code base too dramatically the ability to upgrade to future releases of the software is impaired. Open source software also requires personnel with the requisite knowledge base to implement the software which may require the addition of new personnel or additional training for current personnel. Currently the most popular open source learning management system is. According to the website there are currently 19,234 registered sites with 35 sites supporting more than 20,000 users. California State University, Humboldt and San Francisco State University have both initiated pilot projects based on the learning In December 2006, UCLA announced that in November, 2006, the UCLA Faculty Committee on Educational Technology decided that UCLA should converge S4J-7

on as the single open source platform for its common collaboration and learning environment. [4] The primary goal of this project was to compare the usability of two competing learning management systems from the viewpoint of the most critical stakeholders; faculty and students. PRIOR WORKS There are two prior works that help to provide a foundation and additional insight into this project. The first is unpublished raw data collected in 2005 by Kathy Munoz and Joan Van Dozer from California State University Humboldt [5]. They conducted a study similar to this study, but using a single course of 35 students and the instructional format was a fully online course. The study came to similar conclusions as this study that on any one feature compared between the two learning management systems under investigation there was not always a clear winner, but in the aggregate students were more satisfied with their experience with and would prefer its use over. The second work considered was a conference paper by David Bremer and Rueben Bryant [6] on a study they conducted in 2004. Their study was similar to this study in that the systems were used as an adjunct to traditional classroom instruction. Also participants were chosen with the assumption that they had prior experience with. The differences were that only one course was used with a total of twenty students. This work came to conclusions similar to this study that given any given feature of the learning management systems under review that that either there was no clear winner or was slightly favored. Although in the aggregate choice 80% of the students preferred as a learning tool when compared to. PROJECT GOALS The goal of this project was to compare the usability of two competing information systems. To accomplish these goals a pilot test was established utilizing an alternative learning management system to one already in use by the hosting organization, California State University, Monterey Bay. Users of the learning management system used by the university were identified who were willing to participate in the pilot test. Surveys of the participants were conducted after a reasonable period of time to have sufficient data to analyze to attempt to answer the following questions: Which information system is more efficacious? Which system has the superior user interface? Which system provides the most desired functionality? Which system has the shallowest learning curve? Which system do the users prefer? PROJECT RESOURCES The pilot test of the open source learning management system,, was conducted in collaboration with CSU Monterey Bay s Center for Academic Technologies. The installation was hosted on Center for Academic Technologies servers located in the campus IT building and installed and administered by Center for Academic Technologies personnel. The software is an open source software project and has no licensing costs. The modified version used for the pilot test was provided by San Francisco State University s Academic Technologies group and additional software modules were provided by developers located at California State University, Humboldt. Surveys were conducted using the web-based Opinio survey tool [7] that is available for scholarly research on the California State University, Monterey Bay campus. The time required for this project was approximately two man-months, not factoring in the contribution of the Center for Academic Technologies personnel. The initial creation of a sample course on the system for demonstration purposes and for the initial training of faculty users took approximately thirty hours. Additional time was used to create surveys and interview forms, obtain Human Subjects approval, conduct surveys participants, and analyze the data. Since the pilot test system is an open source project there is a community of users that was an informational resource to maintain and expand the system. This project had no specific budget since all software being used was open source and free of licensing costs and the system was hosted on existing hardware available to the Center for Academic Technologies. METHODOLOGY A pilot test of the learning management system was conducted using five course sections that were taught in the School of Information Technology and Communication Design for credit during the Fall 2006 semester. These course sections all used the test system as the learning management system for the duration of the Fall 2006 semester as a replacement for the currently deployed learning management system,. All course sections were taught previously using. All instructors participating in the pilot test had previous experience using as a learning The course sections chosen were all upper division courses in a technology focused major and the assumption was made that the majority of the students enrolled had experience using as a learning management system previously at CSU Monterey Bay. At the start of the pilot test there were 101 students enrolled in these five courses. Three groups were identified to receive the surveys for the pilot test. The first group identified was a control group to establish a baseline to compare against the pilot test student participant s survey responses. The candidates selected for the control group survey had taken the courses included in the pilot test the last semester they had been offered, either Spring 2006 or Fall 2005. Their survey consisted of a subset of the S4J-8

questions asked the pilot test student participants with all mention of the alternative learning management system removed. Seventy-seven students were identified that fit into control group category, and they were sent links to the control group survey through their campus email address. Twenty of the emails were returned as undeliverable. The reasons for this could include students transferring, graduating, or otherwise leaving the university. This left a total pool of 57 participants for the control group survey. The second group identified was the student participants of the pilot test. At the beginning of the pilot test there were a total of 101 candidates for this group, but due to attrition the final number of candidates surveyed in this group was 94. The pilot test survey for student participants was designed to provide the comparative data for the two learning management systems under consideration. The assumption was made that all or almost all participants would have had previous exposure to the learning management system. This survey consisted of a series of the identically phrased questions about the user s experience with the two learning management systems under consideration utilizing the identical Likert scale. The third group identified was the instructor participants of the pilot test. The pilot test instructor survey was designed to provide the comparative data for the two learning management systems under consideration from the perspective of an instructor. The assumption was made that all or almost all participants would have had previous teaching experience utilizing the learning The survey consisted of a series of the questions directly comparing the user s experiences with the two learning management systems utilizing a Likert scale. The limitations with this part of the project were that there were only a total of four instructors involved in the pilot test, one of whom had no prior experience teaching with the learning Even though all instructor participants responded to the survey, the small sample size does not allow for any reasonable quantitative analysis. SURVEY DESIGN The surveys for the pilot test consisted of four sections. Section 1 focused on demographic information; age, gender and educational status. Section 2 focused on the user s prior experience with learning management systems and comfort level with information technology in general. Section 3 focused on questions that compared the user s experience with the learning Section 4 was the narrative response section. This section allowed users to provide additional comments or suggestions on any issues that were not addressed in the previous three sections of the survey. All three surveys consisted of identical question sets that were modified for the target audience. For example, the control group survey was a subset of the pilot test student participant survey with all questions related to the learning management system removed. Questions that did not require a yes or no answer or were located in the narrative response section used Likert scales. DATA COLLECTION Surveys were sent to the student participants of the pilot test at the beginning of November 2006. The assumption was made that nine weeks exposure to the new learning management system would allow sufficient time for the students to have formed a concrete opinion about the usability of the system. No incentives were offered for them to respond to the survey. The survey for this group included a total of 31 questions. The surveys website was open at all hours from November 15, 2006 through December 6, 2006. A survey for the instructor participants was conducted mid November 2006 to assess their opinions and comments on the new learning The assumptions were the same as above, that sufficient time will have elapsed for them to have formed concrete opinions. There were only four instructor participants to survey so while their feedback is valuable to the pilot test, the group size is too small to make definitive statements concerning the usability of the system from an instructor perspective. In addition, one of the instructors had no previous teaching experience using the learning The survey for this group included a total of 19 questions. A survey for students that had taken the courses involved in the pilot test the last semester they were offered for credit prior to the pilot test was also conducted mid November 2006. They were surveyed about their experiences with the learning management system,, used during those semesters and how it impacted their learning experience. This allowed for a basis of comparison with the student experience with the new test system. There were 77 students in this pool. No incentives were offered for them to respond to the survey. The survey for this group included a total of 22 questions. Control Group SURVEY RESULTS The 57 candidates for the control group survey were sent reminder emails four times during this timeframe the survey was available. Thirteen of the candidates responded to the survey, and of those responding there were ten complete responses. One hundred percent of the participants that responded indicated an educational status of junior or senior in a technology focused undergraduate degree program. This gave a total response rate of 22%, but only a 17.5% rate of survey completion. The answers of significance given by the respondents to the control group survey areas are listed in the following tables. 90% of the students had no prior experience with learning management systems prior to their enrollment at the University. S4J-9

. comfortable highly-comfortable Figure 1 0 1 2 3 4 5 6 7 rate yourself on your comfort level with information technology Figure 2 0 1 2 3 4 You felt that was an easy system to use Figure 3 0 1 2 3 In future courses, I would you prefer to use. Figure 4 0 1 2 3 of standardization in how instructors used the learning Pilot Test Student Participant Survey The 94 candidates for the pilot test student participant survey were sent reminder emails four times during this timeframe the survey was available. Fifty-three of the participants respond to the survey and of those responding there were forty-eight complete responses. Forty-seven of the participants that responded indicated an educational status of junior or senior in a technology focused undergraduate degree program. This gave a total response rate of 56%, but only a 51.1% rate of survey completion. The answers of significance given by the respondents to the survey were in the following areas: 89% had used a learning management system prior to the pilot project and 95% of them had experience with the system. comfortable highly-comfortable Figure 5 0 5 10 15 20 25 30 rate yourself on your comfort level with information technology experienced very experienced Figure 6 0 5 10 15 20 25 30 rate yourself on your experience level with learning management systems The divergences in responses were seen in the following areas: If an alternative learning management system was available I would be more willing to use it than. In the narrative response section of the survey the recurring theme was instructor s lack of expertise and the lack S4J-10

Figure 7 Figure 10 0 5 10 15 20 Figure 7 questions: Did the learning management system enhance the instruction that you received in your current course or did the learning management system enhance the instruction in your previous courses. Figure 8 0 5 10 15 20 25 Figure 8 questions: You felt the learning management system enhanced the organization of learning materials and you felt learning management system enhanced the organization of learning materials. 0 5 10 15 20 25 Figure 10 questions: You felt that the web-based resources available to you in were effective learning tools and you felt that the web-based resources available to you in were effective learning tools. The most significant responses were seen in the following areas: 71% felt that the learning management system was easier to use than the learning 75% of the respondents would prefer to use the learning management system in future courses as a replacement for. These significant responses are mitigated by the survey participant s response to the question; if you have previously used a learning management system, you felt that this experience made learning the user interface easier. These responses are displayed in figure 11. Figure 9 Figure 11 0 2 4 6 8 10 12 14 16 Figure 9 questions: You felt that the communication tools available in enhanced interaction with your instructor(s) and you feel that the communication tools available in enhanced interaction with your instructor(s). not applicable 0 5 10 15 20 25 Pilot Test Instructor Participant Survey The four candidates for the pilot test instructor participant survey were sent reminder emails four times during this timeframe the survey was available. The limitations with this part of the project were that there were only a total of four instructors involved in the pilot project, one of whom had no S4J-11

prior experience teaching with the learning Even though all instructor participants responded to the survey, the small sample size does not allow for any reasonable quantitative analysis. Anecdotally, of the responses received, three of the four instructor felt that was easier to administer and would prefer to use it in future courses. All four felt that allowed for better organization of their course materials. CONCLUSIONS The goal of this project was to compare the usability and effectiveness of two competing learning management systems. From the data collected the following conclusions can be drawn with high confidence in their validity. There were mixed results on functionality. Participants in the pilot project rated s course material organization and communication functionality higher, but in other functional areas the data was not definitive enough to reach a solid conclusion. There was no clear winner when the systems were compared on functionality. The students in the pilot project preferred the learning management system over the learning They rated its ease of use higher and 75% of them would prefer to use it over in the future courses that they enroll in at the university. Even though was rated as easier to use, this result was mitigated by the fact that 65% of students felt that their previous experience with learning management systems helped them to acclimate to the new system faster. The results of the research show that in the aggregate, when the systems were compared in their entireties, that the learning management system was the preferred choice of the users. These results are echoed by the two studies that were looked at in the prior works section of this paper. We therefore conclude that the learning management system is the more efficacious and effective learning management system than the learning REFERENCES [1] Hall, J. (2003), Assessing learning management systems, Oracle University. Retrieved December 1, 2006 from http://www.clomedia.com/content/templates/clo_article.asp?articleid=91 &zoneid=29 [2] O Hara, T. s WebCT Deal Spurs Antitrust Questioning.,Washingtonpost.com. November 26, 2005. Retrieved December 16, 2006 from http://www.washingtonpost.com/wpdyn/content/article/2005/11/25/ar2005112501193.html [3] Mazard, C, ON LINE LEARNING MARKET MAY BE MONOPOLIZED, Antitrust Law Blog Published by Sheppard Mullin. December 7, 2005. Retrieved December 16, 2006 from http://www.antitrustlawblog.com/article-on-line-learning-market-maybe-monopolized.html [4] UCLA Office of Information Technology website, Retrieved December 16, 2006 from http://www.oit.ucla.edu/ccle/ [5] Munoz, K. D.. & Van Duzer, J. (2003) vs. : A comparison of satisfaction with online teaching and learning tools. Unpublished raw data. Retrieved November 15, 2006 from http://www.humboldt.edu/~jdv1/moodle/all.htm [6] Bremer, D, Bryant, R. (2005) A Comparison of two learning management Systems: vs, Proceedings of the 18 th Annual Conference of the National Advisory Committee on Computing Qualifications. pg135-1390ce OF THE TTEE ON COMPUTING -13 2005 AUTING QUALIFICATIONS, pg135-139 [7] Opinio Survey Tool website, http://www.objectplanet.com/opinio/ ACKNOWLEDGMENT We would like to thank Marc Oehlman, the interim director of the Center for Academic Technology at California State University, Monterey Bay and programmer analyst Andrew Coile. Without the contributions of these two individuals this project would not have been possible. S4J-12