Journal of Information Technology Education Volume 10, 2011 Evaluating and Comparing the Usability of Web-based Course Management Systems Zafer Unal University of South Florida, St. Petersburg, FL, USA [email protected] Asli Unal U ş ak University, U ş ak, Turkey [email protected] Executive Summary Course Management Systems (CMS) are an increasingly important part of academic systems in higher education. When choosing a Course Manage ment System for an educational institution, the usability of the system is the key to the eff ectiveness and efficiency of the online courses that are to be implemented. The goal of this paper is to report the results of a comparative usability study conducted in 2008-2009 on two different course management systems: BlackBoard and Moodle. 135 students enrolled in the Fall 2008 and Spring 2009 sections of Introduction to Educational Technology participated in the study (72 and 63 respectively). At the beginning of each semester, participants were randomly divide d into two groups to experience different CMSs at different times. It can be concluded from this study that in al most every module or function comparison that was made, Moodle was favored by course participants over Blackboard with the exception of the Dis- cussion Board module where scores were not significa ntly different. At the end of the study, the researchers concluded that use of Moodle in onl ine courses can be a suitable alternative to the current CMS system (BlackBoard). In fact, now th at the pilot has showed that Moodle is as effec- tive as BlackBoard, the researchers have already shared their experiences with other faculty members and expanded their investigations by involving numerous other online courses, instruc- tors, and students, because the product showed significant potential for further examination. This study adds to the growing body of studies that are carried out to see if an open source CMS (Moodle) warrants consideration as an alternative to the institution’s current course management system. In addition to comparing the students’ f eedback quantitatively, this study also tried to explain in detail what specific component / functio n of each CMS students found useful or better than in the other. Rather than focusing only on student satisfacti on scores, this study further in- vestigated what aspect of each module for each CMS course participants particularly liked or dis- liked. Material published as part of this publication, eith er on-line or in print, is copyrighted by the Informing Science Institute. Permission to make digital or paper copy of part or all of these works for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage AND that copies 1) bear this notice in full and 2) give the full citation on the first page. It is per- missible to abstract these works so long as credit is given. To copy in all other cases or to re publish or to post on a server or to redistribute to lists requires specific permission and payment of a fee. Contact [email protected] ingScience.org to request redistribution permission. Keywords . course management sys- tems, Blackboard, Moodle Introduction A course management system (CMS), such as Blackboard, Blackboard Vista (formerly WebCT), Desire2Learn and Moodle provides a place for learning and teaching activities to occur within a Editor: Janice Whatley Usability of Web-based Cour se Management Systems seamless environment (Burrell-Ihlow, 2009; Ullm an & Rabinowitz, 2004). It enables instructors and learners to post content, participate in discu ssions, maintain a grade book, keep a roster, track participation, and generally e ngage in and manage learning activities in an online environment (Heo, 2009; Lansari, Tuaishat, Al-Rawi, 2010; Rovai, Ponton, & Baker, 2008). In the current market space there are many co mmercially available course management systems from which to choose. Currently, the primary cont ender in this market space is Blackboard, espe- cially when its acquisition of its main competitors WebCT and Angel course management sys- tems are taken into consideration (eLearnity, 2005; InsideHigherEd, 2010). BlackBoard is a commercial product developed by the BlackBoard Corporation, which was founded in 1997 as a consultant to the IMS Global Learning Consor tium (Tsang, Kwan, & Fox, 2007). At present, BlackBoard has thousands of deployments over 60 countries and is available in 8 major lan- guages (BlackBoard, 2010). It has two main product lines, namely, academic suite and commer- cial suite. The former supports universities and academic institutions for teaching purposes while the latter supports commercial ventures fo r business usage (BlackBoard, 2010). The open source community has also been active in creating alternative course management sys- tem choices that are free of licensing costs. The main advantages of an open source course man- agement system are the ability to modify these pr oducts and redistribute them back into the com- munity. In the more popular open source projects, as new features become available they can be integrated into the users’ existing system as needed at minimal cost. The disadvantages of open source software are a lack of dedicated support unlike proprietary systems from software manu- facturers and if an organization modifies the common code base too dramatically the ability to upgrade to future releases of the software is impaired. Open source software also requires personnel with the requisite knowledge base to implement the software, which may require additional training fo r current personnel. Currently the most popular open source course management system is M oodle (Chen, Wang, & Hung, 2009; Machado & Tao, 2007; Ramos, MacLean, Bates, Wylie, & Brempah, 2010). Moddle was founded by Martin Dougiamas in 2001 (Moodle, 2010). It adapts a flexible modular design in which one can choose and apply among thousands of available extensions for their version of Moodle. Currently, it has more than 45 thousand deployments in over 100 countries and has been translated into 45 lan- guages. An institution now has the choice between many competing course management systems, both from proprietary software manufacturers and open source projects. It is not enough, however, to just pick a package based on its price or feat ure list. Institutions considering implementing a course management system must carefully evaluate it before putting it to use with a student popu- lation (Colace, Santo, & Vento, 2002; Iding, Auernheimer, Crosby, & Klemm, 2002). The goal of this study was to carry out and report the results of a comparative usability (field) test conducted during the 2008-2009 acad emic year on two different course management systems, BlackBoard and Moodle. To accomplish this goal, a field test was established utilizing an alterna- tive course management system (Moodle) to one already in use (BlackBoard) by the hosting or- ganization. Moodle was used the first time as an alternative to the institute-wide enterprise course management system, Blackboard. 135 students enrolled in the Introduction to Educational Tech- nology course participated in this study during two semesters. At the beginning of each semester participants were randomly divide d into two groups to experience different CMSs at different times. The same experiment w as repeated with another group during the next semester. During these experiments, participants were asked to sh are their experiences, provide their ratings and feedback on the usability test, and rate their comparisons and preferences of each course man- agement system (response rate 100%). There were no course withdrawals at the end of each se- mester. 20 Unal & Unal The study attempted to answer the following questions: • How do participants rate their experiences with the two course management systems? • How do participants compare the two course management systems? • Does the use of an open source course manage ment system (Moodle) warrant considera- tion as an alternative to th e institution’s current course management systems (Black- Board)? Usability Testing and Course Management Systems The selection and adoption of a CMS by a teaching institution or a corporate training system fol- lows the analysis of some basic parameters, us ually including technical features (e.g., program- ming language used or required hardware infrastru cture, etc.), available functions (e.g., discus- sion forums, integrated streaming services, etc.), supported formats (e.g., HTML, PDF, different video encoding, etc.) and learning technology st andards compliance (e.g., SCORM). Such analy- ses are mostly system-oriented, i.e., measure a defi nite set of features independent from the users, and only a very limited number of comparative studies on CMS actually consider other parame- ters including usability concerns (Inversini, Bo tturi, & Triacca, 2006; Nguyen, Chang, Chang, Jacob & Turk, 2008). Providing web users with a usable environment can lead to significant sav- ings and improved performances (Kibaru & Dicks on-Deane, 2010; Nielsen, 2003; Rivard & Huff, 1988). In terms of teaching and learning, having a usable CMS means potentially reducing teacher time invested in setting up and managi ng the course and improving the students’ learning experience – teachers and learners do not need to struggle with difficult technologies but can fo- cus on content (Inversini et al., 2006). Previous research suggests that online courses de veloped using CMS tended to suffer from a lack of attention to design (Gilbert & Moore, 1998; Oliver, 1999; Tsang et al., 2007). Such systems gave course developers and facilita tors the ability and choice to integrate many appealing options, often resulting in course designs that haphazardly in tegrated a variety of features that confused learners, were not instructionally sound, or did not match course objectives (Kidney & Puckett, 2003). Concept instruction typically includes presen tation of a concept definition, presentation of sample instances, and practice in classifying instances of examples and nonexamples (Tennyson & Cocchiarella, 1986). Although available featur es make the inclusion of presentation and guid- ance features common in online courses, in practice components are often weak or missing (Gil- bert & Moore, 1998; Kidney & Puckett, 2003). Thus , given Merrill’s (1997) warning that instruc- tional strategies will teach only when they include presentation, learner guidance, and practice, failure to use provided CMS features to present a nd guide as well as encourage concerted practice is problematic. Confusion about the uses of av ailable features and the mismatch of course fea- tures to learning expectations can, and often do, impede learning (Graham & Scarborough, 2001; Kearsley, 1997; Wang, 2010). To measure how users perform and how they percei ve or think about an information system is important to the formulation of interface guidelines that are empirically justified. Research into human-computer interaction (HCI) tells us that a major design element is a technology’s usability (de Lera, Fernandez, & Valverde, 2010; Preece, Rogers, & Sharp, 2002; Rozanski & Haake, 2003). Usability, a core concept of HCI, refers to interface characteristics that are easy to use, learn, and remember, and that are pleasant to use and generate the least errors (Nielsen, 1993). The concept of usability has been defined by a number of research ers, but a complete definition is difficult to achieve outside the domain within which it is c onsidered (Petersen, 2007; Simbulan, 2007). To properly define usability as it has evolved during th e past few years, three original approaches have to be included. Jakob Nielsen defined the concept of web usability by stating that making web pages simple to navigate and intuitively or ganized helps the users find the information they 21 Usability of Web-based Cour se Management Systems are looking for with ease (Nielsen, 1994). Nokelain en (2006) expanded Nielsen’s definition to include pedagogical usability, suggesting that usability involves techniques for ensuring a trou- ble-free interaction with the software while pe dagogical usability aims at supporting the learning process. A further refinement creates the term “learnability” which is highly recommended in evaluations of e-learning environments (Kakasev ski, Mihajlov, Arsenovski & Chungurski, 2008; Neal, n.d.). In the world of e-learning, the defi nition of learnability is expanded to include the ability of users to effectively learn and retain the skills and knowledge, as well as learn how to use the system (Neal, n.d.). Information technology adoption and diffusion has b een studied in great detail in recent years by researchers in the information systems area. A doption is defined as the decision to accept or in- vest in a technology. Information technology adoptio n can be studied at two levels: the first is at the organizational level and the other is at the indi vidual level. If the unit of analysis is an indi- vidual, the emphasis is on the acceptance of th e technology. The Technology Acceptance Model (TAM) proposed by Davis (1989) has explained acceptance of information technology. TAM states that an individual’s adoption of informa tion technology is dependent on their perceived ease of use and perceived usefulness of the technolog y. This model has been used and tested, and at times modified, to study the adoption of a number of different technologies in the past decade (Lucas & Spitler, 1999; Venkatesh & Davis, 1996). As a theoretical framework, this study used the technology acceptance model (TAM) which has been successfully applied in examining adop- tion behavior of various information systems (e .g. Jackson, Chow, & Leitch, 1997; Venkatesh & Davis, 2000), in various organizational contex ts (e.g. Agarwal & Prasad, 1999; Hsu & Chiu, 2004; Igbaria, Zinatelli, Cragg, & Cavaye, 1997; Venkatesh & Davis, 2000). Previous studies have found TAM to have a relatively simple stru cture but comparable explanatory power to more sophisticated models, such as the theory of reas oned action and the theory of planned behavior (Davis, Bagozzi, & Warshaw, 1989; Mathieson, 1991; Taylor & Todd, 1995). The main purpose of TAM is to predict the intention to utilize info rmation systems by measuring users’ perceptions of the system’s usefulness and ease of use. More over, TAM proposes that the effects of external variables on usage intention are mediated by th ese perceptions. Therefore, by utilizing TAM as a theoretical framework, the study w as able to investigate the impact of external variables on user acceptance of digital libraries. Usability and user acceptance tests are usually conducted with th e help of a group of participants representing the intended user community, who are asked to perform certain tasks under the ob- servation of controllers who analyze their actions. At the end of the observation, users are often invited to make general comments or give suggestions about how the system can be improved. Typical of usability tests is the think aloud technique, in which users are asked to “think aloud” about what they are doing while performing their task, so that their observers can understand their actions better (Dray & Siegel, 2004; Kantner, Sova, & Rosenbaum, 2003; Rowley, 1994). A limi- tation of this type of test is that it usually focuses on first-time and short-time usage and has a limited coverage of the range of interface feat ures or tasks (Brush, Ames, & Davis, 2004; Shnei- derman, 1998). Therefore, it is difficult to verify how a system will perform over a certain period of regular use. Because of this, usability tests ar e often integrated with other evaluation tech- niques, including field tests (Kakasevski et al., 2008). Field tests can be considered to be a special kind of usability test (note that usability tests are usu- ally conducted in controlled setti ngs, while field tests are conducted in natural settings). During field tests new products are put to work in realistic environments for a fixed trial period. Special examples of field tests are beta versions of software : test versions distributed to users in order to verify them. This type of usability test provides extensive usage and data is collected over a pe- riod of time through focus group interviews or surveys (Kakasevski et al., 2008; Preece et al., 2002). 22 Unal & Unal Methodology Two course management systems were compared during this study: BlackBoard 7.0 (updated to 8.0 during study) (BlackBoard, 2010) and Moodle 1.9 (Moodle, 2010). These CMSs were used to support an Educational Technology course offered during Fall 2008 and Spring 2009 semesters in a Southeastern university. The university already had BlackBoard in place and the course has been taught online only via BlackBoard. For this study, Moodle was installed on the university’s servers located in the campus IT building and administered by academic technologies personne l. The instructor of the courses was experi- enced in both systems previously. EME2040- Introduction to Educational Technology is an educational technology course that in- troduces classroom applications of educational technologies to its students. It is one of the three required courses for students majoring in education in the State University System. The course topics and activities included the fo llowing weekly format (Table 1). Table 1. Course Activities # Course Activities Time Spent 1 Educational Software Evaluation 1 week 2 Educational Website Evaluation 1 week 3 Creating an Educational Game via PowerPoint 1 week 4 Creating an Excel GradeBook 1 week 5 Creating a WebQuest 3 weeks 6 Creating a Technology Integrated Lesson Plan 1 week 7 Copyright Quiz 1 week 8 Online Discussion on Technology & Parental Involvement 1 week 9 Creating a Teacher Website 4 week 10 Online Discussion / Chat Ongoing One hundred thirty five students enrolled in th e Fall 2008 and Spring 2009 section of Introduction to Educational Technology participated in the st udy (72 and 63 respectively). At the beginning of each semester, participants were randomly divided into two groups to experience different CMSs at different times. The course instructor copied the BlackBoard version of the course to Moodle, transferring the same features (modules) (Table 2). In the meantime the researchers created an online survey and interview forms and obtained approval from the Inst itutional Review Board prior to the course. 23
Please Wait Your download Will Start in Seconds
Your DownLoad Will start automatically