April – 2015

A Usability Evaluation of a Blended MOOC Environment: An Experimental Case Study

Yousef photo Chatti photo Schroeder photo Wosnitza photo

Ahmed Mohamed Fahmy Yousef1,2, Mohamed Amine Chatti1, Ulrik Schroeder1, and Marold Wosnitza1
1RWTH Aachen University, Germany, 2Fayoum University, Egypt

Abstract

In the past few years, there has been an increasing interest in Massive Open Online Courses (MOOCs) as a new form of Technology-Enhanced Learning (TEL), in higher education and beyond. Recognizing the limitations of standalone MOOCs, blended MOOCs (bMOOCs) that aim at bringing in-class (i.e. face-to-face) interactions and online learning components together have emerged as an alternative MOOC model of teaching and learning in a higher education context. In this paper, we present the design, implementation, and evaluation details of a bMOOC course on “Teaching Methodologies” at Fayoum University, Egypt in cooperation with RWTH Aachen University, Germany, provided using the bMOOC platform L2P-bMOOC. In order to gauge the usability and effectiveness of the course, we employed an evaluation approach based on Conole’s 12 dimensions rubrics, ISONORM 9241/110-S as a general usability evaluation, and a custom effectiveness questionnaire reflecting the different MOOC stakeholder perspectives.

Keywords: Massive Open Online Courses; MOOCs; Blended MOOC; BMOOCs; MOOC design; Usability; Quality assurance; Effectiveness

Introduction

The emergence of Massive Open Online Courses (MOOCs) as a new Technology Enhanced Learning (TEL) model has the potential to change the existing higher education landscape. MOOCs have the opportunities of opening up learning and offering a wide range of choice in different areas and disciplines for a massive number of participants from anywhere all over the world to attend free online courses without any admission requirements (Liyanagunawardena, Adams, & Williams, 2013). Furthermore, MOOCs support a movement toward a vision of lifelong and on-demand learning for those who are working full time or have taken a break from formal education (Kop, Fournier, & Mak, 2011). Nevertheless, MOOCs suffer from several limitations. Yousef, Chatti, Schroeder, Wosnitza, & Jakobs, (2014a), for instance, provided an extensive review of the MOOC literature and stressed that the initial vision of MOOCs that aims at breaking down obstacles to education for anyone, anywhere and at any time is far away from the reality. In fact, most MOOC implementations so far still follow a top-down, controlled, teacher-centered, and centralized learning model. Attempts to implement bottom-up, student-centered, really open, and distributed forms of MOOCs are the exception rather than the rule. Other limitations of MOOCs include pedagogical problems concerning assessment and feedback (Hill, 2013), the lack of interactivity between learners and the video content (Grünewald, Meinel, Totschnig, & Willems, 2013), as well as high drop-out rates in average of 95% of course participants (Yousef, Chatti, Schroeder, Wosnitza, 2014b). A possible reason for the latter problem is the complexity and diversity of MOOC participants. This diversity is not only related to the cultural and demographic attributes, but it also considers the diverse motives and perspectives when enrolled in MOOCs. Furthermore, a major problem with MOOCs is the ignorance of the importance and benefits of face-to-face communication (Hollands & Tirthali, 2014; Schulmeister, 2014). Bill Gates, for instance, supports the idea of applying MOOCs in a blended-learning approach. He emphasizes the important role of the face-to-face interaction in didactical meta-communication (Young, 2012). These limitations raise some serious concerns on what role MOOCs should play, or how they should fit into the higher education landscape as an alternative mode of teaching and learning and a substantial supplement.

On the way to address these limitations, the new design paradigm of blended MOOCs (bMOOCs) that aim at bringing in-class (i.e. face-to-face) interactions and online learning components together as a blended environment can resolve some of the hurdles facing standalone MOOCs (Bruff, Fisher, McEwen, & Smith, 2013). In fact, the bMOOCs model has the potential to foster student-centered learning, provide effective assessment and feedback, support the interactive design of the video lectures, consider the different patterns of participants in the MOOC, as well as bring the benefits of face-to-face interactions into the MOOC environment. Driven by these opportunities, this paper presents the design, implementation, and evaluation details of a bMOOC course on “Teaching Methodologies” at Fayoum University, Egypt in cooperation with RWTH Aachen University, Germany. The main goals of this study are:

  1. Develop design patterns for effective bMOOC environments.
  2. Evaluate the developed bMOOC environment in terms of usability and effectiveness.
  3. Propose recommendations for the enhancement of the bMOOC environment based on the participants’ feedback.

Blended MOOCs

MOOC providers have already piloted the bMOOC concept within a higher education context. In particular, San José State University (SJSU) partnered with the Harvard and MIT non-profit MOOCs platform edX in the fall of 2012 to provide a bMOOC pilot experiment based on the “Circuits and Electronics” edX course. 87 SJSU on-campus students watched the MOOC video lectures on their own. Then, they practiced problems as homework. Afterwards, they met the faculty professor during class time to discuss the concepts presented in the video lectures. Meanwhile, they worked on projects in small groups and answered quizzes to check their learning progress. This bMOOC achieved a high success rate with 90% of the students passing the final exam, as compared with 55% in the traditional class of the previous year (Ghadiri, Qayoumi, Junn, Hsu & Sujitparapitaya, 2013). Even though the overall feedback showed positive results, there were some open issues, such as the lack of interaction between students and the video content as well as the lack of integration between the MOOC platform and the campus Learning Management System (LMS). Furthermore, the course was scheduled and led by the faculty professor and the students didn’t get the opportunity to engage in a self-organized learning experience. Therefore, they were more involved in the class time activity than the online practice on the edX platform.

bMOOCs are still in the experimentation stage. Different approaches to design and embed bMOOC environments in the higher education landscape have been proposed in the MOOC literature (Bruff et al., 2013; Ghadiri et al., 2013; Ostashewski, & Reid, 2012). These approaches, however, still follow a teacher-centered model. Recognizing the potential of bMOOCs to support new pedagogies such as self-organized and network learning, we focus in this study on learner-centered bMOOCs by providing a bMOOC environment where learners can take an active role in the management of their learning activities.

Blended MOOC Design

As outlined in the introduction section MOOCs suffer from several limitations, namely following a teacher-centered and centralized learning model, the lack of effective assessment and feedback, the lack of interactivity between learners and the video content, the diversity of MOOC participants, and the absence of face-to-face interaction. In order to address the diversity issue in MOOCs, we analyzed and clustered the interest patterns of MOOCs stakeholders to create a meaningful picture of the MOOC community. Our main finding was a set of eight clusters of MOOC stakeholder perspectives namely, blended learning, flexibility, high quality content, instructional design & learning methodologies, lifelong learning, network learning, openness, and student-centered learning (Yousef, Chatti, Wosnitza, Schroeder, 2015). Table 1 illustrates the degree of support of the eight MOOC stakeholder perspectives in cMOOC, xMOOC, and face-to-face learning environments. None of these environments provides a full support for all MOOC stakeholder perspectives.

Table 1

An effective bMOOC that has the potential to take into account the different MOOC stakeholder perspectives can be viewed as the convergence of cMOOC, xMOOC, and face-to-face learning models, as depicted in Figure 1. In fact, cMOOCs support flexibility and openness and provide space for self-organized and networked learning where learners can define their own objectives, present their own view, and collaboratively create and share knowledge. xMOOCs focus on high quality content and follow a clear instructional design approach, where learning objectives are well-defined by teachers through short video lectures, often followed by e-assessment tasks. Face-to-face learning provides a number of benefits including direct feedback, coaching, and scaffolding.

Figure 1

Blended MOOC Design Criteria

We conducted a thorough literature review to collect a set of design criteria related to each cluster of MOOC stakeholder perspectives (Yousef et al., 2014a; Yousef, Chatti, Schroeder, 2014c). Furthermore, we conducted a study to collect feedback from different MOOC participants concerning the importance of the collected criteria for each cluster (Yousef et al., 2014b). The highly ranked criteria related to each cluster are summarized in Table 2.

Table 2

L2P-bMOOC Implementation

The design criteria collected in Table 2 have built the basis for the implementation of the L2P-bMOOC platform on top of the L2P learning management system of RWTH Aachen University, Germany. L2P-bMOOC represents a shift away from traditional MOOC environments where learners are limited to viewing video content passively towards a more dynamic and collaborative one. Learners are no longer limited to watching videos passively and are encouraged to share and create knowledge collaboratively. In L2P-bMOOC, video materials are represented, structured, and collaboratively annotated in a mind-map format.

The workspace of L2P-bMOOC consists of an unbound canvas representing the video map structure of the lecture, a course selection section, and a sidebar for new video node addition and editing of video properties as shown in Figure 2. Possible actions on a video node include video annotations, video clipping, social bookmarking, and discussion threads.

Figure 2

Video Annotations

The annotation section of video nodes is displayed in a separate layer above the main page and can be opened by clicking the “Annotation icon @” attached to map nodes. It consists of three main blocks: Interactive timeline, list of existing annotations and creation form for new annotations (see Figure 3).

Figure 3

The interactive timeline visualizing all annotations is located right under the video and is synchronized with the list of complete annotations. By selecting timeline items, users can watch the video directly starting from the part to which the annotation points to. The timeline range corresponds to video duration and can be freely moved and zoomed into. Timeline items also include small icons that help to distinguish three annotation types: Suggestion, Question and Marked Important. Moreover, learners can adjust their own learning processes according to their points of interest and discuss with text or attaching links of relevant materials and discussion threads. Learners can also, insert new annotations while the video is in play mode at the current playback position. Furthermore, if learners believe the annotation contains an interesting or important note they have the option to “Like” it and later filtering items based on the number of likes. The “Trash” icon situated on top right corner of annotations is used to remove it. However, each item can be deleted only by its author.

Search and Sort Functionalities

Due to the long list of existing annotations in MOOC context, learners can perform searching and sorting actions. By entering a specific keyword, user name or annotation type, users can search for items in the list and a set of matching items will be drawn along with an updated interactive timeline. Sorting can be done based on date, time on video, rating or number of replies each annotation received.

Video Clipping

In order to respond to the learners’ interest in a specific section of the video lecture, L2P-bMOOC provides a clipping option that creates a new node representing a specific segment of the video. Clipping videos is supported for both complete and already clipped videos. In addition, these clipping videos can be accessed and annotated by all course participants.

Bookmarks and Discussion Threads

The options of attaching links of relevant materials and discussion threads are applicable for the original video lecture as well as the video nodes. Bookmarks represent online resources that can be added by all course participants and ordered based on their rating. They can be displayed in a separate JQuery Light box appearing on top of the application page. In contrast to annotations, discussion threads do not refer to any specific time in the video and may be used by course participants to discuss questions or suggestions relating to the general concept that the video node represents.

Evaluation and Discussion

In this study, we used the L2P-bMOOC platform to offer a bMOOC on “Teaching Methodologies” at Fayoum University, Egypt in co-operation with RWTH Aachen University, Germany. We conducted a thorough evaluation of this bMOOC to gauge its usability and effectiveness. To achieve this, a user study was performed with the aim to gather quantitative and qualitative data from participants’ experience in this course.

Evaluation Methodology

We employed an evaluation approach based on Conole’s 12 dimensions rubrics, ISONORM 9241/110-S as a general usability evaluation, and a custom effectiveness questionnaire reflecting the different MOOC stakeholder perspectives.

Conole’s 12 Dimensions Rubrics

Gráinne Conole developed a new classification for MOOCs as part of the EFQUEL MOOC Quality Project (Conole, 2013). Conole’s evaluation rubric consists of the 12 dimensions, namely, level of openness, degree of massiveness, the amount of use of multimedia, the use of communication tools, the degree of collaborative learning, the type of learner pathway (i.e. learner-centered learning against teacher-centered learning), quality assurance, amount of reflection, assessment strategies, learning model (i.e. formal and informal), autonomy, and diversity (Conole, 2013). We evaluated the bMOOC against these 12 dimensions by following a three levels scale (i.e. low, medium, high), as shown in Figure 4.

Figure 4

The evaluation above shows the main characteristics of the “Teaching Methodologies” bMOOC. The course was offered through the L2P-bMOOC platform hosted at RWTH Aachen University. It took place during the summer semester 2014 with duration of eight weeks. It was offered both formally to students from Fayoum University and informally with open enrollment to anybody who is interested in teaching methodologies. The teaching staff is composed of one professor and one assistant researcher from Fayoum University as well as one assistant researcher from RWTH Aachen University. A total of 128 participants completed this course. 93 are formal participants who took the course to earn credits from Fayoum University. These participants were required to complete the course and obtain positive grading of assignments. The rest were informal participants who didn’t attend the face-to-face sessions. They have undertaken the learning activities at their own pace without receiving any credits. The teaching staff provided 6 video lectures and the course participants have added 27 related videos. This course was taught in English and participants were encouraged to self-organize their learning environments, present their own ideas, collaboratively create video maps of the lectures, and share knowledge through social bookmarking, annotations, forums, and discussion threads.

General Usability Evaluation (ISONORM 9241/110-S)

The ISONORM 9241/110-S questionnaire was designed based upon the International Standard ISO 9241, Part 110 (Prümper, 1997). We used this questionnaire as a general usability evaluation for the L2P-bMOOC platform. It consists of 21 questions classified into seven main categories. Participants were asked to respond to each question scaling from (7) a positive exclamation and its mirroring negative counterpart (1). The questionnaire comes with an evaluation framework that computes several aspects of usability to a single score between 21 and 147. A total of 50 questionnaires were completed. The table below illustrates the summary of the ISONORM 9241/110-S usability evaluation.

Table 3

The majority of respondents were in the 18-24 age range. Female respondents formed the majority (90%). Participants have a high level of educational attainment: 70% of participants are Bachelor students at Fayoum University and 30% have a Bachelor’s degree or higher. They also have an experience with TEL courses. Nearly 75% reported that they attended more than two TEL courses.

The overall ISONORM 9241/110-S score from the questionnaires was 93.3, which translates to “Everything is all right! Currently there is no reason to make changes to the software in regards of usability” (Prümper, 1997). In particular, suitability for individualization category was rated the best. This indicates that the participants had no issues with the adaptation of the bMOOC environment to fit their needs and preferences. One unanticipated finding was that the error tolerance category was rated the worst with a sum of 7.4, which indicates that participants had some issues in handling the system errors.

In general, the ISONORM 9241/110-S evaluation results reflect a user satisfaction with the usability of the L2P-bMOOC platform. There is, however, still room for further improvement, especially in the error tolerance category. A possible enhancement of L2P-bMOOC would be to add a help guide (e.g. FAQs and system entry errors) as well as a video tutorial explaining the different features of the platform to ensure a better learning experience.

Effectiveness Evaluation

As stated above, learners have different goals when participating in MOOCs. The result of our study on diversity in MOOCs was a set of eight clusters of MOOC stakeholder perspectives. These include blended learning, flexibility, high quality content, instructional design & learning methodology, lifelong learning, network learning, openness, and student-centered learning (Yousef et al., 2015). The effectiveness evaluation in this paper aims at assessing whether these goals have been met in the offered bMOOC.

There have been several attempts to evaluate the effectiveness of MOOCs. However, most of these studies only focus on a particular aspect of MOOCs. For instance, from a pedagogical perspective, Fini (2009) and Siemens (2013) focused on the effectiveness of cMOOCs for enhancing learning in the digital age. McAuley, Stewart, Siemens, & Cormier (2010) as well as Ostashewski and Reid (2012) focused on the effectiveness of the MOOC design, from a technical perspective. Our study aims at a comprehensive evaluation of MOOCs from different perspectives. We applied a multi-level effectiveness evaluation of the bMOOC that considers the different patterns of MOOC stakeholder perspectives. We designed a questionnaire to gauge whether the different goals of the bMOOC participants have been achieved, as shown from Table 3 to Table 10. The content of this questionnaire is based on relevant literature (Shee & Wang, 2008; Chang, 1999; Tobin, 1998). A 5-point Likert scale was used from (1) strongly disagree to (5) strongly agree.

We defined a set of questions for each cluster. In order to ensure the relevance of these questions, we sent this questionnaire to a small panel of 5 learners as well as 5 learning technologies experts. They were asked for their opinions and suggestions for revising the questionnaire. Their feedback included a refinement of some questions and shifting questions to other clusters. The revised questionnaire was then given to the bMOOC participants. The following sections present the results of the effectiveness evaluation of the bMOOC.

Internal Course Diversity

We started our questionnaire by asking participants about the purpose of their participation in the Teaching Methodologies bMOOC, based on the eight clusters of MOOC stakeholder perspectives outlined above. The participants had the possibility to select more than one answer. Figure 5 shows the summary of their responses. The results reflect diversity in the participants’ perspectives.

Figure 5

Blended Learning

The design of blended learning environments bringing together face-to-face and online learning can be a flexible and effective model to enhance classroom learning and to improve relationships with teachers and peers (Bruff et al., 2013). The course participants were asked to watch the lecture videos online and use the L2P-bMOOC platform to collaboratively annotate and discuss the lecture content. The face-to-face sessions are then used to elaborate more on the concepts presented in the video lecture, discuss practical aspects of the course, and provide direct feedback to the group projects.

Table 4 lists the five evaluation items of the blended learning category. The agreeability mean of the respondents is quite high at 4.4. Item 2 “Bringing together face-to-face and online learning increases my motivation to share and discover new ideas” obtained the highest mean score of 4.5, which indicates that the bMOOC increased the course participants’ motivation. The participants reported that the permanent coaching and scaffolding provided by the teachers, as well as the continuous direct feedback from other course participants had positive impact on their motivation in the course. Moreover, the face-to-face interactions with participants with diverse backgrounds and interests increased their engagement and trust. This reveals the importance of the human factor in bMOOCs. This is consistent with the findings of Bruff et al. (2013) who pointed out that bMOOCs can improve the learning outcome, because participants in bMOOCs can benefit from the opportunities for independent learning, increased engagement and motivation, and flexibility of bMOOCs.

Table 4

Flexibility

One of the successful factors in MOOCs is flexibility (Mackness, Mak, & Williams, 2010). The six evaluation items in Table 5 aim at assessing the flexibility level of the bMOOC. Most participants reported a high satisfaction with the diversity of the provided learning materials as well as the ability to access the learning resources at any time and from anywhere.

Table 5

High Quality Content

One of the most important factors to empower and engage learners around the world to participate in MOOCs is the quality of course content (Yousef et al., 2015). Shee and Wang (2008) pointed out that learners place great value on online courses where the content is well-organized, interactive, the presentation of the subject is clear, and in the right length. The six evaluation items in Table 6 aim at measuring the quality of the content in the provided bMOOC. The mean score in this category was 4.4.

Table 6

Most respondents agreed that the course materials and the user-generated content (e.g. mind maps, discussions, annotations, bookmarks) were very helpful to better understand the course concepts. In particular, browsing highly rated bookmarked articles on each video node and receiving comments and suggestions on the annotations helped to improve the quality of the course content.

Instructional Design and Learning Methodology

Effective instructional design and learning methodology can make bMOOCs more attractive and motivating (Yousef et al., 2015). Table 7 illustrates the evaluation of the effectiveness of the instructional design and learning methodology used in this bMOOC. Respondents were generally positive regarding the well-defined objectives, the clear structure, the effective tools, and the teaching assistance offered to support the learning activities in this course. One unanticipated finding was that the tutor feedback on the assignments obtained a relatively low mean score of 4. Possible reasons for this might be the limited time of the teaching team and using only one type of assessment, namely teacher assessment. Indeed, the ability to evaluate a large number of learners in MOOCs is a highly challenging task. It is necessary to go beyond traditional teacher assessment methods and apply open assessment methods that fit better to the bMOOC environments characterized by openness, networking, and self-organization. These include peer-assessment, self-assessment, and e-assessment methods (Yousef et al., 2014b).

Table 7

Lifelong Learning

Learning is no longer restricted to the formal higher education context. MOOCs are providing a disorganized and unstructured learning model for informal participants. This kind of learning tends to be experimental, spontaneous and free from rigid curricula. There is a wide agreement among MOOC providers and researchers that MOOCs open doors for new opportunities for lifelong learning outside the boundaries of formal educational institutions (Milligan & Littlejohn, 2014; Kizilcec, Piech, & Schneider, 2013; Kop et al., 2011). Several studies on the profile of MOOC participants found that the majority has a Bachelor or a Master degree and in most of the cases the MOOC is used for job (re)training and lifelong learning purposes (Christensen, Steinmetz, Alcorn, Bennett, Woods, & Emanuel, 2013; Kizilcec et al., 2013; Kop et al., 2011). This is quite different in bMOOCs, as the majority of participants take the MOOCs as part of a university credit-bearing course. In our study, only 30% of the course participants are lifelong learners tending to learn through this bMOOC for their personal or professional interest rather than obtaining an official academic degree. As shown in Table 8, most of the respondents agreed that the course helps them improve skills required for their future job as school teachers and opens new opportunities to advance their knowledge and expertise. This confirms the potential of the bMOOC to support lifelong learning activities. The findings of the current study are consistent with those of Milligan and Littlejohn (2014) who emphasize the important role of MOOCs for opening up, supporting and enabling professional learning, allowing opportunities to link formal and informal learning.

Table 8

Network Learning

Network learning is important in open and distributed learning environments like bMOOCs (Chatti, Schroeder, Jarke, 2012). A set of seven items for the evaluation of the offered bMOOC in terms of collaborative and network learning are shown in Table 9.

Table 9

In this category, the high mean average of 4.4 indicates the effectiveness of the bMOOC in supporting network learning. In fact, the participants agreed that the collaboration and communication possibilities offered in L2P-bMOOC (i.e. group workspaces, discussion forums, live chat, social bookmarking, and collaborative annotations) allowed them to share, discuss, exchange, and collaboratively construct knowledge as well as receiving feedback and support from peers.

Openness

Openness is one of the characteristics in MOOCs. It refers to providing a learning experience to a vast number of participants around the globe regardless of their location, age, income, ideology, and level of education, without any entry requirements, or course fees to access high quality education. Openness also refers to providing open educational resources (OER) following the 4Rs, namely Reuse, Revise, Remix, and Redistribute (Peter & Deimann, 2013). Most of the MOOCs on the market are open for participants without any admission requirements and for free. They are, however, not open from a copyright perspective. For instance, Coursera does not permit users to reproduce, retransmit, distribute, or publish any material from its platform.

The offered bMOOC does not only enable participants to register for the course for free and without any academic requirements, but also enable them to reuse, revise, remix, and redistribute all course materials as seen fit. Table 10 shows the high satisfaction of the respondents with the level of openness in the bMOOC.

Table 10

Self-Organized Learning

One important goal of participation in MOOCs is self-organized learning. bMOOCs can provide a space for learners to be active participants in the learning process and to get mutual support (Chatti, 2010). Table 11 shows the results of ten evaluation items to examine how much the bMOOC supports self-organized learning. The mean average was 4 which indicate that a majority agreed that the learning environment allowed them to be self-organized in their learning process. In particular, the participants reported that the representation of the lecture in a mind map view and the video clipping feature helped them to learn independently from teachers. The results further confirm that the learning environment encourages participants to work at their own pace to achieve their learning goals and keep them in control of their learning progress. Items 5 and 10 obtained the lowest mean score of 2.8 and 2.7, respectively. This shows that the participants had some difficulties in tracking and monitoring their learning activities and those of their peers. Further improvement should be done to address this important issue. This can be in the form of a learning analytics tool that enables to collect, visualize, and analyze the data from learning activities (e.g. comments, likes, newly added nodes) to support monitoring, awareness, self-reflection, and feedback (Chatti, Lukarov, Thüs, Muslim, Yousef, Wahid, Greven, Chakrabarti, & Schroeder, 2014).

Table 11

Conclusion

Massive Open Online Courses (MOOCs) present an emerging branch of online learning that is gaining interest in the Technology-Enhanced Learning (TEL) community. Despite their popularity, current MOOCs suffer from several limitations. These include following a teacher-centered and centralized learning model, the lack of effective assessment and feedback, the lack of interactivity between learners and the video content, the diversity of MOOC participants, and the absence of face-to-face interaction. In this paper, we argued that the blended MOOC (bMOOC) model has the potential to address these issues. The purpose of the current study was to design, implement, and evaluate a bMOOC course on “Teaching Methodologies” at Fayoum University, Egypt in cooperation with RWTH Aachen University, Germany, provided using the bMOOC platform L2P-bMOOC. In order to gauge the usability and effectiveness of the course, we employed an evaluation approach based on Conole’s 12 dimensions rubrics, ISONORM 9241/110-S as a general usability evaluation, and a custom effectiveness questionnaire reflecting the different MOOC stakeholder perspectives. The results of the study revealed a general satisfaction with the bMOOC in terms of usability and effectiveness. There was a wide agreement among the participants that offered bMOOC can address the limitations of MOOCs outlined above. The study, however, shows that it is crucial to investigate learning analytics techniques to foster monitoring, awareness, self-reflection, and feedback in bMOOC environments as well as to develop new assessment methods, such as peer-assessment, self-assessment, and e-assessment that reflect the open and massive nature of MOOCs.

Acknowledgments

The authors wish to thank Dr. Abeer El-Sayed Mohamad Abo Zaid, Fayoum University, for providing the video lectures of this course. We also thank Dr. Amal Goma Abdul-Fatah and Dr. Ahmed Ramadan Saad Ahmed Khatiry for their technical assistance in this course.

References

Bruff, D. O., Fisher, D. H., McEwen, K. E., & Smith, B. E. (2013). Wrapping a MOOC: Student perceptions of an experiment in blended learning. MERLOT Journal of Online Learning and Teaching, 9(2), 187-199.

Chang, V. (1999). Evaluating the effectiveness of online learning using a new web based learning instrument. In Proceedings Western Australian Institute for Educational Research Forum.

Chatti, M. A. (2010). Personalization in Technology Enhanced Learning: A Social Software Perspective (Doctoral Dissertation), RWTH Aachen University, Shaker Verlag.

Chatti, M. A., Lukarov, V., Thüs, H., Muslim, A., Yousef, A. M. F., Wahid, U., Greven, C., Chakrabarti, A., Schroeder, U. (2014). Learning Analytics: Challenges and Future Research Directions. eleed, Iss. 10. (urn:nbn:de:0009-5-40350).

Chatti, M. A., Schroeder, U., & Jarke, M. (2012). LaaN: convergence of knowledge management and technology-enhanced learning. Learning Technologies, IEEE Transactions on, 5(2), 177-189.

Christensen, G., Steinmetz, A., Alcorn, B., Bennett, A., Woods, D., & Emanuel, E. J. (2013). The MOOC phenomenon: who takes massive open online courses and why? University of Pennsylvania, nd Web, 6.

Conole, G. (2013). MOOCs as disruptive technologies: Strategies for enhancing the learner experience and quality of MOOCs. Revista de Educación a Distancia, 39, 1-17.

Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox and possibility. Journal of Interactive Media in Education, 3.

Fini, A. (2009). The technological dimension of a massive open online course: The case of the CCK08 course tools. The International Review of Research in Open and Distance Learning, 10(5).

Ghadiri, K., Qayoumi, M. H., Junn, E., Hsu, P., & Sujitparapitaya, S. (2013). The transformative potential of blended learning using MIT edX’s 6.002 x online MOOC content combined with student team-based learning in class. Environment, 8, 14.

Grünewald, F., Meinel, C., Totschnig, M., & Willems, C. (2013). Designing MOOCs for the Support of Multiple Learning Styles. In Scaling up Learning for Sustained Impact (pp. 371-382). Springer Berlin Heidelberg.

Hill, P. (2013). Some validation of MOOC student patterns graphic. From http://mfeldstein.com/validation-mooc-student-patterns-graphic/

Hollands, F. M. & Tirthali, D. (May, 2014). MOOCs: Expectations and reality. Full report. Center for Benefit-Cost Studies of Education, Teachers College Columbia University. Retrieved from http://cbcse.org/wordpress/wp-content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf

Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement: Analyzing learner subpopulations in massive open online courses. In Proceedings of the third international conference on learning analytics and knowledge (pp. 170-179). ACM.

Kop, R., Fournier, H., & Mak, J. S. F. (2011). A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. The International Review of Research in Open and Distance Learning, 12(7), 74-93.

Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A systematic study of the published literature 2008-2012. The International Review of Research in Open and Distance Learning, 14(3), 202-227.

Mackness, J., Mak, S., & Williams, R. (2010). The ideals and reality of participating in a MOOC. In Proc. 7th International Conference on Networked Learning, 2010, 266-274.

McAuley, A., Stewart, B., Siemens, G., & Cormier, D. (2010). The MOOC model for digital practice. Technical Report. Retrieved October 2014 from http://www.elearnspace.org/Articles/MOOC_Final.pdf.

Milligan, C., & Littlejohn, A. (2014). Supporting professional learning in a massive open online course. The International Review of Research in Open and Distance Learning, 15(5).

Ostashewski, N., & Reid, D. (2012). Delivering a MOOC using a social networking site: The SMOOC Design model. In Proc. IADIS International Conference on Internet Technologies & Society, (2012), 217-220.

Peter, S., & Deimann, M. (2013). On the role of openness in education: A historical reconstruction. Open Praxis, 5(1), 7-14.

Prümper, J. (1997). Der Benutzungsfragebogen ISONORM 9241/10: Ergebnisse zur Reliabilität und Validität. In Software-Ergonomie’97 (pp. 253-262). Vieweg+ Teubner Verlag.

Schulmeister, R. (2014). The position of xMOOCs in educational systems. eleed, Iss. 10. (urn:nbn:de:0009-5-40743)

Shee, D. Y., & Wang, Y. S. (2008). Multi-criteria evaluation of the web-based e-learning system: A methodology based on learner satisfaction and its applications. Computers & Education, 50(3), 894-905.

Siemens, G. (2013). Massive open online courses: Innovation in education? Open Educational Resources: Innovation, Research and Practice, 5.

Tobin, K. (1998). Qualitative perceptions of learning environments on the world wide web. Learning Environments Research, 1(2), 139-162.

Young, J. R. (2012). A conversation with Bill Gates about the future of higher education. The Chronicle of Higher Education, 25.

Yousef, A. M. F., Chatti, M. A., Schroeder, U., Wosnitza, M., Jakobs, H. (2014a). MOOCs - A Review of the State-of-the-Art. In Proc. CSEDU 2014 conference, Vol. 3, pp. 9-20. INSTICC, 2014..

Yousef, A. M. F., Chatti, M. A., Schroeder, U., & Wosnitza, M. (2014b, July). What Drives a Successful MOOC? An Empirical Examination of Criteria to Assure Design Quality of MOOCs. In Advanced Learning Technologies (ICALT), 2014 IEEE 14th International Conference on (pp. 44-48). IEEE.

Yousef, A. M. F., Chatti, M. A., & Schroeder, U. (2014c, March). Video-Based Learning: A Critical Analysis of The Research Published in 2003-2013 and Future Visions. In eLmL 2014, The Sixth International Conference on Mobile, Hybrid, and On-line Learning (pp. 112-119).

Yousef, A. M. F., Chatti, M. A., Wosnitza, M., & Schroeder, U. (2015). A Cluster Analysis of MOOC Stakeholder Perspectives. RUSC. Universities and Knowledge Society Journal, 12(1), 74-90.

© Yousef, Chatti, Schroeder, and Wosnitza