International Review of Research in Open and Distributed Learning

Volume 19, Number 4

September - 2018

 

Pushing Toward a More Personalized MOOC: Exploring Instructor Selected Activities, Resources, and Technologies for MOOC Design and Implementation

Author photos

Curtis J. Bonk1, Meina Zhu1, Minkyoung Kim2, Shuya Xu1, Najia Sabir1, and Annisa R. Sari1,3
1Indiana University, USA, 2University of West Florida, USA, 3Yogyakarta State University, Indonesia

Abstract

This study explores the activities, tools, and resources that instructors of massive open online courses (MOOCs) use to improve the personalization of their MOOCs. Following email interviews with 25 MOOC and open education leaders regarding MOOC personalization, a questionnaire was developed. This questionnaire was then completed by 152 MOOC instructors from around the world. While more than 8 in 10 respondents claimed heavy involvement in designing their MOOCs, only one-third placed extensive effort on meeting unique learner needs during course design, and even fewer respondents were concerned with personalization during course delivery. An array of instructional practices, technology tools, and content resources were leveraged by instructors to personalize MOOC-based learning environments. Aligning with previous research, the chief resources and tools employed in their MOOCs were discussion forums, video lectures, supplemental readings, and practice quizzes. In addition, self-monitoring and peer-based methods of learner feedback were more common than instructor monitoring and feedback. Some respondents mentioned the use of flexible deadlines, proposed alternatives to course assignments, and introduced multimedia elements, mobile applications, and guest speakers among the ways in which they attempted to personalize their massive courses. A majority of the respondents reported modest or high interest in learning new techniques to personalize their next MOOC offering.

Keyword: massive open online courses (MOOCs), personalization, instructional design, MOOC instructors

Introduction

Massive open online courses (MOOCs) and their many derivatives allow for thousands of learners to simultaneously engage in a learning experience (Bonk, Lee, Reeves, & Reynolds, 2018; Pappano, 2012; Siemens, 2012b). While a relatively recent phenomenon, MOOCs have the potential for large scale usage and impact by helping learners in developing parts of the world obtain access to education (Bowman, 2012; Jagannathan, 2015). While promising in terms of access, many studies point to retention issues in MOOCs (e.g., Hew & Chueng, 2014; MOOC @ Edinburgh 2013 - Report #1, 2013; Yuan, Powell, & Olivier, 2014). Despite MOOCs being promoted and leveraged by universities and international organizations for several years, there are scant empirical studies evaluating how MOOCs and similar types of open educational courses address diverse learner needs through the personalization of the course content and experiences. After evaluating comprehensive reviews of the MOOC research literature (Zhu, Sari, & Lee, 2018; Deng & Benckendorff, 2017; Liyanagunawardena, Adams, & Williams, 2013; Saadatdoost, Sim, Jafarkarimi, & Mei Hee, 2015; Veletsianos & Shepherdson, 2016; Zhu, Sari, & Lee, 2018), it is evident that few MOOC studies use instructor perspectives to better understand instructional design and delivery practices. It is our belief that collecting instructor perspectives may lead to enhanced instructor training, guidelines, and personalization practices.

The purpose of this study was to better understand how MOOC instructors adapt their courses to enhance or personalize MOOC design and delivery. Personalization, however, is a complex construct (Bethke, 2016) and hard to succinctly define or agree upon. In a meta-review of the literature on personalization, Fan and Poole (2006) caution that:

At the conceptual level, personalization means different things to different people in different fields. For architects, personalization means creating functional, pleasant personal spaces; for social scientists it is a way of enhancing social relationships and building social networks; for some computer scientists, personalization is a toolbox of technologies to enhance the Web experience through graphic user interface design. Different conceptualizations in turn dictate different research methodologies and implementations. Cognitive scientists resort to explicit mental modeling to differentiate users, whereas e-commerce marketers rely on user profiles and purchase records to segment customers. (p. 181)

Seeking to provide a common theoretical framework from which to study personalization and aid in the design of more personalized systems, Fan and Poole (2006) also provide different definitions and examples of personalization for architecture and environmental science, information science, cognitive scientists, computer scientists, social scientists, and marketing/e-commerce. Given the complexities of personalization and the associated difficulties defining it, their ideas and examples can be quite useful for those attempting to design MOOCs that offer individualized attention and personalization.

While personalization is a difficult concept to pin down, the goal of this study was to determine the types of activities, resources, and technology tools that can enhance the quality, and ultimately the retention rates, of MOOCS. Unlike most MOOC research (Zhu et al., 2018), the MOOC instructor perspective is the primarily focus of this study.

According to Kop (2011), instructors are one of five key elements to a successful MOOC; the other four are learners, topic, material, and context. Of the five elements that Kop (2011) delineates, instructors are one of the least researched (Veletsianos & Shepherson, 2016; Zhu et al., 2018). To address this gap, in the present study, MOOC personalization was explored from an instructor perspective. More specifically, this study focuses on the four research questions listed below.

  1. How much self-identified effort do instructors place on addressing unique learner needs in the design and development of their MOOCs?
  2. What are the personalization practices of MOOC instructors in terms of the pedagogical activities and task structures employed?
  3. What are the personalization practices of MOOC instructors in terms of content resources and associated technology tools employed?
  4. How would these instructors structure their next MOOC differently in terms of personalization?

To answer these questions, this study explores the practices of experienced MOOC instructors. By interviewing experts to develop a questionnaire, and then surveying MOOC instructors from a wide range of disciplines and locales, it was hoped that this research would help reveal instructional design and delivery practices toward personalization that could enhance the quality and long-term impact of MOOCs.

There is some early history to build upon in terms of MOOC personalization. In 2010, for instance, a MOOC titled "Personal Learning Environments Networks and Knowledge" (aka PLENK2010) was taught with personalized learning as an objective (Kop, Fournier, & Mak, 2011). Levy (2011) asserts that this particular MOOC used connectivistic theory and ideas throughout. Such a course later became categorized as a "cMOOC" (Reeves & Hedberg, 2014; Siemens, 2012a). A cMOOC is more focused on knowledge generation and sharing than on knowledge consumption and passive forms of learning (Kop & Fournier, 2015). It is in the loosely organized learning networks or spaces of a cMOOC that the facilitator (or instructor) helps foster connections between the participants and the open sharing of knowledge and resources (Kop & Fournier, 2015). PLENK2010 required participants to use social media, including tools such as Second Life and Facebook, to share and co-create knowledge, thereby enhancing learner motivation through the creation of personal networks (Kop, 2011; Kop et al., 2011). In effect, there was enhanced learner choice in how participants would engage with and reflect upon the content and ideas related to the course (Kop et al., 2011).

In 2011, another type of MOOC emerged: the xMOOC (Sneddon, 2015). xMOOCs were based on interactive media such as videos, texts, and lectures that leveraged structured learning pathways on central platforms (Sneddon, 2015). Despite the sudden popularity of xMOOCs within Ivy league universities and abundant media attention (Pappano, 2012; Rodriguez, 2012), there was extensive concern related to how instructors could be responsive to learners in such large-scale courses. Unlike cMOOCs, xMOOCs focused more on content delivery and individual learning. As such, they were criticized for adopting instructional approaches more akin to behavioral theories and models, rather than learning through peers and social networks as with cMOOCs (Bates, 2012; Bonk et al., 2018; Daniel, 2012).

MOOC Personalization

While several researchers have evaluated MOOC elements for personalization, such as course design, assessments, and means of content delivery (de Oliveira Fassbinder, Fassbinder, & Barbosa, 2015), there is a dearth of empirical studies that specifically investigate MOOC personalization from instructor perspectives (Veletsianos & Shepherson, 2016; Zhu et al., 2018). Instead, much of the focus of the literature on MOOCs examines learner completion trends and participant-based data (Balch, 2013; Heutte, Kaplan, Fenouillet, Caron, & Rosselle, 2014; Jordan, 2014). MOOC research also trends towards descriptive case studies based on an individual MOOC (Fini, 2009), rather than analyzing a spectrum of MOOCs through meta-analyses. However, in one meta-analysis of MOOC-related studies from 2008-2013, Nkuyubwatsi (2013) determined that MOOCs provided adult learners opportunities to engage with materials while personalizing their learning environment through content manipulation.

Recently, Hayworth (2016) suggested that a range of technologies can help personalize learning environments, such as social bookmarking, wikis, blogs, image sharing, and collaborative tools. He also notes that such personalized learning environments (PLEs) have significant implications for distance educators, instructional designers, life-long learners, and administrators in terms of the mixing and sharing of content and resources, monitoring and managing the learning process, making learning-related suggestions and recommendations, content creation, and so on (Hayworth, 2016). Hayworth cautioned, however, against placing too much emphasis on technology-based solutions. As he and others (e.g., McLoughlin& Lee, 2010) point out, adult learners often exhibit a preference for learning which is social, participatory, and media supported rather than technocentric (Hayworth, 2016; McLoughlin & Lee, 2010). Too often researchers exploring personalization in online environments focus on technology infrastructures rather than the pedagogical scaffolds provided by instructors and instructional designers to support learners (Aleven, Stahl, Schworm, Fischer, & Wallace, 2003). According to researchers like McLoughlin and Lee (2010), online instructor roles, instructional practices, and design decisions must be evaluated holistically to better understand how online personalized learning environments can be crafted.

Personalized forms of learning are grounded in learner-centered and constructivist learning perspectives (Reigeluth, Myers, & Lee, 2017; Watson & Watson, 2017). Such theoretical viewpoints attempt to address specific learner needs based on their learning interests and preferences, prior knowledge and experiences, and overall backgrounds (Levy, 2011; Xu, Huang, Wang, & Heales, 2014). In effect, personalization is the means used to tailor a particular learning environment's resources, tools, activities, and content to better address individual learner needs, skills, and issues (Kelly, 2016). From a learning theory standpoint, the personalization of instructional spaces lends itself to a more learner-centered paradigm that can address diverse learner requirements, competencies, and backgrounds (Green, Facer, Rudd, Dillon, & Humphreys, 2005). Also vital from this point of view is learner-learner interaction and dialogue (Brown, Collins, & Duguid, 1989; Reigeluth et al., 2015). Peers can often offer guidance that is more relevant to true learner needs and experiences (Rogoff, 1990; Vygotsky, 1978).

Siemens (2007) offered a simplified definition of personalized learning that includes two key elements: (1) the tools, and (2) the ideals that guide the design. His colleague, Downes (2016), argued that the phrase "personalized learning" has appeared so much in the educational literature during the past decade that it has begun to "lose its meaning" (para. 1). According to Downes, some refer to personalized learning as the pedagogical differentiation of instruction according to different participant variables such as learning styles and preferences of learning, whereas others refer to decisions made related to the order or pathways in which the curriculum can be offered. Instead of externally provided personalized environments, Downes claims that personalized learning must empower learners by allowing them to customize and organize their own learning directives. From this viewpoint, greater emphasis is placed on the learner deciding what to learn, how to learn, and where to learn (Downes, 2016).

While empathizing with Downes' (2016) perspective, this study focuses on how MOOC instructors adapt their instruction and set of course resources and tasks to personalize the learning process in a MOOC. As a result, for the purpose of our study, we chose to define personalization as: the process by which MOOC instructors adapt their courses and instructional practices to meet diverse learner needs, skills, prior experiences, and situations.

In an example of MOOC personalization with extensive peer reliance, Kim and Chung (2015) mapped out how they attempted to create an ecology of learning in their MOOC "Designing a New Learning Environment," which was hosted on the Stanford Venture-Lab/NovoEd MOOC platform. The participants in this MOOC supported one another through social media like Twitter and discussion forum solicitations when there was missing or incomplete information (Kim & Chung, 2015). For instance, some participants responded to peer requests by creating low bandwidth versions of instructor videos for those who lived in developing regions of the world, and others translated these videos into other languages and added words and nuances that were specific to the local language to make them understandable to target groups (Kim & Chung, 2015). Instructors facilitated a space which allowed learners to personalize content for their peers so that issues of access and linguistic barriers would not hinder learning (Kim & Chung, 2015).

Similarly, Severance (2015), who has taught three different and highly successful MOOCs (i.e., Python Programming, Programming for Everybody, and Internet History, Security, and Society) has attempted to personalize his MOOC offerings by taking a learner point of view. For instance, he has designed unique "Office Hours" in cafes, hotel lobbies, and other locations wherein he locally meets his global participants in cities around the world to discuss the course with them and get their suggestions for improvement (Severance, 2015). He also has a YouTube channel specific for his MOOCs that features personal stories and contributions from participants that appear as "Voices of the Students in MOOCs" (Severance, 2015). The creation of the YouTube channel and "Office Hours" allows participants to integrate their life experiences with MOOC experiences facilitating a unique type of blended personalized learning experience (Severance, 2015).

The primary intent of the present study is to explore the extent to which MOOC instructors use such forms and types of personalization practices in their MOOCs. Just how is personalization operationalized in the design and delivery of MOOCs?

Method

To understand how MOOC instructors personalize their courses to best meet individual learner needs, both quantitative and qualitative data were employed. The study is comprised of two distinct datasets: (1) email interviews of 25 international MOOCs experts related to how to personalize the MOOC experience; these experts were selected since they all had recently contributed to an edited book on MOOCs and open education, and (2) an online survey questionnaire which was sent via SurveyMonkey to more than 1,026 MOOC instructors, of which 152 qualified and completed the instrument.

Expert Email Interviews

It is important to mention that the email interviews provided the thematic and categorical foundations from which the survey instrument was created. The experts had useful and insightful advice and pedagogical ideas that helped in the design of the survey instrument.

Web-Based Survey

A survey comprised of 30 questions was designed based, in part, on the responses of 25 MOOC and open education experts. This questionnaire, which focused on personalization within MOOCs taught by the 152 survey respondents, consisted of 25 close-ended items and five optional, open-ended questions. The primary selection criteria for MOOC instructor participation in the questionnaire were past or present experience teaching or designing a MOOC, which was the first survey item. Instructor participants were selected from an extensive researcher-created database.

To create the database of MOOC instructors to whom the questionnaire would be distributed, the names and affiliations of the MOOC instructors, course title, subject area, course URL, institution, course start and end date, and course duration for over 1,000 MOOCs were mined from Class Central and the MOOC List (which included courses from Open2study, Canvas, NovoEd, Blackboard, iversity, and Kadenze). Additionally, the researchers directly searched individual vendors and organizational sites (e.g., specific vendor lists from Coursera, edX, FutureLearn, and Open2study) to ensure the maximum scope within the MOOC listings database. The researchers further compiled a list of approximately 50 Korean MOOCs (i.e., K-MOOCs) (http://www.kmooc.kr/). Next, the researchers cross checked the database for redundancy and errors. The final list included MOOC instructors from universities, organizations, and institutions in more than two dozen countries, including Australia, Belgium, Canada, China, Denmark, Germany, Grenada, India, Ireland, Italy, Japan, Korea, Macau, Mexico, the Netherlands, New Zealand, Norway, Russia, Singapore, South Africa, Spain, Sweden, Switzerland, the United Kingdom, and the United States. The largest percentage of participants were from institutions in the United States.

Results

Expert Email Interviews

In terms of the email interviews, some of the experts in the field of MOOCs and open education argued for greater use of collaborative projects, whereas others mentioned the need for MOOC toolkits or platforms that are designed for access in low bandwidth conditions as a means to personalize the experience. Among these experts, a senior education specialist from the Open Learning Campus of the World Bank indicated that they attempted to incorporate badging and customized discussion forums as a means to personalize the experience. Another MOOC expert in the Philippines stated that "one feature that we have integrated into our MOOCs... to personalize learning is to allow the learner to choose whether to learn through the video lessons, text lessons, or podcasts." These expert interviews were thematically coded to develop the Web survey of MOOC instructors, mentioned earlier. Survey questions were drafted related to how MOOC instructors fostered feedback, interaction, and engagement in the learning process. Questions were also drafted related to the types of course resources that were embedded to help personalize the MOOC.

Online Survey Results

Some of the online survey findings are recapped below starting with key demographic data related to the instructor experience with MOOCs. Of the 978 valid survey requests, 152 individuals completed the survey. This 15.5% response rate is considered more than acceptable for opt-in, online surveys (Cho & LaRose, 1999). These 152 instructors taught MOOCs in fields such as science, social sciences, the humanities, engineering, medicine, business, language, mathematics, art, and law. Nearly one-third of the MOOC respondents were from medical and health sciences, or from the field of education. Another 9% came from the field of business, and 9% from computer science (see Figure 1).


Figure 1. MOOC instructor departmental or primary discipline affiliations (n=150).

The prior MOOC teaching experience among the survey participants was quite varied. Of these 152 respondents, roughly 55.3% had taught just one MOOC, 19.7% had taught two MOOCs, and nearly 25% had taught three or more MOOCs in the past. In contrast, more than half of these instructors (n=84; 55.2%) had never completed a MOOC as a learner, while 25 (16.5%) had completed one MOOC in the past. It is also important to note that 43 (28.3%) of the respondents had completed two or more MOOCs.

In terms of MOOC enrollment, 71 out of 150 responding MOOC instructors (47.3%) taught courses with less than 10,000 people, 36 of the respondents (24.0%) had courses with 10,000-25,000 enrolled, 19 (12.7%) had courses with 25,001-50,000, and 15 respondents (10.0%) had courses with 50,001-100,000 participants. Just nine respondents (6.0%) had MOOCs with more than 100,000 enrolled. While precise enrollment information was not requested, these self-reported enrollment figures were clearly lower than 40,000 median MOOC participants reported by Jordan (2014).

The instructors were requested to reflect on their instructional practices for their most recent MOOC. Roughly six in 10 (n=91/150) of the instructors taught instructor-led courses: 64 instructors (42.7%) used additional aids such as teaching assistants, moderators, and/or tutors, while the other 27 instructors (18.0%) had no additional teaching support. Of the remaining 59 courses, 19 (12.7%) were participant driven, 21 (14.0%) were self-paced, nine (6.0%) were a hybrid or blended type of MOOC, and 10 (6.7%) used other methods.

Research Question #1

How much self-reported effort do instructors place on addressing the unique participant or learner needs in the design and development of their MOOCs?

As course personalization can depend on an instructor's involvement in the course design, participants (n=152) were asked to rank on a scale of 1 (low) to 10 (high) their involvement in designing the course. When collapsed to three categories (i.e., 1-3 Low; 4-7 Medium; and 8-10 High), only five instructors (3.3%) indicated low involvement in designing the course, and 17 (11.2%) exerted modest involvement (see Figure 2). The remaining 130 MOOC instructors (85.5%) indicated a high level of involvement, of which 94 (72.3%) instructors marked "10" out of 10 on the scale. The average rating was 8.92 (SD=1.88); indicating heavy involvement from instructors in the design of their MOOCs.


Figure 2. MOOC instructor involvement in designing course content for the MOOC. Note: on a scale of 1 (low) to 10 (high) (n=152).

Given that the vast majority of the respondents were extensively involved in the design of their most recent MOOC, they had some influence over the degree to which that course was adapted to learner needs and preferences. Figures 3 and 4 represent the self-identified efforts or energies expended of MOOC instructors to personalize their courses during the design phase and delivery phase of the MOOC, respectively.

The degree of effort placed on meeting unique learner needs when designing their most recent MOOC was also investigated. As shown in Figure 3, only 50 of the 144 respondents (34.7%) felt that they placed a high degree of effort on meeting unique participant or learner needs during the design of their most recent MOOC. An additional 46 respondents (31.9%) placed modest effort, whereas the remaining 48 (33.3%) admitted to not exerting much effort in this regard (M=5.63; SD=3.03).


Figure 3. Effort placed on meeting unique learner needs when designing most recent MOOC. Note: on a scale of 1 (low) to 10 (high) (n=144).

As noted in Figure 4, only 41 of these respondents (28.5%) felt that they placed high effort on meeting the MOOC participant or learner needs during the implementation and delivery of the MOOC. While 61 (42.4%) placed modest effort in this regard, nearly three in 10 MOOC instructors (n=42; 29.1%) did not commit much effort toward meeting participant needs during the implementation and delivery phase (M=5.53; SD=2.80).


Figure 4. Effort placed on meeting unique learner needs when delivering most recent MOOC. Note: on a scale of 1 (low) to 10 (high) (n=144).

Research Question #2

What are the personalization practices of MOOC instructors in terms of the pedagogical activities and task structures employed?

Participant interaction is another means to address learner needs as exemplified in the differences between cMOOCs and xMOOCs. In this study, MOOC instructors indicated that they attempted to foster learner-to-learner connections and interactions to some degree, with an average of 6.24 on a scale of 1 (low) to 10 (high) (n=137). However, when asked about the ways in which peer interaction was encouraged in their most recent MOOC, the methods selected were limited. When presented with a list of nine options (including "not applicable"), more than 80% of MOOC instructors indicated that they relied on system-built discussion forums for learner-learner forms of interaction. No other resource or activity was employed by more than half of the respondents. For instance, only one in four instructors checked that they used pair-based assignments or tasks (e.g., critical friend activities). Furthermore, synchronous forms of meetings or conferencing were used by less than one in 10 of the respondents. Break-out discussion forums or groups were employed by 31 (22.6%) of the respondents, whereas local meet ups were being used by 22 (16.1%) of the MOOC instructors.

As mentioned earlier, Downes (2016) argues that learner empowerment and choice is a key part of personalization. When asked about the structures that they provided in their most recent MOOC from a list of ten items, the survey participants (n=126) primarily relied on optional readings (74.6%) and learner selected incentives such as certificates, badges, or course credit (64.3%). The respondents also indicated that they employed course tasks and assignments (38.1%), learner discussion and negotiation of content (36.5%), multimedia elements to explain concepts (31.7%), learner-driven or contributed content (30.2%), and learner selected learning pathways (19.0%).

Using an open-ended item, the questionnaire provided space for respondents to elaborate on the MOOC personalization practices that they employed to address those who had enrolled. In this space, some respondents specifically referred to pedagogical adaptations. For instance, one respondent designed her course, "To give [sic] different case studies and examples, considering different backgrounds and interests. To have higher order and lower [sic] order assessments, considering the personal interest for deepening into content." Another reflected that, "it's all about expectations and communication. From the first day of 'launching' we have moderators & academics assigned to welcome and encourage learners to ask questions and post comments for peer-to-peer feedback." One instructor noted that, "in terms of pathways, there was no thought to giving learners precise pathways and choices - instead [of] using flexible deadlines and flexible drop/reenroll, students get a good hybrid of structured/self-paced. Some students move fast and others take material quite slowly. Students vary their own pace as the course progresses according to their needs, skills, and time available for the course."

Personalization also requires monitoring learner progress and awareness of learning accomplishments (Reigeluth et al., 2015). In terms of monitoring or tracking learner progress in a MOOC, 42.3% of MOOC instructors (n=137) relied on learner self-monitoring and evaluation. Approximately, one in three (34.3%) employed modular or unit-based forms of assessment. About one in four (24.8%) used weekly or daily reports from learning analytics. A similar percentage (23.4%) used moderator, tutor, or teaching assistant feedback to monitor or track progress. While 13.9% used a hybrid system of tracking learner progress and participation, another 13.1% relied on peer-based reports. Just 7.3% employed personal tracking from the instructor; in contrast 14.6% noted that learner progress was not tracked.

Human and system forms of feedback are another mechanism to address learner needs in a MOOC. Given the typically large number of MOOC participants, it was not too surprising that peer feedback was used by 87 (64.4%) of the 135 instructors who responded to this "check all that apply" question. In addition, 78 (57.8%) of the respondents relied on computer or system-based forms of feedback (57%) (see Figure 5). Also important were moderator, tutor, or teaching assistant feedback (n=58; 43.9%), instructor feedback (n=54; 40.0%), and feedback via task or assignment rubrics (n=50 or 37.0%). Less frequent was the use of forms of self-feedback (n=36; 26.7%). Nearly nonexistent, was feedback coming from outside experts (n=4; 3.0%).


Figure 5. Number of MOOCs that offer different types of learner feedback (n=135).

Research Question #3

What are the personalization practices of MOOC instructors in terms of content resources and associated technology tools employed?

There are many resources, activities, and technology tools from which to make attempts to personalize MOOCs. Survey participants were asked to check items most frequently used from a list of 22 types of learning resources. Consistent with the literature, MOOC instructors often provided discussion forums (91.5%), video lectures and tutorials (76.8%), and readings (76.1%). More than half of the respondents offered content in the form of practice quizzes (57.7%), interactive assessments (50.7%), and expert interviews (50.0%). Additionally, many relied on PowerPoint and other presentations (47.9%), instructor lecture notes (44.4%), animations and interactive content (43.0%), content visualizations (e.g., concept maps, diagrams, flowcharts, etc.) (42.3%), and video examples (e.g., TED talks) (39.4%). Blogs, wikis, podcasts, mobile applications, simulations, and social media were used infrequently. While the respondents selected from a pre-established list of options, the findings indicated that there are an array of resources and tools which MOOC instructors and designers rely upon to craft their courses.

One open-ended question allowed for the discussion of how technology might be employed to personalize learning. One instructor stated, "[t]he most personal way was a brief video (less than 5 minutes) made at the end of each week where I responded to specific posts made in the discussions forums." Another stated, "I held virtual office hours during each of the three offerings of my course. In several, I had teaching associates join in. In the last offering, I used the first part of the meetup to share current nutrition related news and studies to help keep the course more up to date (we also posted news and studies)." Similarly, another respondent mentioned that he hosted, "periodic Google Hangouts to support learners and volunteer teaching assistants in my course. I also use a Twitter account for sharing less formal, more personal thoughts about the course and its content."

Enhancing the intelligence of the system has the potential to result in greater personalization of the course. As more than half of the MOOC instructors were utilizing computer-based forms of feedback to enhance their courses, the role of automation and artificial intelligence (AI) for personalization warranted further probing. As evident in Figure 6, the use of an automated grading system was the only feature leveraged by more than half of the MOOC instructors (n=67 of 127 respondents; 52.8%). Automated or system generated feedback was employed by 28 (22.1%) of the 127 respondents. Similarly, automated alerts for missed assignments were used by 24 (18.9%) of the respondents and automated alerts to participants who do not log in regularly were used in 21 (16.5%) of the MOOCs. Almost nonexistent were tools for automated group allocation (n=7; 5.5%), automated forms of plagiarism checking and detection (n=5; 3.9%), and embedded agents for learner advice (n=3; 2.4%). System adaptation to user performance was found in a single course (n=1; 0.8%).


Figure 6. Number of MOOCs that offer different types of learning system automation and adaptation (n=127).

Another line of inquiry on personalization and tools was centered on MOOC participant communication to instructors, especially if barriers exist. Over half of the 135 respondents indicated that learners could email the course or system (n=78; 57.8%) or send direct emails to instructors (n=75; 55.6%). Less common was emailing teaching assistants (n=42; 31.1%) or relying on social media for support (n=35; 25.9%). Even fewer used synchronous conferencing (n=18; 13.3%), synchronous chat tools (n=11; 8.2%), or face-to-face meet ups (n=4; 3.0%). Nearly nonexistent was the use of personal visits (n=1), virtual world types of environments (n=1), and mobile phones (including text messaging) (n=0).

Research Question #4

How would these instructors structure their next MOOC differently in terms of personalization?

One of the most significant findings was that the majority of MOOC instructors aspired to do a better job of addressing personalization in their next MOOC experience (n=134, M=6.63; SD=2.91). Of the 134 respondents who answered the question stated above, 56 (41.8%) were highly interested in learning new ways to personalize their next MOOC, 48 (35.8%) were moderately interested, and 30 (22.4%) expressed limited interest (see Figure 7). Combining the modest and high interest groups shows that three-fourths of MOOC instructors were interested in MOOC personalization in the future. Advocating for MOOC instructor professional development and training, therefore, seems highly warranted.


Figure 7. MOOC instructor interest in learning new ways to personalize their next MOOC offering. Note: on a scale of 1 (low) to 10 (high) (n=134).

Several interesting comments were proposed in the open-ended question regarding how respondents might redesign their courses to enhance course personalization and overall effectiveness. For instance, one MOOC instructor would "hire some of our students and alumni to get involved - the students really loved the additional points-of-view and the interaction." Another instructor stated that she would, "introduce Google Hangouts. Develop alternative pathways for content. Allow students more space to share own competencies and knowledge levels (perhaps wikis etc.)." Another example is an instructor planning to "offer more examples on different topics and offer different tracks (e.g., just video, video and quizzes, video, quizzes and peer review assignments, etc.)."

Additional Open-Ended Comments

Across the open-ended questions, other personalization practices of the respondents included greater instructor participation in discussion forums, increasing opportunities for learner reflection, designing online learning communities, creating shorter and less formal videos, fostering more peer interaction, subtitling content in different languages, and utilizing formative assessments in the form of participant surveys at the end of each week. The most frequent comment from these MOOC instructors was that they attempted to incorporate "flexible deadlines," including allowing students to post discussion comments and complete tasks at their own pace. In addition, many also leveraged social media, multimedia, mobile applications, and readings to supplement course materials. Among the other personalization methods employed, several instructors mentioned relying on guest speakers, whereas others employed case-based learning. A few instructors attempted to empower the participants by allowing them to choose their own assignments, make multiple attempts to complete assignments, or create their own student groups.

Discussion

The purpose of this study was to investigate how MOOC instructors adapt their courses to the individualized learning needs of students who enroll in a MOOC. In effect, the goal was to better understand the instructional design and personalization approaches of instructors related to MOOCs. The researchers realize that the personalization of MOOCs is a highly idealized and contested concept. We also acknowledge that the massive size of MOOCs makes personalization extremely difficult, if not impossible. However, the goal was to push toward a more personalized MOOC experience through the exploration of MOOC instructor activities, resources, and technologies involved in MOOC design and implementation.

As detailed in the findings, numerous resources, technology tools, and instructional practices are used by instructors when teaching a MOOC. Not surprisingly, most instructors rely on discussion forums, video lectures, supplemental readings, and quizzes. In the open-ended items, MOOC instructors mentioned additional means in which they attempted to better address learner needs beyond the standard MOOC platform tools and features. For instance, some respondents mentioned the use of flexible deadlines, options for course tasks, virtual office hours, integrated media elements, interactive cases, and guest speakers as among the ways in which they personalized their massive courses. This study finds that personalization methods are so varied that it is difficult to accurately capture all forms of MOOC personalization used by an instructor or design team without additional measures such as in-depth interviews, focus groups, and course observations.

Among the key findings was a disconnect between MOOC instructor perceived degree of involvement in the actual design of their courses, and their perceived effort in the design and delivery of their MOOC related to addressing the unique participant or learner needs. Simultaneously, these MOOC instructors desired further training in techniques for such personalization when designing or revamping their next MOOC.

As shown in this study, myriad options exist to attempt to personalize a MOOC. The 152 instructors who completed the questionnaire employed a gamut of feedback techniques, pedagogical activities, resources, interactions, and assessments to address learner needs. There is a range of instructional techniques, technology tools, and learning resources at the MOOC instructor's disposal for attempting to ameliorate gaps in knowledge and address particular learning needs. Such techniques and resources will only increase in the coming years, thereby adding to the already complex instructional task confronting MOOC instructors and designers. Given that most MOOC instructors surveyed in this study had only taught one MOOC, such limited experiences with MOOCs may constrain the degree to which many of these instructors feel comfortable addressing learner personal needs. Follow-up research could be directed at the more experienced MOOC instructors to investigate if practices and tools vary.

One issue noted in this study was the lack of learner monitoring and feedback. Learner progress was left to self-monitoring or was ignored all together. Similarly, peer feedback and system feedback, while important to learner success, were more pervasive than that coming from the instructor or instructional assistants. Finding ways to build expert feedback (including soliciting alumni of these courses for feedback), which was rare in this study, might be one way to foster greater learner personalized attention and overall success. There were a variety of ways in which to have participant questions answered (i.e., contacting the instructor, teaching assistants, social media, synchronous chat, meet ups, synchronous conferencing, etc.). This issue is consistent with challenges faced by MOOC instructors and designers as they struggle to develop a feedback mechanism which "reinforces learning and identifies inconsistencies in the learner process" (Davis et al., 2014, p. 8). Perhaps social media interactions and local meet ups with peers and instructors within MOOCs will increase in the coming decade (see Severance, 2015 for ideas).

Among the more interesting findings from the survey of 152 MOOC instructors polled was that automated alerts, adaptive forms of instruction, and AI do not seem to be playing much of a role in MOOCs. While only addressed in a single questionnaire item, the findings lend doubt to claims that such technologies and systems will soon be taking on a prominent role in MOOCs and other forms of open education. In fact, automated checking of participant progress and the flagging of potential issues were not widely implemented in these MOOCs, nor was the sending of reminders or feedback on accomplishments.

While several prominent technology pioneers have been promoting adaptive digital courseware and AI technology to help reform education, including Mark Zuckerberg (Singer, 2017) and Bill Gates (Schaffhauser, 2014; Straumsheim, 2016), these findings seem to indicate that the impact of AI thus far in the field of MOOCs and open education is quite limited. Even if AI technology was more prominent in MOOCs, automated alerts, reminders, and feedback do not offer MOOC participants "a sense of being treated as an individual, and, therefore," such forms of course automation fall "short in providing personalized learning" (Fournier & Kopp, 2015, p. 298). As Bates (2012) laments, at present, technologies embedded in MOOCs do not yet offer the timely and pointed comments and questions that can nurture rich and interactive online discussions, a sense of caring and encouragement, and a robust understanding of individual student needs. Nevertheless, much investment is being made today in AI technology around the world that should eventually lead to inroads toward more customized and personalized MOOC experiences (Metz & Satariano, 2018).

Limitations

As with any educational research project, there are several important limitations to mention. First of all, as indicated in the methods section, we assembled a database of more than 1,000 MOOC instructor names, courses, and associated contact information from selected lists and vendor websites. However, the researchers did not collect information from all MOOC vendors, nor were MOOCs taught in languages other than English or Korean included, unless the course was cross listed in a researcher-mined MOOC vendor list. Secondly, participants self-selected into this study on "how massive open online course (MOOC) instructors personalize learning." Therefore, survey respondents may have devoted more time to their instructional and pedagogical approaches than those who did not respond. Thirdly, no actual teaching was directly observed nor was any instructional content analyzed. Additionally, the researchers did not conduct follow-up interviews or focus groups with survey participants on their specific personalization and other instructional design practices. Another limitation is that while survey participants were provided with a loose definition of personalization, the 25 experts were not. By not operationally defining personalization for all participants, any in vivo thematic coding schemes created by the researchers have potential constraints and flaws. At the same time, however, it is important to recognize that the term "personalization" has many different connotations and interpretations; one definition may not work for all stakeholders. Another term not explicitly defined was "effort." Once again, each respondent may have a vastly different understanding of what incredibly high or low effort might entail.

Future Directions

Findings from these data sets are merely the first steps in the process. There is a clear and present need to perform in-depth, follow-up inquires with MOOC instructors about their actual instructional design practices; specifically, the means by which personalized learning is attempted, and any instructional modifications and adaptations implemented over time. Interviews with instructors, via email or Web conferencing, would help uncover effective instructional practices undertaken for MOOC personalization as well as course redesign efforts pending or in progress. In addition to interviews, follow ups can take place via focus groups, content analysis, active participation in MOOCs, reviews of historical records, additional surveys, or a combination of these methods. MOOC participants and instructional designers could be solicited to verify and extend the findings of the knowledge base related to MOOC personalization during design and development. Additional research on MOOC personalization is necessary to create effective instructional design and delivery guidelines, frameworks, and models. A better understanding of instructors and participants will help foster more engaging, personalized, and culturally sensitive MOOC-based learning environments.

Implications and Final Comments

Research undertaken in this vein has the possibility of enhancing the planning, development, and delivery of courses that impact millions of learners. Even if minor or modest enhancements are made, the potential impact is immense. Recent data from Class Central indicate that in 2017, over 78 million students signed up for more than 9,400 MOOCs offered by more than 800 different universities (Lederman, 2018; Shah, 2018). Such data is a huge increase from the prior year which documented over 700 universities worldwide offering nearly 7,000 MOOCs to more than 58 million participants in 2016. In comparison, just 35 million learners enrolled in MOOCs at 500+ universities in 2015 (Shah, 2015). Coursera accounted for 30 million of the MOOC enrollments in 2017 as compared to 23 million in 2016 (Shah, 2016, 2017). Another 14 million enrollments were in edX in 2017; 4 million more than in 2016 (Shah, 2016, 2017). Equally impressive, 23 million participants in 2016 registered for a MOOC for the first time (Shah, 2016), and another 20 million new MOOC participants enrolled in 2017 (Shah, 2017). What is clear from this data is that MOOCs are a phenomenon that is receiving accelerating attention, investment, and overall societal importance. They are no longer a cultural anomaly or learning novelty that has limited value due to low completion rates. Instead, tens of millions of individuals are apparently finding some value from enrolling in MOOCs offered by thousands of universities worldwide.

Even when considering the highly advertised course retention and completion problems and issues, MOOCs are impacting scores of lives around the planet each day. Clearly, better understanding of how MOOCs are designed and participant progress is monitored should eventually result in higher quality course design and delivery, and improved completion and retention rates. Continued research in this area can assist countless MOOC instructors to enhance their massively open online courses with techniques, activities, and resources that engage and inspire learners from around the world into their respective disciplines. The study also informs MOOC vendors about MOOC platforms and associated tool design. In addition, it can apprise government funding agencies about the types of MOOC tools and resources that can foster improvements in the evolution of the field of open and distance learning for capacity building.

Given the number of participants that MOOCs attract, this study has the potential to provide marked insight into an emerging phenomenon that has immense global, local, and societal ramifications. With such wide impact potential, our research team continues to expand the database of MOOC instructors and courses that we have collected. The goal as we move forward is to determine more about the psychological, instructional, and technological issues, challenges, and opportunities of MOOCs and other emerging types of open online courses and educational experiences.

References

Aleven, V., Stahl, E., Schworm, S., Fischer, F., & Wallace, R. (2003). Help seeking and help design in interactive learning environments. Review of Educational Research, 73(3), 277-320. doi: 10.3102/00346543073003277

Balch, T. (2013). About MOOC completion rates: The importance of student investment. The Augmented Trader [Weblog post]. Retrieved from https://augmentedtrader.com/2013/01/06/about-mooc-completion-rates-the-importance-of-investment/

Bates, T. (2012, August 5). What's right and what's wrong about Coursera-style MOOCs. Learning and Distance Education Resources Blog [Weblog post]. Retrieved from http://www.tonybates.ca/2012/08/05/whats-right-and-whats-wrong-about-coursera-style-moocs/

Bethke, R. (2016, June 17). Trend: Online learning going personal. eCampus News. Retrieved from http://www.ecampusnews.com/technologies/personalized-learning-online

Bonk, C. J., Lee, M. M., Reeves, T. C., & Reynolds, T. H. (Eds.). (2015). MOOCs and open education around the world. NY: Routledge.

Bonk, C. J., Lee. M. M., Reeves, T. C., & Reynolds, T. H. (2018). The emergence and design of massive open online courses. In R. A. Reiser & J. V. Dempsey (Eds.), Trends and issues in instructional design and technology (4th Ed.), (pp. 250-258). New York, NY: Pearson Education.

Bowman, K. D. (2012, Summer). Winds of change: Is higher education experiencing a shift in delivery? Public Purpose Magazine (from the American Association of State Colleges and Universities). Retrieved from http://www.aascu.org/WorkArea/DownloadAsset.aspx?id=5570

Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42. doi: 10.3102/0013189X018001032

Cho, H., & LaRose, R. (1999). Privacy issues and Internet surveys. Social Science Computer Review, 17(4), 421-434. doi: 10.1177/089443939901700402

Daniel, J. (2012). Making sense of MOOCs: Musings in a maze of myth, paradox, and possibility. Journal of Interactive Media in Education, 3. Retrieved from http://jime.open.ac.uk/articles/10.5334/2012-18/

Davis, H., Dickens, K., León Urrutia, M., Sánchez Vera, M. del M., & White, S. (2014). MOOCs for universities and learners: An analysis of motivating factors. In Proceedings of the 6th International Conference on Computer Supported Education. Barcelona. Retrieved from http://eprints.soton.ac.uk/363714/1/DavisEtAl2014MOOCsCSEDUFinal.pdf

de Oliveira Fassbinder, A. G., Fassbinder, M., & Barbosa, E. F. (2015, October). From flipped classroom theory to the personalized design of learning experiences in MOOCs. Proceedings of Frontiers in Education Conference (FIE), 32614, 1-8.

Deng, R., & Benckendorff, P. (2017). A contemporary review of research methods adopted to understand students' and instructors' use of massive open online courses (MOOCs). International Journal of Information and Education Technology, 7(8). doi: 10.18178/ijiet.2017.7.8.939

Downes, S. (2016, February 17). Personal and personalized learning. European Multiple MOOCs Aggregator Newsletter. Retrieved from http://www.downes.ca/post/65065

Fan, H., & Poole, M. S. (2006). What is personalization? Perspectives on the design and implementation of personalization in information systems. Journal of Organizational Computing and Electronic Commerce, 16(3-4), 179-202. doi: 10.1080/10919392.2006.9681199

Fini, A. (2009). The technological dimension of a massive open online course: The case of the CCK08 course tools. The International Review of Research in Open and Distributed Learning, 10(5), 14-26. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/643/1402

Fournier, H., & Kop, R. (2015). MOOC learning experience design: Issues and challenges. International Journal on E-Learning, 14(3), 289-304. Retrieved from https://www.learntechlib.org/p/150661/

Green, H., Facer, K., Rudd, T., Dillon, P., & Humphreys, P. (2005). Personalisation and Digital Technologies. Bristol: Futurelab.

Hayworth, R. (2016). Personal learning environments: A solution for self-directed learners. TechTrends, 60, 359-364. doi: 10.1007/s11528-016-0074-z

Hew, K. F., & Cheung, W. S. (2014). Students' and instructors' use of massive open online courses (MOOCs): Motivations and challenges. Educational Research Review, 12, 45-58. doi: 10.1016/j.edurev.2014.05.001

Heutte, J., Kaplan, J., Fenouillet, F., Caron, P. A., & Rosselle, M. (2014) MOOC user persistence. In L. Uden, J. Sinclair, T.H. Tao, & D. Liberona (Eds.), Learning technology for education in cloud. MOOC and big data. LTEC 2014. Communications in computer and information science (Vol 446, pp. 13-24). New York, NY: Springer. doi: 10.1007/978-3-319-10671-7_2

Jagannathan, S. (2015). Harnessing the power of open learning to share global prosperity and eradicate poverty. In C. J. Bonk, M. M. Lee, T. C. Reeves, & T. H. Reynolds (Eds.), MOOCs and open education around the world (pp. 218-231). New York: Routledge.

Jordan, K. (2014). Initial trends in enrolment and completion of massive open online courses. The International Review of Research in Open and Distributed Learning, 15(1), 133-160. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1651/2774

Kelly, R. (2016, July). 7 universities receive grants to implement adaptive learning at scale. Campus Technology. Retrieved from https://campustechnology.com/articles/2016/07/14/7-universities-receive-grants-to-implement-adaptive-learning-at-scale.aspx

Kim, P., & Chung, C. (2015). Creating a temporary spontaneous mini-ecosystem through a MOOC. In C. J. Bonk, M. M. Lee, T. C. Reeves, & T. H. Reynolds (Eds.), MOOCs and open education around the world (pp. 157-168). New York, NY: Routledge.

Kop, R. (2011). The challenges to connectivist learning on open online networks: Learning experiences during a massive open online course. International Review of Research in Open and Distance Learning, 12(3), 19-38. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/882/1689

Kop, R., Fournier, H., & Mak, J. S. F. (2011, November). A pedagogy of abundance or a pedagogy to support human beings? Participant support on massive open online courses. International Review of Research on Open and Distance Learning, 12(7). Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/1041/2025

Kop, R., & Fournier, H. (2015). Peer2peer and open pedagogy of MOOCs to support the knowledge commons. In C. J. Bonk, M. M. Lee., T. C. Reeves, & T. H. Reynolds (Eds.), MOOCs and open education around the world (pp. 303-314). NY: Routledge.

Lederman, D. (2018, February 14). MOOCs: Fewer new students, but more are paying. Inside Higher Ed. Retrieved from https://www.insidehighered.com/digital-learning/article/2018/02/14/moocs-are-enrolling-fewer-new-students-more-are-paying-courses

Levy, D. (2011). Lessons learned from participating in a connectivist massive online open course (MOOC). In Y. Eshet-Alkalai, A. Caspi, S. Eden, N. Geri, & Y. Yair (Eds.), Proceedings of the Chais Conference on Instructional Technologies Research 2011: Learning in the Technological Era (pp. 31-36). Raanana: The Open University of Israel.

Liyanagunawardena, T. R., Adams, A. A., & Williams, S. A. (2013). MOOCs: A systematic study of the published literature 2008-2012. The International Review of Research in Open and Distributed Learning, 14(3), 202-227. doi: 10.19173/irrodl.v14i3.1455

McLoughlin, C., & Lee, M. J. (2010). Personalised and self-regulated learning in the Web 2.0 era: International exemplars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28-43. Retrieved from https://ajet.org.au/index.php/AJET/article/view/1100/355

Metz, C., & Satariano, A. (2018, July 3). Silicon Valley's giants take their talent hunt to Cambridge. The New York Times. Retrieved from https://www.nytimes.com/2018/07/03/technology/cambridge-artificial-intelligence.html

MOOC @ Edinburgh 2013 - Report #1 (2013). MOOC @ Edinburgh 2013 - Report #1. University of Edinburgh, Edinburgh, Scotland. Retrieved from https://www.era.lib.ed.ac.uk/bitstream/handle/1842/6683/Edinburgh_MOOCs_Report2013_no1.pdf?sequence=1&isAllowed=y

Nkuyubwatsi, B. (2013, October). Evaluation of massive open online courses (MOOCs) from the learner's perspective. Proceedings of the12th European Conference on e-Learning, Sophie Antipolis, France, 12, 340-346. Retrieved from http://hdl.handle.net/2381/28553

Pappano, L. (2012, November 2). The year of the MOOC. The New York Times, 2(12). Retrieved from http://www.nytimes.com/2012/11/04/education/edlife/massive-open-online-courses-are-multiplying-at-a-rapid-pace.html

Prain, V., Cox, P., Deed, C., Dorman, J., Edwards, D., Farrelly, C., & Waldrip, B. (2013). Personalised learning: Lessons to be learnt. British Educational Research Journal, 39(4), 654-676. doi: 10.1080/01411926.2012.669747

Reeves, T. C., & Hedberg, J. G. (2014). MOOCs: Let's get REAL. Educational Technology, 54(1), 3-8. Retrieved from https://www.jstor.org/stable/pdf/44430228.pdf

Reigeluth, C. M., Aslan, S., Chen, Z., Dutta, P., Huh, Y., Lee, D., & Watson, W. R. (2015). Personalized integrated educational system: Technology functions for the learner-centered paradigm of education. Journal of Educational Computing Research, 53(3), 459-496. doi: 10.1177/0735633115603998

Reigeluth, C. M., Myers, R. D., & Lee, D. (2017). The learner-centered paradigm of education. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-Design Theories and Models (pp. 5-32). Hillsdale, NJ: Routledge.

Rodriguez, O. C. (2012). MOOCs and the AI-Stanford like courses: Two successful and distinct course formats for massive open online courses. European Journal of Open, Distance and E-Learning, 1(2), 1-13. Retrieved from https://files.eric.ed.gov/fulltext/EJ982976.pdf

Rogoff, B. (1990). Apprenticeships in thinking. New York: Oxford University Press.

Saadatdoost, R., Sim, A. T. H., Jafarkarimi, H., & Hee, J. M. (2015). Exploring MOOC from education and information systems perspectives: A short literature review. Educational Review, 67(4), 505-518. doi: 10.1080/00131911.2015.1058748

Schaffhauser, D. (2014, October 1). Gates Foundation picks seven to vie for $20 million digital courseware investments. Campus Technology. Retrieved from https://campustechnology.com/Articles/2014/10/01/Gates-Foundation-Picks-Seven-To-Vie-for-$20-million-Digital-Courseware-Investments.aspx?p=1

Severance, C. (2015). Learning about MOOCs by talking to students. In C. J. Bonk, M. M. Lee, T. C. Reeves, & T. H. Reynolds, T. H. (Eds.), MOOCs and open education around the world (pp. 169-179). New York: Routledge.

Shah, D. (2015). By the numbers: MOOCs in 2015. Class Central. Retrieved from https://www.class-central.com/report/moocs-2015-stats/

Shah, D. (2016). By the numbers: MOOCs in 2016. Class Central. Retrieved from https://www.class-central.com/report/mooc-stats-2016/

Shah, D. (2018, January 22). A product at every price: A review of MOOC stats and trends in 2017. Class Central. Retrieved from https://www.class-central.com/report/moocs-stats-and-trends-2017/

Siemens, G. (2007). PLEs - I acronym, therefore I exist. Elearnspace: Learning, Networks, Knowledge, Technology, Community [weblog]. Retrieved from http://www.elearnspace.org/blog/archives/002884.html

Siemens, G. (2012a, January 19). Connectivist learning theory. Retrieved from the P2P Foundation Wiki: http://p2pfoundation.net/Connectivist_Learning_Theory_-_Siemens

Siemens, G. (2012b, June 3). What is the theory that underpins our moocs? E-LearningSpace. Retrieved from http://www.elearnspace.org/blog/2012/06/03/what-is-the-theory-that-underpins-our-moocs/

Singer, N. (2017, June 6). The Silicon Valley billionaires remaking America's schools. The New York Times. Retrieved from https://www.nytimes.com/2017/06/06/technology/tech-billionaires-education-zuckerberg-facebook-hastings.html

Sneddon, S. (2015, March 31 - April 2). Could you make it a bit more MOOCy? Paper presented at the Socio Legal Studies Association Annual Conference. University of Warwick, England.

Straumsheim, C. (2016, June 23). Learning to adapt. Inside Higher Ed. Retrieved from https://www.insidehighered.com/news/2016/06/23/study-finds-inconclusive-results-about-efficacy-adaptive-learning

Veletsianos, G., & Shepherson, P. (2016). A systematic analysis and synthesis of the empirical MOOC literature published in 2013-2015. International Review of Research on Open and Distributed Learning, 17(2), 198-221. Retrieved from http://www.irrodl.org/index.php/irrodl/article/view/2448/3655

Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Watson, W. R., & Watson, S. L. (2017). Principles for personalized instruction. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-design theories and models (pp. 93-120). Hillsdale, NJ: Routledge.

Xu, D., Huang, W. W., Wang, H., & Heales, J. (2014). Enhancing e-learning effectiveness using an intelligent agent-supported personalized virtual learning environment: An empirical investigation. Information & Management, 51(4), 430-440. doi: 10.1016/j.im.2014.02.009

Yuan, L., Powell, S., & Olivier, B. (2014). Beyond MOOCs: Sustainable online learning in institutions - A white paper. Cetis - Centre for Educational Technology, Interoperability, and Standards. University of Bolton, UK. Retrieved from http://publications.cetis.org.uk/wp-content/uploads/2014/01/Beyond-MOOCs-Sustainable-Online-Learning-in-Institutions.pdf

Zhu, M., Sari, A., & Lee, M. M. (2018). A systematic review of research methods and topics of the empirical MOOC literature (2014-2016). The Internet and Higher Education, 37, 31-39.

 

Athabasca University

Creative Commons License

Pushing Toward a More Personalized MOOC: Exploring Instructor Selected Activities, Resources, and Technologies for MOOC Design and Implementation by Curtis J. Bonk, Meina Zhu, Minkyoung Kim, Shuya Xu, Najia Sabir, and Annisa R. Sari is licensed under a Creative Commons Attribution 4.0 International License.