International Review of Research in Open and Distributed Learning

Volume 20, Number 4

October - 2019

 

Maturity Levels of Student Support E-Services Within an Open Distance E-learning University

 

Asteria Nsamba
University of South Africa

 

Abstract

The University of South Africa (UNISA) is one of the distance education universities that is shifting from open distance learning (ODL) to open distance e-learning (ODeL). UNISA started as a correspondence institution in the 1950s and it has since evolved into an ODeL university. The aim of this research was to assess and determine the maturity levels of UNISA lecturers' and tutors' explorations of various forms of e-learning technologies to support students in an ODeL environment. Semi-structured interviews were conducted with 12 academic staff members. A hybrid approach involving inductive and deductive reasoning was used to guide the whole research process. The online course design maturity model (OCDMM) was modified and adapted in order to guide data collection, data analysis, and the interpretation of results. The results of the study indicate that the maturity levels of UNISA's student support e-learning technologies are at the basic levels of the maturity assessment framework for open distance e-learning. It is hoped that the results of this research will serve as a starting point that the University can use to constantly measure improvements made in advancing e-learning activities.

Keywords: e-learning, maturity assessment, open distance education

Introduction

The expansion of open universities worldwide has provided access to many students who need higher education qualifications. This expansion is attributed to the open university movement which began with the establishment of the first distance education (DE) universities, namely the University of South Africa (UNISA) and UK's Open University (Tait, 2008). The most common characteristics among open universities are their provision of education through distance learning, hence the name open distance learning (ODL), as well as their open, flexible, and accessible offerings. The development and impact of modern technologies has led to an increase in the adoption and utilisation of these technologies to support students' learning. According to Bernath, Szucs, Tait, and Vidal (2009), modern technologies “are becoming standard elements of institutional practice” (p. xi), hence the adoption of e-learning in ODL universities.

UNISA is one of the ODL universities that is shifting its ODL to open distance e-learning (ODeL). From its beginnings as a correspondence institution in the 1950s, it has since evolved into an ODeL university. This evolution has occurred through five generations of DE (Taylor, 2001), namely (a) print technology (correspondence); (b) multimedia (e.g., videos, CDs); (c) computer- mediated communications (including videoconferencing); (d) Internet-based resources; and online interactive multimedia. Although it is difficult to identify the end or beginning points of these generations because they overlap (de la Pena-Bandalaria, 2007), UNISA student support services have also evolved along with these generations, hence the term student support e-services. This term refers to provision of a variety of information and communication technology resources and instructional methods to help students succeed in their studies. The evolution of student services is necessary because DE requires various technologies to minimise geographical and pedagogical gaps between students and lecturers (Moore, 1993).

According to Makhanya (2016), the shift to ODeL is intended to establish UNISA “as an African university leading the world in using all the technologies available in integrated ways so that technology is a means to an end, not the end itself” (p. 7). This aligns with Garrison and Anderson's (2003) advice that “when adopting new communication technologies with the potential to fundamentally alter the teaching and learning transaction, it is essential we think through our educational ideals” (p. 11).

UNISA's e-learning initiative started in 2013 with the launch of the University's integrated e-tutor model, a student support model that introduced an e-tutoring system for modules with a large number of students. E-tutoring is delivered mainly through an online learning management system (LMS) referred to as myUnisa, a learner-centred environment for synchronous and asynchronous learning interactions.

An e-learning LMS can be described as “a self-contained webpage with embedded instructional tools that permit faculty to organise academic content and engage student in their learning” (Gautreau, 2011, p. 2). LMSs, now ubiquitous in higher education (Anderson & Dron, 2017; Rhode, Richter, Gowen, Miller, & Wills, 2017) have various management tools including course content, learning resources, announcements, examinations, and discussion forums. Mtebe (2015) notes that some institutions do not use all the tools available in their LMS and many institutions in sub-Saharan Africa do not use the LMS at all, even after some training. Whilst some studies found low levels of LMS use in higher education institutions (Olivier, 2016; Maboe, Nkosi, & Makoe, 2013; Mtebe, 2015), others have found high levels (Coleman & Mtshazi, 2017).

MyUnisa forms part of the University's student support system. Its management tools include (a) discussion forums to facilitate student-student and student-lecturer/tutors course interactions; (b) self-assessment; (c) additional resources; (d) module content; and (e) announcements. Lecturers and e-tutors are trained to be familiar with myUnisa tools. In the myUnisa platform:

There is no physical face-to-face component, although there could be a virtual face-to face component. All interactions with staff and students, educational content, learning activities, assessment and support services are integrated and take place online. (CHE, 2014, p.10)

This model shares similar characteristics with OU's supported open learning model. In addition to e-tutoring, UNISA has introduced fully online learning and teaching on myUnisa for certain modules. Lecturers and e-tutors (hereafter referred to as lecturers) are expected to (a) facilitate learning; (b) provide guidance on study material, timely feedback, and technical support; and (c) develop learning communities.

The myUnisa initiative has increased cognitive, social, and teaching presences (Garrison, Anderson, & Archer, 2000) thus maximising learning interactions. Could Nagel's (2009) hypothesis that most college students in the United States of America would be studying online by 2014 hold true with UNISA?

While this initiative is commendable, the extent to which UNISA lecturers explore various e-learning technologies, tools, and applications, as well as implement innovative approaches to support students' learning is unclear and has not been fully researched. A 2015 study (Mbati & Minnaar, 2015) of online learning indicated that online facilitators were not using social media technologies to support and enhance students' learning. A more recent study (Ngubane-Mokiwa, 2017) indicated that some UNISA lecturers are reluctant to use modern technologies because “modern electronic technologies force traditionally-inclined lecturers out of the comfort zone of their customary familiar techniques and pedagogies” (p. 118). They “simply upload PDF versions of old learning materials onto myUnisa without providing any pedagogical support” (Ngubane-Mokiwa, 2017, p. 115). This is worrying because the foundation of e-learning involves adopting teaching and learning technologies as well as knowledge of appropriate pedagogical approaches (Mbati & Minnaar, 2015). Koehler, Mishra, Kereluik, Shin, and Graham (2014) observed that teachers lack the knowledge to incorporate technology into their teaching.

In addition, Haukijarvi (2014) noted that traditional face-to-face methods of university lecturing are applied to e-learning “in spite of the many advantages e-learning provides for distance education teaching purposes” (Guri-Rosenblit, 2005, p. 469). As well, it is unclear how e-learning delivery is sustained beyond myUnisa. A study examining LMS use at a US university (Rhode et al., 2017) observed that LMS use can reach a saturation point whereby there is less than 100% adoption and use of the system, despite its longevity. This is a concern because studies on myUnisa use (Olivier, 2016; Maboe et al., 2013) have recorded significantly low student participation. Olivier (2016) indicated that only 132 of 1015 students registered in a compulsory one-year module participated in forum discussions; Maboe et al. (2013) reported that 53 students out of 1,379 participated.

This study sought to better understand UNISA lecturers' exploration and use of various e-learning technologies, tools, and applications to support their students in an ODeL environment. It focused on two objectives:

This study was part of a larger research project exploring the quality of student support services in ODL environments. Ethics approval was obtained from the Executive Director of UNISA's Research Department.

E-Learning

There is consensus in the literature that e-learning delivery relies solely on ICTs. According to Haukijärvi (2014) the “E” that signifies electronic has become an essential part of various domains within public and private sector institutions. Calli, Balcikanli, Calli, Cebeci, and Seymen (2013, p. 85) believe that e-learning “has gained traction in educational settings in recent years.”

E-learning requires the use of electronic media for a variety of learning purposes that range from “add-on functions in conventional classrooms to full substitution for face-to-face meetings by online encounters” (Guri-Rosenblit (2005, p. 469). These interactions give students the opportunity to complete their courses successfully (Matoane & Mashile, 2013). E-learning is also described as a learning method and a technique for the presentation of academic curricula via the Internet or any other electronic media, including multimedia, compact discs, or other modern technologies (Du Plessis, 2017). Modern e-learning technologies and tools can include: WhatsApp, Facebook, Twitter, smartphones, e-mail, videos, and podcasts.

Studies on e-learning have explored and examined different forms of e-learning delivery, and most have found that social media improves students' engagement, collaboration, and interaction. da Cunha, van Kruistum, and van Oers (2016) used the cultural historical activity theory (CHAT) to examine the use of Facebook in Brazilian face-to-face schools. They found that Facebook increases students' engagement and collaboration. Hamad (2017) explored students' experiences using WhatsApp as a supplementary method of enhancing English language skills. The respondents agreed that WhatsApp enriches vocabulary, develops speaking and writing skills, and enhances enthusiasm. This is consistent with earlier observations (Gunawardena, 1995, 164) that computer mediated communication can promote interactive and collaborative learning if course moderators encourage the creation of online communities. Dickey (2010) conducted a qualitative study examining pre-service teacher education students' perceptions of using blogs and found that blogs helped prevent feelings of alienation and isolation for distance students.

Thomas, Briggs, Hart, and Kerrigan's (2017) study described the benefits of social media technology in community building efforts among first-year students. According to this study, social media was used to support different stages of transitioning into a new community. In contrast, a quantitative study by Owusu-Acheaw and Larson (2015) on the use and impact of social media on performance showed that the mobile phone with Internet capability can negatively affect students' academic work. However, the authors recommended that teachers should encourage students with such devices to use them for research purposes. In a similar quantitative study (Irwin, Ball, Desbrow, & Leveritt, 2012), 51% of students indicated that Facebook was an effective learning tool, 37% said it was ineffective, and 12% were not sure.

E-Learning Maturity Assessment

The concept of maturity assessment originates from the information technology (IT) and software industry. Neuhauser (2004) notes that researchers such as Watts Humphrey found that process improvement involves a sequence of steps instead of concurrent activities. This observation and line of thinking led to the development of the first maturity model within the software industry, followed by the subsequent development of various maturity models (Marshall, 2010; Marshall & Mitchell, 2002, 2003, 2006; Neuhauser, 2004; White, Longenecker, Leidig, Reynolds, & Yarbrough, 2003). Kohlegger, Maier, and Thalmann (2009, p. 51) defined maturity models as “popular instruments used to rate capabilities of maturing elements and select appropriate actions to take the elements to higher level of maturity.” Maturity, on the other hand, is “an evolutionary progress in the demonstration of a specific ability or in the accomplishment of a target from an initial to a desired or normally occurring end stage” (Mettler, Rohner, & Winter, 2010, p. 335).

Maturity models indicate levels of maturity, ranging from low to high. According to Neuhauser (2004) each level provides “a new foundation of practices on which subsequent levels are built” (p. 2). Therefore, for any learning programme to reach maturity, it should provide “learning opportunities not available at a lower level” (Neuhauser, 2004, p. 2).

For the purpose of this study Neuhauser's (2004) online course design maturity model (OCDMM) was chosen and modified because it supports effective applications of technologies appropriate for e-learning. This model is a tool for planning and evaluating online courses. It is based on a set of best practices and can be helpful in guiding institutions to better understand best practices, technologies, learning principles, and performance standards.

The OCDMM consists of five levels, moving from Level 1 (the initial level) to Level 5 (the integration of best practices). Each level has five key process areas (KPA). KPAs can be described as a group of related activities organized by their common characteristics. Each KPA identifies a series of practices that, “when utilised as a group and built on the prior level, will potentially create an environment supporting increased student performance” (Neuhauser, 2004, p. 3).

Table 1

Online Course Design Maturity Model (OCDMM)

Key process areas
Level Components and appearance Individualised and personalised Use of technology Socialisation and interactivity Assessment
Level 5: Integrating best practices Develops learning objects

Engaging

Effortless navigation

Intuitive

Processes integrated and linked

Multiple sensory input
Resources supporting learning preferences

Interactive learning material

Electronic mentors

Sensitive to cultural differences

Self-regulated learning

Learning objects matched to students needs and interests

Learning preference awareness
Extensive generation of Web links and resources

Choices on path practice and community

Provides integration of processes

Blogs
Community of learners

Collaborative problem solving and critical thinking

Social presences

Alignment of learning preferences to practice
Multiple assessments for student performance course improvement

Feedback for effective learning

Multiple options for sharing knowledge

Learning preference
Level 4: Strategising Learning objects to meet course goals

Well- structured content

Audio, video, animation

Multimedia

Attention-getting
Learner instructor partnership

Learner-controlled links

Private e-mail for faculty-student contact
Students filter, integrate, and disseminate knowledge from Web resources Student-generated discussion

Students facilitate tasks and group maintenance

Collaborative tools used

Sensitive to students' needs
Versatility of projects

Peer review of work

Student-instructor readiness for online work
Level 3: Awakening Lectures integrated with links and discussion

Powerpoint and HTML
Primarily instructor-controlled

Private e-mail with students
Discovery of Web resources

Faculty and students comfortable with use of technology
Instructor-controlled discussions

Sensitive to students' participation

Frequent contact
Test pools

Papers from students to instructor

Student access to Content Management System (CMS)
Level 2: Exploring Notes online

Blended course colours and fonts
Instructor-controlled Search engine, library databases If used, discussions are instructor-led Papers through e-mail
Level 1: Initial Syllabus, course information

All text
Limited access, Instructor-controlled E-mail, minimal use of CMS E-mail None online

Using this model as a framework, a set of principles is proposed to assess the maturity levels of e-learning in ODL environments, considering UNISA as our context. This is consistent with Duarte and Martins (2013) who asserted that any approach aimed at assisting higher education institutions to improve their workflows should “take into account the special characteristics of such organisations” (p. 27). A systematic, three-stage process of constructing a maturity assessment framework for ODeL is described below.

Constructing a Maturity Assessment Framework for ODeL at UNISA

Stage 1. In line with Neuhauser (2004, p. 3), we used good practice (leading practice) principles as a foundation for constructing a framework with which to better understand e-learning maturity at UNISA. The following e-learning best practice principles drawn from the literature should be considered by universities and students:

Stage 2. Three KPAs were identified. The first related to LMS environment, and includes all online and offline activities conducted on the LMS. The second KPA deals with the use of learning technological tools and applications. Features under this KPA include all technological devices relevant to learning in an ODeL environment. The third KPA is online assessment on the LMS. We believe that these KPAs represent the key practices in an ODeL environment, and when performed collectively, can help the institution achieve its goals.

Stage 3. In all maturity models, KPAs must be assessed. For Stage 3, five maturity levels are proposed to assess the three KPAs. At Level 1, delivery is still at its lowest level of maturity and little technology is used. As the delivery matures in quality, additional leading practices are integrated until Level 5 (Neuhauser, 2004). This framework proposes that universities be at liberty to decide the level of maturity they are comfortable with. The five levels are explained more fully below.

Maturity Levels of the Maturity Assessment Framework for ODeL

Similar to OCDMM, this model consists of five levels, moving from Level 1 to Level 5. At Level 1, online discussions are led by lecturer/tutor who also generate discussion topics. At this level, students can slowly be introduced to this role on the LMS. The use of e-mail and phone as the only means of communication is acceptable at this level. At this level we propose that 50% of students' assessment should take place online (LMS) and the other 50% by paper and pencil. This percentage distribution is in line with Pinto's (2012) view. Maturity levels are assigned according to the distribution between online and conventional approaches. At a basic level, online approaches range from 0% to 33%. At the intermediate level, they range from 34% and 66%, while 67% to 100% is considered an advanced level.

At Level 2, online discussions are led by students who also generate discussion topics and facilitate these discussions. The use of social media tools such as WhatsApp and Facebook are introduced to complement e-mail and phone for announcements, notices, and issuing reports. At this level, assessment is 80% online and 20% paper and pencil, and students also engage in self-assessment.

At Level 3, the use of social media tools (e.g., WhatsApp and Facebook) are phased in to enhance teaching and learning. Students review one another's work online. Students are encouraged to suggest resources relevant to their topics, activities, and modules. Communities of learning are being formed.

By Level 4, students are able to suggest Web links and other resources during their online discussions with the lecturers/tutors and among themselves. Different forms of assessment are introduced by the lecturers/tutors using all relevant technologies. Social media tools such as WhatsApp and Facebook are used to provide feedback and feedforward. Feedback is also shared online. Strong learning collaborations are evident at this level.

Finally, at Level 5, students' and lecturers' collaborations are well established. The use of all the learning tools, including LMS, WhatsApp, Facebook, blogs, and podcasts are regular features in teaching and learning. Multiple online assessments are established and implemented, and assessment is fully online. Table 2 shows the five levels of our maturity assessment framework for ODeL.

Table 2

Maturity Assessment Framework for Open Distance E-Learning (MAFODeL)

Maturity levels Use of LMS Use of learning technological tools and applications Online assessment via LMS
Level 1: Basic Syllabus, course information, study material

Online discussions led by lecturer/tutor
Use of e-mail and phone only as means of communication 50% online and 50% paper and pencil
Level 2: Novice Online discussions led by students who also generate discussion topics and facilitate these discussions Use of social media tools (e.g., WhatsApp, Facebook, Twitter) to complement e-mail and phone for communication only, not used for teaching and learning 80% online and 20% paper and pencil

Students' self-assessment
Level 3: Intermediate Lecturers' online facilitation is integrated with Web links and other references Use of social media tools (e.g., WhatsApp, Facebook, Twitter) to enhance teaching and learning Students review one another online
Level 4: Developing Students are able to include Web links and other resources during their online discussions Use of social media tools (e.g., WhatsApp, Facebook, Twitter) to provide feedback Feedback and feedforward shared online
Level 5: Advanced Students' and lecturers' learning collaborations are formed Use of all the above tools including blogs and podcasts in teaching and learning Feedback shared online on blogs and full online assessment

Methods

A qualitative methodology using semi-structured interviews and content analysis was employed. The population consisted of all the lecturers in UNISA's five colleges: (a) Education; (b) Human Sciences; (c) Economics and Management Sciences; (d) Science, Engineering, and Technology; and (e) Law. Combination of stratified and convenience sampling technique was used to select participants who could be easily approached. The stratified sampling technique was used to divide the population into colleges to form strata. These strata represented common characteristics among the population of lecturers in different colleges, including access to myUnisa and the library. According to Babbie (2016) stratified sampling addresses issues of representativeness in research whereby participants with similar characteristics are grouped together to help the researcher draw conclusions from different strata.

Participants gave informed consent before taking part in the study. The survey tool was sent to participants via e-mail. The survey consisted of four sets of questions that were aligned with the MAFODeL. The first three sets of questions were objective, requiring participants to answer yes, no, or sometimes. The fourth set consisted of open-ended questions asking participants whether the tools they use to support learning meet their module objectives. They were also requested to explain how they use the identified tools. Of the 30 surveys sent via e-mail, 12 were returned correctly filled out.

Data Analysis

A hybrid approach of deductive and inductive reasoning was used to analyse data. This approach was used to ensure that emerging themes were grounded in the MAFODeL. The analysis was conducted in two phases. In the first phase, responses for the first three sets of questions were collated per question on an Excel spreadsheet to find percentage responses. In the second phase, responses to the fourth set of questions were transcribed and thematic analysis (Braun & Clarke, 2006) was employed. Themes related to MAFODeL were coded into categories related to the concepts from this framework. These categories were grouped under the following headings: (a) myUnisa LMS activities; (b) technological devices and tools; (c) learning strategies; and (d) online assessment. Tables 1 to 3 represent data analysis for the first three sets of questions.

Table 3

Responses Regarding the Tools Used to Support Learning

Learning tools Yes No Sometimes
myUnisa 83% 8% 8%
Web resources/links 42% 58% 0%
CDs 0% 100% 0%
Blogs 0% 82% 18%
WhatsApp 42% 42% 17%
Facebook 0% 82% 18%
Mobile phone 50% 25% 25%
E-mail 100% 0% 0%
Videos 42% 58% 0%
Podcasts 25% 58% 17%

Table 4

Responses Regarding Use of Online Teaching/Tutoring and Learning Strategies

Strategies Yes No Sometimes No answer
Online discussions are integrated with links 42% 25% 25% 8%
Online discussions are led by lecturer/tutor 100% 0% 0% 0%
Online discussions are led by students 0% 50% 50% 0%
Students are comfortable with the use of technology 25% 8% 67% 0%
Students generate discussion topics 8% 33% 58% 0%
Students facilitate discussions 17% 33% 50% 0%
Students online collaborations are satisfactory 8% 50% 42% 0%
Students suggest Web resources and links 0% 50% 33% 17%
Choice of Web-based learning resources is informed by students' preferences. 25% 42% 17% 17%

Table 5

Responses Regarding Use of Online Assessment Strategies

Strategies Always No Sometimes No answer
Online assessment 42% 25% 33% -
Feedback is shared online 42% 8% 50% -
Student online self-assessment 25% 17% 58% -
Students review one another 8% 50% 33% 8%

Findings

The aim of this research was to assess and determine the maturity levels of lecturers' and tutors' explorations of various e-learning technologies within an ODeL environment. Although 83% of the survey respondents indicated that they use myUnisa, the responses from the open-ended questions showed that the announcements tool is the most frequently used on this LMS. Data also showed that most online discussions are initiated and led by tutors, with discussions rarely initiated by students. The survey indicated that less than half of the respondents use Web links, and students hardly ever provide links and other resources to maximise sources of information that can be shared amongst them. Among academics, 42% use Web links and videos, 25% use podcasts, and none use Facebook, blogs, or CDs. In addition, students' and lecturers' collaborations are yet to be developed or used within myUnisa. These findings corroborate Rhode et al.'s (2017) results which indicated that the announcement tool was the most frequently used LMS tool in a US university at 82.13%, and the tools used least were discussions (21.22%) and Web links (29.88%). When asked to explain how the tools they used support learning and teaching, two participants in the study replied:

Yes, all of the myUnisa tools used in my modules support the objectives of the module. I use OERs, which give a more practical application for students compared to just the theory. Case studies are discussed in the discussion forums where students can answer and give their opinions on the case study so that the students can link this with the theory in the module. Students have Web links, for example the Website in their specific field and Unisa Library guide, where they can obtain extra resources on the module. The self-assessment tool is used by students to evaluate their knowledge on the content of the module but the self-assessments are not marked; this is only for self-study. Additional resources, for example a Powerpoint presentation and extra reading material are also available for students.
Yes, the ones (myUnisa and e-mail) that I use are effective, especially with e-tutor discussions. I also use announcements on myUnisa regularly to keep my students abreast of any new developments. As explained above, the myUnisa tool is very effective, and the announcements feature is handy to clarify teaching and learning matters.

Although myUnisa is the main pillar of e-learning at UNISA, a small but significant percentage of respondents (8%) indicated that they sometimes use it while another 8% indicated that they do not use it, as evidenced by the comment “No, we do exercises and assignments in class.” This response indicates that either some academics prefer physical tutorial support to myUnisa or they do not have access to it. There are also indications that the use of myUnisa and e-mail as learning tools may not be students' preference, as stated in these participants' comments:

Yes we do, the only challenge is students are still lagging behind in using the named tools.
Yes, some students benefit from such learning support tools. But there are those who for example, do not visit the announcement and/or discussion section on myUnisa, and thus they are not actively engaged in our discussions.
While students do not participate optimally, those that do, do benefit significantly.

When asked whether they use e-mail as a tool to support learning, 100% of the respondents indicated that they do. This is supported by the interview data:

I usually use these tools (myUnisa and e-mail) when I want to alert students on certain important information. After marking assignments, I usually give feedback to students using such tools.
With regard to the use of e-mail, students mainly use it to query their marks, and not much about content.

When asked whether online discussions are led by tutors or students, 100% of the respondents indicated that discussions are led by tutors and none indicated that they are led by students.

The issue of student-directed discussion scored low among respondents. For example, only 8% indicated that students generate discussion topics, and when asked if students facilitate discussions using the mentioned electronic tools, only 17% replied affirmatively. As well, only 8% indicated that students' online collaboration is satisfactory. These scores indicate low characteristics of Level 2 and high characteristics of Level 1. Elements of MAFODeL that indicate higher levels of maturity in the use of e-tools, such as student self-assessment using the available tools, online feedback and feedforward, and online assessment together scored an average of 29%, indicating that the current methods and tools used are not yet developed to higher levels.

Data clearly indicate that e-learning tools are mainly stuck at the use of e-mail at 100% and sometimes phone at 50%. All respondents indicated that they do not use Facebook, CDs, or blogs to support learning. Much as social media and Web links, videos, and podcasts are available to support students, they scored a mere average of 36% indicating that their use is below average. This performance exhibits strong characteristics of Level 1 on the MAFODeL.

Discussion and Recommendations

Based on the results, the use of e-learning technologies and tools at UNISA is still at Level 1 (Basic) of MAFODeL. The results also depict weak characteristics of Level 2 (Novice) due to the low use of other communication technologies. According to Pinto (2012), maturity levels with percentages between 0% and 33% are considered basic. Even though myUnisa is the backbone of e-learning at the university, use of discussion forums is minimal. Students should be the focus of e-learning and support, therefore online discussions should be more amongst students than simply between students and the lecturer. The discussion tool should be used to deliver student-led activities and collaborations.

We recommend that other types of technologies be explored and adopted, as 100% use of myUnisa is unlikely. Cheaper options like CDs should be fully explored and social media should be adopted because these have great potential in enhancing learning.

Facilitating e-learning requires knowledge of appropriate pedagogical innovations. The literature has indicated that many e-learning facilitators hardly use pedagogies related to the technologies they use; instead they are comfortable with face-to-face methods. One participant in this study indicated that they do activities and assignments in class; this is surprising because UNISA is an ODeL institution and cannot accommodate all its students in classrooms. It is recommended that all UNISA lecturers and tutors be trained on the use of all myUnisa tools, modern e-learning technologies, and the application of relevant pedagogical approaches. myUnisa should be made available to all academics as a tool for learning and teaching not only for announcements.

The results also indicate that the choice of e-learning resources is not informed by students' preferences. Preferences can be influenced by issues of affordability, so this should be considered. Data also indicate unsatisfactory levels of interaction and collaboration even though students are said to be comfortable with the use of technology. Makoe (2012) also found that UNISA students are comfortable with modern technology; generally, literature has shown that mobile phones have become ubiquitous tools for students. E-learning platforms are characterised as flexible, cost effective, collaborative, and allow better access to tutors and learning resources (Garrison, 2011; Pantaziz, 2002; Zhang & Nunamaker, 2003), so the University should take advantage of that.

Data show that assessment and feedback are rarely delivered online despite the fact that this facility is available. It is recommended that 50% of formative assessment be introduced online. The use of more available electronic tools will have to be encouraged and used frequently to support both learning and teaching.

Conclusion

The impact of ICTs and global trends have compelled many ODeL universities to adopt modern technologies to support students' learning. To a large extent, student support has evolved through three or four generations of DE. Understanding the maturity levels of academics' explorations and applications of various e-learning technologies to support students will help universities determine the level of their own e-learning maturity. Founded on leading principles in the literature (Neuhauser, 2004), the maturity assessment framework for open distance e-learning (MAFODeL) developed in this study assesses the maturity levels of academics' explorations and applications of e-learning technologies and strategies. This tool helps determine the ability, consistency, quality, and sustainability of e-learning. Constant assessment of e-learning maturity is therefore recommended and MAFODeL has been found useful in this regard.

In conclusion, given that no maturity assessment on e-learning has been conducted before at UNISA, this assessment can serve as a starting point for the university to measure improvements made in advancing e-learning activities. This research focused only on tutors and lecturers whose role it is to implement e-learning activities. It is hoped that this scope will be expanded to other stakeholders such as students and UNISA management in order to get a holistic maturity measure that will be based on the views and experience of all stakeholders.

References

Anderson, T., & Dron, J. (2017). Integrating learning management and social networking systems. Italian Journal of Educational Technology, 25(3), 5-19. doi: 10.17471/2499-4324/950

Babbie, E. R. (2016). The basics of social research. ‎Boston, MA: Cengage Learning.

Bernath, B., Szucs, A., Tait, A., & Vidal, M. (Eds.). (2009). Distance and e-learning in transition: Learning innovations, technology and social change. London: ISTE Ltd. doi: 10.1002/9781118557686.

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3, 77-101. doi: 10.1191/1478088706qp063oa

Calli, L., Balcikanli, C., Calli, F., Cebeci, H. I., & Seymen, O. F. (2013). Identifying factors that contribute to the satisfaction of students in e-learning. Turkish Online Journal of Distance Education, 14(1), 85-101.

CHE. (2014). Distance higher education programmes in a digital era: Good practice guide. Pretoria: CHE.

Coleman, E., & Mtshazi, S. (2017). Factors affecting the use and non-use of learning management systems (LMS) by academic staff. South African Computer Journal, 29(3), 31-63. doi: 10.18489/sacj.v29i3.459

da Cunha, F. R., van Kruistum, C., & van Oers, B. (2016). Teachers and Facebook: Using online groups to improve students' communication and engagement in education. Communication Teacher, 30(4), 228-241.

de la Pena-Bandalaria, M. (2007). Impact of ICTs on open and distance learning in a developing country setting: The Philippine experience. The International Review of Research in Open and Distributed Learning, 8(1), 1-7.

Dickey, M. (2010). The impact of web-logs (blogs) on student perceptions of isolation and alienation in a web-based distance-learning environment. Open Learning: The Journal of Open, Distance and e-Learning, 19(3), 279-291.

Duarte, D., & Martins, P. V. (2013). A maturity model of higher education institutions. Journal of Spatial and Organizational Dynamics, 1(1), 25-45.

Du Plessis, E. C. (2017). The voices of student teachers on the accessibility of e-learning initiatives in a distance education community of practice. Journal for New Generation Sciences, 15(1), 260-277.

Garrison, D. R. (2011). E-learning in the 21st century: A framework for research and practice (2nd ed.). London: Routledge. doi: 10.4324/9780203838761

Garrison, D. R., & Anderson, T. (2003). E-learning in the 21st century: A framework for research and practice. (2nd ed.) London: Routledge. doi: 10.4324/9780203166093

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. Internet and Higher Education, 2(2-3), 87-105.

Gautreau, C. (2011). Motivational factors affecting the integration of a learning management system by faculty. The Journal of Educators Online, 8(1). doi: 10.9743/JEO.2011.1.2

Gunawardena, C. (1995). Social presence theory and implications for interaction and collaborative learning in computer conferences. International Journal of Educational Telecommunications, 1(2/3), 147-166.

Guri-Rosenblit, S. (2005). ‘Distance education' and ‘e-learning': Not the same thing. Higher Education, 49(4), 467-493.

Hamad, M. M. (2017). Using WhatsApp to enhance students' learning of English language: Experience to share. Higher Education Studies, 7(4), 74-87.

Haukijärvi, I. (2014). E-learning maturity model—process-oriented assessment and improvement of e-learning in a Finnish University of Applied Sciences. In D. Passey & A. Tatnall (Eds.), Key competencies in ICT and informatics : Implications and issues for educational professionals and management: ITEM 2014, IFIP advances in information and communication technology (pp. 76-93). Potsdam: Springer. doi: 10.1007/978-3-662-45770-2

Irwin, C., Ball, L., Desbrow, B., & Leveritt, M. (2012). Students' perceptions of using Facebook as an interactive learning resource at university. Australasian Journal of Educational Technology, 28(7), 1221-1232.

Koehler, M. J., Mishra, P., Kereluik, K., Shin, T. S., & Graham, C. R. (2014). The technological pedagogical content knowledge framework. In J.M. Specter, M.D. Merrill, J. Elen, & M.J. Bishop (Eds.), Handbook of research on educational communications and technology, 101 (pp. 101-111). New York: Springer. doi: 10.1007/978-1-4614-3185-5_9

Kohlegger, M., Maier, R., & Thalmann, S. (2009). Understanding maturity models: Results of a structured content analysis. Proceedings of I-KNOW and I-SEMANTICS. Graz, Austria.

Maboe, K. A., Nkosi, Z. Z., & Makoe, M. E. (2013). Prospects and challenges of learners' on-line discussion forum interaction in an ODL (open distance learning) institution. Journal of Health Science, 1, 12-20.

Makhanya, M. S. (2016, June 29). University of Africa staff induction. Muckleneuk Campus, Pretoria. Retrieved from https://www.unisa.ac.za/static/corporate_web/Content/About/Executive%20management/Unisa%20Principal%20and%20Vice-Chancellor/documents/makhanya-staff-induction_8July2016.pdf

Makoe, M. (2012). Teaching digital natives: Identifying competencies for mobile learning facilitators in distance education. South African Journal of Higher Education, 26(1), 91-104.

Marshall, S., & Mitchell, G. (2002). An e-learning maturity model? Winds of Change in the Sea of Learning: Proceedings of ASCILITE Auckland, New Zealand. Retrieved from http://www.ascilite.org.au/conferences/auckland02/proceedings/papers/173.pdf

Marshall, S., & Mitchell, G. (2003). Potential indicators of e-learning process capability. Proceedings of EDUCAUSE in Australasia 2003 (pp. 1-8). Adelaide, Australia. Retrieved from http://www.utdc.vuw.ac.nz/research/emm/documents/1009anav.pdf

Marshall, S. J., & Mitchell, G. (2006). Assessing sector e-learning capability with an e-learning maturity model. In D. Whitelock & S. Wheeler (Eds.), Proceedings of the 13th International Conference of the Association for Learning Technology (ALT-C) (pp. 203-214). Edinburgh, Scotland. Retrieved from http://www.utdc.vuw.ac.nz/research/emm/

Marshall, S. (2010). A quality framework for continuous improvement of elearning: The learning maturity model. International Journal of eLearning and Distance Education, 24(1), 143-166.

Matoane, M. C., & Mashile, E. O. (2013). Key considerations for successful e-tutoring: Lessons learnt from an institution of higher learning in South Africa. Proceedings of E-Learn: 2013 World Conference on E-Learning in Corporate, Government, Healthcare and Higher Education (pp. 863-871). Las Vegas: Association for the Advancement of Computing in Education (AACE).

Mbati, L., & Minnaar, A (2015). Guidelines towards the facilitation of interactive online learning programmes in higher education. International Review of Research in Open and Distributed Learning, 16(2), 272-287. doi: 10.19173/irrodl.v16i2.2019

Mettler, T., Rohner, P., & Winter, R. (2010). Towards a classification of maturity models in Information Systems. In A. D'Atri, M. De Marco, A. Braccini A., & F. Cabiddu (Eds.), Management of the interconnected world. Berlin, Heidelberg: Physica-Verlag HD. doi: 10.1007/978-3-7908-2404-9_39

Mtebe, J. S. (2015). International learning management system success: Increasing learning management system usage in higher education in sub-Saharan Africa. Journal of Education and Development Using Information and Communication Technology, 11(2), 51-64.

Moore, G. M. (1993). Theory of transactional distance. In D. Keegan (Ed.), Theoretical principles of distance education (pp. 22-38). New York: Routledge.

Nagel, D. (2009, October 28). Most college students to take classes online by 2014. Campus Technology Online. Retrieved from https://campustechnology.com/articles/2009/10/28/most-college-students-to-take-classes-online-by-2014.aspx

Neuhauser, C. (2004). A maturity model: Does it provide a path for online course design? The Journal of Interactive Online Learning, 3(1), 1-17.

Ngubane-Mokiwa, S. A. (2017). Implications of the University of South Africa's (UNISA) shift to open distance e-learning on teacher education. Australian Journal of Teacher Education, 42(9), 111-124. doi: 10.14221/ajte.2017v42n9.7

Olivier, B. H. (2016). The impact of contact sessions and discussion forums on the academic performance of open distance learning students. International Review of Research in Open and Distributed Learning, 17(6), 75-88. doi: 10.19173/irrodl.v17i6.2493

Owusu-Acheaw, M., & Larson, A. (2015). Use of social media and its impact on academic performance of tertiary institution students. Journal of Education and Practice, 6(6), 94-101.

Pantaziz, C. (2002) Maximizing e-learning to train the 21st century workforce. Public Personnel Management, 31(1), 21-26.

Pinto, A. (2012). How to assess the maturity of a PMO. Paper presented at PMI® Global Congress 2012—North America, Vancouver, BC. Newtown Square, PA: Project Management Institute. Retrieved from https://www.pmi.org/learning/library/pmo-maturity-assessment-model-6079

Rhode, J., Richter, S., Gowen, P., Miller, T., & Wills, C. (2017). Understanding faculty use of the learning management system. Online Learning, 21(3), 68-86. doi: 10.24059/olj.v%vi%i.1217

Tait, A. (2008). What are open universities for? Open Learning: The Journal of Open, Distance and e-Learning, 23(2), 85-93. doi: 10.1080/02680510802051871

Taylor, J. (2001). Fifth generation distance education. e-Journal of Instructional Science & Technology, 4(1), 1-14. Retrieved from http://eprints.usq.edu.au/136/

Thomas, L., Briggs, P., Hart, A., & Kerrigan, V. (2017). Understanding social media and identity work in young people transitioning into university. Computers in Human Behaviour, 76, 541-553. doi: 10.1016/j.chb.2017.08.021

White, B., Longenecker, H., Leidig, P., Reynolds, J., & Yarbrough, D. (2003). Applicability of CMMI to the IS curriculum: A panel discussion. Paper presented at the Information Systems Education Conference (ISECON 2003), San Diego, CA.

Zhang, D., & Nunamaker, J. F., (2003). Powering e-learning in the new millennium: An overview of e-learning and enabling technology. Information Systems Frontiers, 5(2), 207-218. doi: 10.1023/A:1022609809036

 

Athabasca University

Creative Commons License

Maturity Levels of Student Support E-Services Within an Open Distance E-learning University by Asteria Nsamba is licensed under a Creative Commons Attribution 4.0 International License.