International Review of Research in Open and Distributed Learning

Volume 16, Number 4

October– 2015

Usability Evaluation of the Student Centered     e-Learning Environment

 

Inas Sofiyah Junus, Harry Budi Santoso, R. Yugo K. Isal, and Andika Yudha Utomo

Faculty of Computer Science, Universitas Indonesia, INDONESIA

Abstract

Student Centered e-Learning Environment (SCeLE) has substantial roles to support learning activities at Faculty of Computer Science, Universitas Indonesia (Fasilkom UI). Although it has been utilized for about 10 years, the usability aspect of SCeLE as an e-Learning system has not been evaluated. Therefore, the usability aspects of SCeLE Fasilkom UI as a learning support system is not well-acquainted yet. Accordingly, the researchers are in need of conducting a usability evaluation in order to propose a set of recommendation for SCeLE usability improvement, based on usability evaluation reflecting both students and lecturers experience as user.

In this present research, the usability testing was conducted for SCeLE, targeting learning activities underwent by undergraduate students at Fasilkom UI, in the form of blended mode online learning. The data collection stage in the usability testing was performed by distributing questionnaires to students and interviewing several lecturers and students. The collected data were then analyzed and interpreted in order to obtain usability problems and solution alternatives. The quantitative data were analyzed using the mean asreference, while the qualitative data were analyzed using theme-based content analysis. Data interpretation was performed by determining how to handle each kind of data based on its theme, and classifying each of the identified usability problem referring to its severity rating.

The recommendations were constructed based on solution alternatives from the analyzed data supported by literature study. The present research comes up with seven main recommendations and an extra recommendation. The main recommendations are solutions to tackle the identified usability problems, while the extra recommendation is not directly related to any of identified usability problems, but was considered potential to improve the SCeLE usability.

Keywords: Usability Testing, SCeLE, Student-Centered Learning, Learning Management System, Evaluation.

Introduction

Nowadays, Indonesian higher education has been gradually improving their capabilities. One of the ways they are achieving this is by increasing the number of enrolled students annually. The increasing number of students triggered universities to continually improve their learning facilities in order to keep the learning outcomes in balance. The Faculty of Computer Science, Universitas Indonesia, has been pioneering the implementation of Moodle-based online learning system Student Centered e-Learning Environment (SCeLE) since 2005 to enhance the quality of learning.

SCeLE has been used by the faculty to support regular teaching and learning activities in classroom. Students are triggered to be active, both in the classroom and in the online learning system. Students are motivated to be active in class by conducting discussions, case studies, or group tasks. Some of the assignments were designed to be finished in group. Thus, they will need to comprehend their understanding by interacting with their mates. Outside of class time, SCeLE provides an environment for teachers and students to communicate regardless of the barriers of time and place. Discussion forums are provided for the students to continue their discussions after class, at any place and time. Meanwhile, lecturers or facilitators could monitor the discussion, and keep it on track while needed. In this particular condition, learning process are centered in the students themselves, thus provide full understanding about the learning materials.

However, even though SCeLE has been applied for approximately ten years, the system has not been tested for its usability, thus it does not show how well students and lecturers use the system and what kinds of improvement should be implemented.

This research focuses to solve the following problems:

1.             To find out the learning experiences of students and lecturers in SCeLE;

2.            To find out aspects that are required to be preserved in SCeLE;

3.            To find out aspects that are obliged to be improved in SCeLE along with the steps needed.

Aside of answering those problems, this research was expected to achieve the following conditions:

1.             Providing chances for the users to communicate their reflections regarding their experiences of using SCeLE;

2.            Contributing to the improvement of e-Learning implementation in the Faculty;

3.            Providing a reference for further development of SCeLE.

To accomplish these goals, the present research focuses on the main features in SCeLE that have direct connection to learning process: course page and front page. Moreover, the research focus was on the usage of e-Learning system as learning support.

Relevant Literature Review

Student Centered e-Learning Environment

Online learning is an application of e-Learning where the learning process is done by using internet access (Harashim, 2011). In the beginning of its presence, e-Learning was focused only on technical aspects. The researchers were focused on discovering how to deliver the learning materials effectively using the network. But as the technology advanced, they realized that e-Learning was more than just a technical thing; they have to face the pedagogical aspects as well. That made the e-Learning research paradigm shift from delivering materials to student-centered learning (Ehlers, 2009).

The main paradigm of student-centered learning is to shift the learning activity from passive to active learner participation; students are facilitated and encouraged by their instructor to construct knowledge. Instructors have to be aware that every single student is unique; that they have different styles of learning which need to be accommodated (McCombs & Whistler, 1997). This approach provides positive interactions among students and allows them to be appreciated and acknowledged. Students are given different ways of learning in accordance with their learning behavior (U.S. Department of Education, 2011).

SCeLE is a next generation of online repository for learning materials in Faculty of Computer Science, Universitas Indonesia. SCeLE was developed using Moodle, thus SCeLE has similar features comparable to the original Moodle, such as course page, online assignment submission forms, online quizzes, online discussion forums, and so on (Hasibuan & Santoso, 2005; Junus, Santoso, & Sadita, 2014). Moreover, SCeLE has some additional features developed by the Faculty, such as course graph integration, assessment, and Learning Object Manager (Hasibuan, Santoso, & Hidayanto, 2007). The application of e-Learning in SCeLE adapted the concept of blended mode online learning, a usage of e-Learning media to support in-class meetings (Harashim, 2011).

Usability Evaluation in e-Learning System

Usability is a representation of user interface quality. It is one of the main focuses in interaction system design. According to The International Standards Organization (ISO-9241), usability is a limit of effectiveness, efficiency, and satisfaction of a user regarding the usage of a system to accomplish a particular goal.

In the context of e-Learning, the usability issue that has become the focus nowadays is how to make students engaged with the system and how to make students interact with it (Ssemugabi & de Villiers, 2010). The practice of usability evaluation in e-Learning is all about the usability aspects itself enriched by the interaction design concept, pedagogical effectiveness, learning content, and how many supports that can be earned by learners. Therefore, usability evaluation in an e-Learning system should be focused on the processes that are supported by the system. It is different from the task-based conventional usability that is focused only on the product of interaction between users and systems (Ssemugabi & de Villiers, 2010).

The usability testing method utilized in this research was user-based survey. This method treats the user as the evaluation participants. According to Dix, Finlay, Abowd, and Beale (2004), Ardito et al. (2004), and Miami University of Ohio (2004), this method is the best method to identify usability problems. Common techniques in this method are questionnaire, individual interview, group interview, and focus group discussion. Questionnaire technique is excellent in terms of wide data range and the possibility to attract large numbers of participants (Ssemugabi & de Villiers, 2010), while the weakness is the lack of flexibility. In order to overcome that problem, the questionnaire is often followed by a number of interviews. Interviews could be a productive method since the researcher can dig up the specific information from the respondents (Ssemugabi & de Villiers, 2010). In addition, if further information is needed, focus group discussions could be formed.

Usability Factors in e-Learning System

The usability factors observed in this research were based on the eight factors previously studied by Zaharias and Poylymenakou (2009). These are the explanation of the factors applied in this research.

1.             Content: This factor consists of languages and terms used, learning and supporting materials, and any other information in the system.

2.            Learning and Support: This aspect is related to the features that have a direct connection to delivery of learning materials and academic discussions, as well as learning assessment done in the system.

3.            Visual Design: This factor consists of convenience and easiness to understanding interface including layout, color, font, and images.

4.            Navigation: Browsing activities on the website and features utilization are the aspects in this factor.

5.            Accessibility: This aspect summarizes the access of website pages and features.

6.            Interactivity: This factor consists of all forms of communications in the learning context that is facilitated by the system.

7.            Self-Assessment and Learnability: This factor consists of: (1) independent assessment aspects that are facilitated by the system; and (2) the capability of the system to facilitate the users on how to learn to utilize it effectively.

8.            Motivation to Learn: Ability of the system to support and engage the students’ motivation to learn.

This research uses an approach that made the students, not only they have to learn, but also they have to interact to the application (Smulders, 2002). The Figure 1 below is the framework illustration for usability in e-Learning system.

Usability factors used as a reference in e-Learning system usability evaluation are designed to accommodate all parameters in the picture above. Zaharias and Poylymenakou (2009) started their research by collecting and observing many forms of e-Learning and websites guideline. Their preliminary study concluded 90 items that are divided into 13 factors for questionnaire; each factor represents one usability criterion. Those criteria were then processed to select 48 questionnaire items that represent the criteria of eight factors above. Selection process regarding the existing criteria was done by the consistency analysis in each factor, followed by three time

pilot trials in form of data collection using questionnaire. Factor and reliability analysis was done in each trial. Factor analysis was conducted by using Eigen score in each item as the reference to eliminate or preserve that item. Questionnaire items that have been filtered by factor analysis then were measured using Cronbach’s alpha reliability analysis to find out the degree of consistency of each existing variables (Zaharias & Poylymenakou, 2009).

Theme-Based Content Analysis

According to Neale and Nichols (2001), Theme-Based Content Analysis (TBCA) is a method of information processing in the form of user’s opinion and behavior in order to provide the summary of data collecting result in a user population by grouping data into a number of meaningful categories. This method could be used with any form of qualitative data, such as interviews, open-ended questionnaires, and direct behavior observations. The advantage of this process is the time and effort required is relatively small (Neale & Nichols, 2001). According to Neale and Nicholas (2001), the steps of TBCA are:

1.             Data collection: Collecting qualitative data. Method used in this step depends on the needs.

2.            Data collation: Gathering data then grouping them into questions or hypothesis that represented a simple matrix. Rows in the matrix show the raw data in the form of response from participants, while columns represent the summary of data themes. This matrix will ease the researcher to re-observe and re-analyze the data.

3.            Theme definition and classification: Every single row then is grouped based on raw data themes that are identified from participants’ responses. The number of responses that support the raw data theme then marked in the matrix.

4.            Higher order theme selection: Higher order themes could be formed based on composed raw data themes. A higher order theme is a group of code that needs a higher level of inference than raw data theme. The level of higher order themes could be more than one, as the level of inference becomes higher. The number of responses that supports a higher order theme was also marked in the matrix.

5.            Presentation of classification matrix: Raw data, raw data themes, and higher order themes then compiled in a matrix where the columns represent the ordered information based on its level of inference.

Wireframe

Wireframe is an interface illustration that focuses on space allocation and content priority. Thus, in general, wireframes are simple, without many colors, images, and fonts. Wireframes created in this research are low-fidelity wireframes, a kind of simple wireframe that provides with the images regarding the layout giving neither the detailed information nor functions. Wireframes can be created manually by hand or by using software, such as Balsamiq. Figure 2 shows a sample of wireframe that is created using Balsamiq, an application that was used in this research.

Interface Design for e-Learning System

One of the aspects that impact the system’s usability is user interface. In e-learning system, the consideration of interface design principles is as important as the consideration of learning principles and concepts (Faghih, Azadehfar, & Katebi, 2013). According to Faghih et al. (2013), factors that could impact the learning process are working stimulus, role of working memory and long-term memory, multimedia resources, and availability. Stimulus and availability factors were used as a reference to build recommendations in this research. Both are the factors that have strong relations to the system, while the two others are more connected to learning materials, which were incompatible to this research. Learning stimulus could form the learning motivation, thus should be implemented in such a way to increase the students’ eagerness in learning, without resulting in a fear of punishment Indonesian higher education has been gradually improving its capabilities.  One of the ways they are achieving this is by increasing the number of enrolled students annually, or an over desire for appreciation (Faghih et al., 2013). Few aspects in the system that could be developed to increase the students’ motivation are curriculum, learning materials layout and positioning, learning features, interface, and learning content delivery (Faghih et al., 2013). An example of stimulus that could be applied are the usage of informal communication, color variation in learning media, control given to the learner regarding the learning environment, and the implementation of sound and music. Availability is a system ability to provide the content every time it is needed. Availability improvements could be done by adding searching features, content tagging, and cloud computing (Faghih et al., 2013).

Methodology

Research Phases

This research performed in five phases, includes: (1) literature study; (2) usability testing (i.e., determination of targeted user profiles, determination of data collecting method, determination of data collecting instrument, and data collecting); (3) data analysis and interpretation; (4) recommendations determination; and (5) report development.

Usability Testing Execution Design

This phase consists of the steps of usability test planning and the execution of that plan. Usability test planning consists of three steps created based on the usability testing guideline by Rubin and Chisnell (2008), without including the steps regarding task. The approach was chosen because this test was designed to focus on the learning process facilitated by the learning management system, unlike the conventional usability testing that focused on the product in the form of tasks (Ssemugabi & de Villiers, 2010).

The heuristic factors in SCeLE which were attempted in this usability testing used the factors in the usability evaluation methods for e-Learning created by Zaharias and Poylymenakou (2009). Factors used in this research were: content, learning and support, visual design, navigation, accessibility, interactivity, self-assessment and learnability, and motivation to learn. These factors were used as a reference in the data collecting steps, data analysis, and data interpretation.

User Target Profile

User targets in this research were lecturers and students who frequently use SCeLE. The lecturer respondents were chosen based on two criteria, such as: have used SCeLE for at least one semester and was not a member of SCeLE development team. These characteristics were important because if the lecturers were involved in the development team, they would have more knowledge regarding the system compared to normal users, thus making the testing biased (Miami University of Ohio, 2004). In addition, students from the Faculty of Computer Science in one of reputable universities in Indonesia, who had experiences in using SCeLE for at least one semester, were invited to participate in the interview session.

Data Collection Procedure

The data was collected using questionnaires and interviews. The target of this questionnaire were students. In addition, the target of the interview sessions were lecturers and students. Below is the explanation for each method.

Questionnaire for students. This questionnaire consists of eight factors that have been explained in the previous section. Each factor consisted of a set of questions that represent one usability indicator. The 5-Likert rating scale was used as scoring criteria for each question (1 = Strongly Disagree, 2 = Disagree, 3 = Neutral, 4 = Agree, and 5 = Strongly Agree). Face validity of questionnaires were performed by five eligible students to make sure that it contained no ambiguity (see Table 1). Cronbach’s Alpha coefficients for its eight factors ranged from 0.611 to 0.840 (Content = 0.611; Learning and Support = 0.795; Visual Design = 0.795; Navigation = 0.742; Accessibility = 0.686; Interactivity = 0.729; Self-Assessment and Learnability = 0.615; Motivation to Learn = 0.840). Although 30 participants were targeted to fill out the questionnaire as suggested by Nielsen (2012), 42 students agreed to fill in the questionnaire.

Interview with students. Retrospective interview with students was needed after the problems overview had been captured. This interview was conducted to gather the problem details, solution alternatives, and feedback regarding the proposed solutions. Five students were targeted for interviews, as Nielsen (2012) said that the number is enough to identify usability problems. Open ended questions were proposed in interviews, as it will open up the chance to discuss other topics regarding the learning experiences in SCeLE.

Interview with lecturers. The purpose of the interview was to gather information regarding the teaching experience using SCeLE, and to find out the alternative solutions to solve the problems. The mechanism is similar to an interview with students. Five lecturers were selected, and each of them was involved in a semi-structured interview. The difference is that some student-specific parameters were removed, such as the indicator about learning motivation and experience.

Data Analysis Method

Quantitative data. Quantitative data retrieved from the questionnaire was used to determine the problems regarding each factor in SCeLE. The data consisted of a group of score regarding an issue in the system, measured in 1-5 Likert-rating scale. Mean was measured for each response (Boone & Boone, 2012). Criteria with the agreement level more than or equals to 3.50 considered as an acceptable aspect, while the others considered as unsatisfactory parts of the system, indicating important issues that should be resolved in the system (Marreez, 2013).

Qualitative data. Qualitative data gathered from the open-ended questionnaire and interview with students and lecturers. The questions were about their experience while using SCeLE as learning support. The existing data was analyzed using Theme-Based Content Analysis (TBCA). This method was used based on its excellence as recommended by The Miami University of Ohio (2004).

Questionnaire data. After the questionnaire data was processed using both qualitative and quantitative method, a solution identification analysis was performed. It was done by referring the information given by respondents in questionnaire and supported by literature study. Interview with students and lecturers were conducted by considering the findings from questionnaire data analysis.

Data Interpretation Method

Data interpretation in this research consists of determining the data theme that should be preserved and improved, as well as the solution alternatives and the aspects that should be suspended in this research. This step was done by data grouping. Usability problems data in SCeLE were processed in this step. Data grouping was done based on problem solving severity level: unusable, severe, moderate, and irritant (Rubin & Chisnell, 2008). Smaller numbers indicated the problems with higher impact, thus needed to be solved immediately rather than problems that are grouped in higher numbers. Below are the definitions of these categories:

  1. Unusable: Problems that make certain activities in SCeLE cannot be done, thus significantly disturb users.
  2. Severe: Problems that occasionally prevent the user to complete certain tasks, thus limit the users’ activity and disturb them.
  3. Moderate: Problems that cause a delay on some activities, or it can be done well but needs an extra effort from users.
  4. Irritant: Some aesthetic problems, or problems that could be solved easily by most users, thus only disturb few or new users.

Method Used to Develop Recommendations

Recommendations were constructed based on data analysis and interpretation result and supported by relevant literature study. A usability problem could have multiple alternative solutions. Some solution alternatives may also be used to solve multiple problems. In that case, solutions with narrower scope or those that could be used to solve other problems were counted in recommendations.

The expected recommendations consist of improvement recommendations of usability aspects with identified problems, and recommendations to preserve the other ones. Usability problem solving recommendations were sorted by its severity level of the problem.

Results and Discussion

This section presents the results and discussion of the current study. Table 1 shows the mean and standard deviation of the questionnaire items. As mentioned above, the Likert rating scale of the questionnaire is 1 (Strongly Disagree) – 5 (Strongly Agree).

Table 1

Mean and Standard Deviation of the Questionnaire Items (n = 42)

Questionnaire Item

Mean (SD)

Content

The terms applied throughout SCeLE are used consistently

3.81 (0.68)

Concepts in abstract lectures are concretely illustrated by relevant example

3.40 (0.90)

The materials are accurate and up-to-date

3.60 (0.85)

The lectures provide you chance to reflect your learning process

3.72 (0.71)

Vocabulary and terminology used are appropriate for you

3.62 (0.68)

Each topic contains overview and summary

3.28 (0.97)

The learning outcomes are easy to understand

3.32 (0.78)

System layout are easy to be understood intuitively

3.60 (0.88)

Learning and Support

The Web course provides learners with opportunities to access extended feedback from instructors, experts, peers, or others through e-mail or other Internet communications

3.38 (0.74)

Feedback given at any specific time is tailored to the content being studied, problem being solved, or task being completed by the learner

3.40 (0.85)

SCeLE provides opportunities for self-assessment that advance learner achievement

3.68 (0.66)

SCeLE provides appropriate facilities to support your learning process

3.83 (0.56)

Assessment features in SCeLE are easy to use and effective to help understanding the materials

3.53 (0.86)

SCeLE gives you support to apply your knowledge while learning new things

3.51 (0.75)

SCeLE provides good environment to discuss and collaborate to fulfill your academic needs

3.38 (0.97)

SCeLE provides facilities to accommodate individual and group learning

3.36 (0.87)

Visual Design

 

The most important information on the screen is placed in areas most likely to attract your attention

3.43 (0.80)

Texts and graphics are easy to understood

3.79 (0.62)

Fonts (style, color, saturation) are easy to read in both on-screen and printed versions

3.74 (0.74)

Information in help and documentation features are clearly written

3.51 (0.93)

Navigation

You can decide which parts of the course to access, the order, and the pace

3.64 (0.70)

You can control your learning activities

3.30 (0.99)

You always know where you are in SCeLE

3.74 (0.74)

SCeLE allows you to leave whenever desired but easily returns to the closest logical point in the course

3.96 (0.78)

Learning units and modules are self-contained enough that you can take them out of sequence without becoming confused

3.62 (0.87)

You always know what to do in SCeLE in a time you face learning difficulties

3.43 (0.90)

Accessibility

The pages and other components of the SCeLE could be accessed easily with reasonable time required.

3.85 (0.83)

SCeLE is easy to access from any platform

3.81 (1.01)

SCeLE is free from technical problems

2.53 (0.97)

Interactivity

SCeLE provides facility to make the learning process more engaging and motivating

2.96 (0.66)

SCeLE provides access to a range of resources (Web links, case studies, simulations, problems, examples) appropriate to the learning context

3.45 (0.77)

SCeLE engages learners in tasks that are closely aligned with the learning goals and objectives

3.68 (0.66)

Graphics and multimedia assist in noticing and learning critical content rather than merely entertaining or possibly distracting learners

3.62 (0.79)

Self-Assessment and Learnability

You can predict the general result of clicking on each button or link.

3.70 (0.59)

Learners can get started taking the course in SCeLE using only online assistance

3.60 (0.88)

You can understand easily the purpose of using SCeLE in learning process

3.79 (0.69)

Lectures in SCeLE give you chances to know your understanding based on given learning outcomes

3.45 (0.80)

Exercise and grading facility in SCeLE prepare you well to apply your knowledge in daily life

3.13 (0.95)

Motivation to Learn

 

You feel that lectures in SCeLE are unique and fun

3.34 (0.84)

Lectures in SCeLE encourage you to deepen your knowledge

3.19 (0.68)

Lectures in SCeLE are enjoyable and interesting

3.43 (0.65)

Lectures in SCeLE provide learning instruction and help that match your experience

3.47 (0.75)

Lectures in SCeLE help you fulfill your learning needs

3.81 (0.71)

SCeLE helps you in making academic decisions regarding your study

3.60 (0.80)

Lectures in SCeLE provide frequent and various learning activities that increase learning success

3.43 (0.77)

Learning prerequisites and success criteria are well explained

3.34 (0.81)

Lectures in SCeLE provide you chances to apply your new acquired knowledge in real situation

3.15 (0.72)

Lectures in SCeLE assists you to have positive feelings about your accomplishment.

3.19 (0.77)

Note: The version of questionnaire used in this study is in Indonesian language. The questionnaire for lecturer respondents is similar with some adaptation.

The gathered data from questionnaire filling from both lecturer and student interviews were combined. The issues from quantitative data indicators were also combined with necessary adaptation. In this step, higher-order theme resembles suggestions, usability problems, user problems, usability problems outside the research scope, and positive impression were retrieved. Below are the descriptions of each higher-order theme.

Positive Impression

This theme includes the positive responses of questions regarding the aspects in SCeLE. It also consists of indicators that have more than 3.50 points mean, for example, “the terms applied throughout SCeLE are used consistently” (mean = 3.81) and “The pages and other components of the SCeLE could be accessed easily with reasonable time required” (mean = 3.85). The excerpts below are examples of positive impression from interview and categorized in this theme:

“SCeLE Fasilkom UI makes interaction among students and between students and instructors easier. It also helps students to gather both formal and nonformal information.” (Student K12)

“It is (always) easy to access SCELE.” (Instructor D01)

Usability Problems

This theme consists of identified usability problems identification in SCeLE that included in the scope of this research. Questionnaire indicators that gain less than 3.50 points in mean and have a direct impact with SCeLE were also classified into this theme, for example, “the Web course provides learners with opportunities to access extended feedback from instructors, experts, peers, or others through e-mail or other Internet communications” (mean= 3.38). The excerpts below are examples of usability problems from interview and categorized in this theme:

“Discussion activities are less effective than discussion conducted through face-to-face in classroom.” (Student K07)

“Although level of understanding could be assessed through quizzes by using SCELE, the efforts are not too often being conducted.” (Student K09)

“The SCELE user interface is too crowded, to it is quite hard to select important information.” (Instructor D04)

Usability Problems outside the Research Scope

This theme consists of identified usability problems in SCeLE that are not included in the scope of this research. Problems related to course content materials and curriculum are some of the issues classified into this theme.

User Problems

This theme consists of identified problems that have a direct impact through users’ behavior and preference. Questionnaire indicators that gain less than 3.50 mean and have a direct impact to users’ behavior and preference were included in this theme, for example “SCeLE is free from technical problems” (mean = 2.53).

Suggestions

Feedbacks and improvement suggestions were included in this theme. According to the analysis summary data, the follow-up action could be determined based on its higher order theme: suggestions, usability problems, user problems identification, usability problems outside the research scope, and positive impression. The aspects in the usability problems are parts of the system that should be improved.  Meanwhile, SCeLE’s usability problem alternative solutions were classified as suggestions that have a direct connection with usability problems. The issues that arise in data with positive impression considered as the aspects that were satisfactory, thus they should be preserved and maintained.

Below are the aspects that should be preserved and maintained:

•      Content: The terms applied throughout SCeLE are used consistently.

•      Learning and Support: Assessment features (e.g., grader, quiz, discussion forum) that are provided by SCeLE are easy to use and effective to help understand the materials.

•      Visual Design: Fonts (style, color, saturation) are easy to read in both on-screen and in printed versions.

•      Navigation: Learners always know where they are in SCeLE.

•      Accessibility: The pages and other components of the SCeLE could be accessed easily with reasonable time required.

•      Interactivity: SCeLE engages learners in tasks that are closely aligned with the learning goals and objectives.

•      Self-assessment and learnability: Learners can predict the general result of clicking on each button or link.

•      Motivation to learn: SCeLE helps students in making academic decisions regarding their study.

Furthermore, solution priority of the existing usability problems determined by grouping the problems based on their severity. Based on the grouping, there are no usability problems in SCeLE that categorized as unusable, and most of the problems were classified in the moderate category. Whereas the data that criticize the unmentioned features were considered as uncategorized.

Concerning the usability problems in the Motivation to Learn factor, based on the interviews, students have a higher motivation to learn in SCeLE if there are positive encouragements from the lecturer and/or academic incentives. SCeLE itself was not considered to have sufficient impact to students’ learning motivation. Therefore, further research regarding the curriculum and students’ behavior is necessary. Below are usability key points regarding Motivation to Learn that have not been achieved by SCeLE.

•      SCeLE is enjoyable and interesting.

•      SCeLE provides learners with frequent and various learning activities that increase learning success.

•      SCeLE incorporates novel characteristics.

•      SCeLE assists learners to have positive feelings about their accomplishment.

•      SCeLE stimulates further inquiry.

Recommendation for SCeLE Development

After the problems and their solution alternatives have been identified, development recommendations could be formed. Development recommendations were formed based on the suggestions given by respondents as well as literature study to identify the problems’ core and solution alternatives that could be used to overcome the problems. Recommendations have been formed based on available alternative solutions.

Recommendation A – Severe

Regarding the severe problem, the recommendation is to upgrade the Moodle to the latest version. This recommendation was chosen because it could overcome many usability problems related to views and ease of use. The latest version of Moodle also provides better interface and performance.

Recommendation B – Moderate

The recommendations regarding the moderate problem include: elements positioning based on their importance; addition of document and post tagging and searching feature; and addition of the recent activities information in course page. First, there were difficulties to access some important features that can cause the students to miss some important information. This research suggests the elements that are more frequently needed and/or considered more important should be placed on top of the page, while others should be placed below. Second, tagging and searching features are important to keep the information on track. Tag keywords could be topic information or course name. Forum posts and documents are learning artifacts that are recommended to have tag information, alongside with search feature. Third, based on the interviews, the recent activities information can appear as notifications that consist of the latest unread forum posts, updates on assignment or quiz information, and updates on course materials. It is also recommended to add subscription feature on particular discussion forum (see Figure 3 below).

Fourth, the user guide features could be implemented as a tooltip, a simple explanation that accompanies features. This will help beginner users to get accustomed with SCeLE easily, thus they will have more time to focus on their study instead. Fifth, the integration with University Academic Information System (UAIS); integration will enhance the course enrollment and scoring feature, to diminish the feature redundancy on both SCeLE and UAIS, thus the scoring mechanism will be simplified.

Recommendation C – Irritant

The recommendation regarding the irritant problem is theme choice. The respondents’ suggestions regarding the system’s view vary considerably.  While one student feels comfortable using a calm color theme, others may prefer to choose a more energetic color. The recommendation is to let users have interface with a font and color theme according to their own preference. It is important since the ease of use is one of the factors that imply the learning stimulus (Faghih et al., 2013).

Additional Recommendation

Another recommendation for SCeLE development is course materials archiving. All study materials should be available online so that students can access them easily. This recommendation is suitable for the courses with “stable” topics, such as mathematics, programming foundations, data structures, etc.

Further Research

Below are the recommendations for further research concerning SCeLE:

1.             After the recommendations are applied in the system, it is better to perform another usability testing. The iterative usability testing is important to detect as many problems as possible in order to enhance the system’s usability (Miami University of Ohio, 2014).

2.            The TBCA qualitative data analysis should be conducted by a number of researchers to yield higher reliability.

References

Ardito, C., Costabile, M., De Marsico, M., Lanzilotti, R., Levialdi, S., Plantamura, P., Roselli, T., Rossano, V., & Tersigni, M. (2004). Towards guidelines for usability of e-learning applications. In User-Centered Interaction Paradigms for Universal Access in the Information Society: 8th ERCIM Workshop on User Interfaces for All, pp185–202, Berlin: Springer-Verlag.

Balsamiq Studios, L. (2014). Balsamiq examples. Retrieved from: https://support.mybalsamiq.com/ projects/examples/grid

Boone, H. N., & Boone, D. A. (2012). Analyzing likert data. Journal of Extension, 50(2), 1-5. Retrieved from http://www.joe.org/joe/2012april/pdf/JOE_v50_2tt2.pdf

Dix, A., Finlay, J., Abowd, G., & Beale, R. (2004). Human-Computer Interaction (3rd ed.). Harlow, UK: Pearson Education Limited.

Ehlers, U. D. (2009). Web 2.0 - E-Learning 2.0 - Quality 2.0? Quality for New Learning Cultures. Quality Assurance in Education, 17(3), 296-314.

Faghih, B., Azadehfar, M. R., & Katebi, S. D. (2013). User interface design for e-learning software. The International Journal of Soft Computing and Software Engineering, 3(3), 786–794.

Harasim, L. (2011). Chapter two: Historical overview of learning and technology. Retrieved from: http://lindaharasim.com/sfu-courses/cmns-453/chapter-two-historical-overview-of-learning-and-technology/

Hasibuan, Z. A., & Santoso, H. B. (2005, July). The use of e-learning towards new learning paradigm: Case study student centered e-learning environment at faculty of computer science – University of Indonesia. Paper presented at the Fifth IEEE International Conference on Advanced Learning Technologies, Kaohsiung, Taiwan. DOI: 10.1109/ICALT.2005.279

Hasibuan, Z. A., Santoso, H. B., & dan Hidayanto, A. N. (2007). Penyelenggaraan e-learning sebagai layanan internal dan eksternal di level fakultas: Studi kasus Fakultas Ilmu Komputer Universitas Indonesia. Paper presented at the Teknologi Informasi dan Komunikasi untuk Indonesia.Retrieved from: http://dl2.cs.ui.ac.id/files/papers/eII2007/penyelenggaraan_elearning_eii_2007.pdf

Junus, K., Santoso, H. B., & Sadita, L. (2014). The use of collaborative and self-monitoring tools for linear algebra course in Student Centered e-Learning Environment. Paper presented at the 44th Frontiers in Education 2014, Madrid, Spain. DOI:     10.1109/FIE.2014.7044296. Retrieved from: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=7044296

Malamed, C. (2009). User interface design for e-learning. Retrieved from: http//theelearningcoach. com/elearning_design/user-interface/user-interface-design-for-elearning/ 

Marreez, Y. M. A., Wells, M., Eisen, A., Rosenberg, L., Park, D., Schaller, F., & Krishna, J. T. R. (2013). Towards integrating basic and clinical sciences: Our experience at touro university nevada. The Journal of the International Association of Medical Science Educators, 23(4), 595–606.

McCombs, B. & Whistler, J. (1997). The learner-centered classroom and school: Strategies for increasing student motivation and achievement. San Francisco: Jossey-Bass Publishers.

Miami University of Ohio (2004). Usability testing: Developing useful and usable products. Retrieved from http://www.units.miamioh.edu/mtsc/usabilitytestingrevisedFINAL.pdf

Moodle.org (2014a). About moodle. Retrieved from: http://docs.moodle.org/27/en/About_Moodle

Moodle.org (2014b). Moodle history. Retrieved from: http://docs.moodle.org/25/en/History

Neale, H., & Nichols, S. (2001). Theme-based content analysis: A flexible method for virtual environment evaluation. International Journal of Human-Computer Studies, 55(2), 167–189.

Nielsen, J. (2012). How many test users in a usability study? Retrieved from: http://www.nngroup.com/articles/how-many-test-users/

Rubin, J., & Chisnell, D. (2008). Handbook of usability testing: How to plan, design, and conduct effective tests (2nd ed.). Indianapolis, IN: John Wiley and Sons, Inc.

Ryan, G.W., & Bernard, H.R. (2003). Techniques to identify themes in qualitative data. Retrieved from: http://www.analytictech.com/mb870/Readings/ryan-bernard_techniques_to_identify_themes_in.htm

Smulders, D. (2003 February). Designing for learners, designing for users. ACM eLearn Magazine  DOI: 10.1145/640559.2134466. Retrieved from: http://dl.acm.org/citation.cfm?id=2134466

Ssemugabi, S., & de Villiers, R. (2010). Effectiveness of heuristic evaluation in usability evaluation of e-learning applications in higher education. South African Computer Journal, 45, 26-39.

U.S. Department of Education, Office of Vocational and Adult Education. (2011). Just Write! Guide. Washington, DC: American Institutes for Research. Retrieved from https://teal.ed.gov/documents/TEAL_JustWriteGuide.pdf  

Usability.gov (2014). Wireframing. Retrieved from: http://www.usability.gov/how-to-and-tools/ methods/wireframing.html

Zaharias, P., & Poylymenakou, A. (2009). Developing a usability evaluation method for e-learning applications: Beyond functional usability. International Journal of Human–Computer Interaction, 25(1), 75–98.

© Junus, Santoso, Isal, and Utomo

Black_AU_logomark_sml

ccbyimage