Volume 18, Number 7
Mik Fanguy1, Jamie Costley2, and Matthew Baldwin3
1,3KAIST, 2Kongju National University
Lecture videos have become an increasingly prevalent and important source of learning content. Lecturer-generated summaries may be used during a video lecture to improve student recall. Furthermore, the integration of a guest lecturer into the classroom may be a beneficial educational practice drawing the learner's attention to specific content or providing a change of pace. The current study measures the effects of lecturer-generated summaries and the inclusion of a guest lecturer on students' ability to recall online video lecture contents. Seven sections of a flipped scientific writing course were divided into three groups. The control group videos featured a lecturer speaking with PowerPoint slides in the background. The Summaries Only group viewed the same videos as those of the control, with the addition of lecturer-generated summaries spliced into the middles and ends of the videos, respectively, and these summaries were delivered by the same lecturers of the original video. The Summaries with a Guest Lecturer group viewed the same videos as the control, but with the addition of lecturer-generated summaries respectively spliced into the middles and ends of the videos, and these summaries were instead delivered by a guest lecturer. Student recall was measured through two online multiple-choice quizzes. The results of the study show that the Summaries Only group significantly outperformed the other two groups, while no significant difference was found between the performances of the control and the Summaries with a Guest Lecturer group. The results suggest that lecturer-generated summaries help to improve student recall of online video lecture contents. However, the introduction of a guest lecturer shown in a different setting may cause learners to lose concentration, nullifying the benefit of the summaries.
Keywords: e-learning, flipped learning, guest lecturer, quizzes, summaries, video lectures
Flipped classrooms are becoming an increasingly prevalent part of the educational landscape (Breslow et al., 2013). Since flipped courses tend to rely heavily on online video lectures, it is necessary to examine how such videos can affect students' understanding and recall of course content. While online video lectures provide students with considerable advantages in terms of accessibility, and the ability to tailor learning to their own pace, there are also valid concerns about the effectiveness of this form of instruction. Some major concerns are that students may not be fully engaged in listening to the lecturers or may skip them altogether (Alksne, 2016), that students may experience a reduced sense of engagement compared to traditional classroom environments, and that video lectures may provide less pedagogical benefit than traditional lectures. In e-learning environments, videos and other media can help by keeping learners interested and engaged in the material (Zhang, Zhou, Briggs, & Nunamaker, 2006).
In flipped courses, lectures are typically provided online in video format rather than live in-class delivery. While classroom practices may vary, practitioners often recommend that class time traditionally devoted to in-class lectures should instead be devoted to more student-centered activities, particularly student-centered problem-solving activities (Ferreri & O'Connor, 2013; Kim et al., 2014; Mason, Shuman, & Cook, 2013; Prober & Khan, 2013; Wilson, 2013). These in-class group activities often require prior knowledge, which students are expected to gain from watching online lecture videos beforehand. In such cases, student engagement with course videos is important, not only from the perspective of gaining knowledge from the videos themselves, but also to maximize the learning opportunities presented through in-class collaborative problem-solving activities requiring prior knowledge from the videos.
There are a broad variety of common methods that are used to improve student retention from lectures, both video and traditional formats, including spoken organizational cues (Titsworth & Kiewra, 2004) and visual ancillary cues (Hirsch, 1987). Brecht (2012, p. 245) found that lecture videos that had "strong presentation of relief and change-of-pace elements" such as alterations in audio or video stimulus, background or setting modifications, changes in the camera angle, or dramatic transitions between various parts of the video were more effective. Barker and Benest (1996) note that the addition of multimedia contents and animations is desirable since it prevents students from losing concentration. Furthermore, Costley and Lange (2017) showed that video lectures with a greater variety of media types lead to students retaining more of the video contents. Other studies suggest periodically changing the pace of a lecture, for example, with a humorous remark, an activity, a question that must be answered, or audio/visual material (Padget & Yoder, 2008; Woodring & Woodring, 2011; Wolff, Wagner, Poznanski, Schiller, & Santen, 2015; Jones, Peters, & Shields, 2007). A drastic change-of-pace element that can be introduced into a video lecture is the appearance of a guest lecturer, which constitutes a type of team teaching. The present study will attempt to assess the effect of integrating a guest lecturer into the context of video lecture content within a flipped learning environment.
The level of learning in online environments is another important factor in understanding the effectiveness in video lectures. Quizzes are a commonly-used component across all types of courses and are especially prevalent in flipped classroom environments, as quizzes that count towards final grades can be powerful motivators for students to actively watch video lecture contents in such learning environments (Frydenberg, 2013; Tune, Sturek, & Basile, 2013; Enfield, 2013). Quizzes have often been shown to be a good way to measure and motivate student learning (Johnson & Kiviniem, 2009; Herold, Lynch, Ramnath, & Ramanathan, 2012; Kamuche, 2011), and in predominantly or entirely online environments, such as MOOCs, online quizzes may be the only way to do so easily. In most courses, quizzes are connected to the contents of lectures and are seen as a measure of student learning of the course content and of lectures. Quizzes can, therefore, be seen as a representation of the amount that students learn and retain from the content. Video lectures have been shown to positively impact student performance on quizzes (Mason, Shuman, & Cook, 2013; Williams, Birch, & Hancock, 2012), although the effectiveness of these videos when compared to traditional lectures is an ongoing debate.
In order to increase student retention of lecture video contents, various methods are employed by both students and instructors. Smidt & Hegelheimer (2004) have shown that among the numerous strategies used by students when watching online videos, one of the most common and effective is to listen to the content again. A method that instructors use to improve student recall of lecture content is to provide cues when important information is being delivered. Instructors have many different means to deliver such cues to the students. For example, the lecturer can write important ideas on the blackboard (Locke, 1977), present it visually (Baker & Lombardi, 1985), or simply emphasize it through the manner of speaking (Maddox & Hoole, 1975; Scerbo, Warm, Dember, & Grasha, 1997). Titsworth & Kiewra (2004) found that spoken cues increased student academic performance. Each of these methods has been shown to bring about an improvement in students' achievement. Repetition has been shown to improve recall in learning situations. Mayer (1983) also claimed that recall and problem-solving abilities increased the more that learning content was repeated. Bromage and Mayer (1986) found that repetition enabled students to recall more of what they heard and to remember a larger quantity of structurally and functionally important information, signaling a change in their learning strategies. Webb (2007) examined the effects of repetition on the language acquisition of Japanese students learning English and discovered that knowledge increased with increased repetition, as students who encountered a new word in context frequently were likely to have a deeper understanding of the word's meaning and function.
When providing students with repetition of important content in a course, instructors must also consider the pacing of such repetition. An exhaustive number of experiments conducted by cognitive psychologists have shown the benefits of spaced repetition of content compared to massed presentation (Cepeda, Pashler, Vul, Wixted, & Rohrer, 2006). Furthermore, distributed practice received one of the highest utility ratings in a comprehensive review of a variety of learning strategies based on the available evidence (Dunlosky, Rawson, Marsh, Nathan, & Willingham, 2013). It has also been found that providing students with a paraphrased version of the content produced an equivalent amount of recall as with a verbatim repetition (Glover & Corkill, 1987). This paraphrasing can be seen, in some sense, as a summary of the previous content.
A common approach to the structuring and delivery of course material is to divide responsibilities among more than one teacher, and several terms are used in the literature to describe this concept, including cooperative teaching, team teaching, and co-teaching (Welch, Brownell, & Sheridan, 1999). The definition of co-teaching according to Cook and Friend (1996) is "two or more professionals delivering substantive instruction to a diverse, or blended group of students in a single space" (p. 156). Cook and Friend (1996) describe the following five styles of co-teaching. The first of these is called "one teaching/one assisting," which means that one instructor takes primary responsibility for the delivery of content, while the other plays a facilitatory role. One issue with this method is that the assisting teacher may be seen more as an assistant than an equal partner in instruction. "Station teaching" refers to teachers dividing presentation of the course content as well as the physical space of the learning environment, with each teacher managing a separate part of the classroom and the curriculum, with student groups alternating between different segments of the classroom. "Parallel teaching" is where teachers collaborate in planning course content but then divide the students into groups of equal sizes and are responsible for educating their own respective groups. "Alternative teaching" usually involves dividing a class into one large and one small group, with the smaller group receiving specialized instruction such as pre-teaching, directed practice, and review. In "team teaching," the instructors take turns in discussing course topics with students or in providing demonstrations through role play. As an alternative to co-teaching, an instructor may choose to bring in a guest lecturer, as the guest lecturer may have more expertise on a particular topic, or in some cases, provide students with a change of pace or a differing perspective. This can also be done in online videos, with cameos by guest lecturers.
It is often claimed that students' attention spans fall within the 10- to 20-minute mark (Davis, 1993). While the concreteness of this time-frame is debated (Wilson & Korn, 2007), the occurrence of such lapses is not (Bunce, Flens, & Neiles, 2010). This phenomenon is often called the "vigilance decrement" in the literature (Ariga & Lleras, 2011; Farley, Risko, & Kingstone, 2013; Risko, Anderson, Sarwal, Engelhardt, & Kingstone, 2012; Young, Robinson, & Alberts, 2009). To counter the negative effects that attention lapses may have on learning, switching things up or changing pace is recommended to ensure students remain focused (Center for Excellence in Teaching, 1999). One method of doing so includes introducing a guest lecturer to maintain student interest in the lesson (Young, Robinson, & Alberts, 2009). While the literature on team teaching often refers to traditional rather than online or blended learning environments, studies have shown that a variety of student benefits to team teaching, including exposure to a variety of viewpoints (Letterman & Dugan, 2004), improved achievement and retention of course contents (Johnson, Johnson, & Smith, 2000), improved communication skills (Helms, Alvis, & Willis,, 2005) and better student-teacher relationships (Wilson & Martin, 1998). Interestingly, when Dugan and Letterman (2008) compared student's own perceptions of the value of team-taught courses compared to traditional courses taught by one instructor, they found no significant difference between them. Despite this finding, Dugan and Letterman did find that students preferred courses taught by two instructors rather than a panel of three or more instructors, and that the students preferred co-teaching, where each instructor attends and actively engages in teaching the class during each session, over an alternating style of instruction where each instructor would teach during an entire class session on an turn-by-turn basis.
Quizzes have been shown to increase the long-term retention of academic lecture content as compared to restudying or not reviewing the contents (Roediger III & Karpicke, 2006; Butler & Roediger III, 2007). Marks (2015) points out that computer-based quizzes enable instructors to pinpoint if some or all students have been struggling with the lesson's content, allowing the lecturer to adjust instruction accordingly. For these reasons, quizzes can be a valuable component in flipped and blended classrooms where students are expected to perform computer-based tasks outside of the class and to engage in group work during class time.
Multiple-choice quizzes are pervasive in university-level courses (Roediger & Marsh, 2005). While they have sometimes been disparaged in the literature, multiple-choice quizzes are often viewed as a necessary evil since they provide instructors with an easy way to grade a large number of quizzes, particularly for large undergraduate courses, where short answer or essay exams may be too burdensome or unreliable (Little, Bjork, Bjork, & Angello, 2012). For the same reasons, multiple-choice quizzes are widely used in MOOCs and other e-learning environments in which large numbers of students are enrolled. (Colvin et al., 2014). Commonly cited criticism of multiple-choice quizzes include their limited ability to measure complex learning (Frederiksen, 1984), their inability to engage students in the type of retrieval processes that lead to long-term retention (Chan, McDermott, & Roediger, 2006; Foos & Fisher, 1988), and their tendency to cause misinformation since studies have shown that simply exposing students to plausible incorrect answers causes them to judge such lures as more true than novel fact answers, though never quite reaching the rated truth levels of true statements (Roediger & Marsh, 2005; Toppino & Brochin, 1989; Toppino & Luipersbeck, 1993).
However, studies have also shown advantages to multiple-choice testing. For example, despite the aforementioned issue of the creation of false knowledge, Roediger and Marsh (2005) showed a substantial positive testing effect when students were given a multiple-choice test in preparation for a general knowledge exam. Little et al. (2012) found that properly constructed multiple-choice questions fostered the recall of previously tested information, where "proper construction" means that all answer choices are plausible, but not so plausible that the question becomes unfair. They further found that multiple-choice questions aided students in the recall of information related to incorrect answer choices, unlike cued-recall tests (Little et al., 2012). Multiple-choice questions also offer greater reliability in grading and provide students with a greater variety of questions in a shorter period of time than some other constructed response questions, such as essay exams (Walstad & Becker, 1994). A common criticism of multiple-choice questions is that testing factual knowledge does not guarantee competence, as high-level competence requires the integration of knowledge with attitudes and communication skills (McCoubrie, 2004). However, research has shown that knowledge of a subject area is the single best determinant of expertise (Glaser, 1984). Therefore, multiple-choice questions are a valid method of testing competence, as written test forms are the best assessment instrument of cognitive knowledge (Downing, 2002).
In a study by King (1992), university students were divided into three groups, one trained in self-questioning (and answering these questions), one trained in summarizing, and one untrained control group who simply took notes and reviewed them. Among the three groups, King found that the summary and self-questioning groups outperformed the control group in the recall of a lecture, with the summary group performing best in the short-term and the self-questioning group performing slightly better than the summary group in the long-term.
Adapting the concepts of co-teaching mentioned previously, Jang (2006) devised an experimental team-teaching approach in which two teachers specialized in teaching different parts of a science lesson, and were transferred to different classrooms at different times to provide instruction on that specialization. This approach was combined with a course website where students could access supplemental textbook information, discussion groups, and class notices as well as turn in course assignments. Jang found that students who were taught by this team-teaching web-based approach showed more improvement from their pretest to their posttest scores as compared to students who took traditional classes taught by one teacher with no web component (Jang, 2006).
Considering the need for developing lectures that are engaging and effective for learners, this study is guided by two research questions:
In the present study, we sought to assess how change-of-pace and structural elements in online videos would affect student learning outcomes in a Scientific Writing course (CC500) at the Korea Advanced Institute of Science and Technology (KAIST) in Daejeon, South Korea. This Scientific Writing course is specialized compared to other academic writing courses offered at KAIST in that it is only offered for graduate students, and it focuses on producing and publishing research manuscripts in science and technology. This focus means that the video lectures cover numerous topics beyond traditional writing skills and include issues such as the function of particular sections of a manuscript and various conventions of scientific and engineering writing, as well as the process of publishing a paper. The course is offered in a flipped format, so that students and professors met once per week, and students also watched videos and took quizzes online for homework.
The present study examined 3 weeks (11 videos) among the total 16 weeks (56 videos) that constitute the entire course. The first batch of five videos were shown to the students in week 6 of the 16-week course and covered the following five topics: "Usage Rules for the Colon and Semicolon," "Usage Rules for the Hyphen," "How to Incorporate Numbers into Writing," "Avoiding Unclear Pronoun Usage," and "Paraphrasing." The second batch of six videos was shown in weeks 14 and 15 of the 16-week course and all applied to the theme of "Publishing Your Paper." This video series was designed to guide the students through the steps that follow the completion of a research manuscript, and the topics were as follows: "Selecting a Journal," "Open Access vs. Subscription Journals," "How to Get your Research Published," "The Peer Review Process," "Common Reasons for Rejection," and "Final Thoughts and Advice on Publishing Your Paper."
For the control group, the video series was prepared in a straightforward style with only one lecturer and no summaries, as shown in Figure 1. For the videos prepared for the "summaries only" group, all lecturing and summaries were delivered by the same instructor, as shown in Figure 2(a). The summaries were around 30 seconds in length and were shown at the middle and end of each video. In the mid- and end-point summaries, the lecturer would mention the main points from the first and second halves of each video, respectively. The video series provided to students in the "summaries with a guest lecturer" group contained more varied change-of-pace elements: each video contained mid and final summaries with identical content to those of the "summaries only" group, but the summaries were delivered by a guest lecturer who was also seated in a "coffee shop" setting projected in the background, as shown in Figure 2(b), rather than standing in front of PowerPoint slide contents, as in the videos of the "summaries only" group.
Figure 1. Main video lecture without summaries.
Figure 2. (From left to right) Summaries by the main lecturer (a) and guest lecturer (b).
Figure 3. Summary of the video timelines. The letters A-C correspond to the style of videos shown above.
A total of 135 students were divided up into a control group and two treatment groups, with between 39 and 55 students in each group. The respective videos were posted on the school's learning management system in weeks 6 and 15 and were available to watch at the students' convenience. Once the videos were viewed for each week of the study, students took a multiple-choice quiz online, which they could access at any time during the seven-day video viewing period but which must be completed by the respective brick-and-mortar classroom meeting day. The quiz was used to measure the students' comprehension and recall of the contents of the videos. During the respective brick-and-mortar classroom meeting, students were asked to fill out a survey that involved a 10-point Likert scale to assess the videos. Survey forms were assessed, and quiz data was taken from the online learning management system of the course.
This study was conducted at the Korea Advanced Institute of Science and Technology (KAIST), a large university located in Daejeon, South Korea. Most students at the university specialize in STEM fields. As of 2013, KAIST has a student population of 11,175, with about 60% of those studying at the graduate level (KAIST, 2014b). The university is predominantly male (80%), and international students make up 5% of the total population (KAIST, 2014b). KAIST offers an array of online and blended courses. These include Massive Open Online Courses (MOOCs) made available through Coursera, institution-level online courses through the CyberKAIST program as well as the Bridge-Program for prospective freshmen, and global and institutional-level flipped courses through iPodia and Education 3.0, respectively (KAIST, 2014a).
Participants in the present study were students in classes that were provided as part of KAIST's Education 3.0. The Education 3.0 initiative was started in 2012 with the goal of reducing the amount of traditional lecturing in KAIST courses and enabling students to engage in more interactive and communicative learning activities through a flipped classroom environment (Horn, 2014). KAIST performed a trial run involving four Education 3.0 courses in 2012. Students' performance on exams and overall satisfaction with the courses were higher than those with conventional lecture classes (Horn, 2014). In 2014, 5% of all classes at KAIST were offered in Education 3.0 format, with a goal of increasing that number to 30% by 2018 (Horn, 2014). As a graduation requirement, master's and PhD students at KAIST must be able to compose articles for publication in scientific and engineering journals. To assist them in this process, the Scientific Writing (CC500) course teaches students how to communicate their research through writing in English. The course is conducted in English, and enrollment is generally 20 students per section. In the present study, seven sections of Scientific Writing were included in the experiment. This was a total of 135 participants, 17 of which did not complete the second section of the quiz and so were removed from the analysis done as a part of this study. This left 118 valid responses, of which 32 were female and 86 were male. The oldest participant was 45, and the youngest was 22, with a mean age of 27. The seven selected sections of Scientific Writing were taught as part of KAIST's Education 3.0 program, so they were given in a flipped format.
Students took two quizzes, the first comprising 25 questions and covering information given in five video lectures and the second comprising 30 questions covering six video lectures. All quizzes were given online and could be taken at any point during the one-week video viewing period (weeks 6 and 15, respectively). The quizzes were made up of multiple choice questions, with some of the quizzes allowing only one answer choice and others calling for one or more answer choices, and example of some of the questions can be seen below in Table 1. For the latter question type, partial credit was awarded when a correct answer was selected, but no credit was given when an incorrect answer choice was selected. Quiz questions were created by the course instructors and were designed to make students demonstrate that they could apply the concepts taught in the videos. Each quiz was worth 5% of the class's total grade. While the effectiveness of multiple-choice quizzes is still being debated, they were an attractive choice for this course, as with most e-learning environments, since they present an opportunity to measure students' recall while allowing students to take quizzes anytime and anywhere (provided they have internet access and a device) with immediate feedback. The topics in the first batch of videos were related to rule-based concepts rather than communicative issues in writing, and provided descriptions and suggestions to guide students through the process of publishing their completed manuscripts in a journal after the conclusion of the course. We felt that such topics were appropriate for measurement via quizzes, while the quality of the students' writing and their ability to clearly communicate their research results were evaluated separately through instructor comments and grading.
Table 1
Sample Quiz Questions
Quiz and question numbers | Question | Choice of answers |
Quiz 3, Question 1 | Which of the following sentences is correct in terms of writing with numbers? Select one or more answers. |
|
Quiz 3, Question 8 | Which of the following is true regarding paraphrasing? Select one or more answers. |
|
Quiz 6, Question 20 | The aim and scope of a journal ______________. Select one or more answers. |
|
Quiz 6, Question 28 | Which of the following is an advantage of a journal transfer? Select one or more answers. |
|
The first point of analysis was to look at any relationships between the demographic variables of gender and age the dependent variable. Using Spearman's point-bi-serial correlations for gender, and Spearman's standard bivariate correlations for age and quiz score, correlation coefficients were calculated and analyzed, which can be seen in Table 2 below. The results showed that the main dependent variable for the study (quiz score) had a small non-statistically significant relationship with both age and gender. However, there was a statistically significant relationship between age and gender. The negative relationship between age and gender was -.199, the relationship between gender and quiz score was.011, and age and quiz score was -.051. The likely reason for the relationship between age and gender is that Korean males are required to do two years of military service, which is usually done after their first or second year of university. These two years of time out of education means that men in graduate school in Korea tend to be older than women.
Table 2
Correlations Between Gender, Age, and Grade
Gender | Age | Grade | |
Gender | 1 | ||
Age | -.199** | 1 | |
Quiz score | .011 | -.051 | 1 |
** p = <.01
To answer the main research questions that were a part of this study, the mean quiz scores for each experimental group were calculated and compared (see Table 3). The condition with the highest mean quiz score was the condition with the summary only, followed by the summary with a different professor, with the basic lecture having the lowest mean quiz score. It is worth noting that both the basic lecture condition (5.789) and the summary with a different professor (5.827) had very similar means, while the condition that contained summaries only had markedly higher mean quiz scores (6.787).
Table 3
Mean Quiz Score for Each Experimental Condition
Mean | N | SD | Min | Max | |
Basic lecture | 5.789 | 46 | 1.35867 | 2.53 | 9.00 |
With summary | 6.787 | 37 | 1.68573 | 3.01 | 10.00 |
With summary and different professor | 5.827 | 35 | 1.44842 | 2.99 | 9.53 |
Total | 6.134 | 118 | 1.55056 | 2.53 | 10.00 |
After examining the quiz means for each of the three conditions, Analysis of Variance (ANOVA) was used to analyze the differences among and between the group means that were a part of this research. ANOVA tested if there was a statistically significant difference between the basic lectures, lectures with summary only, and lectures with summary by a guest lecturer groups. Table 4 shows that there was a statistically significant difference between the three groups that were a part of this study (p = <.01).
Table 4
ANOVA Comparisons for Grades Within the Three Experimental Conditions
Sum of squares | df | Mean square | F | Sig. | |
Between groups | 15.812 | 2 | 7.906 | 3.633 | .005 |
Within groups | 287.250 | 132 | 2.176 | ||
Total | 303.063 | 134 |
After the overall difference among the experimental groups was established, the Scheffe test was used to examine the specific difference between each experimental group's quiz means, and whether or not those differences were statistically significant. The Scheffe test is a single-step multiple comparison procedure designed to be applied to the set of estimates of all possible comparisons among means. Table 5 shows that in the lecture with summary group, the quiz mean was.998 higher than that of the basic lecture group. Also, the lecture with a summary and different professor group had a quiz mean.960 lower than the lecture with summary group. Both the relationships between the basic lecture and lecture with summary, and lecture with summary with a guest lecturer and lecture with summary were statistically significant. The quiz mean of the lecture with summary by a guest lecturer was .038 higher than the basic lecture quiz mean, though this relationship is not statistically significant.
Table 5
Scheffe Test for Comparisons of the Three Experimental Conditions
Basic lecture | With summary | With summary and different professor | |
Basic lecture | 1 | ||
With summary | .998* | 1 | |
With summary and different professor | .038 | -.960* | 1 |
* p = <.05
A post-hoc survey was conducted to collect students' opinions on their experiences regarding the video lecture element of the course once the semester had ended; grades had already been allocated prior to the email messages being sent. Said messages contained screenshots of the summaries each respective treatment group had encountered to serve as a reminder, as well as a simple request for participants' feelings regarding the summaries; any general comments on the videos were also welcomed. A balanced amount of criticisms to positive responses were logged.
Students from the two treatment groups generally perceived the summaries positively, noting they were helpful for reviewing and emphasizing the salient points of the lecture. Such summaries were seen to complement notetaking and were appreciated as they enabled a time to reflect and take stock of what had been explained so far. For weaker students, they were seen as a tool for clarifying understanding of the content. In addition, the appearance of a guest lecturer for those in the "summaries with a guest lecturer" group was said to have a "refreshing" function by some participants - as if it were a reminder to focus one's attention. One individual in particular mentioned how useful such appearances were to draw one's attention back to the content if they were multitasking. Others claimed the summaries reduced the amount of pausing and (presumably) re-watching.
It was, however, felt (by members of both treatment groups) that while generally useful, summaries may be redundant for shorter videos or those that deal with straightforward content. There was a tendency to favor one final summary in such cases. Furthermore, a common complaint for the second treatment group was that the summaries were distracting as they broke up the structure and disturbed the flow of the lecture.
Of the 19 replies received, the following represent the spectrum of responses.
Redundancy.
"If the content was difficult, full of things to memorize, or a lot of additional explanation, then the mid summaries were very helpful. However, if the content is straightforward, I think mid summaries might be redundant." (T1)
"Mid summaries were such a time consuming point in the lecture in some moments in that it re-explained the content which were studied right before (even in few seconds)." (T1)
"It could make students distracted. I think a concise lecture with only end summary will be great to let student know what is the most important concept on that lecture and what should be remembered and studied." (T1)
Distraction.
"But I think there may not need the time countdown - It distracts my focus a little." (T2)
"I did think the coffee scene broke up the structure of the lecture." (Treatment 2)
Students as directors of learning.
It would be better to collect those summarizing slides and provide them as ppt/pdf. The reason is... if students already know some of the subsections of the lecture, they can skip those parts and dive directly into the subsections they don't know about. (T1)
If I were to think back and give you some recommendations, I think having an option of speeding up the videos will be great. I used to watch 3-4 videos in parallel because there was no speeding option. Perhaps because I was watching the videos intermingled together, I felt the need to focus more during the summary sections, but I think they are useful anyways because I doubt most students don't multi-task or think of what they are having for dinner when listening to the videos. (T2)
Help Clarify Content for Weaker Students.
Actually, because of some reasons (poor English skill, not concentrating on the subject, etc.), I couldn't understand the words quite many times. But because of the summaries you gave us, I could catch up on the lecture and understand what's going on at that time. (T1)
Focus Attention (during Attention Lapses).
"I used to get spaced out sometimes while watching videos, but the summarizing slides help me focus on the topics and keep track of the lectures." (T1)
"Appearance of the guest professors refreshed me to focus on the videos. Of coarse [sic], the summary itself helped me to memorize key points of the lectures." (T2)
"Sometime[s] it is easy to be distracted while watching the video so, summary gives a nice construct of the contents discussed." (T2)
Focus Attention on the Most Salient Points.
"When I was attained in Edu3.0 online lecture, I usually paused the video to understand unclear contents. However, these mid and end summaries reduced that, because these showed the key points of the talks." (T2)
"I enjoyed having some form of summary as this gave me an idea of the most important points of the lecture, giving me a focus on what to revise." (T2)
A Rest Point.
"Regarding the mid and end summaries, I think they were actually the best part of the video lectures. I appreciated having a stopping point in the lectures, and it was helpful having the summarized version in my head before taking the quizzes." (T1)
Assisted Note-Taking.
"Having summaries definitely coincided with my note taking formats and thus was very effective." (T1)
"Also, in my case, summary slides were useful for preparing the quiz and making a summary note." (T1)
Useful as a Recap.
"In particular, I like mid summaries because I can refresh in the middle of the lecture. Lectures are sometimes long, and sometimes I cannot remember all the contents of the lecture. The mid summaries help to understand the flow of content." (T1)
The students in the "summaries only" group scored an average of nearly one point higher than their counterparts in the control and the "summaries with a guest lecturer" group on the combined 10 points of the two 5-point quizzes given in weeks 6 and 15, respectively. A number of studies in the literature suggest that the use of lecturer-generated summaries, whether spoken or visual, increases learner retention of course content (Hirsch, 1987; Titsworth & Kiewra, 2004). The improved quiz scores of the "summaries only" group as compared to those of the control seem to support this notion. However, the usefulness of lecturer-generated summaries seems to be contradicted by the result that there was no significant difference between the scores of the "summaries with a guest lecturer" group and the control. A reason to expect that summaries with a guest lecturer would have outperformed the other two groups is that several studies have shown that the integration of a guest lecturer into a traditional classroom setting can be beneficial (Jang, 2006; Johnson et al., 2000; Letterman & Dugan, 2004). We expected this tendency to manifest in online course videos, but this did not occur.
One possible explanation for this result is that some aspect of instructor characteristics caused the lack of increase of retention among students who viewed the videos with a guest lecturer. Numerous studies have shown that unfamiliar accents may impede comprehension (Eisenstein & Berkowitz, 1981; Smith & Bisazza, 1982; Anderson-Hsieh & Koehler, 1988), and students often identify accents as an obstacle in understanding a lecture (Richards, 1983; Bilbow, 1989). In the present study, one of the instructors spoke standard American English, while the other spoke standard British English. Considering that the students also met their respective instructor once a week over a 16-week period during the semester for a workshop session in addition to watching the video lectures, it is plausible that the students developed familiarity with the accent and speaking mannerisms of their primary instructor. Perhaps the introduction of a guest speaker with an unfamiliar accent caused difficulties in listening comprehension.
A number of studies have recommended and also shown that change-of-pace elements help maintain students' attention and enable them to better recall course content (Brecht, 2012; Center for Excellence in Teaching, 1999). Because the videos shown to the "summaries with a guest lecturer" group contained a greater variety of presentation elements (e.g., the change of scene to a coffee shop setting, a 30-second stopwatch animation, and the presence of a guest lecturer), we expected the videos shown to the "summaries with a guest lecturer" group to be the most effective. The sudden change in the setting of the video from a familiar-looking PowerPoint slide background to an informal coffee shop scene may have seemed jarring to the students, causing an interruption to their concentration on the content of the video. One student from the "summaries with a guest lecturer" group described his feelings on the videos as follows: "The summaries cut the flow of the lecture. I was focused on the [content] of the video but the change of professor and background looked so complex. It was disturbing [sic]."
Kim et al. (2014) point out that increases in viewership and student lecture behaviors, such as pausing, seeking backwards and forward, and re-watching videos, occur during particularly unclear or interesting segments of a video. As evidence that the visual design of a video can affect the ability of students to understand the information covered, Kim et al. found, in their study, that 61% of such increased periods of viewership and lecture behaviors occurred during an abrupt change in the background, such as shifting from a slide view to a classroom view, and the researchers further suggest that video content creators refrain from including abrupt scene changes or excessive transitions, as these may cause student to feel confused or to lose the context. While we were unaware of this suggestion when designing the treatment videos and the present experiment, the aforementioned findings provide a plausible explanation for the reduced effectiveness of the videos in the "summaries with a guest lecturer" group compared to the "summaries only" group, and our findings seem to support those of Kim et al. (2014). While both of these groups contained mid- and end-point summaries of equal length (30 seconds each), only the "summaries with a guest lecturer" contained scene changes (from a slide background to a coffee shop setting and back again), as the "summaries only" group kept the same slide background throughout.
When considering these results, it is also clear that the present study has a number of limitations. Only two weeks of the videos and quizzes were examined, but examining more cases would have provided a more complete view of student outcomes. Furthermore, inclusion of a greater number of course sections of scientific writing, and thereby a greater number of participants, would have improved reliability of the results.
Another limitation is that we were unable to get a clear sense of the guest lecturer effect from the data since the "summaries by a guest lecturer" videos also contained abrupt scene changes. How would summaries given by guest lecturers have compared with summaries given by the original lecturer if each was provided with a minimal scene change (e.g., keeping the same PowerPoint slide background)? A few students who responded to the email survey mentioned that they found the mid-point summaries jarring or interrupting to the flow of the lecture. One way to address these concerns would be to provide only end-point summaries, but the present study did not measure the effectiveness of such an approach. As students noted that summaries were useful for emphasizing the most salient points of the lecture (Table 8) and in keeping with Guo, Kim, & Rubin (2014) who suggest that video lecture designers should "facilitate skimming and re-watching," we propose that instructors create timeline markers for summaries, so students are aware of the presence of something at a clearly marked point (perhaps shown by a dot or vertical line) on the video timeline bar. Such a feature will address the aforementioned complaint because, eventually, it will become obvious that the marker indicates the position of the summaries.
Furthermore, in the present study, the guest lecturer spoke using a different type of English from the primary course lecturer (i.e., British vs. American English). Summaries delivered by the primary lecturer improved student recall, but those delivered by a guest lecturer who spoke a different type of English did not; however, it is unclear whether guest-lecturer summaries were ineffective due to the difference in the types of English being spoken between the two lecturers or if the mere presence of a guest lecturer is, in and of itself, distracting to student concentration on course content. This question could be answered by assessing the effects of lecturer-generated summaries delivered by a guest lecturer who speaks the same type of English as the primary course lecturer.
While we cannot say with certainty which of the aspects of the videos in the "summaries with a guest lecturer" group distracted or hindered students in recalling the contents, it is clear from the results of the present study that mid- and end-point summaries provided and delivered by the original instructor enabled students to perform better on related quizzes. With this in mind, we recommend that instructors incorporate short summaries into course videos in MOOCs and flipped and blended classes. The findings of the present study are relevant to instructors in e-learning environments because our data suggest that inserting short summaries into existing course videos is a simple and facile way to improve student recall of course contents. Even though a number of studies in the literature suggest that a greater number of change-of-pace elements will increase student recall of video contents, instructors should also bear in mind that, in certain instances and to certain degrees, such elements may become distractions that may eventually negate the benefits they are expected to provide. This was particularly evident in the case of the scene changes between the slide background and the coffee shop setting in the "guest lecturer summaries" group. As suggested by Kim et al. (2014), avoiding abrupt scene changes in video lectures appears to be sound advice for video content developers for flipped classrooms, MOOCs, and other e-learning environments, and this suggestion was confirmed in the present study.
Alksne, L. (2016). How to produce video lectures to engage students and deliver the maximum amount of information. In SOCIETY. INTEGRATION. EDUCATION. Proceedings of the International Scientific Conference, 2, 503-516.
Anderson-Hsieh, J., & Koehler, K. (1988). The effect of foreign accent and speaking rate on native speaker comprehension. Language Learning, 38(4), 561-613.
Ariga, A., & Lleras, A. (2011). Brief and rare mental "breaks" keep you focused: Deactivation and reactivation of task goals preempt vigilance decrements. Cognition, 118(3), 439-443.
Baker, L., & Lombardi, B. R. (1985). Students' lecture notes and their relation to test performance. Teaching of Psychology, 12(1), 28-32.
Barker, P. G., & Benest, I. D. (1996). The on-line lecture concept - a comparison of two approaches. In IEE Colloquium on Learning at a Distance: Developments in Media Technologies, doi: 10.1049/ic:19960882
Bilbow, G. T. (1989). Towards an understanding of overseas students' difficulties in lectures: A phenomenographic approach. Journal of Further and Higher Education, 13(3), 85-99.
Brecht, H. D. (2012). Learning from online video lectures. Journal of Information Technology Education, 11, 227-250.
Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D., & Seaton, D. T. (2013). Studying learning in the worldwide classroom: Research into edX's first MOOC. Research & Practice in Assessment, 8.
Bromage, B. K., & Mayer, R. E. (1986). Quantitative and qualitative effects of repetition on learning from technical text. Journal of Educational Psychology, 78(4), 271.
Bunce, D. M., Flens, E. A., & Neiles, K. Y. (2010). How long can students pay attention in class? A study of student attention decline using clickers. Journal of Chemical Education, 87(12), 1438-1443.
Butler, A. C., & Roediger III, H. L. (2007). Testing improves long-term retention in a simulated classroom setting. European Journal of Cognitive Psychology, 19(4-5), 514-527.
Center for Excellence in Teaching. (1999). Teaching nuggets. Los Angeles: University of Southern California.
Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132(3), 354.
Chan, J. C., McDermott, K. B., & Roediger, H. L., III. (2006). Retrieval-induced facilitation: Initially nontested material can benefit from prior testing of related material. Journal of Experimental Psychology: General, 135, 553-571.
Colvin, K. F., Champaign, J., Liu, A., Zhou, Q., Fredericks, C., & Pritchard, D. E. (2014). Learning in an introductory physics MOOC: All cohorts learn equally, including an on-campus class. The International Review of Research in Open and Distributed Learning, 15(4).
Cook, L., & Friend, M. (1996). Co-teaching: Guidelines for creating effective practices. In E. L. Meyen, G. A. Vergason, & R. J. Whelan (Eds.), Strategies for teaching exceptional children in inclusive settings, 155-182. Denver: Love.
Costley, J., & Lange, C. (2017). The effects of lecture diversity on germane load. The International Review of Research in Open and Distributed Learning, 18(2). doi: 10.19173/irrodl.v18i2.2860
Davis, B. G. (1993). Tools for teaching. San Francisco: Jossey-Bass.
Downing, S. M. (2002). Construct-irrelevant variance and flawed test questions: Do multiple-choice item-writing principles make any difference? Academic Medicine, 77(10), S103-S104.
Dugan, K., & Letterman, M. (2008). Student appraisals of collaborative teaching. College Teaching, 56(1), 11-15.
Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students' learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14(1), 4-58.
Eisenstein, M., & Berkowitz, D. (1981). The effect of phonological variation on adult learner comprehension. Studies in Second Language Acquisition, 4(01), 75-80.
Enfield, J. (2013). Looking at the impact of the flipped classroom model of instruction on undergraduate multimedia students at CSUN. TechTrends, 57(6), 14-27.
Farley, J., Risko, E., & Kingstone, A. (2013). Everyday attention and lecture retention: the effects of time, fidgeting, and mind wandering. Frontiers in Psychology, 4, 619.
Ferreri, S. P., & O'Connor, S. K. (2013). Redesign of a large lecture course into a small-group learning course. American Journal of Pharmaceutical Education, 77(1), 13.
Foos, P. W., & Fisher, R. P. (1988). Using tests as learning opportunities. Journal of Educational Psychology, 80, 179-183.
Frederiksen, N. (1984). The real test bias: Influences of testing on teaching and learning. American Psychologist, 39, 193-202.
Frydenberg, M. (2013). Flipping excel. Information Systems Education Journal, 11( 1), 63.
Glaser, R. (1984). Education and thinking: The role of knowledge. American Psychologist, 39(2), 93.
Glover, J. A., & Corkill, A. J. (1987). Influence of paraphrased repetitions on the spacing effect. Journal of Educational Psychology, 79(2), 198.
Guo, P. J., Kim, J., & Rubin, R. (2014). How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the first ACM conference on Learning@ scale conference (pp. 41-50). ACM.
Helms, M. M., Alvis, J. M., & Willis, M. (2005). Planning and implementing shared teaching: An MBA team-teaching case study. Journal of Education for Business, 81(1), 29-34.
Herold, M. J., Lynch, T. D., Ramnath, R., & Ramanathan, J. (2012, October). Student and instructor experiences in the inverted classroom. In 2012 Frontiers in Education Conference Proceedings, 1-6. IEEE.
Hirsch, R. (1987). Listening: The influence of neuro-linguistic cues on the retention of information. Journal of the International Listening Association, 1, 103-11.
Horn, M. (2014, March 17). KAIST doesn't wait for change In Korea, pioneers 'Education 3.0'. Forbes. Retrieved from http://www.forbes.com/sites/michaelhorn/2014/03/17/kaist-doesnt-wait-for-change-in-korea-pioneers-education-3-0/#5ae890b01a06
Jang, S. J. (2006). Research on the effects of team teaching upon two secondary school teachers. Educational Research, 48(2), 177-194.
Johnson, D. W., Johnson, R. T., & Smith, K. A. (2000). Constructive controversy: The educative power of intellectual conflict. Change: The Magazine of Higher Learning, 32(1), 28-37.
Johnson, B. C., & Kiviniemi, M. T. (2009). The effect of online chapter quizzes on exam performance in an undergraduate social psychology course. Teaching of Psychology, 36(1), 33-37.
Jones, R., Peters, K., & Shields, E. (2007). Transform your training: Practical approaches to interactive Information Literacy teaching. Journal of Information Literacy, 1(1), 35-42.
Kamuche, F. U. (2011). The effects of unannounced quizzes on student performance: Further evidence. College Teaching Methods & Styles Journal (CTMS), 3(2), 21-26.
Kim, J., Guo, P. J., Seaton, D. T., Mitros, P., Gajos, K. Z., & Miller, R. C. (2014). Understanding in-video dropouts and interaction peaks in online lecture videos. In Proceedings of the first ACM conference on learning at scale, 31-40. ACM.
King, A. (1992). Comparison of self-questioning, summarizing, and notetaking-review as strategies for learning from lectures. American Educational Research Journal, 29(2), 303-323.
Korea Advanced Institute of Science and Technology (KAIST). (2014a). Center for excellence in teaching and learning. Retrieved from http://www.kaist.edu/html/en/edu/edu_030405.html
Korea Advanced Institute of Science and Technology (KAIST). (2014b). Student handbook. Retrieved from https://itguide.kaist.ac.kr/itguide/eng/bbs/selectBbsView.do
Letterman, M. R., & Dugan, K. B. (2004). Team teaching a cross-disciplinary honors course: Preparation and development. College Teaching, 76-79.
Little, J. L., Bjork, E. L., Bjork, R. A., & Angello, G. (2012). Multiple-choice tests exonerated, at least of some charges fostering test-induced learning and avoiding test-induced forgetting. Psychological Science, 23(11), 1337-1344.
Locke, E. A. (1977). An empirical study of lecture note taking among college students. The Journal of Educational Research, 71( 2), 93-99.
McCoubrie, P. (2004). Improving the fairness of multiple-choice questions: A literature review. Medical Teacher, 26(8), 709-712.
Maddox, H., & Hoole, E. (1975). Performance decrement in the lecture. Educational Review, 28(1), 17-30.
Marks, D. B. (2015). Flipping the classroom: Turning an instructional methods course upside down. Journal of College Teaching & Learning, 12(4), 241-248.
Mason, G. S., Shuman, T. R., & Cook, K. E. (2013). Comparing the effectiveness of an inverted classroom to a traditional classroom in an upper-division engineering course. IEEE Transactions on Education, 56(4), 430-435.
Mayer, R. E. (1983). Can you repeat that? Qualitative effects of repetition and advance organizers on learning from science prose. Journal of Educational Psychology, 75(1), 40.
Padgett, W. T., & Yoder, M. A. (2008). Effective communication: Excellence in a technical presentation. IEEE Signal Processing Magazine, 25(2), 124.
Prober, C. G., & Khan, S. (2013). Medical education reimagined: A call to action. Academic Medicine, 88(10), 1407-1410.
Richards, J. C. (1983). 1983: Listening comprehension: Approach, design, procedure. TESOL Quarterly, 17(2), 219-240.
Risko, E. F., Anderson, N., Sarwal, A., Engelhardt, M., & Kingstone, A. (2012). Everyday attention: Variation in mind wandering and memory in a lecture. Applied Cognitive Psychology, 26(2), 234-242.
Roediger III, H. L., & Karpicke, J. D. (2006). The power of testing memory: Basic research and implications for educational practice. Perspectives on Psychological Science, 1(3), 181-210.
Roediger III, H. L., & Marsh, E. J. (2005). The positive and negative consequences of multiple-choice testing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31(5), 1155.
Scerbo, M. W., Warm, J. S., Dember, W. N., & Grasha, A. F. (1992). The role of time and cuing in a college lecture. Contemporary Educational Psychology, 17(4), 312-328.
Smidt, E., & Hegelheimer, V. (2004). Effects of online academic lectures on ESL listening comprehension, incidental vocabulary acquisition, and strategy use. Computer Assisted Language Learning, 17(5), 517-556.
Smith, L. E., & Bisazza, J. A. (1982). The comprehensibility of three varieties of English for college students in seven countries. Language Learning, 32(2), 259-269.
Titsworth, B. S., & Kiewra, K. A. (2004). Spoken organizational lecture cues and student notetaking as facilitators of student learning. Contemporary Educational Psychology, 29(4), 447-461.
Toppino, T. C., & Brochin, H. A. (1989). Learning from tests: The case of true-false examinations. The Journal of Educational Research, 83, 119-124.
Toppino, T. C., & Luipersbeck, S. M. (1993). Generality of the negative suggestion effect in objective tests. The Journal of Educational Research, 86, 357-362.
Tune, J. D., Sturek, M., & Basile, D. P. (2013). Flipped classroom model improves graduate student performance in cardiovascular, respiratory, and renal physiology. Advances in Physiology Education, 37(4), 316-320.
Walstad, W. B., & Becker, W. E. (1994). Achievement differences on multiple-choice and essay tests in economics. The American Economic Review, 84(2), 193-196.
Webb, S. (2007). The effects of repetition on vocabulary knowledge. Applied Linguistics, 28(1), 46-65.
Welch, M., Brownell, K., & Sheridan, S. M. (1999). What's the score and game plan on teaming in schools? A review of the literature on team teaching and school-based problem-solving teams. Remedial and Special Education, 20(1), 36-49.
Williams, A., Birch, E., & Hancock, P. (2012). The impact of online lecture recordings on student performance. Australasian Journal of Educational Technology, 28(2), 199-213.
Wilson, K., & Korn, J. H. (2007). Attention during lectures: Beyond ten minutes. Teaching of Psychology, 34(2), 85-89. doi: 10.1080/00986280701291291
Wilson, S. G. (2013). The flipped class: A method to address the challenges of an undergraduate statistics course. Teaching of Psychology, 40(3), 193-199.
Wilson, V. A., & Martin, K. M. (1998, February). Practicing what we preach: Team teaching at the college level. Paper presented in the Annual Meeting of the Association of Teacher Educators, Dallas, TX, USA). Abstract retrieved from http://files.eric.ed.gov/fulltext/ED417172.pdf
Wolff, M., Wagner, M. J., Poznanski, S., Schiller, J., & Santen, S. (2015). Not another boring lecture: Engaging learners with active learning techniques. The Journal of Emergency Medicine, 48(1), 85-93.
Woodring, B. C., & Woodring, R. C. (2011). Lecture: Reclaiming a place in pedagogy. Innovative Teaching Strategies in Nursing and Related Health Professions, 113-135.
Young, M. S., Robinson, S., & Alberts, P. (2009). Students pay attention! Combating the vigilance decrement to improve learning during lectures. Active Learning in Higher Education, 10(1), 41-55.
Zhang, D., Zhou, L., Briggs, R. O., & Nunamaker, J. F. (2006). Instructional video in e-learning: Assessing the impact of interactive video on learning effectiveness. Information & management, 43(1), 15-27.
Pinch Hitter: The Effectiveness of Content Summaries Delivered by a Guest Lecturer in Online Course Videos by Mik Fanguy, Jamie Costley, and Matthew Baldwin is licensed under a Creative Commons Attribution 4.0 International License.