Evidence-Informed Tips for Using the Interactive Video Builder Tool
Why Use Interactive Videos?
One hundred and twenty seconds.
That’s the average attention span for watching educational videos. Research on millions of learners enrolled in Massive Open Online Courses (MOOCs) suggests that after two minutes, most people’s attention wanes (Fishman 2016; Guo et al. 2014; Kim et al. 2014). They become distracted by the lure of social media on their phones, or their minds wander to thoughts like “what’s for dinner?”
However, we can help learners refocus by embedding questions in videos to create “interactive videos.” As an interactive video progresses, it is paused at various points to show questions that require learners to think and act before the video resumes.
Interactive videos gained popularity in flipped courses where learners prepared for class by watching video content, and as “clicker activities” in lectures where learners watched a video and answered multiple-choice questions using a mobile device. Today, interactive videos are used in just about every context, from collaborative in-person formative activities to online summative assessments.
Research supports the effectiveness of interactive videos for learning (Rice et al. 2019). In addition to refocusing attention, interactive videos cue learners to the lesson objectives (i.e., to be selective about the information they attend to), help learners realize what they do not understand about a given topic or skill, and allow space for reflection (Brame 2016; Lawson et al. 2006; Tanner 2012). These engagement opportunities also increase learner motivation, engagement, satisfaction, and self-efficacy (van der Meij and Bӧckmann 2021).
Questions in a video also tap into the well-established phenomenon of the “testing effect.” Quizzes encourage learners to construct their knowledge about a concept and in doing so are a more effective learning strategy than simply studying (Adesope et al. 2017; Butler 2010). Each time a learner makes their thinking visible (as they do when answering a quiz), they restructure or reinforce their understanding about the concept. In short, it helps them learn. One study found that learners who watched videos with embedded questions remembered information more than learners who reviewed the equivalent content by reading a textbook (Pulukuri and Abrams 2021).
For this reason, BioInteractive has created a suite of ready-to-use interactive videos, which can be accessed both on the BioInteractive website and from the “Explore” tab of the Interactive Video Builder tool. Each interactive video includes automatic pause points with embedded questions. The questions were written by educators (including myself) to help learners develop scientific thinking skills and achieve the learning outcomes stated in the video’s accompanying materials.
Creating Your Own Interactive Video
What happens when the existing interactive videos do not match your audience, learning objectives, or desired question format? The Interactive Video Builder tool also lets you create new interactive videos by embedding your own questions at the locations you want in a video. The tool works with any BioInteractive video. More instructions for using Interactive Video Builder are provided in the “Supporting Information” tab inside the tool and on its webpage.
As educators, we want to know how to use a tool not just technologically but also instructionally. In this case, we want to know how to design an interactive video to maximize learning and engagement. To help you design effective interactive videos, I outline 10 research-based strategies below.
1. Focus on Your Learning Objectives
I can’t emphasize this one enough. Know why you are developing an interactive video by defining the learning objectives/goals at the start. Then be clear and purposeful in developing questions that help learners achieve those goals.
In an interactive video, questions serve to direct your learners’ attention. Use them to cue learners about what is important to learn. Start with your goals, then develop your questions with those goals in mind. When you are done, reexamine your questions to check that they will help a learner reach the goals you set.
2. Start with a Pretest
There is a phenomenon in education whereby learners who take a test about a topic before learning about it — even if they get everything wrong — end up performing better on a test of similar concepts after instruction. The questions don’t have to be the same in the pretest and the posttest; they must just be on the same concepts. This robust phenomenon is called the “pretesting effect,” sometimes termed the forward testing effect or test-potentiated new learning (Chan et al. 2018; Yang et al. 2018).
Researchers have proposed a few explanations for this effect. One is that the pretest cues learners about the concepts that will be important in the upcoming instruction, which helps direct attention. Another is that the pretest helps learners organize the knowledge that’s about to be presented. A third is that the pretest stimulates learners’ existing and related knowledge about a topic, making it readily accessible to make connections with new knowledge as it is presented.
Whatever the mechanism, it helps to ask learners a question before the video even starts — at time 00:00. Ask them what they know about a topic. Or, ask them a question about the topic they will learn in the video (and reassure them that wrong answers will not be penalized). This will assist their learning (Zaki 2019).
3. How Many Questions and How Often?
What is the optimal instructional video length, according to educators? An educational video streaming company periodically conducts surveys of hundreds of instructors and instructional designers. In 2015, more than 50% responded that 5–10 minutes was the ideal length for an educational video, while 22% preferred 10–30 minutes (The State of Video in Education 2015, 2015).
Recall that studies have consistently shown that learners stay focused on videos for two minutes, with engagement falling off sharply after that (Fishman 2016; Guo et al. 2014; Kim et al. 2014). There is a tension between the length of time that educators feel is necessary to impart knowledge and what the evidence shows is the maximal attention span of learners for watching videos.
Thus, it would make sense to interrupt longer videos with questions, on average, every two minutes or so. That number is not exact. In fact, one study looked at the impact of integrating a question in a video every 90 seconds, two minutes, and three minutes and found no difference in how these different time intervals affected retention (Wachtler et al. 2016).
As general guidance, consider including one question every two minutes.
4. Find Teachable Moments
I’m going to put a big “but” on my previous tip: You don’t want to follow that guideline too rigidly.
What matters even more than the two-minute guideline are the educational opportunities afforded by the video. For example, each time a video is about to reveal data or other results, I like to pause it and ask learners to predict what the video is about to show.
In a video that follows a scientist’s experiment, I pause once the context and research questions have been presented and ask: “If you were the scientist, what hypothesis would you have about the phenomenon?” Then, I like to ask learners to propose an experimental design to test that hypothesis. When the video resumes, learners can compare their experiment to what the scientist did. In my experience, many learners proposed similar methods, bolstering their confidence in their ability to understand and do science.
Once the video has covered the experimental setup, you can pause and ask learners to predict what the results will look like if the hypothesis is confirmed. You can even ask questions about how learners expect scientists to present their data (for example, what would they place on the axes of a graph?). Finally, once the results of the experiment are revealed, you can pause the video and ask whether the results support the scientist’s hypothesis. Those opportunities help learners develop their abilities to think scientifically.
You can view examples of such “predictive” embedded questions in the Interactive Case Study for The Effects of Fungicides on Bumble Bee Colonies and the Interactive Case Study for Studying Elephant Communication.
With embedded questions, you can also test learners’ comprehension by probing their understanding of the concepts in the video. Do this for particularly hard-to-understand topics where scaffolding the learning can help. For example, if a video introduces gene therapy and then uses that information to explain how it can be used to treat a specific condition, it may be beneficial to make sure that learners understand the mechanisms of gene therapy before proceeding with the specific example.
In summary, include a question whenever it makes sense to pause and ask learners what they understood from the past video section, or to prepare them for the upcoming section. While research suggests that you should aim to include a question roughly every two minutes, don’t let that stop you from including more frequent questions if it makes sense to do so.
5. Closed- or Open-Ended Questions?
The Interactive Video Builder tool can integrate both closed-ended questions (e.g., multiple choice and true/false) and open-ended questions (e.g., short answer). Which one should you choose?
The answer depends on your learning objectives and the type of feedback that might best support your learners. When you develop questions to encourage learners to practice skills like scientific thinking, remember that:
- Closed-ended questions limit the range of options that learners will consider, but it makes it easier to give learners immediate feedback.
- Open-ended questions allow learners to develop their thinking with fewer scaffolds and can be a very powerful way to provoke and develop their thinking. However, giving immediate feedback is more challenging.
Before you pick a question type, consider the goal behind the question, what kind of response you’re hoping to elicit from learners, and what type of feedback would work best for them and you. Tips 6 and 7 below provide more guidance on feedback types and timing.
6. Include Personalized Feedback
A large body of evidence shows that providing feedback to learners on their performance helps them improve. That’s not surprising. After all, if a learner doesn’t know they got something wrong, how are they supposed to correct their thinking and improve?
On a multiple-choice question, educators can provide four levels of feedback:
- Correct Answer. This simply tells learners which answer was the correct one without reference to their selected answer.
- Correct/Incorrect. This tells learners if the answer they selected was correct or incorrect, but does not explain why, nor does it reveal the correct answer if the learner selected an incorrect one.
- Repeat Until Correct. This is almost the same as the “Correct/Incorrect” approach, but it lets learners reattempt the question until they get the correct answer.
- Explanation. This strategy provides customized feedback to learners based on their choice. It explains why each incorrect answer was so, and why the correct answer was the best choice.
It probably won’t surprise you to learn that these forms of feedback were listed from least to most helpful to learning (Bangert-Drowns et al. 1991). The Interactive Video Builder tool allows you to provide “Explanation” feedback, the most helpful form of feedback above.
One drawback of using multiple-choice questions is that learners are exposed to distractors, which often reflect common learner misconceptions. If a learner selects the wrong response and isn’t given sufficient feedback, it is possible that the quiz will reinforce — rather than dispel — the misconception. Now the testing effect is working against learning! So, always provide answer-specific feedback on quizzes, especially when they are used for the purpose of helping learners readjust their mental schemas and address their misconceptions (Butler and Roediger 2008).
7. Provide Feedback Now or Later?
It might come as a surprise that there is some debate as to whether it is more beneficial to provide feedback immediately after a learner responds to a question or to delay the feedback. Intuitively, we feel that immediate course correction would be ideal. Indeed, research clearly shows that learners prefer immediate feedback (Mullet et al. 2014).
However, the evidence is equivocal on whether immediate feedback improves learning. Some studies show that delaying feedback helps with retention and memory (Butler et al. 2007; Corral et al. 2021; Fyfe et al. 2021; Mullet et al. 2014; Smith and Kimball 2010). This may be because when delaying the feedback, the learner has to think about the concept twice — once when they answer the question and again when they review the feedback later on — which reinforces their learning (a concept called “retrieval practice,” which is the basis for the testing effect). However, the evidence isn’t clear-cut, with some studies showing that better learning occurs when feedback is provided immediately after a response (Azevedo and Bernard 1995; Calimeris and Kosack 2020; Lu et al. 2021). One review article even suggests that the debate is an experimental aberration, since most studies that take place in a classroom (rather than in an artificial laboratory setting) find that immediate feedback is superior (Kulik and Kulik 1988).
Until this issue is settled, perhaps the best course of action is to provide feedback not immediately after each question but rather at the end of the quiz. This gives some spacing between the time that the learner answered the question and when they revisit it to read the feedback, but it is still fresh enough to matter to learners. In fact, some data supports this very strategy (Kulik and Kulik 1988).
8. Bloom it Up!
When educators embed questions into a video, their first impulse is often to check learner understanding. That is a valid purpose of integrating questions into videos. However, you should develop questions that target not just recall and comprehension but all levels of Bloom’s taxonomy (Bloom et al. 1956; Anderson et al. 2001).
Be sure to encourage learners to do higher-level Bloom’s tasks, such as compare and contrast, evaluate, predict, analyze, and create. If you aren’t sure how to integrate such questions, revisit Tip 4 (“Find Teachable Moments”) for suggestions about where to integrate pause points that encourage learners to practice their scientific thinking.
9. Promote Transfer
We usually don’t assign an interactive video just to teach the real-world example or specific context that’s featured in the video. Rather, we use the video to teach a general underlying concept. For example, the popular video The Making of the Fittest: Natural Selection and Adaptation uses the rock pocket mice population in the southwestern United States as an example of adaptation and natural selection.
The problem is that, as studies have shown, people don’t easily apply what they learned in one specific context to another. They aren’t very good at transfer. They must be guided (Carpenter 2012; Shemwell et al. 2015).
You can encourage transfer by juxtaposing a new example to the one provided in the video. In a question near the end of the video, provide details of another example where the surface features are different, but the underlying concepts are similar, and ask learners to identify the similarities between this example and the one in the video. Research has shown that encouraging learners to find similarities in situations that are superficially different helps them transfer the foundational concepts more effectively. Interestingly, if you just point out how a general concept applies across examples, learners don’t recognize the general concepts in new examples (Son and Rivas 2016). They have to go through the exercise of finding similarities for themselves (Shemwell et al. 2015).
So, if transfer learning is your goal, consider including a question that gets learners to compare the concept they are learning about in the video with another example, and to extract what these two examples have in common.
10. Consider Guiding Questions
Sometimes, the purpose of pausing the video and asking a question is not to ask learners to respond in that moment but rather to focus attention as learners watch the next part of the video (Brame 2016; Cojean and Jamet 2022; Lawson et al. 2006; Lawson et al. 2007). This is similar to providing guiding questions with a reading, a strategy that has been shown to improve learning (Stiegler-Balfour and Benassi 2015; Tanner 2012).
Examples of such questions might be: “As you watch the next section, consider which variable is being manipulated and which is being measured. How many distinct methods do the scientists use to collect data on the dependent variable? Why do they need more than one?”
The learner is not asked to provide an immediate answer; rather, the learner is directed to seek answers in the upcoming segment of the video. It cues attention and helps learners pick out relevant information. It wouldn’t hurt to pause the video after the segment and reiterate the question, this time collecting learner responses. That way, learners are held accountable, increasing motivation and engagement. Collecting answers could also be a way to provide feedback and improve learning.
Other Considerations for Interactive Videos
The strategies outlined above focused on the instructional implementation of pause points with associated questions embedded in interactive videos. One aspect that was not addressed is how to write the questions themselves. Haladyna et al. (2002) identified what is known about writing good assessment questions, and the information is captured in these guidelines on the Brigham Young University website. It includes tips such as avoiding negatives (i.e., questions that include the word “not” in them) and avoiding terms like “always” and “never.”
You can also use the Interactive Video Builder tool to do much more than embed questions into videos. The tool enables you to add labels, links, and other instructions at any pause points you choose. For example, you could provide:
- Links to webpages with additional information, with directions for learners to engage in investigation.
- Prompts to perform certain activities, like a short at-home experiment, connected with the video’s concepts.
- Instructions to make pen-and-paper or digital concept maps of what they are learning.
- Links to an online whiteboard or discussion board where they can share perspectives and personal experiences with their peers.
What happens when the video pauses should be guided by your instructional style, audience, creativity, and above all, the learning objectives for your learners.
References
Adesope, O. O., D. A. Trevisan, and N. Sundararajan. “Rethinking the use of tests: A meta-analysis of practice testing.” Review of Educational Research 87, 3 (2017): 659–701. https://doi.org/10.3102/0034654316689306.
Anderson, L. W., and D. R. Krathwohl. A taxonomy for learning, teaching, and assessing: A revision of Bloom's taxonomy of educational objectives. Longman, 2001.
Azevedo, R., and R. M. Bernard. “A meta-analysis of the effects of feedback in computer-based instruction.” Journal of Educational Computing Research 13, 2 (1995): 111–127. https://doi.org/10.2190/9lmd-3u28-3a0g-ftqt.
Bangert-Drowns, R. L., C.-L. C. Kulik, J. A. Kulik, and M. Morgan. “The instructional effect of feedback in test-like events.” Review of Educational Research 61, 2 (1991): 213–238. https://doi.org/10.3102/00346543061002213.
Bloom, B. S., M. D. Engelhart, E. J. Furst, W. H. Hill, and D. R. Krathwohl. Taxonomy of educational objectives: the classification of educational goals: handbook I: cognitive domain. New York, US: D. Mckay, 1956.
Brame, C. J. “Effective educational videos: Principles and guidelines for maximizing student learning from video content.” CBE—Life Sciences Education 15, 4 (2016): es6. https://doi.org/10.1187/cbe.16-03-0125.
Butler, A. C. “Repeated testing produces superior transfer of learning relative to repeated studying.” Journal of Experimental Psychology: Learning, Memory, and Cognition 36, 5 (2010): 1118–1133. https://doi.org/10.1037/a0019902.
Butler, A. C., J. D. Karpicke, and H. L. Roediger. “The effect of type and timing of feedback on learning from multiple-choice tests.” Journal of Experimental Psychology: Applied 13, 4 (2007): 273–281. https://doi.org/10.1037/1076-898x.13.4.273.
Butler, A. C., and H. L. Roediger. “Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing.” Memory & Cognition 36, 3 (2008): 604–616. https://doi.org/10.3758/mc.36.3.604.
Calimeris, L., and E. Kosack. “Immediate feedback assessment technique (IF-AT) quizzes and student performance in microeconomic principles courses.” The Journal of Economic Education 51, 3–4 (2020): 211–226. https://doi.org/10.1080/00220485.2020.1804501.
Carpenter, S. K. “Testing enhances the transfer of learning.” Current Directions in Psychological Science 21, 5 (2012): 279–283. https://doi.org/10.1177/0963721412452728.
Chan, J. C. K., C. A. Meissner, and S. D. Davis. “Retrieval potentiates new learning: A theoretical and meta-analytic review.” Psychological Bulletin 144, 11 (2018): 1111–1146. https://doi.org/10.1037/bul0000166.
Cojean, S., and E. Jamet. “Effects of outlines and information seeking on learning outcomes in video-based environments.” Interactive Learning Environments (2022): 1–13. https://doi.org/10.1080/10494820.2022.2028854.
Corral, D., S. K. Carpenter, and S. Clingan-Siverly. “The effects of immediate versus delayed feedback on complex concept learning.” Quarterly Journal of Experimental Psychology 74, 4 (2021): 786–799. https://doi.org/10.1177/1747021820977739.
Fishman, E. “How long should your next video be?” Wistia. Published July 5, 2016. https://wistia.com/learn/marketing/optimal-video-length.
Fyfe, E. R., J. R. de Leeuw, P. F. Carvalho, R. L. Goldstone, J. Sherman, D. Admiraal, L. K. Alford, et al. “ManyClasses 1: Assessing the generalizable effect of immediate feedback versus delayed feedback across many college classes.” Advances in Methods and Practices in Psychological Science 4, 3 (2021): 25152459211027575. https://doi.org/10.1177/25152459211027575.
Guo, P. J., J. Kim, and R. Rubin. “How video production affects student engagement: An empirical study of MOOC videos.” Proceedings of the first ACM conference on Learning @ scale conference (2014): 41–50. https://doi.org/10.1145/2556325.2566239.
Haladyna, T. M., S. M. Downing, and M. C. Rodriguez. “A review of multiple-choice item-writing guidelines for classroom assessment.” Applied Measurement in Education 15, 3 (2002): 309–333. https://doi.org/10.1207/s15324818ame1503_5.
Kim, J., P. J. Guo, D. T. Seaton, P. Mitros, K. Z. Gajos, and R. C. Miller. “Understanding in-video dropouts and interaction peaks in online lecture videos.” Proceedings of the first ACM conference on Learning @ scale conference (2014): 31–40. https://doi.org/10.1145/2556325.2566237.
Kulik, J. A., and C.-L. C. Kulik. “Timing of feedback and verbal learning.” Review of Educational Research 58, 1 (1988): 79–97. https://doi.org/10.3102/00346543058001079.
Lawson, T. J., J. H. Bodle, M. A. Houlette, and R. R. Haubner. “Guiding questions enhance student learning from educational videos.” Teaching of Psychology 33, 1 (2006): 31–33. https://doi.org/10.1207/s15328023top3301_7.
Lawson, T. J., J. H. Bodle, and T. A. McDonough. “Techniques for increasing student learning from educational videos: Notes versus guiding questions.” Teaching of Psychology 34, 2 (2007): 90–93. https://doi.org/10.1080/00986280701291309.
Lu, X., A. Sales, and N. T. Heffernan. “Immediate versus delayed feedback on learning: Do people's instincts really conflict with reality?” Journal of Higher Education Theory and Practice 21, 16 (2021): 188–198. https://doi.org/10.33423/jhetp.v21i16.4925.
Mullet, H. G., A. C. Butler, B. Verdin, R. von Borries, and E. J. Marsh. “Delaying feedback promotes transfer of knowledge despite student preferences to receive feedback immediately.” Journal of Applied Research in Memory and Cognition 3, 3 (2014): 222–229. https://doi.org/10.1016/j.jarmac.2014.05.001.
Pulukuri, S., and B. Abrams. “Improving learning outcomes and metacognitive monitoring: Replacing traditional textbook readings with question-embedded videos.” Journal of Chemical Education 98, 7 (2021): 2156–2166. https://doi.org/10.1021/acs.jchemed.1c00237.
Rice, P., P. Beeson, and J. Blackmore-Wright. “Evaluating the impact of a quiz question within an educational video.” TechTrends 63, 5 (2019): 522–532. https://doi.org/10.1007/s11528-019-00374-6.
Smith, T. A., and D. R. Kimball. “Learning from feedback: Spacing and the delay–retention effect.” Journal of Experimental Psychology: Learning, Memory, and Cognition 36, 1 (2010): 80–95. https://doi.org/10.1037/a0017407.
Shemwell, J. T., C. C. Chase, and D. L. Schwartz. “Seeking the general explanation: A test of inductive activities for learning and transfer.” Journal of Research in Science Teaching 52, 1 (2015): 58–83. https://doi.org/10.1002/tea.21185.
Son, J. Y., and M. J. Rivas. “Designing clicker questions to stimulate transfer.” Scholarship of Teaching and Learning in Psychology 2, 3 (2016): 193–207. https://doi.org/10.1037/stl0000065.
State of Video in Education 2015: A Kaltura Report. Kaltura. Accessed June 24, 2022. https://site.kaltura.com/rs/984-SDM-859/images/The_State_of_Video_in_Education_2015_a_Kaltura_Report.pdf
Stiegler-Balfour, J. J., and V. A. Benassi. “Guiding questions promote learning of expository text for less-skilled readers.” Scholarship of Teaching and Learning in Psychology 1, 4 (2015): 312–325. https://doi.org/10.1037/stl0000044.
Tanner, K. D. “Promoting student metacognition.” CBE—Life Sciences Education 11, 2 (2012): 113–120. https://doi.org/10.1187/cbe.12-03-0033.
van der Meij, H., and L. Bӧckmann. “Effects of embedded questions in recorded lectures.” Journal of Computing in Higher Education 33, 1 (2021): 235–254. https://doi.org/10.1007/s12528-020-09263-x.
Wachtler, J., M. Hubmann, H. Zöhrer, and M. Ebner. “An analysis of the use and effect of questions in interactive learning-videos.” Smart Learning Environments 3, 1 (2016): 1–16. https://doi.org/10.1186/s40561-016-0033-3.
Yang, C., R. Potts, and D. R. Shanks. “Enhancing learning and retrieval of new information: a review of the forward testing effect.” npj Science of Learning 3, 1 (2018): 1–9. https://doi.org/10.1038/s41539-018-0024-y.
Zaki, M. “The relationship between segmentation and question location within mobile video platforms for enhancing the ability of recall.” International Journal of Interactive Mobile Technologies 13, 8 (2019): 74–94. https://doi.org/10.3991/ijim.v13i08.10614.
As one of the founders of Quest University Canada, Annie has had the opportunity to develop over a dozen undergraduate courses in topics as far-ranging as astrobiology, infectious diseases, and nutrition. After a decade at Quest in a range of roles, she took on the challenge of vice president of science at TELUS World of Science in Edmonton, one of the largest science centers in Canada. There, she worked to create opportunities for researchers to share their passion for their work with the public. After that, she focused on adult education, heading up the school of Continuing Studies at Capilano University in Vancouver, British Columbia, Canada. In her spare time, she reads voraciously about the future of higher education.