The Influence of Video Format on Engagement and Performance in Online Learning
Abstract
:1. Introduction
2. Related Work
2.1. Cognitive Theory of Multimedia Learning
2.2. Media Richness Theory
2.3. Multimedia Learning and Engagement
3. Hypothesis Development
4. Methods
4.1. Experimental Design and Participants
4.2. Experimental Stimuli
4.3. Operationalization of the Research Variables
4.3.1. Emotional Engagement
4.3.2. Cognitive Engagement
4.3.3. Learning Performance
4.4. Experimental Setup and Protocol
4.5. Data Processing
5. Results and Discussion
5.1. Comparison of Emotional Engagement between the Conditions (H1A)
5.2. Comparison of Cognitive Engagement between the Conditions (H1B)
5.3. Comparison of Learning Outcomes Between the Conditions (H2)
5.4. Relationship between Emotional Engagement and Learning (H3A)
5.5. Relationship between Cognitive Engagement and Learning (H3B)
5.6. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Allen, I.E.; Seaman, J. Grade Level: Tracking Online Education in the United States. Babson Survey Research Group; Babson College: Babson Park, MA, USA, 2015. [Google Scholar]
- Hansch, A.; Hillers, L.; McConachie, K.; Newman, C.; Schildhauer, T.; Schmidt, J.P. Video and Online Learning: Critical Reflections and Findings from the Field; HIIG Discussion Paper Series No. 2015-02; SSRN: Rochester, NY, USA, 2015. [Google Scholar] [CrossRef]
- Guo, P.J.; Kim, J.; Rubin, R. How video production affects student engagement: An empirical study of MOOC videos. In Proceedings of the 1st ACM Conference on Learning at Scale Conference, Atlanta, GA, USA, 4–5 March 2014; pp. 41–50. [Google Scholar] [CrossRef]
- Chen, C.M.; Wu, C.H. Effects of different video lecture types on sustained attention, emotion, cognitive load, and learning performance. Comput. Educ. 2015, 80, 108–121. [Google Scholar] [CrossRef]
- Da Silva, A.G.; Santos, A.M.; Costa, F.A.; Viana, J. Enhancing MOOC videos: Design and production strategies. In Proceedings of the 2016 European Stakeholder Summit on Experiences and Best Practices in and around MOOCs, Graz, Austria, 22–24 February 2016; pp. 107–122. [Google Scholar]
- Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef] [Green Version]
- Hew, K.F. Promoting engagement in online courses: What strategies can we learn from three highly rated MOOCS. Br. J. Educ. Technol. 2016, 47, 320–341. [Google Scholar] [CrossRef]
- Lee, Y.H.; Hsiao, C.; Ho, C.H. The effects of various multimedia instructional materials on students’ learning responses and outcomes: A comparative experimental study. Comput. Hum. Behav. 2014, 40, 119–132. [Google Scholar] [CrossRef]
- Chen, C.M.; Wang, H.P. Using emotion recognition technology to assess the effects of different multimedia materials on learning emotion and performance. Libr. Inf. Sci. Res. 2011, 33, 244–255. [Google Scholar] [CrossRef]
- Ilioudi, C.; Giannakos, M.N.; Chorianopoulos, K. Investigating differences among the commonly used video lecture styles. In Proceedings of the Workshop on Analytics on Video-Based Learning, Leuven, Belgium, 8 April 2013; pp. 21–26. [Google Scholar]
- Kizilcec, R.F.; Bailenson, J.N.; Gomez, C.J. The instructor’s face in video instruction: Evidence from two large-scale field studies. J. Educ. Psychol. 2015, 107, 724–739. [Google Scholar] [CrossRef] [Green Version]
- Wang, J.; Antonenko, P.D. Instructor presence in instructional video: Effects on visual attention, recall, and perceived learning. Comput. Hum. Behav. 2017, 71, 79–89. [Google Scholar] [CrossRef] [Green Version]
- Veletsianos, G.; Shepherdson, P. A systematic analysis and synthesis of the empirical MOOC literature published in 2013–2015. Int. Rev. Res. Open Distrib. Learn. 2016, 17, 198–221. [Google Scholar] [CrossRef] [Green Version]
- Harley, J.M.; Bouchet, F.; Hussain, M.S.; Azevedo, R.; Calvo, R. A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system. Comput. Hum. Behav. 2015, 48, 615–625. [Google Scholar] [CrossRef]
- De Guinea, A.O.; Titah, R.; Léger, P.M. Explicit and implicit antecedents of users’ behavioral beliefs in information systems: A neuropsychological investigation. J. Manag. Inf. Syst. 2014, 30, 179–210. [Google Scholar] [CrossRef]
- Wang, Y.J.; Minor, M.S. Validity, reliability, and applicability of psychophysiological techniques in marketing research. Psychol. Mark. 2008, 25, 197–232. [Google Scholar] [CrossRef]
- Dillon, J.; Bosch, N.; Chetlur, M.; Wanigasekara, N.; Ambrose, G.A.; Sengupta, B.; D’Mello, S.K. Student Emotion, Co-occurrence, and Dropout in a MOOC Context. In Proceedings of the 9th International Conference on Educational Data Mining, Raleigh, NC, USA, 29 June–2 July 2016; pp. 353–357. [Google Scholar]
- Mayer, R.E. Cognitive theory of multimedia. In The Cambridge Handbook of Multimedia Learning; Mayer, R.E., Ed.; Cambridge University Press: New York, NY, USA, 2005; pp. 31–48. [Google Scholar]
- Trevino, L.K.; Lengel, R.H.; Daft, R.L. Media symbolism, media richness, and media choice in organizations: A symbolic interactionist perspective. Commun. Res. 1987, 14, 553–574. [Google Scholar] [CrossRef]
- Baddeley, A. Working memory: Looking back and looking forward. Nat. Rev. Neurosci. 2003, 4, 829–839. [Google Scholar] [CrossRef] [PubMed]
- Homer, B.D.; Plass, J.L.; Blake, L. The effects of video on cognitive load and social presence in multimedia-learning. Comput. Hum. Behav. 2008, 24, 786–797. [Google Scholar] [CrossRef]
- Korving, H.; Hernández, M.; De Groot, E. Look at me and pay attention! A study on the relation between visibility and attention in weblectures. Comput. Educ. 2016, 94, 151–161. [Google Scholar] [CrossRef]
- Sun, P.C.; Cheng, H.K. The design of instructional multimedia in e-Learning: A Media Richness Theory-based approach. Comput. Educ. 2007, 49, 662–676. [Google Scholar] [CrossRef]
- Chang, T.; Chang, D. Enhancing learning experience with dynamic animation. In Proceedings of the 2004 International Conference on Engineering Education, Gainesville, FL, USA, 17–19 October 2004. [Google Scholar]
- Li, X.; Zhao, Q.; Liu, L.; Peng, H.; Qi, Y.; Mao, C.; Fang, Z.; Liu, Q.; Hu, B. Improve affective learning with EEG approach. Comput. Inform. 2012, 29, 557–570. [Google Scholar]
- Feldman, L.A. Valence focus and arousal focus: Individual differences in the structure of affective experience. J. Personal. Soc. Psychol. 1995, 69, 153–166. [Google Scholar] [CrossRef]
- Smiley, W.; Anderson, R. Measuring Students’ Cognitive Engagement on Assessment Tests: A Confirmatory Factor Analysis of the Short Form of the Cognitive Engagement Scale. Res. Pract. Assess. 2011, 6, 17–28. [Google Scholar]
- Serrhini, M.; Dargham, A. Toward Incorporating Bio-signals in Online Education Case of Assessing Student Attention with BCI. In Europe and MENA Cooperation Advances in Information and Communication Technologies; Rocha, Á., Serrhini, M., Felgueiras, C., Eds.; Springer: Berlin, Germany, 2017; Volume 520, pp. 135–146. [Google Scholar] [CrossRef]
- Gollan, J.K.; Hoxha, D.; Chihade, D.; Pflieger, M.E.; Rosebrock, L.; Cacioppo, J. Frontal alpha EEG asymmetry before and after behavioral activation treatment for depression. Biol. Psychol. 2014, 99, 198–208. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lagerstrom, L.; Johanes, P.; Ponsukcharoen, M.U. The myth of the six minute rule: Student engagement with online videos. In Proceedings of the 2015 American Society for Engineering Education Annual Conference and Exposition, Seattle, WA, USA, 14–17 June 2015; pp. 14–17. [Google Scholar]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- De Guinea, A.O.; Titah, R.; Léger, P.M. Measure for measure: A two study multi-trait multi-method investigation of construct validity in IS research. Comput. Hum. Behav. 2013, 29, 833–844. [Google Scholar] [CrossRef]
- Keltner, D.; Ekman, P. Facial expression of emotion. In Handbook of Emotions, 2nd ed.; Lewis, M., Haviland-Jones, J., Eds.; Guilford Press: New York, NY, USA, 2000; pp. 236–249. [Google Scholar]
- Al-Awni, A. Mood Extraction Using Facial Features to Improve Learning Curves of Students in E-Learning Systems. Int. J. Adv. Comput. Sci. Appl. 2016, 7, 444–453. [Google Scholar] [CrossRef] [Green Version]
- Bahreini, K.; Nadolski, R.; Westera, W. Data fusion for real-time multimodal emotion recognition through webcams and microphones in e-learning. Int. J. Hum. Comput. Interact. 2016, 32, 415–430. [Google Scholar] [CrossRef] [Green Version]
- Lewinski, P.; den Uyl, T.M.; Butler, C. Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. J. Neurosci. Psychol. Econ. 2014, 7, 227–236. [Google Scholar] [CrossRef] [Green Version]
- Hetland, A.; Vittersø, J.; Fagermo, K.; Øvervoll, M.; Dahl, T.I. Visual excitement: Analyzing the effects of three Norwegian tourism films on emotions and behavioral intentions. Scand. J. Hosp. Tour. 2016, 16, 528–547. [Google Scholar] [CrossRef]
- Charland, P.; Leger, P.M.; Senecal, S.; Courtemanche, F.; Mercier, J.; Skelling, Y.; Labonte-Lemoyne, E. Assessing the multiple dimensions of engagement to characterize learning: A neurophysiological perspective. JOVE J. Vis. Exp. 2015. [Google Scholar] [CrossRef] [Green Version]
- Courtemanche, F.; Léger, P.M.; Dufresne, A.; Fredette, M.; Labonté-LeMoyne, É.; Sénécal, S. Physiological heatmaps: A tool for visualizing users’ emotional reactions. Multimed. Tools Appl. 2018, 77, 11547–11574. [Google Scholar] [CrossRef] [Green Version]
- Pauna, H.; Léger, P.M.; Sénécal, S.; Fredette, M.; Courtemanche, F.; Chen, S.L.; Labonté-Lemoyne, É.; Ménard, J.F. The psychophysiological effect of a vibro-kinetic movie experience: The case of the D-BOX movie seat. In Information Systems and Neuroscience; Davis, F., Riedl, R., vom Brocke, J., Léger, P.M., Randolph, A., Eds.; Springer: Berlin, Germany, 2018; Volume 25, pp. 1–7. [Google Scholar] [CrossRef]
- Charland, P.; Lapierre, H.; Skelling, Y.; Léger, P.M.; Simon, G. L’utilisation de l’électroencéphalographie: La collecte et l’analyse en continu. In Méthodes de Recherche en Neuroéducation; Masson, S., Borst, G., Eds.; Les Presses de l’Université du Québec: Montreal, QC, Canada, 2017; pp. 219–242. [Google Scholar] [CrossRef]
- Pope, A.T.; Bogart, E.H.; Bartolome, D.S. Biocybernetic system evaluates indices of operator engagement in automated task. Biol. Psychol. 1995, 40, 187–195. [Google Scholar] [CrossRef]
- Mikulka, P.J.; Scerbo, M.W.; Freeman, F.G. Effects of a biocybernetic system on vigilance performance. Hum. Factors 2002, 44, 654–664. [Google Scholar] [CrossRef] [PubMed]
- American Encephalographic Society. Guideline thirteen: Guidelines for standard electrode position nomenclature. J. Clin. Neurophysiol. 1994, 11, 111–113. [Google Scholar] [CrossRef]
- Cronan, T.P.; Léger, P.M.; Robert, J.; Babin, G.; Charland, P. Comparing objective measures and perceptions of cognitive learning in an ERP simulation game: A research note. Simul. Gaming 2012, 43, 461–480. [Google Scholar] [CrossRef]
- Xiong, Y.; Li, H.; Kornhaber, M.L.; Suen, H.K.; Pursel, B.; Goins, D.D. Examining the relations among student motivation, engagement, and retention in a MOOC: A structural equation modeling approach. Glob. Educ. Rev. 2015, 2, 23–33. [Google Scholar]
- Sanders, S.V. Wireless EEG and self-report engagement in online learning environments. Ph.D. Thesis, Regent University, Virginia Beach, VA, USA, December 2016. [Google Scholar]
Comparison Item | Lecture Capture | Infographic Video |
---|---|---|
Cost | Low | High |
Conveyed learning context | A professor presenting the subject in a traditional class setting | A visual presentation which shows dynamic content with a background voice of the same professor |
Multimedia elements | Camera focused on the professor, audio | Graphics, images, text, audio |
Media richness | Medium | High |
Social cues | Many | Some |
Measure | Neurophysiological State (Response to Stimuli) | Operationalization |
---|---|---|
Emotional engagement | Affective response: valence | Implicit measure: Facial expressions (Noldus Information Technology, Wageningen, Netherlands) Explicit measure: SAM Scale-pleasure |
Emotional engagement | Affective response: arousal | Implicit measures: Facial expressions (Noldus Information Technology, Wageningen, Netherlands) and EDA (Biopac, Goleta, USA) Explicit measure: SAM Scale-arousal |
Cognitive engagement | Cognitive response: attention | Implicit measure: EEG (Brain Vision, Morrisville, USA), Mensia NeuroRT (Mensia Technologies, Paris, France) |
Learning performance | Difference between pre- and post-test multiple-choice questionnaire results |
Facial Expressions | SAM | EDA | EEG | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Valence (Mean) | Valence (Over Time) | Arousal (Mean) | Arousal (Over Time) | Valence | Arousal | Arousal (Mean) | Arousal (Over Time) | Arousal (STD) | Arousal (STD Over Time) | Attention (Mean) | Attention (Over Time) | |
β Estimate | −0.19 | 7.30 × 10−6 | −0.02 | −5 × 10−6 | 0.24 | −60 × 10−6 | −0.004 | 2.4 × 10−6 | −0.06 | −44 × 10−6 | ||
t Value | −1.45 | 4.97 | −1.12 | −10.73 | 8.22 | −9.48 | −0.55 | 2.43 | −0.23 | −2.11 | ||
Sig. | 0.15 | <0.0001 | 0.26 | <0.0001 | 0.24 | 0.53 | <0.0001 | <0.0001 | 0.58 | 0.02 | 0.81 | 0.03 |
Facial Expressions | SAM | EDA | EEG | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Intercept | Valance (Mean) | Intercept | Arousal (Mean) | Arousal (STD) | Valance | Arousal (Infographic Video) | Intercept | EDA (Mean) | EDA (STD) | Intercept | Attention (Mean) | Attention (Squared) | |
β Estimate | 77.17 | 20.27 | 13.92 | 123.62 | 110.22 | 69.37 | 9.30 | −16.19 | −680.62 | 889.34 | −259.85 | ||
t Value | 17.97 | 2.25 | 0.52 | 2.01 | 1.83 | 10.01 | 2.14 | −2.38 | −1.87 | 2.09 | −2.11 | ||
Sig. | <0.0001 | 0.03 | 0.61 | 0.06 | 0.08 | 0.53 | 0.03 | <0.0001 | 0.04 | 0.02 | 0.08 | 0.04 | 0.04 |
Research Questions | Hypotheses | Results |
---|---|---|
RQ1: Which video production style engages the students more emotionally and cognitively? | H1A: Emotional engagement will be higher for the infographic video | Not supported |
H1B: Cognitive engagement will be higher for the infographic video | Supported | |
RQ2: Is there a difference in learning between conditions? | H2: Learning performance will be higher for the infographic video | Supported |
RQ3: What is the relationship between engagement and learning outcomes? | H3A: The higher the emotional engagement, the better the student performance | Supported |
H3B: There is a quadratic relationship between cognitive engagement and performance | Supported |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lackmann, S.; Léger, P.-M.; Charland, P.; Aubé, C.; Talbot, J. The Influence of Video Format on Engagement and Performance in Online Learning. Brain Sci. 2021, 11, 128. https://doi.org/10.3390/brainsci11020128
Lackmann S, Léger P-M, Charland P, Aubé C, Talbot J. The Influence of Video Format on Engagement and Performance in Online Learning. Brain Sciences. 2021; 11(2):128. https://doi.org/10.3390/brainsci11020128
Chicago/Turabian StyleLackmann, Sergej, Pierre-Majorique Léger, Patrick Charland, Caroline Aubé, and Jean Talbot. 2021. "The Influence of Video Format on Engagement and Performance in Online Learning" Brain Sciences 11, no. 2: 128. https://doi.org/10.3390/brainsci11020128
APA StyleLackmann, S., Léger, P. -M., Charland, P., Aubé, C., & Talbot, J. (2021). The Influence of Video Format on Engagement and Performance in Online Learning. Brain Sciences, 11(2), 128. https://doi.org/10.3390/brainsci11020128