Next Article in Journal
A Novel Mixed Methods Approach to Synthesize EDA Data with Behavioral Data to Gain Educational Insight
Previous Article in Journal
Modified Split Ring Resonators Sensor for Accurate Complex Permittivity Measurements of Solid Dielectrics
Previous Article in Special Issue
A Multimodal Real-Time Feedback Platform Based on Spoken Interactions for Remote Active Learning Support
Open AccessReview

Multimodal Data Fusion in Learning Analytics: A Systematic Review

by 1, 1,* and 2
1
School of Information Technology in Education, South China Normal University, Guangzhou 510631, China
2
School of Computing and Mathematics, Charles Sturt University, Albury, NSW 2640, Australia
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(23), 6856; https://doi.org/10.3390/s20236856
Received: 10 November 2020 / Revised: 26 November 2020 / Accepted: 28 November 2020 / Published: 30 November 2020
Multimodal learning analytics (MMLA), which has become increasingly popular, can help provide an accurate understanding of learning processes. However, it is still unclear how multimodal data is integrated into MMLA. By following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, this paper systematically surveys 346 articles on MMLA published during the past three years. For this purpose, we first present a conceptual model for reviewing these articles from three dimensions: data types, learning indicators, and data fusion. Based on this model, we then answer the following questions: 1. What types of data and learning indicators are used in MMLA, together with their relationships; and 2. What are the classifications of the data fusion methods in MMLA. Finally, we point out the key stages in data fusion and the future research direction in MMLA. Our main findings from this review are (a) The data in MMLA are classified into digital data, physical data, physiological data, psychometric data, and environment data; (b) The learning indicators are behavior, cognition, emotion, collaboration, and engagement; (c) The relationships between multimodal data and learning indicators are one-to-one, one-to-any, and many-to-one. The complex relationships between multimodal data and learning indicators are the key for data fusion; (d) The main data fusion methods in MMLA are many-to-one, many-to-many and multiple validations among multimodal data; and (e) Multimodal data fusion can be characterized by the multimodality of data, multi-dimension of indicators, and diversity of methods. View Full-Text
Keywords: multimodal learning analytics; data fusion; multimodal data; learning indicators; online learning multimodal learning analytics; data fusion; multimodal data; learning indicators; online learning
Show Figures

Figure 1

MDPI and ACS Style

Mu, S.; Cui, M.; Huang, X. Multimodal Data Fusion in Learning Analytics: A Systematic Review. Sensors 2020, 20, 6856. https://doi.org/10.3390/s20236856

AMA Style

Mu S, Cui M, Huang X. Multimodal Data Fusion in Learning Analytics: A Systematic Review. Sensors. 2020; 20(23):6856. https://doi.org/10.3390/s20236856

Chicago/Turabian Style

Mu, Su; Cui, Meng; Huang, Xiaodi. 2020. "Multimodal Data Fusion in Learning Analytics: A Systematic Review" Sensors 20, no. 23: 6856. https://doi.org/10.3390/s20236856

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop