Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models
Abstract
:1. Introduction
- To generate behavioral- and emotional-feature-based student datasets in the offline classroom environment;
- To compare the performance of TL algorithms in terms of the computation of student engagement in the offline classroom environment;
- To propose an effective model for computing student engagement based on behavioral features and revealing the level of engagement based on emotional features in the offline classroom environment.
2. Related Work
3. Materials and Methods
3.1. Dataset Acquisition
3.2. Pre-Processing
3.2.1. Frame Extraction and Augmentation
3.2.2. Data Augmentation
3.3. Survey Analysis
3.4. Experimental Setup
3.4.1. Deep-Learning Models for Behavioral-Based Students’ Engagement Level
3.4.2. Deep-Learning Models for Measuring Emotion-Based Engagement Level
4. Results
4.1. Evaluation of Behavior Detection Models
4.1.1. Intra-Model Evaluations’ Comparison of Behavior Detection Models
4.1.2. Inter-Model Evaluations’ Comparison of Behavior Detection Models
4.2. Evaluation of Emotion Detection Models
4.2.1. Intra-Model Evaluations’ Comparison of Behavior Detection Models
4.2.2. Inter-Model Evaluations’ Comparison of Emotion Detection Models
4.3. Behavior and Emotion Detection Using Optimal Models
4.4. Computation of Student Engagement Level
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef]
- Pabba, C.; Kumar, P. An intelligent system for monitoring students’ engagement in large classroom teaching through facial expression recognition. Expert Syst. 2022, 39, e12839. [Google Scholar] [CrossRef]
- Bradbury, N.A. Attention span during lectures: 8 seconds, 10 minutes, or more? Adv. Physiol. Educ. 2016, 40, 509–513. [Google Scholar] [CrossRef] [PubMed]
- Exeter, D.J.; Ameratunga, S.; Ratima, M.; Morton, S.; Dickson, M.; Hsu, D.; Jackson, R. Student engagement in very large classes: The teachers’ perspective. Stud. High. Educ. 2010, 35, 761–775. [Google Scholar] [CrossRef]
- Sathik, M.; Jonathan, S.G. Effect of facial expressions on student’s comprehension recognition in virtual educational environments. SpringerPlus 2013, 2, 455. [Google Scholar] [CrossRef]
- Zaletelj, J.; Košir, A. Predicting students’ attention in the classroom from Kinect facial and body features. EURASIP J. Image Video Process. 2017, 2017, 80. [Google Scholar] [CrossRef]
- Klein, R.; Celik, T. The Wits Intelligent Teaching System: Detecting student engagement during lectures using convolutional neural networks. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 2856–2860. [Google Scholar]
- Thomas, C.; Jayagopi, D.B. Predicting student engagement in classrooms using facial behavioralal cues. In Proceedings of the 1st ACM SIGCHI International Workshop on Multimodal Interaction for Education, Glasgow, UK, 13 November 2017; pp. 33–40. [Google Scholar]
- Hu, M.; Wei, Y.; Li, M.; Yao, H.; Deng, W.; Tong, M.; Liu, Q. Bimodal Learning Engagement Recognition from Videos in the Classroom. Sensors 2022, 22, 5932. [Google Scholar] [CrossRef] [PubMed]
- Fredricks, J.A. The measurement of student engagement: Methodological advances and comparison of new self-report instruments. In Handbook of Research on Student Engagement; Springer International Publishing: Cham, Germany, 2022; pp. 597–616. [Google Scholar]
- Dirican, A.C.; Göktürk, M. Psychophysiological measures of human cognitive states applied in human computer interaction. Procedia Comput. Sci. 2011, 3, 1361–1367. [Google Scholar] [CrossRef]
- Murshed, M.; Dewan, M.A.A.; Lin, F.; Wen, D. Engagement detection in e-learning environments using convolutional neural networks. In Proceedings of the 2019 IEEE Intl Conf on Dependable, Autonomic and Secure Computing, Intl Conf on Pervasive Intelligence and Computing, Intl Conf on Cloud and Big Data Computing, Intl Conf on Cyber Science and Technology Congress (DASC/PiCom/CBDCom/CyberSciTech), Fukuoka, Japan, 5–8 August 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 80–86. [Google Scholar]
- Ma, X.; Xu, M.; Dong, Y.; Sun, Z. Automatic student engagement in online learning environment based on neural turing machine. Int. J. Inf. Educ. Technol. 2021, 11, 107–111. [Google Scholar] [CrossRef]
- Bosch, N.; D’mello, S.K.; Ocumpaugh, J.; Baker, R.S.; Shute, V. Using video to automatically detect learner affect in computer-enabled classrooms. ACM Trans. Interact. Intell. Syst. (TiiS) 2016, 6, 1–26. [Google Scholar] [CrossRef]
- Zhang, H.; Xiao, X.; Huang, T.; Liu, S.; Xia, Y.; Li, J. A novel end-to-end network for automatic student engagement recognition. In Proceedings of the 2019 IEEE 9th International Conference on Electronics Information and Emergency Communication (ICEIEC), Beijing, China, 12–14 July 2019; IEEE: Piscataway, NJ, USA; pp. 342–345. [Google Scholar]
- Mukhopadhyay, M.; Pal, S.; Nayyar, A.; Pramanik, P.K.; Dasgupta, N.; Choudhury, P. Facial emotion detection to assess Learner’s State of mind in an online learning system. In Proceedings of the 2020 5th International Conference on Intelligent Information Technology, Hanoi, Vietnam, 19–22 February 2020; pp. 107–115. [Google Scholar]
- Bhardwaj, P.; Gupta, P.K.; Panwar, H.; Siddiqui, M.K.; Morales-Menendez, R.; Bhaik, A. Application of deep learning on student engagement in e-learning environments. Comput. Electr. Eng. 2021, 93, 107277. [Google Scholar] [CrossRef] [PubMed]
- Yulina, S.; Elviyenti, M. An Exploratory Data Analysis for Synchronous Online Learning Based on AFEA Digital Images. J. Nas. Tek. Elektro Dan Teknol. Inf. 2022, 11, 114–120. [Google Scholar]
- Altuwairqi, K.; Jarraya, S.K.; Allinjawi, A.; Hammami, M. Student behavior analysis to measure engagement levels in online learning environments. Signal Image Video Process. 2021, 15, 1387–1395. [Google Scholar] [CrossRef]
- Kim, H.; Küster, D.; Girard, J.M.; Krumhuber, E.G. Human and machine recognition of dynamic and static facial expressions: Prototypicality, ambiguity, and complexity. Front. Psychol. 2023, 14, 1221081. [Google Scholar] [CrossRef] [PubMed]
- Mastorogianni, M.E.; Konstanti, S.; Dratsiou, I.; Bamidis, P.D. Masked emotions: Does children’s affective state influence emotion recognition? Front. Psychol. 2024, 15, 1329070. [Google Scholar] [CrossRef]
- Peng, S.; Nagao, K. Recognition of students’ mental states in discussion based on multimodal data and its application to educational support. IEEE Access 2021, 9, 18235–18250. [Google Scholar] [CrossRef]
- Vanneste, P.; Oramas, J.; Verelst, T.; Tuytelaars, T.; Raes, A.; Depaepe, F.; Van den Noortgate, W. Computer vision and human behaviour, emotion and cognition detection: A use case on student engagement. Mathematics 2021, 9, 287. [Google Scholar] [CrossRef]
- Luo, Z.; Chen, J.; Wang, G.; Liao, M. A three-dimensional model of student interest during learning using multimodal fusion with natural sensing technology. Interact. Learn. Environ. 2022, 30, 1117–1130. [Google Scholar] [CrossRef]
- Zheng, R.; Jiang, F.; Shen, R. Intelligent student behavioral analysis system for real classrooms. In Proceedings of the ICASSP 2020–2020 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), Barcelona, Spain, 4–8 May 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 9244–9248. [Google Scholar]
- Ashwin, T.S.; Guddeti, R.M. Unobtrusive behavioralal analysis of students in classroom environment using non-verbal cues. IEEE Access 2019, 7, 150693–150709. [Google Scholar] [CrossRef]
- Soloviev, V. Machine learning approach for student engagement automatic recognition from facial expressions. Sci. Publ. State Univ. Novi Pazar Ser. A Appl. Math. Inform. Mech. 2018, 10, 79–86. [Google Scholar] [CrossRef]
- Zhang, Z.; Fort, J.M.; Mateu, L.G. Facial expression recognition in virtual reality environments: Challenges and opportunities. Front. Psychol. 2023, 14, 1280136. [Google Scholar] [CrossRef]
- Muarraf, A.; Ahmad, H.; Ahmad, W.; Faisal, N.; Ahmad, M. Research Trend Analysis of Artificial Intelligence. In Proceedings of the 2020 30th International Conference on Computer Theory and Applications (ICCTA), Alexandria, Egypt, 12–14 December 2020; IEEE: Piscataway, NJ, USA, 2020; pp. 49–53. [Google Scholar]
- Maheen, F.; Asif, M.; Ahmad, H.; Ahmad, S.; Alturise, F.; Asiry, O.; Ghadi, Y.Y. Automatic computer science domain multiple-choice questions generation based on informative sentences. PeerJ Comput. Sci. 2022, 8, e1010. [Google Scholar] [CrossRef]
- Rashid, H.U.; Ibrikci, T.; Paydaş, S.; Binokay, F.; Çevik, U. Analysis of breast cancer classification robustness with radiomics feature extraction and deep learning techniques. Expert Syst. 2022, 39, e13018. [Google Scholar] [CrossRef]
- Thakur, A.; Aggarwal, P.; Dubey, A.K.; Abdelgawad, A.; Rocha, A. Design of decision model for sensitive crop irrigation system. Expert Syst. 2022, 40, e13119. [Google Scholar] [CrossRef]
- Nyangaresi, V.O.; Ahmad, M.; Alkhayyat, A.; Feng, W. Artificial Neural Network and Symmetric Key Cryptography Based Verification Protocol for 5G Enabled Internet of Things. Expert Syst. 2022, 39, e13126. [Google Scholar] [CrossRef]
- Asif, M.; Ishtiaq, A.; Ahmad, H.; Aljuaid, H.; Shah, J. Sentiment analysis of extremism in social media from textual information. Telemat. Inform. 2020, 48, 101345. [Google Scholar] [CrossRef]
- Ahmad, H.; Nasir, F.; Faisal, C.M.N.; Ahmad, S. Depression Detection in Online Social Media Users Using Natural Language Processing Techniques. In Handbook of Research on Opinion Mining and Text Analytics on Literary Works and Social Media; IGI Global: Hershey, PA, USA, 2022; pp. 323–347. [Google Scholar]
- Ahmad, H.; Ahmad, S.; Asif, M.; Rehman, M.; Alharbi, A.; Ullah, Z. Evolution-based performance prediction of star cricketers. Comput. Mater. Contin. 2021, 69, 1215–1232. [Google Scholar] [CrossRef]
- Teng, Y.; Zhang, J.; Sun, T. Data-driven decision-making model based on artificial intelligence in higher education system of colleges and universities. Expert Syst. 2022, 40, e12820. [Google Scholar] [CrossRef]
- Gamulin, J.; Gamulin, O.; Kermek, D. Using Fourier coefficients in time series analysis for student performance prediction in blended learning environments. Expert Syst. 2016, 33, 189–200. [Google Scholar] [CrossRef]
- Sunaryono, D.; Siswantoro, J.; Anggoro, R. An android based course attendance system using face recognition. J. King Saud Univ.-Comput. Inf. Sci. 2021, 33, 304–312. [Google Scholar] [CrossRef]
- Karimah, S.N.; Hasegawa, S. Automatic engagement estimation in smart education/learning settings: A systematic review of engagement definitions, datasets, and methods. Smart Learn. Environ. 2022, 9, 31. [Google Scholar] [CrossRef]
- Wang, S.; Liu, Z.; Lv, S.; Lv, Y.; Wu, G.; Peng, P.; Chen, F.; Wang, X. A natural visible and infrared facial expression database for expression recognition and emotion inference. IEEE Trans. Multimed. 2010, 12, 682–691. [Google Scholar] [CrossRef]
- Gupta, A.; D’Cunha, A.; Awasthi, K.; Balasubramanian, V. Daisee: Towards user engagement recognition in the wild. arXiv 2016, arXiv:1609.01885. [Google Scholar]
- Dhall, A.; Sharma, G.; Goecke, R.; Gedeon, T. Emotiw 2020: Driver gaze, group emotion, student engagement and physiological signal-based challenges. In Proceedings of the 2020 International Conference on Multimodal Interaction, Utrecht, The Netherlands, 25–29 October 2020; pp. 784–789. [Google Scholar]
- Dubovi, I. Cognitive and emotional engagement while learning with VR: The perspective of multimodal methodology. Comput. Educ. 2022, 183, 104495. [Google Scholar] [CrossRef]
- Ashwin, T.S.; Guddeti, R.M.R. Automatic detection of students’ affective states in classroom environment using hybrid convolutional neural networks. Educ. Inf. Technol. 2020, 25, 1387–1415. [Google Scholar]
- Apicella, A.; Arpaia, P.; Frosolone, M.; Improta, G.; Moccaldi, N.; Pollastro, A. EEG-based measurement system for monitoring student engagement in learning 4.0. Sci. Rep. 2022, 12, 5857. [Google Scholar] [CrossRef]
- Goldberg, P.; Sümer, Ö.; Stürmer, K.; Wagner, W.; Göllner, R.; Gerjets, P.; Kasneci, E.; Trautwein, U. Attentive or not? Toward a machine learning approach to assessing students’ visible engagement in classroom instruction. Educ. Psychol. Rev. 2021, 33, 27–49. [Google Scholar] [CrossRef]
- Baltrušaitis, T.; Robinson, P.; Morency, L.-P. Openface: An open-source facial behavioural analysis toolkit. In Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA, 7–10 March 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 1–10. [Google Scholar]
- Abedi, A.; Khan, S. Affect-driven engagement measurement from videos. Computer 2021, 11, 12. [Google Scholar] [CrossRef]
- Mehta, N.K.; Prasad, S.S.; Saurav, S.; Saini, R.; Singh, S. Three-dimensional DenseNet self-attention neural network for automatic detection of student’s engagement. Appl. Intell. 2022, 52, 13803–13823. [Google Scholar] [CrossRef]
- Thomas, C.; Sarma, K.A.P.; Gajula, S.S.; Jayagopi, D.B. Automatic prediction of presentation style and student engagement from videos. Comput. Educ. Artif. Intell. 2022, 3, 100079. [Google Scholar] [CrossRef]
- Acharya, S.; Reza, M. Real-time emotion engagement tracking of students using human biometric emotion intensities. In Machine Learning for Biometrics; Academic Press: Cambridge, MA, USA, 2022; pp. 143–153. [Google Scholar]
- Li, Y.-T.; Yeh, S.-L.; Huang, T.-R. The cross-race effect in automatic facial expression recognition violates measurement invariance. Front. Psychol. 2023, 14, 1201145. [Google Scholar] [CrossRef]
- Ikram, S.; Ahmad, H.; Mahmood, N.; Faisal, C.M.N.; Abbas, Q.; Qureshi, I.; Hussain, A. Recognition of student engagement state in a classroom environment using deep and efficient transfer learning algorithm. Appl. Sci. 2023, 13, 8637. [Google Scholar] [CrossRef]
- Pan, M.; Wang, J.; Luo, Z. Modelling study on learning affects for classroom teaching/learning auto-evaluation. Sci. J. Educ. 2018, 6, 81–86. [Google Scholar] [CrossRef]
- Abedi, A.; Khan, S.S. Improving state-of-the-art in detecting student engagement with resnet and tcn hybrid network. In Proceedings of the 2021 18th Conference on Robots and Vision (CRV), Burnaby, BC, Canada, 26–28 May 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 151–157. [Google Scholar]
References | Dataset | No. of Participants | Features | Input | Classification | Methodology | Student Engagement Level | Test Accuracy |
---|---|---|---|---|---|---|---|---|
[2] | Self-generated | 61 | Facial expression, eye-tracking, EDA data | Dedicated sensors | Emotional engagement, cognitive engagement | Linear mixed-effects model for facial, ANOVA for eye tracking | No | 51% |
[13] | Self-generated, BAUM-1, DAiSEE, and YawDD | 50 | Bored, confused, focused, frustrated, yawning, sleepy | Images | Low, medium, and high engagement | CNN | No | 76.9% |
[44] | DAiSEE | 112 | Eye-gaze, FAU, head pose, body pose | Images | Completely disengaged, barely engaged, engaged, and highly engaged | Neural Turing machine | No | 61% |
[45] | Self-generated | 50 | Facial expressions, body postures | Images | Engaged, non-engaged, and neutral | Inception V3 | No | 86% |
[46] | DAiSEE and EmotiW | 112 | Gaze direction and head pose | Images | Low- and high-level engagement | LSTM and TCN | No | 63% |
[49] | Self-generated | 21 | EEG signals and performance tests | EEG Signal | Emotion level, cognitive level | SVM | No | 76.7% |
76.9% |
Dataset | Features | No. of Frames |
---|---|---|
Closed eyes | 648 | |
Focused | 723 | |
Behavioral | Looking away | 650 |
Yawning | 600 | |
Happy | 710 | |
Emotional | Sad | 708 |
Angry | 500 | |
Neutral | 550 |
Features | Engaged | Non-Engaged | Scale (1–10) Average Score | |
---|---|---|---|---|
Behavior reflecting features | Looking away | -- | 69% | 5.5 |
Yawning | -- | 71% | 5.5 | |
Focused | 92% | -- | 7.7 | |
Closed eyes | -- | 98% | 6.2 | |
Sad | 69% | -- | 6.6 | |
Emotion reflecting feature | Happy | 88% | -- | 8 |
Angry | 75% | -- | 6.2 | |
Neutral | 77% | -- | 6.8 |
Parameters | Values |
---|---|
Epochs | 200 |
Batch size | 16 |
Activation function | ReLU |
Learning rate | 0.0001 |
Image size | 155 × 155 × 3 |
Optimizer | Adam |
Binary-class loss function | Binary cross-entropy |
Multi-class loss function | Categorical cross-entropy |
Model | Training Accuracy (%) | Validation Accuracy (%) | Training Loss | Validation Loss | Testing Accuracy (%) | Optimal Solution |
---|---|---|---|---|---|---|
CNN | 97 | 91 | 0.12 | 0.15 | 83 | Yes |
VGG16 | 91 | 85 | 0.22 | 0.26 | 76 | No |
Inception V3 | 93 | 80 | 0.28 | 0.46 | 69 | No |
ResNet50 | 90 | 81 | 0.23 | 0.29 | 71 | No |
Model | Training Accuracy (%) | Validation Accuracy (%) | Training Loss | Validation Loss | Testing Accuracy (%) | Optimal Solution |
---|---|---|---|---|---|---|
CNN | 92 | 86 | 0.14 | 0.26 | 70 | No |
VGG16 | 91 | 80 | 0.21 | 0.26 | 62 | No |
Inception V3 | 85 | 79 | 0.24 | 0.46 | 58 | No |
ResNet50 | 95 | 90 | 0.15 | 0.19 | 82 | Yes |
Detection Type | Testing Accuracy | Mod | Precision | Recall | F-Measure |
---|---|---|---|---|---|
Behavior detection using CNN | 0.83 | Engaged | 0.84 | 0.82 | 0.83 |
Non-Engaged | 0.82 | 0.84 | 0.83 | ||
Emotion detection using ResNet50 | 0.82 | Happy | 0.80 | 0.78 | 0.79 |
Sad | 0.85 | 0.82 | 0.83 | ||
Angry | 0.82 | 0.80 | 0.81 | ||
Neutral | 0.82 | 0.89 | 0.86 |
Sample Image Frame | (MCSi)B | Detected Emotional State | (MCSi)E | (SAS)ES | SEL |
---|---|---|---|---|---|
7.5 | Happy | 8.1 | 7.1 | 7.5 | |
NA | NA | NA | NA | Not engaged | |
6.5 | Angry | 7.0 | 7 | 6.8 | |
8.5 | Sad | 6 | 6 | 6.5 | |
7 | Neutral | 8 | 6 | 6.9 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Mahmood, N.; Bhatti, S.M.; Dawood, H.; Pradhan, M.R.; Ahmad, H. Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models. Algorithms 2024, 17, 458. https://doi.org/10.3390/a17100458
Mahmood N, Bhatti SM, Dawood H, Pradhan MR, Ahmad H. Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models. Algorithms. 2024; 17(10):458. https://doi.org/10.3390/a17100458
Chicago/Turabian StyleMahmood, Nasir, Sohail Masood Bhatti, Hussain Dawood, Manas Ranjan Pradhan, and Haseeb Ahmad. 2024. "Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models" Algorithms 17, no. 10: 458. https://doi.org/10.3390/a17100458
APA StyleMahmood, N., Bhatti, S. M., Dawood, H., Pradhan, M. R., & Ahmad, H. (2024). Measuring Student Engagement through Behavioral and Emotional Features Using Deep-Learning Models. Algorithms, 17(10), 458. https://doi.org/10.3390/a17100458