Emotion Detection Based on Pupil Variation
Abstract
:1. Introduction
2. Methodology
2.1. Participants
2.2. Scenario and Emotion Evoking
2.3. Experiemental Setting
2.4. Measurements and Data Processing
2.5. Feature Selection and Classification Model
3. Results
4. Discussions
5. Conclusions
6. Limitation and Future Research
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Domínguez-Jiménez, J.A.; Campo-Landines, K.C.; Martínez-Santos, J.C.; Delahoz, E.J.; Contreras-Ortiz, S.H. A machine learning model for emotion recognition from physiological signals. Biomed. Signal. Process. Control 2020, 55, 101646. [Google Scholar] [CrossRef]
- Li, Q.; Liu, Y.; Shang, Y.; Zhang, Q.; Yan, F. Deep Sparse Autoencoder and Recursive Neural Network for EEG Emotion Recognition. Entropy 2022, 24, 1187. [Google Scholar] [CrossRef] [PubMed]
- Kowalska, M.; Wróbel, M. Basic emotions. In Encyclopedia of Personality and Individual Differences; Springer: Cham, Switzerland, 2020; pp. 377–382. [Google Scholar]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef] [PubMed]
- Kołakowska, A.; Szwoch, W.; Szwoch, M.A. review of emotion recognition methods based on data acquired via smartphone sensors. Sensors 2020, 20, 6367. [Google Scholar] [CrossRef] [PubMed]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A review of emotion recognition using physiological signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Atmaja, B.T.; Sasou, A. Effects of Data Augmentations on Speech Emotion Recognition. Sensors 2022, 22, 5941. [Google Scholar] [CrossRef]
- Khoeun, R.; Chophuk, P.; Chinnasarn, K. Emotion Recognition for Partial Faces Using a Feature Vector Technique. Sensors 2022, 22, 4633. [Google Scholar] [CrossRef]
- Park, H.; Shin, Y.; Song, K.; Yun, C.; Jang, D. Facial Emotion Recognition Analysis Based on Age-Biased Data. Appl. Sci. 2022, 12, 7992. [Google Scholar] [CrossRef]
- Dzedzickis, A.; Kaklauskas, A.; Bucinskas, V. Human emotion recognition: Review of sensors and methods. Sensors 2020, 20, 592. [Google Scholar] [CrossRef] [Green Version]
- Milosavljevic, N. How does light regulate mood and behavioral state? Clocks Sleep 2019, 1, 319–331. [Google Scholar] [CrossRef]
- Atmaja, B.T.; Sasou, A. Sentiment Analysis and Emotion Recognition from Speech Using Universal Speech Representations. Sensors 2022, 22, 6369. [Google Scholar] [CrossRef] [PubMed]
- Aggarwal, A.; Srivastava, A.; Agarwal, A.; Chahal, N.; Singh, D.; Alnuaim, A.A.; Lee, H.N. Two-way feature extraction for speech emotion recognition using deep learning. Sensors 2022, 22, 2378. [Google Scholar] [CrossRef] [PubMed]
- Tarnowski, P.; Kołodziej, M.; Majkowski, A.; Rak, R.J. Eye-tracking analysis for emotion recognition. Comput. Intell. Neurosci. 2020, 2020, 2909267. [Google Scholar] [CrossRef] [PubMed]
- Lim, J.Z.; Mountstephens, J.; Teo, J. Emotion recognition using eye-tracking: Taxonomy, review and current challenges. Sensors 2020, 20, 2384. [Google Scholar] [CrossRef]
- Zhang, Z.; Shu, D.; Luo, L. Effects of Tai Chi and Walking Exercise on Emotional Face Recognition in Elderly People: An ERP Study. Healthcare 2022, 10, 1486. [Google Scholar] [CrossRef]
- Lu, Y.; Zheng, W.L.; Li, B.; Lu, B.L. Combining eye movements and EEG to enhance emotion recognition. In Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, Buenos Aires, Argentina, 25–31 July 2015. [Google Scholar]
- Zheng, W.L.; Dong, B.N.; Lu, B.L. Multimodal emotion recognition using EEG and eye tracking data. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 26–30 August 2014; pp. 5040–5043. [Google Scholar]
- Kinner, V.L.; Kuchinke, L.; Dierolf, A.M.; Merz, C.J.; Otto, T.; Wolf, O.T. What our eyes tell us about feelings: Tracking pupillary responses during emotion regulation processes. Psychophysiology 2017, 54, 508–518. [Google Scholar] [CrossRef]
- Kahya, M.; Liao, K.; Gustafson, K.M.; Akinwuntan, A.E.; Manor, B.; Devos, H. Cortical Correlates of Increased Postural Task Difficulty in Young Adults: A Combined Pupillometry and EEG Study. Sensors 2022, 22, 5594. [Google Scholar] [CrossRef]
- Chen, J.T.; Kuo, Y.C.; Hsu, T.Y.; Wang, C.A. Fatigue and Arousal Modulations Revealed by Saccade and Pupil Dynamics. Int. J. Environ. Res. Public Health 2022, 19, 9234. [Google Scholar] [CrossRef]
- Vasquez-Rosati, A.; Brunetti, E.P.; Cordero, C.; Maldonado, P.E. Pupillary response to negative emotional stimuli is differentially affected in meditation practitioners. Front. Hum. Neurosci. 2017, 11, 209. [Google Scholar] [CrossRef] [Green Version]
- Lin, W.; Li, C.; Zhang, Y. Interactive Application of Data Glove Based on Emotion Recognition and Judgment System. Sensors 2022, 22, 6327. [Google Scholar] [CrossRef]
- Gilzenrat, M.S.; Nieuwenhuis, S.; Jepma, M.; Cohen, J.D. Pupil diameter tracks changes in control state predicted by the adaptive gain theory of locuscoeruleus function. Cogn. Affect. Behav. Neurosci. 2021, 10, 252–269. [Google Scholar] [CrossRef] [PubMed]
- Ekman, P. Universal Facial Expressions of Emotion; California Mental Health Digest: Stockton, CA, USA, 1970; pp. 151–158. [Google Scholar]
- Alsharekh, M.F. Facial Emotion Recognition in Verbal Communication Based on Deep Learning. Sensors 2022, 22, 6105. [Google Scholar] [CrossRef] [PubMed]
- Tsalera, E.; Papadakis, A.; Samarakou, M.; Voyiatzis, I. Feature Extraction with Handcrafted Methods and Convolutional Neural Networks for Facial Emotion Recognition. Appl. Sci. 2022, 12, 8455. [Google Scholar] [CrossRef]
- Orange Data Mining Software. University of Ljubljana. Available online: https://orangedatamining.com/ (accessed on 10 November 2022).
- Fawcett, C.; Nordenswan, E.; Yrttiaho, S.; Häikiö, T.; Korja, R.; Karlsson, L.; Karlsson, H.; Kataja, E.L. Individual differences in pupil dilation to others’ emotional and neutral eyes with varying pupil sizes. Cogn. Emot. 2022, 36, 928–942. [Google Scholar] [CrossRef] [PubMed]
- Kret, M.E.; Roelofs, K.; Stekelenburg, J.J.; de Gelder, B. Emotional signals from faces, bodies and scenes influence observers’ face expressions, fixations and pupil-size. Front. Hum. Neurosci. 2013, 7, 810. [Google Scholar] [CrossRef] [Green Version]
- Chen, S.; Jiang, K.; Hu, H.; Kuang, H.; Yang, J.; Luo, J.; Chen, X.; Li, Y. Emotion recognition based on skin potential signals with a portable wireless device. Sensors 2021, 21, 1018. [Google Scholar] [CrossRef]
Features | Happiness | Surprise | Sadness | Disgust | Anger | Fear | |
---|---|---|---|---|---|---|---|
Left | Minimum | 0.08 | 0.20 | 0.16 | 0.20 | 0.20 | 0.20 |
Maximum | 0.20 | 0.14 | 0.20 | 0.20 | 0.20 | 0.20 | |
q1 | 0.20 | 0.20 | 0.19 | 0.20 | 0.07 | 0.20 | |
q2 | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 | |
q3 | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 | |
Mean | 0.20 | 0.20 | 0.19 | 0.20 | 0.20 | 0.20 | |
SD | 0.00 * | 0.10 | 0.01 * | 0.00 * | 0.00 * | 0.00 * | |
Variance | 0.20 | 0.20 | 0.09 | 0.00 * | 0.03 * | 0.20 | |
Right | Minimum | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 | 0.20 |
Maximum | 0.20 | 0.20 | 0.20 | 0.16 | 0.19 | 0.20 | |
q1 | 0.20 | 0.11 | 0.20 | 0.20 | 0.20 | 0.20 | |
q2 | 0.20 | 0.09 | 0.20 | 0.20 | 0.07 | 0.20 | |
q3 | 0.20 | 0.05 | 0.20 | 0.20 | 0.14 | 0.20 | |
Mean | 0.20 | 0.13 | 0.20 | 0.20 | 0.20 | 0.20 | |
SD | 0.00 * | 0.09 | 0.00 * | 0.00 * | 0.00 * | 0.02 * | |
Variance | 0.11 | 0.04 * | 0.08 | 0.03 * | 0.03 * | 0.06 |
Pupil (mm) Features | Happiness | Surprise | Sadness | Disgust | Anger | Fear | |
---|---|---|---|---|---|---|---|
Left | Minimum | 2.25 | 2.18 | 2.36 | 2.38 | 2.6 | 2.67 |
(0.49) | (0.53) | (0.48) | (0.52) | (0.52) | (0.54) | ||
Maximum | 4.5 | 4.45 | 4.66 | 4.61 | 4.76 | 5.32 | |
(0.69) | (0.69) | (0.73) | (0.77) | (0.77) | (0.66) | ||
q1 | 3.59 | 3.4 | 3.62 | 3.57 | 3.82 | 4.31 | |
(0.62) | (0.61) | (0.6) | (0.59) | (0.68) | (0.62) | ||
q2 | 3.74 | 3.56 | 3.78 | 3.75 | 3.99 | 4.51 | |
(0.63) | (0.61) | (0.61) | (0.61) | (0.74) | (0.64) | ||
q3 | 3.88 | 3.72 | 3.95 | 3.91 | 4.14 | 4.69 | |
(0.64) | (0.63) | (0.61) | (0.63) | (0.77) | (0.65) | ||
Mean | 3.72 | 3.55 | 3.79 | 3.74 | 3.96 | 4.48 | |
(0.62) | (0.62) | (0.6) | (0.59) | (0.69) | (0.62) | ||
Standard deviation | 0.24 | 0.27 | 0.2 | 0.28 | 0.28 | 0.3 | |
(0.07) | (0.06) | (0.11) | (0.12) | (0.17) | (0.09) | ||
Variance | 2.25 | 2.27 | 2.3 | 2.23 | 2.17 | 2.65 | |
(0.54) | (0.55) | (0.58) | (0.61) | (0.7) | (0.58) | ||
Right | Minimum | 2.28 | 2.23 | 2.35 | 2.28 | 2.5 | 2.72 |
(0.43) | (0.51) | (0.37) | (0.44) | (0.48) | (0.52) | ||
Maximum | 4.65 | 4.57 | 4.79 | 4.7 | 4.86 | 5.4 | |
(0.75) | (0.73) | (0.68) | (0.69) | (0.88) | (0.72) | ||
q1 | 3.62 | 3.42 | 3.64 | 3.58 | 3.85 | 4.32 | |
(0.57) | (0.58) | (0.57) | (0.53) | (0.68) | (0.6) | ||
q2 | 3.77 | 3.58 | 3.83 | 3.77 | 4.03 | 4.54 | |
(0.59) | (0.6) | (0.6) | (0.55) | (0.76) | (0.64) | ||
q3 | 3.91 | 3.74 | 4 | 3.94 | 4.19 | 4.74 | |
(0.61) | (0.62) | (0.61) | (0.58) | (0.78) | (0.65) | ||
Mean | 3.76 | 3.57 | 3.82 | 3.76 | 4 | 4.51 | |
(0.58) | (0.58) | (0.57) | (0.53) | (0.69) | (0.61) | ||
Standard deviation | 0.25 | 0.27 | 0.3 | 0.3 | 0.29 | 0.32 | |
(0.09) | (0.08) | (0.14) | (0.17) | (0.19) | (0.13) | ||
Variance | 2.37 | 2.34 | 2.43 | 2.42 | 2.36 | 2.68 | |
(0.71) | (0.75) | (0.62) | (0.73) | (0.73) | (0.66) |
Pupil | Features | F | Sig. | Duncan’s Multiple Range Test |
---|---|---|---|---|
Left | Minimum | 4.175 | 0.00 * | Fear, anger > anger, disgust, sadness > disgust, sadness, happiness, surprise |
Maximum | 5.753 | 0.00 * | Fear > anger, sadness, disgust, happiness, surprise | |
q1 | 7.947 | 0.00 * | Fear > anger, sadness, happiness, disgust > sadness, happiness, disgust, surprise | |
q2 | 8.038 | 0.00 * | Fear > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise | |
q3 | 8.058 | 0.00 * | Fear > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise | |
Mean | 8.189 | 0.00 * | Fear > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise | |
Right | Minimum | 4.843 | 0.00 * | Fear, anger > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise |
Maximum | 4.767 | 0.00 * | Fear > anger, sadness, disgust, happiness, surprise | |
q1 | 8.575 | 0.00 * | Fear > anger, sadness, happiness, disgust > sadness, happiness, disgust, surprise | |
q2 | 8.593 | 0.00 * | Fear > anger, sadness, happiness, disgust > sadness, happiness, disgust, surprise | |
q3 | 8.857 | 0.00 * | Fear > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise | |
Mean | 9.045 | 0.00 * | Fear > anger, sadness, disgust, happiness > sadness, disgust, happiness, surprise |
Classifier | AUC | CA | F1 | Precision | Recall |
---|---|---|---|---|---|
K-nearest neighbor (KNN) | 0.675 | 0.500 | 0.507 | 0.574 | 0.500 |
Decision Tree (DT) | 0.760 | 0.654 | 0.653 | 0.653 | 0.654 |
Radom Forest (RF) | 0.832 | 0.700 | 0.702 | 0.704 | 0.700 |
Logistic Regression (LR) | 0.871 | 0.761 | 0.761 | 0.761 | 0.761 |
Predicted | Anger | Fear | Surprise | |
---|---|---|---|---|
True | ||||
Anger | 60.0% | 40.0% | 0.0% | |
Fear | 36.7% | 63.3% | 0.0% | |
Surprise | 0.0% | 0.0% | 100.0% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lee, C.-L.; Pei, W.; Lin, Y.-C.; Granmo, A.; Liu, K.-H. Emotion Detection Based on Pupil Variation. Healthcare 2023, 11, 322. https://doi.org/10.3390/healthcare11030322
Lee C-L, Pei W, Lin Y-C, Granmo A, Liu K-H. Emotion Detection Based on Pupil Variation. Healthcare. 2023; 11(3):322. https://doi.org/10.3390/healthcare11030322
Chicago/Turabian StyleLee, Ching-Long, Wen Pei, Yu-Cheng Lin, Anders Granmo, and Kang-Hung Liu. 2023. "Emotion Detection Based on Pupil Variation" Healthcare 11, no. 3: 322. https://doi.org/10.3390/healthcare11030322
APA StyleLee, C.-L., Pei, W., Lin, Y.-C., Granmo, A., & Liu, K.-H. (2023). Emotion Detection Based on Pupil Variation. Healthcare, 11(3), 322. https://doi.org/10.3390/healthcare11030322