The Challenge of Emotions—An Experimental Approach to Assess the Emotional Competence of People with Intellectual Disabilities
Abstract
:1. Introduction
2. Materials and Methods
2.1. Study Design and Procedure
2.2. Sample
2.3. Measures
2.3.1. IAPS and SAM Scale
2.3.2. SAMY Scale
2.3.3. SEED Scale
- Dealing with our own body;
- Interaction with caregivers;
- Dealing with environmental changes;
- Differentiation of emotions;
- Interaction with peers;
- Dealing with material activities;
- Communication;
- Emotion regulation.
- Emotions can be directed through the attention of caregivers (Phase II).
- The person is able to identify his or her own basal emotions (e.g., anger, sadness, happiness, fear) (Phase III).
- Phase 1 (reference age 0–6 months; first adaption);
- Phase 2 (reference age 7–18 months; first socialization);
- Phase 3 (reference age 19–36 months; first individualization);
- Phase 4 (reference age 37–84 months; first identification);
- Phase 5 (reference age 85–156 months; incipient awareness of reality).
2.3.4. Automated Facial Expression Analysis
2.4. Statistical Analysis and Data Processing
3. Results
3.1. SEED Scale
3.2. SAMY Scale and SAM Scale
3.3. Automated Facial Expression Analysis
4. Discussion
- First, to assess the emotional development of the participants with ID to investigate how effectively emotional expressions of other persons or emotional situations can be evaluated, how well one’s own emotions can be expressed and how accurate one’s own emotions can be assessed by self-reports.
- Third, to investigate in a pilot-like analysis if these self-reported emotional reactions match the emotions expressed in their faces using the automated facial expression analysis of Affectiva (Affdex SDK).
5. Limitations
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Valence | ||||
---|---|---|---|---|
IAPS No. | Description | M | SD | Emotion |
1120 | Snake | 9.98 | 21.42 | Fear |
1300 1 | Pit Bull | 10.01 | 22.96 | Fear |
2100 1 | Angry Face | 13.85 | 22.85 | Anger |
2110 | Angry Face | 4.44 | 8.56 | Anger |
2130 | Woman | 6.48 | 13.19 | Anger |
2301 | Kid Cry | 0.23 | 0.71 | Sadness |
2455 | Sad Girls | 6.48 | 14.49 | Sadness |
2456 | Crying Family | 2.93 | 4.67 | Sadness |
2457 | Crying Boy | 6.90 | 14.00 | Sadness |
2691 | Riot | 5.00 | 11.35 | Anger |
6200 | Aimed Gun | 0.38 | 0.88 | Fear |
6250.1 | Aimed Gun | 6.27 | 17.21 | Fear |
9290 | Garbage | 7.33 | 12.71 | Disgust |
9301 1 | Toilet | 12.21 | 19.51 | Disgust |
9362 | Vomit | 8.32 | 19.28 | Disgust |
9830 1 | Cigarettes | 12.98 | 23.57 | Disgust |
References
- Denham, S.A.; Bassett, H.H.; Wyatt, T. The Socialization of Emotional Competence. In Handbook of Socialization: Theory and Research; Grusec, J.E., Hastings, P.D., Eds.; Guilford Press: New York, NY, USA, 2015; pp. 590–613. [Google Scholar]
- Adams, D.; Oliver, C. The expression and assessment of emotions and internal states in individuals with severe or profound intellectual disabilities. Clin. Psychol. Rev. 2011, 31, 293–306. [Google Scholar] [CrossRef] [Green Version]
- Bermejo, B.G.; Mateos, P.M.; Sánchez-Mateos, J.D. The emotional experience of people with intellectual disability: An analysis using the international affective pictures system. Am. J. Intellect. Dev. Disabil. 2014, 119, 371–384. [Google Scholar] [CrossRef]
- McRae, K.; Gross, J.J. Emotion regulation. Emotion 2020, 20, 1–9. [Google Scholar] [CrossRef] [PubMed]
- McClure, K.S.; Halpern, J.; Wolper, P.A.; Donahue, J.J. Emotion Regulation and Intellectual Disability. J. Dev. Disabil. 2009, 15, 38–44. [Google Scholar]
- Girgis, M.; Paparo, J.; Kneebone, I. A systematic review of emotion regulation measurement in children and adolescents diagnosed with intellectual disabilities. J. Intellect. Dev. Disabil. 2021, 46, 90–99. [Google Scholar] [CrossRef]
- Littlewood, M.; Dagnan, D.; Rodgers, J. Exploring the emotion regulation strategies used by adults with intellectual disabilities. Int. J. Dev. Disabil. 2018, 64, 204–211. [Google Scholar] [CrossRef]
- Martínez-González, A.E.; Veas, A. Identification of emotions and physiological response in individuals with moderate intellectual disability. Int. J. Dev. Disabil. 2019, 67, 406–411. [Google Scholar] [CrossRef] [PubMed]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual; Technical Report A-8; University of Florida: Gainesville, FL, USA, 2008. [Google Scholar]
- Murray, G.; McKenzie, K.; Murray, A.; Whelan, K.; Cossar, J.; Murray, K.; Scotland, J. The impact of con-textual information on the emotion recognition of children with an intellectual disability. J. Appl. Res. Intellect. Disabil. 2019, 32, 152–158. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Scotland, J.L.; Cossar, J.; McKenzie, K. The ability of adults with an intellectual disability to recognise facial expressions of emotion in comparison with typically developing individuals: A systematic review. Re-Search Dev. Disabil. 2015, 41–42, 22–39. [Google Scholar] [CrossRef] [PubMed]
- Moore, D.G. Reassessing Emotion Recognition Performance in People with Mental Retardation: A Review. Am. J. Ment. Retard. 2001, 106, 481–502. [Google Scholar] [CrossRef]
- Ekman, P.; Friesen, W.V.; Hager, J.C. The Facial Action Coding System; Research Nexus eBook: Salt Lake City, UT, USA, 2002. [Google Scholar]
- Ekman, P.; Cordaro, D. What is Meant by Calling Emotions Basic. Emot. Rev. 2011, 3, 364–370. [Google Scholar] [CrossRef]
- Russell, J.A. A circumplex model of affect. J. Personal. Soc. Psychol. 1980, 39, 1161–1178. [Google Scholar] [CrossRef]
- Fischer, A.H.; Manstead, A.S.R. Social Functions of Emotion and Emotion Regulation. In Handbook of Emotions, 4th ed.; Barrett, L.F., Lewis, M., Haviland-Jones, J.M., Eds.; The Guilford Press: New York, NY, USA, 2016; pp. 424–439. [Google Scholar]
- Kraut, R.E.; Johnston, R.E. Social and emotional messages of smiling: An ethological approach. J. Personal. Soc. Psychol. 1979, 37, 1539–1553. [Google Scholar] [CrossRef]
- Krämer, T.; Zentel, P. Expression of Emotions of People with Profound Intellectual and Multiple Disabilities. A Single-Case Design Including Physiological Data. Psychoeduc. Assess. Interv. Rehabil. 2020, 2, 15–29. [Google Scholar] [CrossRef]
- Stewart, C.A.; Singh, N.N. Enhancing the Recognition and Production of Facial Expressions of Emotion by Children with Mental Retardation. Res. Dev. Disabil. 1995, 16, 365–382. [Google Scholar] [CrossRef]
- Bielozorov, A.; Bezbradica, M.; Helfert, M. The Role of User Emotions for Content Personalization in e-Commerce: Literature Review. In HCI in Business, Government and Organizations. eCommerce and Consumer Behavior, 1st ed.; Nah, F.F.-H., Siau, K., Eds.; Springer International Publishing: Basel, Switzerland, 2019; pp. 177–193. [Google Scholar]
- Picard, R.W. Affective computing: Challenges. Int. J. Hum.-Comput. Stud. 2003, 59, 55–64. [Google Scholar] [CrossRef]
- Zhang, Y.; Weninger, F.; Schuller, B.; Picard, R.W. Holistic Affect Recognition using PaNDA: Paralinguistic Non-metric Dimensional Analysis. IEEE Trans. Affect. Comput. 2019, 13, 769–780. [Google Scholar] [CrossRef] [Green Version]
- Vallverdú, J. Para-functional engineering: Cognitive challenges. Int. J. Parallel Emergent Distrib. Syst. 2022, 37, 292–302. [Google Scholar] [CrossRef]
- Franzoni, V.; Milani, A.; Nardi, D.; Vallverdú, J. Emotional machines: The next revolution. Web Intell. 2019, 17, 1–7. [Google Scholar] [CrossRef] [Green Version]
- Dupré, D.; Andelic, N.; Morrison, G.; McKeown, G. Accuracy of three commercial automatic emotion recognition systems across different individuals and their facial expressions. In Proceedings of the IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), Athens, Greece, 7 October 2018; pp. 627–632. [Google Scholar] [CrossRef] [Green Version]
- Garcia-Garcia, J.M.; Penichet, V.M.R.; Lozano, M.D. Emotion detection: Technology review. In Proceedings of the XVIII International Conference on Human Computer Interaction—Interacción’17, Cancun, Mexico, 25–27 September 2017; González-Calleros, J.M., Ed.; ACM Press: New York, NY, USA, 2017; pp. 1–8. [Google Scholar] [CrossRef]
- Canal, F.Z.; Müller, T.R.; Matias, J.C.; Scotton, G.G.; Sa Junior, A.R.; de Pozzebon, E.; Sobieranski, A.C. A survey on facial emotion recognition techniques: A state-of-the-art literature review. Inf. Sci. 2022, 582, 593–617. [Google Scholar] [CrossRef]
- Avola, D.; Cinque, L.; Fagioli, A.; Foresti, G.L.; Massaroni, C. Deep Temporal Analysis for Non-Acted Body Affect Recognition. IEEE Trans. Affect. Comput. 2022, 13, 1366–1377. [Google Scholar] [CrossRef]
- iMotions. Affectiva—Emotion AI. Available online: https://imotions.com/affectiva/ (accessed on 30 June 2022).
- Noldus. FaceReader: Emotion Analysis. Available online: https://www.noldus.com/facereader (accessed on 18 August 2022).
- Megvii. Face++. Available online: https://www.faceplusplus.com/emotion-recognition/ (accessed on 18 August 2022).
- Baltrusaitis, T.; Zadeh, A.; Lim, Y.C.; Morency, L. OpenFace 2.0: Facial Behavior Analysis Toolkit. In Proceedings of the 13th IEEE International Conference on Automatic Face & Gesture Recognition, Xi’an, China, 15–19 May 2018. [Google Scholar]
- Ertugrul, I.O.; Jeni, L.A.; Ding, W.; Cohn, J.F. AFAR: A Deep Learning Based Tool for Automated Facial Affect Recognition. In Proceedings of the 14th IEEE International Conference on Automatic Face & Gesture Recognition, Lille, France, 14–18 May 2019. [Google Scholar]
- Dupré, D.; Krumhuber, E.G.; Küster, D.; McKeown, G.J. A performance comparison of eight commercially available automatic classifiers for facial affect recognition. PLoS ONE 2020, 15, e0231968. [Google Scholar] [CrossRef] [Green Version]
- Stöckli, S.; Schulte-Mecklenbeck, M.; Borer, S.; Samson, A.C. Facial expression analysis with AFFDEX and FACET: A validation study. Behav. Res. Methods 2018, 50, 1446–1460. [Google Scholar] [CrossRef] [Green Version]
- Dubovi, I. Cognitive and emotional engagement while learning with VR: The perspective of multimodal methodology. Comput. Educ. 2022, 183, 104495. [Google Scholar] [CrossRef]
- Kjærstad, H.L.; Jørgensen, C.K.; Broch-Due, I.; Kessing, L.V.; Miskowiak, K. Eye gaze and facial displays of emotion during emotional film clips in remitted patients with bipolar disorder. Eur. Psychiatry 2020, 63, E29. [Google Scholar] [CrossRef]
- Kovalchuk, Y.; Budini, E.; Cook, R.M.; Walsh, A. Investigating the Relationship between Facial Mimicry and Empathy. Behav. Sci. 2022, 12, 250. [Google Scholar] [CrossRef]
- Mehta, A.; Sharma, C.; Kanala, M.; Thakur, M.; Harrison, R.; Torrico, D.D. Self-Reported Emotions and Facial Expressions on Consumer Acceptability: A Study Using Energy Drinks. Foods 2021, 10, 330. [Google Scholar] [CrossRef]
- Millet, B.; Chattah, J.; Ahn, S. Soundtrack design: The impact of music on visual attention and affective responses. Appl. Ergon. 2021, 93, 103301. [Google Scholar] [CrossRef]
- Timme, S.; Brand, R. Affect and exertion during incremental physical exercise: Examining changes using automated facial action analysis and experiential self-report. PLoS ONE 2020, 15, e0228739. [Google Scholar] [CrossRef]
- Magdin, M.; Prikler, F. Real Time Facial Expression Recognition Using Webcam and SDK Affectiva. Int. J. Interact. Multimed. Artif. Intell. 2018, 5, 7–15. [Google Scholar] [CrossRef] [Green Version]
- Zjiderveld, G.; Affectiva. The World’s Largest Emotion Database: 5.3 Million Faces and Counting. Available online: https://blog.affectiva.com/the-worlds-largest-emotion-database-5.3-million-faces-and-counting (accessed on 18 August 2022).
- Namba, S.; Sato, W.; Osumi, M.; Shimokawa, K. Assessing Automated Facial Action Unit Detection Systems for Analyzing Cross-Domain Facial Expression Databases. Sensors 2021, 21, 4222. [Google Scholar] [CrossRef]
- Hewitt, C.; Gunes, H. CNN-based Facial Affect Analysis on Mobile Devices. arXiv 2018, arXiv:1807.08775. [Google Scholar]
- Kartali, A.; Roglić, M.; Barjaktarović, M.; Đurić-Jovičić, M.; Janković, M.M. Real-time Algorithms for Facial Emotion Recognition: A Comparison of Different Approaches. In Proceedings of the 14th Symposium on Neural Networks and Applications, Belgrade, Serbia, 20–21 November 2018. [Google Scholar]
- Taggart, R.W.; Dressler, M.; Kumar, P.; Khan, S.; Coppola, J.F. Determining Emotions via Facial Expression Analysis Software. In Proceedings of the Student-Faculty Research Day, CSIS, Pace University, New York, NY, USA, 6 May 2016. [Google Scholar]
- Sappok, T.; Zepperitz, S.; Barrett, B.F.; Došen, A. SEED: Skala der emotionalen Entwicklung–Diagnostik: Ein Instrument zur Feststellung des Emotionalen Entwicklungsstands bei Personen mit Intellektueller Entwicklungsstörung: Manual; Hogrefe: Göttingen, Germany, 2018. [Google Scholar]
- Marx, A.K.G.; Frenzel, A.C.; Pekrun, R.; Schwartze, M.M.; Reck, C. Automated Facial Expression Analysis in Authentic Face-To-Face Classroom Settings. A Proof of Concept Study. 2022; submitted. [Google Scholar]
- Zentel, P.; Sansour, T.; Engelhardt, M.; Krämer, T.; Marzini, M. Mensch und/oder Maschine? Der Einsatz von Künstlicher Intelligenz in der Arbeit mit Menschen mit schwerer und mehrfacher Behinderung. Schweiz. Z. Heilpädagogik 2019, 25, 35–42. [Google Scholar]
Emotion | Number of the IAPS Pictures |
---|---|
Fear | 1120 *, 1300, 6200 *, 6250.1 * |
Happiness | 2000 *, 2020 *, 2040 *, 2347 |
Sadness | 2455 *, 2456 *, 2457 *, 2301 |
Anger | 2100, 2110 *, 2130 *, 2691 |
Disgust | 9290, 9301, 9326, 9830 |
Neutral content | 7000 *, 7010 *, 7012, 7025 |
In total | 24 pictures |
Phase I | Phase II | Phase III | Phase IV | Phase V | |
---|---|---|---|---|---|
Dealing with our own body | 1 | 0 | 9 | 14 | 4 |
Interaction with caregivers | 0 | 5 | 5 | 14 | 4 |
Dealing with environmental changes | 0 | 1 | 5 | 5 | 17 |
Differentiation of emotions | 2 | 4 | 6 | 11 | 5 |
Interaction with peers | 3 | 3 | 2 | 8 | 12 |
Dealing with material activities | 1 | 1 | 2 | 8 | 16 |
Communication | 1 | 1 | 1 | 9 | 16 |
Emotion regulation | 2 | 4 | 8 | 7 | 7 |
General emotional development | 0 | 3 | 4 | 16 | 5 |
Participants with ID Using the SAMY Scale | Values of the IAPS Technical Report Using the SAM Scale | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Valence | Arousal | Valence | Arousal | ||||||||
IAPS No. | Order in Study | Description | n | M | SD | M | SD | M | SD | M | SD |
1120 | 20. | Snake | 29 | 3.21 ** | 2.64 | 6.24 | 3.00 | 3.79 | 1.93 | 6.93 | 1.68 |
1300 | 17. | Pit Bull | 29 | 2.52 ** | 2.37 | 7.21 | 2.64 | 3.55 | 1.78 | 6.79 | 1.84 |
2000 | 4. | Adult | 30 | 7.47 | 1.63 | 4.00 ** | 2.27 | 6.51 | 1.83 | 3.32 | 2.07 |
2020 | 16. | Adult | 29 | 7.28 | 2.31 | 4.17 ** | 2.30 | 5.68 | 1.99 | 3.34 | 1.89 |
2040 | 2. | Baby | 30 | 8.13 ** | 1.36 | 3.83 ** | 2.17 | 8.17 | 1.60 | 4.64 | 2.54 |
2100 | 19. | Angry Face | 29 | 3.76 ** | 2.75 | 6.31 | 2.63 | 3.85 | 1.99 | 4.53 | 2.57 |
2110 | 15. | Angry Face | 29 | 4.17 ** | 2.80 | 6.38 | 2.68 | 3.71 | 1.82 | 4.53 | 2.25 |
2130 | 21. | Woman | 29 | 3.41 ** | 2.23 | 6.31 | 2.35 | 4.08 | 1.33 | 5.02 | 2.00 |
2301 | 6. | Kid Cry | 29 | 3.14 ** | 2.39 | 5.55 * | 2.67 | 2.78 | 1.38 | 4.57 | 1.96 |
2347 | 24. | Children | 30 | 8.53 ** | 1.01 | 4.07 ** | 2.45 | 7.83 | 1.36 | 5.56 | 2.34 |
2455 | 22. | Sad Girls | 29 | 5.21 ** | 3.04 | 5.34 ** | 2.51 | 2.96 | 1.79 | 4.46 | 2.12 |
2456 | 18. | Crying Family | 30 | 3.20 ** | 2.37 | 4.93 ** | 2.70 | 2.84 | 1.27 | 4.55 | 2.16 |
2457 | 3. | Crying Boy | 29 | 3.14 ** | 2.50 | 5.90 | 2.30 | 3.20 | 1.51 | 4.94 | 2.01 |
2691 | 23. | Riot | 29 | 2.93 ** | 2.42 | 6.93 | 2.59 | 3.04 | 1.73 | 5.85 | 2.03 |
6200 | 10. | Aimed Gun | 27 | 2.33 ** | 2.22 | 7.59 | 2.14 | 3.20 | 1.62 | 5.82 | 1.99 |
6250.1 | 13. | Aimed Gun | 28 | 2.29 ** | 2.39 | 6.86 | 2.82 | 2.63 | 1.74 | 6.92 | 1.92 |
7000 | 14. | Rolling Pin | 28 | 6.07 | 2.34 | 5.14 ** | 2.37 | 5.00 | 0.84 | 2.42 | 1.79 |
7010 | 1. | Basket | 29 | 6.17 | 2.85 | 4.10 ** | 2.37 | 4.94 | 1.07 | 1.76 | 1.48 |
7012 | 11. | Rubberbands | 29 | 5.76 * | 2.29 | 4.59 ** | 2.16 | 4.98 | 1.05 | 3.00 | 1.94 |
7025 | 7. | Stool | 30 | 6.53 | 2.15 | 5.21 ** | 2.53 | 4.63 | 1.17 | 2.71 | 2.20 |
9290 | 9. | Garbage | 29 | 3.07 ** | 2.36 | 6.79 | 2.74 | 2.88 | 1.52 | 4.40 | 2.11 |
9301 | 8. | Toilet | 28 | 1.93 ** | 1.68 | 6.86 | 2.37 | 2.26 | 1.56 | 5.28 | 2.46 |
9326 | 12. | Vomit | 27 | 1.74 ** | 1.13 | 7.00 | 2.66 | 2.21 | 1.30 | 5.89 | 2.35 |
9830 | 5. | Cigarettes | 29 | 3.34 ** | 2.78 | 6.00 | 2.69 | 2.54 | 1.75 | 4.86 | 2.63 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hammann, T.; Schwartze, M.M.; Zentel, P.; Schlomann, A.; Even, C.; Wahl, H.-W.; Rietz, C. The Challenge of Emotions—An Experimental Approach to Assess the Emotional Competence of People with Intellectual Disabilities. Disabilities 2022, 2, 611-625. https://doi.org/10.3390/disabilities2040044
Hammann T, Schwartze MM, Zentel P, Schlomann A, Even C, Wahl H-W, Rietz C. The Challenge of Emotions—An Experimental Approach to Assess the Emotional Competence of People with Intellectual Disabilities. Disabilities. 2022; 2(4):611-625. https://doi.org/10.3390/disabilities2040044
Chicago/Turabian StyleHammann, Torsten, Manuel M. Schwartze, Peter Zentel, Anna Schlomann, Christiane Even, Hans-Werner Wahl, and Christian Rietz. 2022. "The Challenge of Emotions—An Experimental Approach to Assess the Emotional Competence of People with Intellectual Disabilities" Disabilities 2, no. 4: 611-625. https://doi.org/10.3390/disabilities2040044
APA StyleHammann, T., Schwartze, M. M., Zentel, P., Schlomann, A., Even, C., Wahl, H. -W., & Rietz, C. (2022). The Challenge of Emotions—An Experimental Approach to Assess the Emotional Competence of People with Intellectual Disabilities. Disabilities, 2(4), 611-625. https://doi.org/10.3390/disabilities2040044