User Affect Elicitation with a Socially Emotional Robot
Abstract
:1. Introduction
2. Related Work on Affect Elicitation Using Robots
2.1. Coded Affect
2.2. Self-Reported Affect
2.3. Use of Both Self-Reported and Coded Affect
3. A User Affect Elicitation Methodology Using a Social Robot
3.1. Affect Elicitation
3.2. Affect Detection
3.2.1. Physiological Responses
EEG Feature Extraction
3.2.2. Self-Assessment
3.2.3. Affect Detection Model
4. Experiments
4.1. Affect Elicitation Results
4.2. Affect Detection Models
5. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Fortunati, L.; Esposito, A.; Lugano, G. Introduction to the Special Issue “Beyond Industrial Robotics: Social Robots Entering Public and Domestic Spheres”. Inf. Soc. 2015, 31, 229–236. [Google Scholar] [CrossRef]
- Saunderson, S.; Nejat, G. How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human-Robot Interaction. Int. J. Soc. Robot. Manuscr. 2019, 11, 1–34. [Google Scholar] [CrossRef]
- McColl, D.; Hong, A.; Hatakeyama, N.; Nejat, G.; Benhabib, B. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI. J. Intell. Robot. Syst. 2016, 82, 101–133. [Google Scholar] [CrossRef]
- Ficocelli, M.; Terao, J.; Nejat, G. Promoting Interactions Between Humans and Robots Using Robotic Emotional Behavior. IEEE Trans. Cybern. 2016, 46, 2911–2923. [Google Scholar] [CrossRef] [PubMed]
- Cambria, E. Affective Computing and Sentiment Analysis. IEEE Intell. Syst. 2016, 31, 102–107. [Google Scholar] [CrossRef]
- Kory, J.; Mello, S.D. Affect Elicitation for Affective Computing. In Oxford Handbook of Affective Computing; Calvo, R., D’Mello, S., Gratch, J., Kappas, A., Eds.; Oxford University Press: New York, NY, USA, 2014; pp. 1–22. [Google Scholar]
- Shao, M.; Franco, S.F.D.R.; Ismail, O.; Zhang, X.; Nejat, G.; Benhabib, B. You are doing great! Only one Rep left: An affect-aware social robot for exercising. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics, Bari, Italy, 6–9 October 2019; pp. 3791–3797. [Google Scholar]
- Schaaff, K.; Schultz, T. Towards an EEG-based emotion recognizer for humanoid robots. In Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 27 September–2 October 2009; pp. 792–796. [Google Scholar]
- Castellano, G.; Leite, I.; Pereira, A.; Martinho, C.; Paiva, A.; Mcowan, P.W. Affect Recognition for Interactive Companions: Challenges and Design in Real World Scenarios. J. Multimodal User Interfaces 2009, 3, 89–98. [Google Scholar] [CrossRef]
- Riether, N.; Hegel, F.; Wrede, B.; Horstmann, G. Social facilitation with social robots? In Proceedings of the 2012 7th ACM/IEEE International Conference on Human-Robot Interaction, Boston, MA, USA, 5–8 March 2012; pp. 41–47. [Google Scholar]
- Sanghvi, J.; Castellano, G.; Leite, I.; Pereira, A.; Mcowan, P.W.; Paiva, A. Automatic analysis of affective postures and body motion to detect engagement with a game companion categories and subject descriptors. In Proceedings of the 6th International Conference on Human Robot Interaction, Lausanne, Switzerland, 6–9 March 2011; pp. 305–312. [Google Scholar]
- Wang, W.; Athanasopoulos, G.; Yilmazyildiz, S.; Patsis, G.; Enescu, V.; Sahli, H.; Verhelst, W.; Hiolle, A.; Lewis, M.; Cañamero, L. Natural emotion elicitation for emotion modeling in child-robot interactions. In Proceedings of the 4th Workshop on Child Computer Interaction, Singapore, 19 September 2014; pp. 51–56. [Google Scholar]
- Kumagai, K.; Hayashi, K.; Mizuuchi, I. Elicitation of specific facial expression by robot’s action. In Proceedings of the International Conference on Advanced Mechatronics, Tokyo, Japan, 5–8 December 2015; pp. 53–54. [Google Scholar]
- Kulic, D.; Croft, E.A. Affective State Estimation for Human-Robot Interaction. IEEE Trans. Robot. 2007, 23, 991–1000. [Google Scholar] [CrossRef]
- Jercic, P.; Wen, W.; Hagelbäck, J.; Sundstedt, V. The Effect of Emotions and Social Behavior on Performance in a Collaborative Serious Game Between Humans and Autonomous Robots. Int. J. Soc. Robot. 2018, 10, 115–129. [Google Scholar] [CrossRef] [Green Version]
- Xu, J.; Broekens, J.; Hindriks, K.V.; Neerincx, M. Effects of bodily mood expression of a robotic teacher on students. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 2614–2620. [Google Scholar]
- McColl, D.; Nejat, G. Determining the Affective Body Language of Older Adults during Socially Assistive HRI. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 2633–2638. [Google Scholar]
- McColl, D.; Nejat, G. Affect Detection from Body Language during Social HRI. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, Paris, France, 9–13 September 2012; pp. 1013–1018. [Google Scholar]
- McColl, D.; Jiang, G.; Nejat, G. Classifying a Person’s Degree of Accessibility from Natural Body Language During Social Human-Robot Interactions. IEEE Trans. Cybern. 2017, 47, 524–538. [Google Scholar] [CrossRef]
- Metallinou, A.; Yang, Z.; Lee, C.C.; Busso, C.; Carnicke, S.; Narayanan, S.S. The USC CreativeIT Database of Multimodal Dyadic Interactions: From Speech and Full Body Motion Capture to Continuous Emotional Annotations. J. Lang. Resour. Eval. 2016, 50, 497–521. [Google Scholar] [CrossRef]
- Diehr, P.H.; Thielke, S.M.; Newman, A.B.; Hirsch, C.; Tracy, R. Decline in Health for Older Adults: Five-Year Change in 13 Key Measures of Standardized Health. J. Gerontol. 2013, 68, 1059–1067. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wilson, J.R.; Scheutz, M.; Briggs, G. Reflections on the Design Challenges Prompted by Affect-Aware Socially Assistive Robots. In Emotions and Personality in Personalized Services; Tkalčič, M., De Carolis, B., de Gemmis, M., Odić, A., Košir, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2016; pp. 377–395. [Google Scholar]
- McColl, D.; Nejat, G. Meal-Time With a Socially Assistive Robot and Older Adults at a Long-term Care Facility. J. Hum.-Robot Interact. 2013, 2, 152–171. [Google Scholar] [CrossRef] [Green Version]
- McColl, D.; Nejat, G. A Socially Assistive Robot That Can Monitor Affect of the Elderly During Meal-Time Assistance. J. Med. Devices 2014, 8, 030941. [Google Scholar] [CrossRef]
- Woiceshyn, L.; Wang, Y.; Nejat, G.; Benhabib, B. Personalized clothing recommendation by a social robot. In Proceedings of the IEEE 5th International Symposium on Robotics and Intelligent Sensors, Ottawa, ON, Canada, 5–7 October 2017; pp. 179–185. [Google Scholar]
- Woiceshyn, L.; Wang, Y.; Nejat, G.; Benhabib, B. A Socially assistive robot to help with getting dressed. In Proceedings of the 2017 Design of Medical Devices Conference, Minneapolis, MN, USA, 10–13 April 2017. [Google Scholar]
- Hong, A.; Lunscher, N.; Hu, T.; Tsuboi, Y.; Zhang, X.; Alves, S.F.R.; Nejat, G.; Benhabib, B. A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication. IEEE Trans. Cybern. 2020, 1–14. [Google Scholar] [CrossRef] [PubMed]
- Louie, W.G.; Li, J.; Mohamed, C.; Despond, F.; Lee, V.; Nejat, G. Tangy the Robot Bingo Facilitator: A Performance Review. J. Med. Devices 2015, 9, 020936. [Google Scholar] [CrossRef]
- Louie, W.G.; Li, J.; Vaquero, T.; Nejat, G. A Focus group study on the design considerations and impressions of a socially assistive robot for long-term care. In Proceedings of the IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 237–242. [Google Scholar]
- Louie, W.G.; Nejat, G. A Social Robot Learning to Facilitate an Assistive Group-Based Activity from Non-expert Caregivers. Int. J. Soc. Robot. 2020, 1–18. [Google Scholar] [CrossRef]
- Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef] [Green Version]
- Al-Nafjan, A.; Hosny, M.; Al-Ohali, Y.; Al-Wabil, A. Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review. Appl. Sci. 2017, 7, 1239. [Google Scholar] [CrossRef] [Green Version]
- Bradley, M.; Lang, P.J. Measuring Emotion: The Self-Assessment Manikin and the Semantic Differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef]
- Lao, S.; Kawade, M. Vision-based face understanding technologies and their applications. In Proceedings of the Chinese Conference on Advances in Biometric Person Authentication, Guangzhou, China, 13–15 December 2004; pp. 339–348. [Google Scholar]
- Kumagai, K.; Baek, J.; Mizuuchi, I. A situation-aware action selection based on individual’s preference using emotion estimation evaluator. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Bali, Indonesia, 5–10 December 2014; pp. 356–361. [Google Scholar]
- Scherer, K.R. What Are Emotions? And How Can They Be Measured? Soc. Sci. Inf. 2005, 44, 695–729. [Google Scholar] [CrossRef]
- Deng, E.; Mutlu, B.; Matarić, M.J. Embodiment in Socially Interactive Robots. Found. Trends Robot. 2019, 7, 251–356. [Google Scholar] [CrossRef]
- Aviezer, H.; Trope, Y.; Todorov, A. Body Cues, Not Facial Expressions, Discriminate Between Intense Positive and Negative Emotions. Science 2012, 338, 1225–1229. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Shafir, T.; Taylora, S.F.; Atkinsonc, A.P.; Langeneckerd, S.A.; Zubietaa, J.-K. Emotion Regulation Through Execution, Observation, and Imagery of Emotional Movements. Brain Cogn. 2014, 82, 219–227. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ribeiro, F.S.; Santos, F.H.; Albuquerque, P.B.; Oliveira-Silva, P. Emotional Induction Through Music: Measuring Cardiac and Electrodermal Responses of Emotional States and Their Persistence. Front. Psychol. 2019, 10, 451–463. [Google Scholar] [CrossRef] [Green Version]
- Koelsch, S. Towards a Neural Basis of Music-Evoked Emotions. Trends Cogn. Sci. 2010, 14, 131–137. [Google Scholar] [CrossRef]
- Lin, Y.-P.; Wang, C.-H.; Jung, T.-P.; Wu, T.-L.; Jeng, S.-K.; Duann, J.-R.; Chen, J.-H. EEG-Based Emotion Recognition in Music Listening. IEEE Trans. Biomed. Eng. 2010, 57, 1798–1806. [Google Scholar]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Christensen, J.F.; Gaigg, S.B.; Gomila, A.; Oke, P.; Calvo-merino, B. Enhancing Emotional Experiences to Dance Through Music: The Role of Valence and Arousal in the Cross-Modal Bias. Front. Hum. Neurosci. 2014, 8, 757–765. [Google Scholar] [CrossRef] [Green Version]
- Wallbott, H.G. Bodily Expression of Emotion. Eur. J. Soc. Psychol. 1998, 28, 879–896. [Google Scholar] [CrossRef]
- Soleymani, M.; Caro, M.N.; Schmidt, E.M.; Sha, C.-Y.; Yang, Y.-H. 1000 Songs for emotional analysis of music. In Proceedings of the 2nd ACM International Workshop on Crowdsourcing for Multimedia, Barcelona, Spain, 22 October 2013; pp. 1–6. [Google Scholar]
- Chapados, C.; Levitin, D.J. Cross-modal Interactions in the Experience of Musical Performances: Physiological Correlates. Cognition 2008, 108, 639–651. [Google Scholar] [CrossRef] [Green Version]
- Christensen, J.F.; Nadal, M.; Cela-Conde, C.J. A Norming Study and Library of 203 Dance Movements. Perception 2014, 43, 178–206. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jola, C.; Ehrenberg, S.; Reynolds, D. The Experience of Watching Dance: Phenomenological-Neuroscience Duets. Phenomenol. Cogn. Sci. 2012, 11, 17–37. [Google Scholar] [CrossRef] [Green Version]
- Sievers, B.; Polansky, L.; Casey, M.; Wheatley, T. Music and Movement Share a Dynamic Structure That Supports Universal Expressions of Emotion. Proc. Natl. Acad. Sci. USA 2013, 110, 70–75. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Healey, J. Physiological sensing of emotion. In Oxford Handbook of Affective Computing; Calvo, R., D’Mello, S., Gratch, J., Kappas, A., Eds.; Oxford University Press: New York, NY, USA, 2014; pp. 1–20. [Google Scholar]
- Girardi, D.; Lanubile, F.; Novielli, N. Emotion Detection using noninvasive low cost sensors. In Proceedings of the International Conference on Affective Computing and Intelligent Interaction, San Antonio, TX, USA, 23–26 October 2017; pp. 125–130. [Google Scholar]
- InteraXon Inc. Technical Specifications, Validation, and Research Use; InteraXon Inc.: Toronto, ON, Canada, 2016. [Google Scholar]
- Zhao, G.; Zhang, Y.; Ge, Y. Frontal EEG Asymmetry and Middle Line Power Difference in Discrete Emotions. Front. Behav. Neurosci. 2018, 12, 225–239. [Google Scholar] [CrossRef] [Green Version]
- Al-Nafjan, A.; Hosny, M.; Al-Wabil, A.; Al-Ohali, Y. Classification of Human Emotions From Electroencephalogram (EEG) Signal Using Deep Neural Network. Int. J. Adv. Comput. Sci. Appl. 2017, 8, 419–425. [Google Scholar] [CrossRef]
- Barachant, A.; Morrison, D.; Banville, H.; Kowaleski, J.; Shaked, U.; Chevallier, S.; Tresols, J.J.T. Muse-lsl. Available online: https://github.com/alexandrebarachant/muse-lsl (accessed on 31 May 2020).
- Mühl, C.; Allison, B.; Nijholt, A.; Chanel, G. A Survey of Affective Brain Computer Interfaces: Principles, State-Of-The-Art, and Challenges. Brain-Comput. Interfaces 2014, 1, 66–84. [Google Scholar]
- Aftanas, L.I.; Varlamov, A.A.; Pavlov, S.V.; Makhnev, V.P.; Reva, N.V. Affective Picture Processing: Event-Related Synchronization Within Individually Defined Human Theta Band Is Modulated by Valence Dimension. Neurosci. Lett. 2001, 303, 115–118. [Google Scholar] [CrossRef]
- Reuderink, B.; Mühl, C.; Poel, M. Valence, Arousal and Dominance in the EEG During Game Play. Int. J. Auton. Adapt. Commun. Syst. 2013, 6, 45–62. [Google Scholar] [CrossRef]
- Menon, S.; Geethanjali, B.; Seshadri, N.P.G.; Muthumeenakshi, S.; Nair, S. Evaluating the induced emotions on physiological response. In Computational Signal Processing and Analysis; Nandi, A.K., Sujatha, N., Menaka, R., Alex, J.S.R., Eds.; Springer: Singapore, 2018; pp. 211–220. [Google Scholar]
- Ramirez, R.; Vamvakousis, Z. Detecting emotion from EEG Signals Using the emotive epoc device. In Proceedings of the International Conference on Brain Informatics, Macau, China, 4–7 December 2012; pp. 175–184. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-Learn: Machine Learning in Python. J. Mach. Learn. Res. 2011, 12, 2825–2830. [Google Scholar]
- Stemmler, G. Methodological considerations in the psychophysiological study of emotion. In Handbook of Affective Sciences; Davidson, R.J., Scherer, K.R., Goldsmith, H.H., Eds.; Oxford University Press: New York, NY, USA, 2003; pp. 225–255. [Google Scholar]
- Lan, Z.; Sourina, O.; Wang, L.; Liu, Y. Real-Time EEG-Based Emotion Monitoring Using Stable Features. Vis. Comput. 2016, 32, 347–358. [Google Scholar] [CrossRef]
- Zheng, W.L.; Liu, W.; Lu, Y.; Lu, B.L.; Cichocki, A. Emotionmeter: A Multimodal Framework for Recognizing Human Emotions. IEEE Trans. Cybern. 2019, 49, 1110–1122. [Google Scholar] [CrossRef] [PubMed]
- Lin, Y.P.; Yang, Y.H.; Jung, T.P. Fusion of Electroencephalographic Dynamics and Musical Contents for Estimating Emotional Responses in Music Listening. Front. Neurosci. 2014, 8, 1–14. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Dolcos, S.; Katsumi, Y.; Dixon, R.A. The Role of Arousal in the Spontaneous Regulation of Emotions in Healthy Aging: A fMRI Investigation. Front. Psychol. 2014, 5, 1–12. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pandey, P.; Seeja, K.R. Subject-Independent Emotion Detection From EEG Using VMD and Deep Learning. J. King Saud Univ. Comput. Inf. Sci. 2019. [Google Scholar] [CrossRef]
- Katsigiannis, S.; Ramzan, N. DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-Cost Off-The-Shelf Devices. IEEE J. Biomed. Health Inform. 2018, 22, 98–107. [Google Scholar] [CrossRef] [Green Version]
- Li, X.; Song, D.; Zhang, P.; Zhang, Y.; Hou, Y.; Hu, B. Exploring EEG Features in Cross-Subject Emotion Recognition. Front. Neurosci. 2018, 12, 1–15. [Google Scholar] [CrossRef] [Green Version]
- Soleymani, S.; Soleymani, M. Cross-corpus EEG-based emotion recognition. In Proceedings of the IEEE International Workshop on Machine Learning for Signal Processing, Aalborg, Denmark, 17–20 September 2018; pp. 1–6. [Google Scholar]
- Abadi, M.K.; Subramanian, R.; Kia, S.M.; Avesani, P.; Patras, I.; Sebe, N. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses. IEEE Trans. Affect. Comput. 2015, 6, 209–222. [Google Scholar] [CrossRef]
- Bradley, A.E. The Use of the Area Under the ROC Curve in the Evaluation of Machine Learning Algorithms. Pattern Recognit. 1997, 30, 1145–1159. [Google Scholar] [CrossRef] [Green Version]
Affect Type | Song Title | Artist |
---|---|---|
Positive Valence High Arousal | Tennessee Hayride | Jason Shaw |
Runtime Error | Peter Sharp | |
Night Drive | Decktonic | |
Songe D’Automne | Latché Swing | |
Requiem for a Fish | The Freak Fandango Orchestra | |
Negative Valence Low Arousal | Eight | Marcel Pequel |
One | Marcel Pequel | |
Seven | Marcel Pequel | |
Moonlight and Roses | Lee Rosevere | |
LA | Julian Winter |
Age Group | PH Session | NL Session | ||
---|---|---|---|---|
Valence | Arousal | Valence | Arousal | |
All | 1.37 ± 0.68 | 0.63 ± 1.12 | −0.74 ± 1.24 | −0.58 ± 0.96 |
OA | 2.00 ± 0.00 | 1.17 ± 0.98 | −0.33 ± 1.96 | 0.00 ± 0.00 |
YA | 1.08 ± 0.64 | 0.38 ± 1.12 | −0.92 ± 0.75 | −0.85 ± 1.07 |
Method | Data Set | Valence | Arousal | ||
---|---|---|---|---|---|
NN | SVM | NN | SVM | ||
Ten-fold Cross-Validation | Training set | 71.9% | 70.1% | 70.6% | 69.5% |
LOO Cross-Validation | Training set | 63.7% | 61.8% | 63.3% | 61.6% |
Subject-Independent Testing | Testing set | 63.3% | 62.4% | 62.6% | 61.2% |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Shao, M.; Snyder, M.; Nejat, G.; Benhabib, B. User Affect Elicitation with a Socially Emotional Robot. Robotics 2020, 9, 44. https://doi.org/10.3390/robotics9020044
Shao M, Snyder M, Nejat G, Benhabib B. User Affect Elicitation with a Socially Emotional Robot. Robotics. 2020; 9(2):44. https://doi.org/10.3390/robotics9020044
Chicago/Turabian StyleShao, Mingyang, Matt Snyder, Goldie Nejat, and Beno Benhabib. 2020. "User Affect Elicitation with a Socially Emotional Robot" Robotics 9, no. 2: 44. https://doi.org/10.3390/robotics9020044
APA StyleShao, M., Snyder, M., Nejat, G., & Benhabib, B. (2020). User Affect Elicitation with a Socially Emotional Robot. Robotics, 9(2), 44. https://doi.org/10.3390/robotics9020044