A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue via Sharing Experience Emotionally
Abstract
:1. Introduction
- We build an emotional voice conversion model to obtain ERICA’s emotional voice;
- We let an android robot, ERICA, provide scenario-based comforts to users by expressing corresponding emotions in verbal behavior;
- We construct people-centered messages in robot utterances by sharing related experiences/situations of other people in historical human–robot interactions;
- We adopt the questionnaire-based evaluation with a Likert scale to examine the effectiveness of emotional experience sharing in comforting dialogues;
- In addition, we evaluate ERICA’s personality based on BIG FIVE personality traits [23].
2. Related Works
2.1. Human–Robot Comforting Interaction
2.2. Robot’s Emotion in Human–Robot Interaction
2.3. Audio Modality Emotional Expression for Robots
3. Investigation of Human–Human Interaction
4. Method
4.1. Emotional Voice
Algorithm 1 Training strategy |
|
4.2. Measurements
- Emotional expression:
|
- Extroversion:
|
4.3. Hypotheses
- H1: The proposed method reinforces ERICA’s ability to express emotion with voice, and obtain a higher score in terms of emotional expression;
- H2: The proposed method improves the perception of empathy and encouragement;
- H3: When feeling down, people prefer to talk with the ERICA equipped with the proposed method;
- H4: Compared to the neutral voice, the emotional expression of the voice can better shape the extroversion and openness of the robot to some extent.
5. Experiment
5.1. Scenarios and Conditions
5.2. Procedures and Subjects
5.3. Results and Discussion
6. Where Next
6.1. The Effect of Experience Sharing
6.2. The Design of Robots’ Emotional Responses
6.3. Multi-Modality Emotional Expression
6.4. Gender Effects
6.5. Experiment with Practical Comforting Interaction
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Jeong, S.; Logan, D.E.; Goodwin, M.S.; Graca, S.; O’Connell, B.; Goodenough, H.; Anderson, L.; Stenquist, N.; Fitzpatrick, K.; Zisook, M.; et al. A social robot to mitigate stress, anxiety, and pain in hospital pediatric care. In Proceedings of the Tenth Annual ACM/IEEE International Conference on Human-Robot Interaction Extended Abstracts, Portland, OR, USA, 2–5 March 2015; pp. 103–104. [Google Scholar]
- Wada, K.; Shibata, T. Robot therapy in a care house-results of case studies. In Proceedings of the ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006; pp. 581–586. [Google Scholar]
- Han, M.J.; Lin, C.H.; Song, K.T. Robotic emotional expression generation based on mood transition and personality model. IEEE Trans. Cybern. 2012, 43, 1290–1303. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Boccanfuso, L.; Wang, Q.; Leite, I.; Li, B.; Torres, C.; Chen, L.; Salomons, N.; Foster, C.; Barney, E.; Ahn, Y.A.; et al. A thermal emotion classifier for improved human-robot interaction. In Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA, 26–31 August 2016; pp. 718–723. [Google Scholar]
- Cañamero, L. Emotion understanding from the perspective of autonomous robots research. Neural Netw. 2005, 18, 445–455. [Google Scholar] [CrossRef] [PubMed]
- Miller, D. The Comfort of People; John Wiley & Sons: Hoboken, NJ, USA, 2017. [Google Scholar]
- Almeida, D.M.; Wethington, E.; McDonald, D.A. Daily variation in paternal engagement and negative mood: Implications for emotionally supportive and conflictual interactions. J. Marriage Fam. 2001, 63, 417–429. [Google Scholar] [CrossRef]
- Gasser, L.; Grütter, J.; Buholzer, A.; Wettstein, A. Emotionally supportive classroom interactions and students’ perceptions of their teachers as caring and just. Learn. Instr. 2018, 54, 82–92. [Google Scholar] [CrossRef]
- High, A.C.; Solomon, D.H. Motivational systems and preferences for social support strategies. Motiv. Emot. 2014, 38, 463–474. [Google Scholar] [CrossRef]
- Tian, X.; Solomon, D.H.; Brisini, K.S.C. How the comforting process fails: Psychological reactance to support messages. J. Commun. 2020, 70, 13–34. [Google Scholar] [CrossRef]
- Lee, D.; Oh, K.J.; Choi, H.J. The chatbot feels you-a counseling service using emotional response generation. In Proceedings of the 2017 IEEE International Conference on Big Data and Smart Computing (BigComp), Jeju, Korea, 13–16 February 2017; pp. 437–440. [Google Scholar]
- Ho, A.; Hancock, J.; Miner, A.S. Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. J. Commun. 2018, 68, 712–733. [Google Scholar] [CrossRef]
- Burleson, B.R. Explaining recipient responses to supportive messages. New Dir. Interpers. Commun. Res. 2010, 159, 179. [Google Scholar]
- High, A.C.; Dillard, J.P. A review and meta-analysis of person-centered messages and social support outcomes. Commun. Stud. 2012, 63, 99–118. [Google Scholar] [CrossRef]
- Sabelli, A.M.; Kanda, T.; Hagita, N. A conversational robot in an elderly care center: An ethnographic study. In Proceedings of the 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Lausanne, Switzerland, 8–11 March 2011; pp. 37–44. [Google Scholar]
- Fu, C.; Yoshikawa, Y.; Iio, T.; Ishiguro, H. Sharing experiences to help a robot present its mind and sociability. Int. J. Soc. Robot. 2021, 13, 341–352. [Google Scholar] [CrossRef] [Green Version]
- Fu, C.; Liu, C.; Ishi, C.T.; Yoshikawa, Y.; Iio, T.; Ishiguro, H. Using an Android Robot to Improve Social Connectedness by Sharing Recent Experiences of Group Members in Human-Robot Conversations. IEEE Robot. Autom. Lett. 2021, 6, 6670–6677. [Google Scholar] [CrossRef]
- Leite, I.; Pereira, A.; Mascarenhas, S.; Martinho, C.; Prada, R.; Paiva, A. The influence of empathy in human–robot relations. Int. J. Hum.-Comput. Stud. 2013, 71, 250–260. [Google Scholar] [CrossRef]
- Breazeal, C.; Berlin, M.; Brooks, A.; Gray, J.; Thomaz, A.L. Using perspective taking to learn from ambiguous demonstrations. Robot. Auton. Syst. 2006, 54, 385–393. [Google Scholar] [CrossRef]
- Torrey, C.; Fussell, S.R.; Kiesler, S. What robots could teach us about perspective-taking. In Expressing Oneself/Expressing One’s Self: Communication, Cognition, Language, and Identity; Psychology Press: Hove, UK, 2009; pp. 93–106. [Google Scholar]
- Busso, C.; Bulut, M.; Lee, C.C.; Kazemzadeh, A.; Mower, E.; Kim, S.; Chang, J.N.; Lee, S.; Narayanan, S.S. IEMOCAP: Interactive emotional dyadic motion capture database. Lang. Resour. Eval. 2008, 42, 335–359. [Google Scholar] [CrossRef]
- Poria, S.; Hazarika, D.; Majumder, N.; Naik, G.; Cambria, E.; Mihalcea, R. Meld: A multimodal multi-party dataset for emotion recognition in conversations. arXiv 2018, arXiv:1810.02508. [Google Scholar]
- Koshio, S.; Abe, S. Ten Item Personality Inventory (TIPI-J). Panasonic Res. 2012, 21, 40–52. (In Japanese) [Google Scholar]
- Wada, K.; Shibata, T.; Musha, T.; Kimura, S. Robot therapy for elders affected by dementia. IEEE Eng. Med. Biol. Mag. 2008, 27, 53–60. [Google Scholar] [CrossRef]
- Pipitpukdee, J.; Phantachat, W. The study of the pet robot therapy in Thai autistic children. In Proceedings of the 5th International Conference on Rehabilitation Engineering & Assistive Technology, Bangkok, Thailand, 21–23 July 2011; pp. 1–4. [Google Scholar]
- Aminuddin, R.; Sharkey, A.; Levita, L. Interaction with the Paro robot may reduce psychophysiological stress responses. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; pp. 593–594. [Google Scholar]
- Rossi, S.; Larafa, M.; Ruocco, M. Emotional and behavioural distraction by a social robot for children anxiety reduction during vaccination. Int. J. Soc. Robot. 2020, 12, 765–777. [Google Scholar] [CrossRef]
- Baecker, A.N.; Geiskkovitch, D.Y.; González, A.L.; Young, J.E. Emotional support domestic robots for healthy older adults: Conversational prototypes to help with loneliness. In Proceedings of the Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; pp. 122–124. [Google Scholar]
- Rosenthal-von der Pütten, A.M.; Krämer, N.C.; Herrmann, J. The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. Int. J. Soc. Robot. 2018, 10, 569–582. [Google Scholar] [CrossRef]
- Paiva, A.; Leite, I.; Ribeiro, T. Emotion modeling for social robots. Oxf. Handb. Affect. Comput. 2014, 296–308. [Google Scholar]
- Xie, B.; Park, C.H. A MultiModal Social Robot Toward Personalized Emotion Interaction. arXiv 2021, arXiv:2110.05186. [Google Scholar]
- Graterol, W.; Diaz-Amado, J.; Cardinale, Y.; Dongo, I.; Lopes-Silva, E.; Santos-Libarino, C. Emotion detection for social robots based on NLP transformers and an emotion ontology. Sensors 2021, 21, 1322. [Google Scholar] [CrossRef] [PubMed]
- Fu, C.; Liu, C.; Ishi, C.T.; Ishiguro, H. Multi-modality emotion recognition model with GAT-based multi-head inter-modality attention. Sensors 2020, 20, 4894. [Google Scholar] [CrossRef] [PubMed]
- Hegel, F.; Spexard, T.; Wrede, B.; Horstmann, G.; Vogt, T. Playing a different imitation game: Interaction with an Empathic Android Robot. In Proceedings of the 2006 6th IEEE-RAS International Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006; pp. 56–61. [Google Scholar]
- Riek, L.D.; Paul, P.C.; Robinson, P. When my robot smiles at me: Enabling human-robot rapport via real-time head gesture mimicry. J. Multimodal User Interfaces 2010, 3, 99–108. [Google Scholar] [CrossRef]
- Marvin, R.S.; Greenberg, M.T.; Mossler, D.G. The early development of conceptual perspective taking: Distinguishing among multiple perspectives. Child Dev. 1976, 47, 511–514. [Google Scholar] [CrossRef]
- Crumpton, J.; Bethel, C. Conveying emotion in robotic speech: Lessons learned. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 274–279. [Google Scholar]
- Nass, C.; Foehr, U.; Brave, S.; Somoza, M. The effects of emotion of voice in synthesized and recorded speech. In Proceedings of the AAAI Symposium Emotional and Intelligent II: The Tangled Knot of Social Cognition, North Falmouth, MA, USA, 2–4 November 2001. [Google Scholar]
- Roehling, S.; MacDonald, B.; Watson, C. Towards expressive speech synthesis in english on a robotic platform. In Proceedings of the Australasian International Conference on Speech Science and Technology, Auckland, New Zealand, 6–8 December 2006; Citeseer: Princeton, NJ, USA, 2006; pp. 130–135. [Google Scholar]
- Lee, J. Generating Robotic Speech Prosody for Human Robot Interaction: A Preliminary Study. Appl. Sci. 2021, 11, 3468. [Google Scholar] [CrossRef]
- Williams, A.M.; Irurita, V.F. Emotional comfort: The patient’s perspective of a therapeutic context. Int. J. Nurs. Stud. 2006, 43, 405–415. [Google Scholar] [CrossRef]
- Polanco-Roman, L.; Moore, A.; Tsypes, A.; Jacobson, C.; Miranda, R. Emotion reactivity, comfort expressing emotions, and future suicidal ideation in emerging adults. J. Clin. Psychol. 2018, 74, 123–135. [Google Scholar] [CrossRef] [Green Version]
- Asai, S.; Yoshino, K.; Shinagawa, S.; Sakti, S.; Nakamura, S. Emotional speech corpus for persuasive dialogue system. In Proceedings of the 12th Language Resources and Evaluation Conference, Marseille, France, 11–16 May 2020; pp. 491–497. [Google Scholar]
- Liu, S.; Cao, Y.; Meng, H. Emotional Voice Conversion With Cycle-consistent Adversarial Network. arXiv 2020, arXiv:2004.03781. [Google Scholar]
- Kaneko, T.; Kameoka, H. Cyclegan-vc: Non-parallel voice conversion using cycle-consistent adversarial networks. In Proceedings of the 2018 IEEE 26th European Signal Processing Conference (EUSIPCO), Rome, Italy, 3–7 September 2018; pp. 2100–2104. [Google Scholar]
- Fu, C.; Liu, C.; Ishi Toshinori, C.; Ishiguro, H. CycleTransGAN-EVC: A CycleGAN-based Emotional Voice Conversion Model with Transformer. arXiv 2021, arXiv:2111.15159. [Google Scholar]
- Zhou, K.; Sisman, B.; Li, H. Transforming spectrum and prosody for emotional voice conversion with non-parallel training data. arXiv 2020, arXiv:2002.00198. [Google Scholar]
- Morise, M. CheapTrick, a spectral envelope estimator for high-quality speech synthesis. Speech Commun. 2015, 67, 1–7. [Google Scholar] [CrossRef] [Green Version]
- Andrist, S.; Mutlu, B.; Tapus, A. Look like me: Matching robot personality via gaze to increase motivation. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, Seoul, Korea, 18–23 April 2015; pp. 3603–3612. [Google Scholar]
- Goetz, J.; Kiesler, S. Cooperation with a robotic assistant. In Proceedings of the CHI’02 Extended Abstracts on Human Factors in Computing Systems, Minneapolis, MN, USA, 20–25 April 2002; pp. 578–579. [Google Scholar]
- Robert, L. Personality in the human robot interaction literature: A review and brief critique. In Proceedings of the 24th Americas Conference on Information Systems, New Orleans, LA, USA, 16–18 August 2018; pp. 16–18. Available online: https://ssrn.com/abstract=3308191 (accessed on 30 December 2018).
- Uchida, T.; Takahashi, H.; Ban, M.; Shimaya, J.; Minato, T.; Ogawa, K.; Yoshikawa, Y.; Ishiguro, H. Japanese Young Women did not discriminate between robots and humans as listeners for their self-disclosure-pilot study. Multimodal Technol. Interact. 2020, 4, 35. [Google Scholar] [CrossRef]
- de Graaf, M.M.A.; Allouch, S.B. The influence of prior expectations of a robot’s lifelikeness on users’ intentions to treat a zoomorphic robot as a companion. Int. J. Soc. Robot. 2017, 9, 17–32. [Google Scholar] [CrossRef] [Green Version]
Interlocutor’s Emotional Responses | ||||||||
---|---|---|---|---|---|---|---|---|
Happy | Sad | Frustrated | Surprise | Anger | Excited | Neutral | ||
Speaker’s status | Happy | 69.81 | 0.71 | 0.24 | 2.89 | 0 | 16.98 | 10.38 |
Sad | 0.32 | 88.29 | 5.59 | 0.21 | 0.54 | 0.21 | 4.30 | |
Frustrated | 1.40 | 3.85 | 72.22 | 0.41 | 13.86 | 0 | 9.52 | |
Surprise | 10.26 | 1.28 | 10.26 | 39.74 | 2.56 | 24.36 | 8.97 | |
Anger | 0 | 0.97 | 38.37 | 0.19 | 57.95 | 0.58 | 1.94 | |
Excited | 9.42 | 0.11 | 0.11 | 1.71 | 0 | 81.69 | 6.00 | |
Neutral | 2.77 | 2.48 | 10.50 | 1.46 | 0.29 | 4.38 | 78.05 |
Interlocutor’s Emotional Responses | ||||||||
---|---|---|---|---|---|---|---|---|
Disgust | Joy | Neutral | Angry | Fear | Sadness | Surprise | ||
Speaker’s status | Disgust | 16.10 | 7.20 | 36.44 | 13.98 | 2.96 | 9.75 | 13.56 |
Joy | 2.26 | 33.63 | 38.95 | 7.50 | 2.33 | 4.66 | 10.67 | |
Neutral | 2.29 | 14.89 | 55.53 | 8.30 | 2.19 | 5.00 | 11.80 | |
Angry | 2.40 | 10.72 | 35.87 | 31.46 | 3.01 | 7.14 | 11.16 | |
Fear | 1.65 | 13.22 | 41.32 | 13.64 | 11.57 | 7.43 | 11.16 | |
Sadness | 3.48 | 10.76 | 34.11 | 10.43 | 2.65 | 24.34 | 14.24 | |
Surprise | 2.58 | 15.38 | 45.30 | 10.41 | 2.76 | 7.09 | 16.48 |
MCD | RMSE | |||||
---|---|---|---|---|---|---|
Models | N→P | N→LS | Avg. | N→P | N→LS | Avg. |
CycleGAN-CL (ours) | 18.94 | 17.56 | 18.25 | 135.11 | 84.42 | 94.77 |
CycleGAN [47] | 20.28 | 17.88 | 19.08 | 137.64 | 89.97 | 113.81 |
Utterance | Emotion (EXP.) | Emotion (CON.) | |
---|---|---|---|
ERICA | Hi, Yuki, meet you again, how is it going? | Neutral | Neutral |
User | Due to Covid-19, I haven’t been able to go out for about a week, I felt a little down. | Low spirit | Low spirit |
ERICA (experience sharing) | That is tough. Delina once told me that she had to wear a mask every time she left the apartment, which was quite inconvenient. | Low spirit | Neutral |
User | That’s right. I do not even want to go out. | Low spirit | Low spirit |
ERICA (encouragement) | It is better to go out for a walk sometimes to refresh yourself! With a mask on. | Positive | Neutral |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Fu, C.; Deng, Q.; Shen, J.; Mahzoon, H.; Ishiguro, H. A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue via Sharing Experience Emotionally. Sensors 2022, 22, 991. https://doi.org/10.3390/s22030991
Fu C, Deng Q, Shen J, Mahzoon H, Ishiguro H. A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue via Sharing Experience Emotionally. Sensors. 2022; 22(3):991. https://doi.org/10.3390/s22030991
Chicago/Turabian StyleFu, Changzeng, Qi Deng, Jingcheng Shen, Hamed Mahzoon, and Hiroshi Ishiguro. 2022. "A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue via Sharing Experience Emotionally" Sensors 22, no. 3: 991. https://doi.org/10.3390/s22030991
APA StyleFu, C., Deng, Q., Shen, J., Mahzoon, H., & Ishiguro, H. (2022). A Preliminary Study on Realizing Human–Robot Mental Comforting Dialogue via Sharing Experience Emotionally. Sensors, 22(3), 991. https://doi.org/10.3390/s22030991