Positive Emotion Amplification by Representing Excitement Scene with TV Chat Agents
Abstract
:1. Introduction
2. Related Work
2.1. Dialog Agents
2.2. Possibility of Emotion Amplification by Empathy between Humans and Robots
2.3. Analysis of Comments on Social Media
2.4. The Goal of the Present Study
3. Proposed Method Implementation
- The agent estimates the degree of excitement elicited by a TV program in real time.
- The agent decides and operates its behavior from the estimated degree of excitement.
3.1. Estimating the Degree of Excitement
3.2. Validation of the Estimation Model
3.3. Determining the Behavior of the Agents
4. Experimental Conditions
4.1. Participants
4.2. Agent Conditions
4.3. TV Content
- Japan versus Paraguay (broadcasted on 5 September 2019)
- Japan versus Mongolia (broadcasted on 10 October 2019)
- Japan versus Kyrgyzstan (broadcasted on 14 November 2019)
- Japan versus China (broadcasted on 10 December 2019)
4.4. Procedure
4.5. Analysis
5. Results
5.1. Emotional Ratings
5.2. Physiological Activity
5.3. Evaluation of Agents
5.4. Motivation to Interact with Agents
6. Discussion
6.1. Enhanced Emotional Arousal Watching TV Alongside Excited Agents
6.2. Enhanced Excitement Recognition in Excited Agents
6.3. Improving Motivational Use of the Excited Agents
6.4. Limitation
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Information and Communications in Japan; Technical Report; Ministry of Internal Affairs and Communications of Japan: Tokyo, Japan, 2018.
- Annual Report on the Ageing Society; Technical Report; Cabinet Office Government of Japan: Tokyo, Japan, 2019.
- White Paper on Children and Young People; Technical Report; Cabinet Office Government of Japan: Tokyo, Japan, 2019.
- Breazeal, C. A motivational system for regulating human-robot interaction. In Proceedings of the 10th Conference on Innovative Applications of Artificial Intelligence, Madison, WI, USA, 27–29 July 1998; pp. 54–61. [Google Scholar]
- Gockley, R.; Bruce, A.; Forlizzi, J.; Michalowski, M.; Mundell, A.; Rosenthal, S.; Sellner, B.; Simmons, R.; Snipes, K.; Schultz, A.C.; et al. Designing robots for long-term social interaction. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 1338–1343. [Google Scholar]
- Leite, I.; Martinho, C.; Paiva, A. Social robots for long-term interaction: A survey. Int. J. Soc. Robot. 2013, 5, 291–308. [Google Scholar] [CrossRef]
- Kanda, T.; Sato, R.; Saiwaki, N.; Ishiguro, H. A Longitudinal Field Trial for Human-robot Interaction in an Elementary School. J. Hum. Interface Soc. Hum. Interface 2005, 7, 27–37. (In Japanese) [Google Scholar]
- Weizenbaum, J. ELIZA—A computer program for the study of natural language communication between man and machine. Commun. ACM 1966, 9, 36–45. [Google Scholar] [CrossRef]
- Tsujino, K.; Iizuka, S.; Nakashima, Y.; Isoda, Y. Speech Recognition and Spoken Language Understanding for Mobile Personal Assistants: A Case Study of “Shabette Concier”. In Proceedings of the 2013 IEEE 14th International Conference on Mobile Data Management, Milan, Italy, 3–6 June 2013; Volume 2, pp. 225–228. [Google Scholar]
- Iwama, K.; Kano, Y. Japanese advertising slogan generator using case frame and word vector. In Proceedings of the 11th International Conference on Natural Language Generation, Tilburg, The Netherlands, 5–8 November 2018; pp. 197–198. [Google Scholar]
- Tsunomori, Y.; Onishi, K.; Fujimoto, H.; Kadono, K.; Yoshimura, T.; Isoda, Y. Development of a customizable open domain chat-oriented dialogue system. J. Jpn. Soc. Artif. Intell. SIG-SLUD 2018, 5, 124–127. [Google Scholar]
- Miyazawa, K.; Tokoyo, T.; Masui, Y.; Matsuo, N.; Kikuchi, H. Factors of interaction in the spoken dialogue system with high desire of sustainability. Trans. Inst. Electron. Inf. Commun. Eng. A 2012, 95, 27–36. [Google Scholar]
- Minami, H.; Kawanami, H.; Kanbara, M.; Hagita, N. Chat robot coupling machine responses and social media comments for continuous conversation. In Proceedings of the 2016 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), Seattle, WA, USA, 11–15 July 2016; pp. 1–6. [Google Scholar]
- Iio, T.; Yoshikawa, Y.; Ishiguro, H. Starting a Conversation by Multi-Robot Cooperative Behavior. In Proceedings of the International Conference on Social Robotics, Tsukuba, Japan, 22–24 November 2017; Springer: Cham, Switzerland, 2017; pp. 739–748. [Google Scholar]
- Matsumoto, T.; Seko, S.; Aoki, R.; Miyata, A.; Watanabe, T.; Yamada, T. Affective agents for enhancing emotional experience. In Proceedings of the Second International Conference on Human-Agent Interaction, Tsukuba, Japan, 28–31 October 2014; pp. 169–172. [Google Scholar]
- Mohammad, S.M.; Kiritchenko, S.; Zhu, X. NRC-Canada: Building the state-of-the-art in sentiment analysis of tweets. arXiv 2013, arXiv:1308.6242. [Google Scholar]
- Gilbert, C.; Hutto, E. Vader: A parsimonious rule-based model for sentiment analysis of social media text. In Proceedings of the Eighth International Conference on Weblogs and Social Media (ICWSM-14), Ann Arbor, MI, USA, 1–4 June 2014; Volume 81, p. 82. Available online: http://comp.social.gatech.edu/papers/icwsm14.vader.hutto.pdf (accessed on 16 April 2020).
- Ghani, N.A.; Hamid, S.; Hashem, I.A.T.; Ahmed, E. Social media big data analytics: A survey. Comput. Hum. Behav. 2019, 101, 417–428. [Google Scholar] [CrossRef]
- Shamma, D.A.; Kennedy, L.; Churchill, E.F. Tweet the debates: Understanding community annotation of uncollected sources. In Proceedings of the First SIGMM Workshop on Social Media, Beijing, China, 23 October 2009; pp. 3–10. [Google Scholar]
- Lang, P.J.; Bradley, M.M.; Cuthbert, B.N. Emotion, motivation, and anxiety: Brain mechanisms and psychophysiology. Biol. Psychiatry 1998, 44, 1248–1263. [Google Scholar] [CrossRef]
- SHARP Corporation. RoBoHon, Product Infromation. Available online: https://robohon.com/product/robohon.php/ (accessed on 4 May 2020). (In Japanese).
- HOYA Corporation. Speech Sythesis Voice Text Web API. Available online: http://voicetext.jp/products/vt-webapi/ (accessed on 4 May 2020). (In Japanese).
- Twitter, Inc. Welcome to Twitter. Available online: https://twitter.com/ (accessed on 4 May 2020). (In Japanese).
- DWANGO Co., Ltd. Nico Nico Commentary. Available online: http://jk.nicovideo.jp/ (accessed on 4 May 2020). (In Japanese).
- Masanobu, K.; Takeo, K.; Hiroko, K.; Seiji, N. “Mora method” for objective evaluation of severity of spasmodic dysphonia. Jpn. J. Logop. Phoniatr. 1997, 38, 176–181. [Google Scholar]
- Russell, J.A.; Weiss, A.; Mendelsohn, G.A. Affect grid: A single-item scale of pleasure and arousal. J. Personal. Soc. Psychol. 1989, 57, 493. [Google Scholar] [CrossRef]
- Nishimura, S.; Nakamura, T.; Kanbara, M.; Wataru, S.; Hagita, N. Evaluation of Pacing for Dialog Robots to Build Trust Relationships with Human Users. In Proceedings of the 7th International Conference on Human-Agent Interaction, Kyoto, Japan, 6–10 October 2019; pp. 300–302. [Google Scholar]
- Orne, M.T. On the social psychology of the psychological experiment: With particular reference to demand characteristics and their implications. Am. Psychol. 1962, 17, 776. [Google Scholar] [CrossRef]
- Li, S.; Scott, N.; Walters, G. Current and potential methods for measuring emotion in tourism experiences: A review. Curr. Issues Tour. 2015, 18, 805–827. [Google Scholar] [CrossRef]
- Cacioppo, J.T.; Berntson, G.G.; Klein, D.J. What is an Emotion? The Role of Somatovisceral Afference, with Special Emphasis on Somatovisceral “Illusions”; Sage Publications, Inc.: Thousand Oaks, CA, USA, 1992. [Google Scholar]
- Faul, F.; Erdfelder, E.; Lang, A.G.; Buchner, A. G* Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods 2007, 39, 175–191. [Google Scholar] [CrossRef] [PubMed]
Utterance Interval (s) | Pitch | Volume | Speech Rate | |
---|---|---|---|---|
Level 3: the parameters when the participants were excited | 4 | 123 | 132 | 100 |
Level 2 | 7 | 117 | 118 | |
Level 1 | 10 | 111 | 105 | |
Level 0: the parameters when the participants were not excited | 13 | 105 | 92 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Nishimura, S.; Kimata, D.; Sato, W.; Kanbara, M.; Fujimoto, Y.; Kato, H.; Hagita, N. Positive Emotion Amplification by Representing Excitement Scene with TV Chat Agents. Sensors 2020, 20, 7330. https://doi.org/10.3390/s20247330
Nishimura S, Kimata D, Sato W, Kanbara M, Fujimoto Y, Kato H, Hagita N. Positive Emotion Amplification by Representing Excitement Scene with TV Chat Agents. Sensors. 2020; 20(24):7330. https://doi.org/10.3390/s20247330
Chicago/Turabian StyleNishimura, Shogo, Daiki Kimata, Wataru Sato, Masayuki Kanbara, Yuichiro Fujimoto, Hirokazu Kato, and Norihiro Hagita. 2020. "Positive Emotion Amplification by Representing Excitement Scene with TV Chat Agents" Sensors 20, no. 24: 7330. https://doi.org/10.3390/s20247330
APA StyleNishimura, S., Kimata, D., Sato, W., Kanbara, M., Fujimoto, Y., Kato, H., & Hagita, N. (2020). Positive Emotion Amplification by Representing Excitement Scene with TV Chat Agents. Sensors, 20(24), 7330. https://doi.org/10.3390/s20247330