A Survey on Recent Advances in Social Robotics
Abstract
:1. Introduction
- Not all articles using the “social robot” label defined what the authors meant by it.
- In highly pertinent articles, the majority acknowledges the lack of a generally accepted and accurate definition and the difficulty of defining a social robot.
- There has been an increase in the number of times the label “social robot” has been mentioned since the late 1990s.
- Among the definitions made in scientific articles, the following properties are associated with social robots: autonomy, ability to act in a socially appropriate manner, and in meaningful social interactions, communication, intelligence, operation according to established social and cultural norms, the ability to sense the presence of humans and to engage in physical acknowledgment, the ability to use gestures, to express and/or perceive emotions and to engage in a conversation [2,3,4,5,6].
- Different alternative future-oriented definitions have been made, mentioning a merge between biological and technological elements in social robots, and social robots representing applications of technology meant to solve social problems that are non-technical in nature, for pressing issues in contemporary society [7,8].
- Demonstrating the extent to which social robotics can be involved in humans’ life.
- Proving different capacities allowing a robot to interact in a social manner.
- Emphasizing the impacts of technology and research on social robots.
- Addressing the human side of the human-robot interaction.
- Providing a multi-viewpoint image of the future of social robots.
2. Domains of Application
2.1. Telepresence
2.2. Education
2.3. Care and Assistance
2.4. Medicine
2.5. Autism Spectrum Disorders
2.6. Other Applications of Children Companionship
2.7. Other Domains of Research and Application
3. Modalities of Human-Robot Interaction
3.1. Vision Systems in Robots
3.2. Conversational Systems in Robots
- Context of the interactions: Visitors approach the receptionist and engage in conversations in English. Both questions and answers will be included in the database.
- Audio recordings of the conversations: a text by speech recognition modules is used to transcript the conversation.
- Video recordings of the interaction, showing the face and upper body of the receptionist, with a quality of images usable by body posture recognition systems.
- The collected data will be used to progressively train the system. Each conversation will be labeled with the corresponding date, time and interaction parties.
- Participants will be asked to be free to ask questions they may have to inquire about the center in English, without having any other constraint or any specific text to pronounce.
3.3. Expressions and Gestures
4. Robotic Platforms Used in Social Robotics
5. Relationships with Recent Advances in Different Domains
5.1. Artificial Intelligence
5.2. Manufacturing Technologies
5.2.1. Additive Manufacturing
5.2.2. Semiconductor Devices
- Processors: to carry out and coordinate different tasks of the robotic system.
- Human machine interface: in screen displays, LEDs, microphones, speakers and their drivers.
- Sensors: for position, speed, current, distance and orientation for example.
- Drivers: for the different types of actuators used in robots.
5.3. Processing Technologies
5.4. Operating Systems and Other Software
6. Metrics of Human Perception and Acceptability
- in [15], a robotic platform was equipped with the capacity to perform the two tasks of group interaction, where it had to maintain an appropriate position and orientation in a group, and the person following. The human evaluation began with a briefing of 15 subjects about the purpose of each task, followed by a calibration step where the subjects were shown human-level performance in each task, followed by interaction with the robotic platform for each task. Then, the subjects were asked to rate the social performance of the platform with a number from 1 to 10 where 10 was human-level performance. The authors suggested increasing the number of subjects and a more detailed questionnaire to be necessary for reaching definitive conclusions.
- the “Godspeed” series of questionnaires has been proposed in [200] to help creators of robots in the robot development process. Five questionnaires using 5-point scales address the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. For example, in the anthropomorphism questionnaire (Godspeed I), participants are asked to rate their impressions of the robot with an integer from fake (1) to natural (5), and from machine-like (1) to human-like (5), and from artificial (1) to lifelike (5). Also in the animacy questionnaire (Godspeed II), participants can rate the robot for example from dead (1) to alive (5), from stagnant (1) to lively (5), and from inert (1) to interactive (5). The authors in [200] report cultural backgrounds, prior experiences with robots, and personality to be among the factors affecting the measurements made in such questionnaires. Furthermore, the perceptions of humans are unstable as their expectations and knowledge change with the increase of their experiences with robots. This means, for the authors in [200], that repeating the same experiment after a long duration of time would yield different results.
- in the context of elderly care and assistance, the Almere model was proposed in [201] as an adaptation and theoretical extension of the Unified Theory of Acceptance and Use of Technology (UTAUT) questionnaire [202]. Questionnaire items in the Almere model were adapted from the UTAUT questionnaire to fit the context of assistive robot technology and address elderly users in a care home. Different constructs are adopted and defined and questionnaires related to them, respectively. This resulted in constructs such as the users’ attitude towards the technology their intention to use it, their perceived enjoyment, perceived ease of use, perceived sociability and usefulness, social influence and presence, and trust. Experiments made on the model consisted of a data collection instrument with different questionnaire items on a 5-point Likert-type scale ranging from 1 to 5, corresponding to statements ranging from “totally disagree” to “totally agree”, respectively.
7. Future of Social Robotics
- Education: in [28], the authors suggested motivating schoolteachers to introduce a collaborative robot into their lectures. Enabling robot-supported language learning for preschool children was proposed as a long-term goal in [50]. In [51], where a review of robots in education was made, improving the speech understanding capabilities of robots and reproducing human-like behavior were proposed. In [27], the authors specified that second language anxiety and negative attitudes toward robots need to be carefully considered before introducing robots to students as second language tutors. Additionally, in [52], incorporating robots into teacher education or professional development programs was proposed. Teachers, students, and social robots were said to become all key actors in future classrooms, and teachers’ attitudes and beliefs were said to have possible influences on the future deployment of social robots.
- Care and assistance: in [61], the importance of experience in working with robots and raising awareness about what care robots can do was shown, in the objective of moving away from preconceptions based on fiction and imaginaries of care robots. In [58], an aim to achieve a design that is easily deployed in multiple locations, and contains all the necessary information for repeated deployments was expressed. In [205], an emotion recognition algorithm and an imitation algorithm was said to bring improvements to a robotic system for physical training of older adults. In [55], where a review of the usages of socially assistive robots in elderly care was made, the authors concluded that studies should be clearer about the precise role of any robot, and should use validated measures to assess their effectiveness. In the same context, a sophisticated speech analysis ability and accuracy of understanding language were said to be desired to improve the interaction between a human being and a robot in [59]. Under the topic of the acceptance of healthcare robots for older adults in [60], matching individual needs and preferences to the robot was said to possibly improve the acceptance. An alternatively proposed approach was to alter users’ expectations to match the capabilities of the robot.
- Children companionship: in [41], it was said that humanoid robots are promising for robot-mediated education with primary school-aged children due to their capacity of making voices and gestures that motivate children in learning activities. In [79], where the work addressed robots playing games with people, it was said that a robot can have a sort of character that would support its perception as a rational agent, by taking measures such as adapting the behavior and strategy of the robot to the real-time perception it has of the humans it interacts with.
- Autism and medicine: in [77], a list of considerations to be taken into account when developing robots for children with ASD has been made. It shows that such robots should be user-focused, usable, reliable, safe, and affordable. In [74], the robotic scenario was said to be an excellent way to elicit behaviors in children with ASD through interaction, analysis of the child’s behavior, and adaptation to it. According to the authors, Introducing robots into therapy would be of great clinical interest. In [75], works on social signal processing and socially assistive robotics were reported and issues that should be addressed by researchers in these research domains were listed. Among them are the machine understanding of typical and autistic behaviors, and the availability of databases of children with ASD interactions.
- security of robotic systems: an important aspect to address in social robotics is security and cybersecurity. Indeed, intelligent systems can help protect the security of users but hackers could attack social robot users from different vectors [206,207,208]. This should be taken into account when considering the use of social robots [209]. Work has been done in this domain to improve security, such as the guidelines published by a High-Level Expert Group established by the European Commission on trustworthy artificial intelligence. Additionally, the Trusted-ROS system was proposed in [207] for security improvement in humanoid robots. Additionally, recommendations were presented in [208] such as multi-factor authentication and multi-factor cryptography.
8. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Conflicts of Interest
References
- Sarrica, M.; Brondi, S.; Fortunati, L. How many facets does a “social robot” have? A review of scientific and popular definitions online. Inf. Technol. People 2019, 33, 1–21. [Google Scholar] [CrossRef]
- Duffy, B.R. Anthropomorphism and the social robot. Robot. Auton. Syst. 2003, 42, 177–190. [Google Scholar] [CrossRef]
- Fong, T.; Nourbakhsh, I.; Dautenhahn, K. A survey of socially interactive robots. Robot. Auton. Syst. 2003, 42, 143–166. [Google Scholar] [CrossRef] [Green Version]
- Breazeal, C. Toward sociable robots. Robot. Auton. Syst. 2003, 42, 167–175. [Google Scholar] [CrossRef]
- Li, H.; Cabibihan, J.J.; Tan, Y. Towards an Effective Design of Social Robots. Int. J. Soc. Robot. 2011, 3, 333–335. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Chignell, M. Communication of Emotion in Social Robots through Simple Head and Arm Movements. Int. J. Soc. Robot. 2010, 3, 125–142. [Google Scholar] [CrossRef]
- Shaw-Garlock, G. Looking Forward to Sociable Robots. Int. J. Soc. Robot. 2009, 1, 249–260. [Google Scholar] [CrossRef]
- Sabanovic, S. Robots in Society, Society in Robots. Int. J. Soc. Robot. 2010, 2, 439–450. [Google Scholar] [CrossRef]
- Korn, O. (Ed.) Social Robots: Technological, Societal and Ethical Aspects of Human-Robot Interaction; Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
- Karar, A.; Said, S.; Beyrouthy, T. Pepper Humanoid Robot as a Service Robot: A Customer Approach. In Proceedings of the 2019 3rd International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris, France, 24–26 April 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Bennett, C.C.; Sabanociv, S. Deriving Minimal Features for Human-Like Facial Expressions in Robotic Faces. Int. J. Soc. Robot. 2014, 6, 367–381. [Google Scholar] [CrossRef]
- Yoon, Y.; Ko, W.R.; Jang, M.; Lee, J.; Kim, J.; Lee, G. Robots Learn Social Skills: End-to-End Learning of Co-Speech Gesture Generation for Humanoid Robots. In Proceedings of the International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019. [Google Scholar]
- Ko, W.R.; Kim, J.H. Behavior SeBehavior Selection of Social Robots Using Developmental Episodic Memory-based Mechanism of Thought. In Proceedings of the IEEE International Conference on Consumer Electronics—Asia (ICCE-Asia), Jeju, Korea, 24–26 June 2018. [Google Scholar]
- Qureshi, A.H.; Nakamura, Y.; Yoshikawa, Y.; Ishiguro, H. Robot gains Social Intelligence through Multimodal Deep Reinforcement Learning. In Proceedings of the IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016. [Google Scholar]
- Shiarlis, K.; Messias, J.; Whiteson, S. Acquiring Social Interaction Behaviours for Telepresence Robots via Deep Learning from Demonstration. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, 24–28 September 2017. [Google Scholar]
- Castellano, G.; Cervelione, A.; Cianciotta, M.; De Carolis, B.; Vessio, G. Recognizing the Waving Gesture in the Interaction with a Social Robot. In Proceedings of the 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 31 August–4 September 2020. [Google Scholar]
- Saad, E.; Broekens, J.; Neerincx, M.A.; Hindriks, K.V. Enthusiastic Robots Make Better Contact. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019. [Google Scholar]
- Barra, P.; Bisogni, C.; Distasi, R.; Rapuano, A. VMPepper: How to Use a Social Humanoid Robot for Interactive Voice Messaging. In Proceedings of the Fourth International Conference on Applications and Systems of Visual Paradigms, VISUAL, Rome, Italy, 30 June–4 July 2019. [Google Scholar]
- Castellano, G.; De Carolis, B.; D’Errico, F.; Macchiarulo, N.; Rossano, V. PeppeRecycle: Improving Children’s Attitude Toward Recycling by Playing with a Social Robot. Int. J. Soc. Robot. 2021, 13, 97–111. [Google Scholar] [CrossRef]
- Van der Putte, D.; Boumans, R.; Neerincx, M.; Rikkert, M.O.; De Mul, M. A Social Robot for Autonomous Health Data Acquisition among Hospitalized Patients: An Exploratory Field Study. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Schicchi, D.; Pilato, G. A Social Humanoid Robot as a Playfellow for Vocabulary Enhancement. In Proceedings of the Second IEEE International Conference on Robotic Computing, Laguna Hills, CA, USA, 31 January–2 February 2018. [Google Scholar]
- Shiarlis, K.; Messias, J.; van Someren, M.; Whiteson, S.; Kim, J.; Vroon, J.; Englebienne, G.; Truong, K.; Evers, V.; Pérez-Higueras, N.; et al. TERESA: A Socially Intelligent SEmi-autonomous Telepresence System. In Proceedings of the International Conference on Robotics and Automation, Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
- Niemela, M.; van Aerschot, L.; Tammela, A.; Aaltonen, L.; Lammi, H. Towards Ethical Guidelines of Using Telepresence Robots in Residential Care. Int. J. Soc. Robot. 2019, 13, 431–439. [Google Scholar] [CrossRef] [Green Version]
- Zhang, G.; Hansen, J.P.; Minkata, K.; Alapetite, A.; Wang, Z. Eye0Gaze-Controlled Telepresence Robots for People with Motor Disabilities. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Hood, D.; Lemaignan, S.; Dillenbourg, P. When Children Teach a Robot to Write: An Autonomous Teachable Humanoid Which Uses Simulated Handwriting. In Proceedings of the 10th ACM/IEEE International Conference on Human-Robot Interaction, Portland, OR, USA, 2–5 March 2015. [Google Scholar]
- Engwall, O.; Lopes, J.; Ahlund, A. Robot Interaction Styles for Conversation Practice in Second Language Learning. Int. J. Soc. Robot. 2020, 13, 251–276. [Google Scholar] [CrossRef] [Green Version]
- Kanero, J.; Oranc, C.; Koskulu, S.; Kumkale, G.T.; Goksun, T.; Kuntay, A.C. Are Tutor Robots for Everyone? The Influence of Attitudes, Anxiety, and Personality on Robot-Led Language Learning. Int. J. Soc. Robot. 2022, 14, 297–312. [Google Scholar] [CrossRef]
- Shimaya, J.; Yoshikawa, Y.; Palinko, O.; Ogawa, K.; Jinnai, N.; Ishiguro, H. Active Participation in Lectures via a Collaboratively Controlled Robot. Int. J. Soc. Robot. 2021, 13, 587–598. [Google Scholar] [CrossRef]
- Reyes, G.E.B.; Lopez, E.; Ponce, P.; Mazon, N. Role Assignment Analysis of an Assistive Robotic Platform in a High School Mathematics Class, Through a Gamification and Usability Evaluation. Int. J. Soc. Robot. 2021, 13, 1063–1078. [Google Scholar] [CrossRef]
- Vogt, P.; van den Berghe, R.; de Haas, M.; Hoffman, L.; Kanero, J.; Mamus, E.; Montanier, J.M.; Oranc, C.; Oudgenoeg-Paz, O.; Hernandez Garcia, D.; et al. Second language tutoring using social robots: L2TOR—The movie. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Obayashi, K.; Kodate, N.; Masuyama, S. Assessing the Impact of an Original Soft Communicative Robot in a Nursing Home in Japan: Will Softness or Conversations Bring more Smiles to Older People? Int. J. Soc. Robot. 2022, 14, 645–656. [Google Scholar] [CrossRef] [PubMed]
- Luperto, M.; Monroy, J.; Renoux, J.; Lunardini, F.; Basilico, N.; Bulgheroni, M.; Cangelosi, A.; Cesari, M.; Cid, M.; Ianes, A.; et al. Integrating Social Assistive Robots, IoT, Virtual Communities and Smart Objects to Assist at-Home Independently Living Elders: The MoveCare Project. Int. J. Soc. Robot. 2022, 14, 1–31. [Google Scholar] [CrossRef]
- Ismail, L.I.; Hanapiah, F.A.; Belpaeme, T.; Dambre, J.; Wyffels, F. Analysis of Attention in Child-Robot Interaction Among Children Diagnosed with Cognitive Impairement. Int. J. Soc. Robot. 2021, 13, 141–152. [Google Scholar] [CrossRef]
- Schrum, M.; Park, C.H.; Howard, A. Humanoid Therapy Robot for Encouraging Exercise in Dementia Patients. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Moharana, S.; Panduro, A.E.; Lee, H.R.; Rick, L.D. Robots for Joy, Robots for Sorrow: Community Based Robot Design for Dementia Caregivers. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Anzalone, S.M.; Tilmont, E.; Boucenna, S.; Xavier, J.; Jouen, A.L.; Bodeau, N.; Maharatna, K.; Chetouani, M.; Cohen, D.; the MICHELANGELO Study Group. How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D + time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Disord. 2014, 8, 814–826. [Google Scholar] [CrossRef]
- Huijnen, C.A.G.J.; Verreussel-Willen, H.A.M.D.; Lexis, M.A.S.; de Witte, L.P. Robot KASPAR as Mediator in Making Contact with Children with Autism: A Pilot Study. Int. J. Soc. Robot. 2021, 13, 237–249. [Google Scholar] [CrossRef]
- Taheri, A.; Meghdari, A.; Mahoor, M.H. A Close Look at the Imitation Performance of Children with Autism and Typically Developing Children Using a Robotic System. Int. J. Soc. Robot. 2021, 13, 1125–1147. [Google Scholar] [CrossRef]
- Chung, E.Y.H. Robot-Mediated Social Skill Intervention Programme for Children with Autism Spectrum Disorder: An ABA Time-Series Study. Int. J. Soc. Robot. 2021, 13, 1095–1107. [Google Scholar] [CrossRef]
- Striepe, H.; Donnermann, M.; Lein, M.; Lugrin, B. Modeling and Evaluating Emotion, Contextual Head Movement and Voices for a Social Robot Storyteller. Int. J. Soc. Robot. 2021, 13, 441–457. [Google Scholar] [CrossRef]
- Desideri, L.; Bonifacci, P.; Croati, G.; Dalena, A.; Gesualdo, M.; Molinario, G.; Gherardini, A.; Cesario, L.; Ottaviani, C. The Mind in the Machine: Mind Perception Modulates Gaze Aversion During Child-Robot Interaction. Int. J. Soc. Robot. 2021, 13, 599–614. [Google Scholar] [CrossRef]
- Filippini, C.; Spadolini, E.; Cardone, D.; Bianchi, D.; Preziuso, M.; Sciarretta, C.; del Cimmuto, V.; Lisciani, D.; Merla, A. Facilitating the Child-Robot Interaction by Endowing the Robot with the Capability of Understanding the Child Engagement: The Case of Mio Amico Robot. Int. J. Soc. Robot. 2021, 13, 677–689. [Google Scholar] [CrossRef]
- Uluer, P.; Kose, H.; Gumuslu, E.; Erol Barkana, D. Experience with an Affective Robot Assistant for Children with Hearing Disabilities. Int. J. Soc. Robot. 2021, 16, 1–8. [Google Scholar] [CrossRef]
- Iio, T.; Satake, S.; Kanda, T.; Hayashi, K.; Ferreri, F.; Hagita, N. Human-Like Guide Robot that Proactively Explains Exhibits. Int. J. Soc. Robot. 2020, 12, 549–566. [Google Scholar] [CrossRef] [Green Version]
- Shi, C.; Satake, S.; Kanda, T.; Ishiguro, H. A Robot that Distributes Flyers to Pedestrians in a Shopping Mall. Int. J. Soc. Robot. 2018, 10, 421–437. [Google Scholar] [CrossRef]
- Belay Tuli, T.; Olana Terefe, T.; Ur Rashid, M.M. Telepresence Mobile Robots Design and Control for Social Interaction. Int. J. Soc. Robot. 2021, 13, 877–886. [Google Scholar] [CrossRef]
- Double Robotics—Telepresence Robot for the Hybrid Office. Available online: https://www.doublerobotics.com/ (accessed on 1 March 2022).
- Mubin, O.; Alhashmi, M.; Baroud, R.; Alnajjar, F.S. Humanoid Robots as Teaching Asistants in an Arab School. In Proceedings of the 31st Australian Conference on Human-Computer Interaction, Fremantle, Australia, 2–5 December 2019. [Google Scholar]
- Mispa, T.A.; Sojib, N. Educational Robot Kiddo Learns to Draw to Enhance Interactive Handwriting Scenario for Primary School Children. In Proceedings of the 3rd Intrernational Conference of Intelligent Robotic and Control Engineering (IRCE), Oxford, UK, 10–12 August 2020. [Google Scholar]
- Schodde, T.; Bergmann, K.; Kopp, S. Adaptive Robot Language Tutoring Based on Bayesian Knowledge Tracing and Predictive Decision-Making. In Proceedings of the 12th ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017. [Google Scholar]
- Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.J. A Review of the Applicability of Robots in Education. Technol. Educ. Learn. 2013, 1, 13. [Google Scholar] [CrossRef] [Green Version]
- Xia, Y.; LeTendre, G. Robots for Future Classrooms: A Cross-Cultural Validation Study of “Negative Attitudes Toward Robots Scale” in the U.S. Context. Int. J. Soc. Robot. 2021, 13, 703–714. [Google Scholar] [CrossRef]
- Nomura, T.; Kanda, T.; Suzuki, T. Experimental investigation into influence of negative attitudes toward robots on human-robot interaction. Ai Soc. 2006, 20, 138–150. [Google Scholar] [CrossRef]
- Vogt, P.; van den Berghe, R.; de Haas, M.; Hoffman, L.; Kanero, J.; Mamus, E.; Montanier, J.M.; Oranc, C.; Oudgenoeg-Paz, O.; Hernandez Garcia, D.; et al. Second Language Turoting using Social Robots: A Large-Scale Study. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Abdi, J.; Al-Hindawi, A.; Ng, T.; Vizcaychipi, M.P. Scoping review on the use of socially assistive robot technology in elderly care. BMJ Open 2017, 8, e018815. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lin, C.; Ogata, T.; Zhong, Z.; Kanai-Pak, M.; Maeda, J.; Kitajima, Y.; Nakamura, M.; Kuwahara, N.; Ota, J. Development and Validation of Robot Patient Equipped with an Inertial Measurement Unit and Angular Position Sensors to Evaluate Transfer Skills of Nurses. Int. J. Soc. Robot. 2021, 13, 899–917. [Google Scholar] [CrossRef]
- Meia, C.T.; Scheutz, M. Assistive Robots for the Social Management of Health: A Framework for Robot Design and Human–Robot Interaction Research. Int. J. Soc. Robot. 2021, 13, 197–217. [Google Scholar]
- Bardaro, G.; Antonini, A.; Motta, E. Robots for Elderly Care in the Home: A Landscape Analysis and Co-Design Toolkit. Int. J. Soc. Robot. 2022, 14, 657–681. [Google Scholar] [CrossRef]
- Obayashi, K.; Kodate, N.; Masuyama, S. Enhancing older people’s activity and participation with socially asisstive robots: A multicentre quasi-experimental study using the ICF framework. Adv. Robot. 2018, 32, 1207–1216. [Google Scholar] [CrossRef]
- Broadbent, E.; Stafford, R.; MacDonald, B. Acceptance of Healthcare Robots for the Older Population: Review and Future Directions. Int. J. Soc. Robot. 2009, 1, 319–330. [Google Scholar] [CrossRef]
- Frennert, S.; Aminoff, H.; Ostlund, B. Technological Framces and Care Robots in Eldercare. Int. J. Soc. Robot. 2021, 13, 317–325. [Google Scholar] [CrossRef] [Green Version]
- McGinn, C.; Bourke, E.; Murtagh, A.; Donovan, C.; Cullinan, M.F. Meeting Stevie: Perceptions of a Socially Assistive Robot by Residents and Staff in a Long-term Care Facility. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Williams, A.B.; Williams, R.M.; Moore, R.E.; McFarlane, M. AIDA: A Social Co-Robot to Uplift Workers with Intellectual and Developmental Disabilities. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Monekosso, D.; Florez-Revuelta, F.; Remagnino, P. Ambient Assisted Living [Guest editors’ introduction]. IEEE Intell. Syst. 2015, 30, 2–6. [Google Scholar] [CrossRef]
- AAL Home 2020—AAL Programme. Available online: www.aal-europe.eu (accessed on 15 February 2022).
- Casiddu, N.; Cesta, A.; Cortellessa, G.; Orlandini, A.; Porfirione, C.; Divano, A.; Micheli, E.; Zallio, M. Robot Interface Design: The Giraff Telepresence Robot for Social Interaction. Biosyst. Biorobot. 2015, 11, 499–509. [Google Scholar] [CrossRef]
- Coradeschi, S.; Cesta, A.; Cortellessa, G.; Coraci, L.; Galindo, C.; González-Jiménez, J.; Karlsson, L.; Forsberg, A.; Frennert, S.; Furfari, F.; et al. GiraffPlus: A System for Monitoring Activities and Physiological Parameters and Promoting Social Interaction for Elderly. Adv. Intell. Syst. Comput. 2014, 300, 261–271. [Google Scholar] [CrossRef]
- Kabacinska, K.; Prescott, T.J.; Robillard, J.M. Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review. Int. J. Soc. Robot. 2021, 13, 919–935. [Google Scholar] [CrossRef]
- Rasouli, S.; Gupta, G.; Nilsen, E.; Dautenhahn, K. Potential Applications of Social Robots in Robot-Assisted Interventions for Social Anxiety. Int. J. Soc. Robot. 2022. ahead of printing. [Google Scholar] [CrossRef] [PubMed]
- Nielsen, C.; Mathiesen, M.; Nielsen, J.; Jensen, L.C. Changes in Heart Rate and Feeling of Safety when Led by a Rehabilitation Robot. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Wilson, J.R.; Lee, N.Y.; Saechao, A.; Tickle-Degnen, L.; Scheutz, M. Supporting Human Autonomy in a Robot-Assisted Medication Sorting Task. Int. J. Soc. Robot. 2018, 10, 621–641. [Google Scholar] [CrossRef]
- Chatbots|GPT-3 Demo. Available online: https://gpt3demo.com/category/chatbots (accessed on 13 April 2022).
- Delaherche, E.; Chetouani, M.; Bigouret, F.; Xavier, J.; Plaza, M.; Cohen, D. Assessment of the communicative and coordination skills of children with Autism Spectrum Disorders and typically developing children using social signal processing. Res. Autism Spectr. Disord. 2013, 7, 741–756. [Google Scholar] [CrossRef]
- Boucenna, S.; Narzisi, A.; Tilmont, E.; Muratori, F.; Pioggia, G.; Cohen, D.; Chetouani, M. Interactive Technologies for Autistic Children: A Review. Cogn. Comput. 2014, 6, 722–740. [Google Scholar] [CrossRef] [Green Version]
- Chetouani, M.; Boucenna, S.; Chaby, L.; Plaza, M.; Cohen, D. Social Signal Processing and Socially Assistive Robotics in Developmental Disorders; Cambrige University Press: Cambrige, UK, 2017; pp. 389–403. [Google Scholar] [CrossRef]
- Emery, N. The eyes have it: The neuroethology, function and evolution of social gaze. Neurosci. Biobehav. Rev. 2000, 24, 581–604. [Google Scholar] [CrossRef]
- Wood, L.J.; Zaraki, A.; Robins, B.; Dautenhahn, K. Developing Kaspar: A Humanoid Robot for Children with Autism. Int. J. Soc. Robot. 2021, 13, 491–508. [Google Scholar] [CrossRef] [Green Version]
- Lee, J.; Lee, D.; Lee, J.G. Can Robots Help Working Parents with Childcare? Optimizing Childcare Functions for Different Parenting Characteristics. Int. J. Soc. Robot. 2022, 14, 193–201. [Google Scholar] [CrossRef]
- de Oliveira, E.; Donadoni, L.; Boriero, S.; Bonarini, A. Deceptive Actions to Improve the Attribution of Rationality to Playing Robotic Agents. Int. J. Soc. Robot. 2021, 13, 391–405. [Google Scholar] [CrossRef]
- Wu, C.H.; Huang, Y.M.; Hwang, J.P. Review of affective computing in education/learning: Trends and challenges. Br. J. Educ. Technol. 2016, 47, 1304–1323. [Google Scholar] [CrossRef]
- Zheng, M.; She, Y.; Chen, J.; Shu, Y.; XiaHou, J. BabeBay—A Companion Robot for Children Based on Multimodal Affective Computing. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Daegu, Korea, 11–14 March 2019. [Google Scholar]
- Bjorling, E.A.; Rose, E.; Davidson, A.; Ren, R.; Wong, D. Can We Keep Him Forever? Teens’ Engagement and Desire for Emotional Connection with a Social Robot. Int. J. Soc. Robot. 2020, 12, 65–77. [Google Scholar] [CrossRef]
- Gonzalez-Pacheco, V.; Ramey, A.; Alonso-Martin, F.; Castro-Gonzalez, A.; Salichs, M.A. Maggie: A Social Robot as a Gaming Platform. Int. J. Soc. Robot. 2011, 3, 371–381. [Google Scholar] [CrossRef] [Green Version]
- Mutlu, B.; Forlizzi, J.; Hodgins, J. A Storytelling Robot: Modeling and Evaluation of Human-like Gaze Behavior. In Proceedings of the IEEE-RAS Intrernational Conference on Humanoid Robots, Genova, Italy, 4–6 December 2006. [Google Scholar]
- Hsieh, W.F.; Sato-Shimokawara, E.; Yamaguchi, T. Enhancing the Familiarity for Humanoid Robot Pepper by Adopting Customizable Motion. In Proceedings of the IECON 2017—43rd Annual Conference of the IEEE Industrial Electronics Society, Beijing, China, 29 October–1 November 2017. [Google Scholar]
- Pasquali, D.; Gonzalez-Billandon, J.; Aroyo, A.M.; Sandini, G.; Sciutti, A.; Rea, F. Detecting Lies is a Child (Robot)’s Play: Gaze-Based Lie Detection in GRI. Int. J. Soc. Robot. 2021. [Google Scholar] [CrossRef]
- Youssef, K.; Said, S.; Beyrouthy, T.; Alkork, S. A Social Robot with Conversational Capabilities for Visitor Reception: Design and Framework. In Proceedings of the 2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris/Créteil, France, 8–10 December 2021; pp. 1–4. [Google Scholar] [CrossRef]
- Novanda, O.; Salem, M.; Saunders, J.; Walters, M.L.; Dautenhahn, K. What Communication Modalities Do Users Prefer in Real Time HRI? arXiv 2016, arXiv:1606.03992. [Google Scholar]
- Tatarian, K.; Stower, R.; Rudaz, D.; Chamoux, M.; Kappas, A.; Chetouani, M. How does Modality Matter? Investigating the Synthesis and Effects of Multi-modal Robot Behavior on Social Intelligence. Int. J. Soc. Robot. 2021, 14, 893–911. [Google Scholar] [CrossRef]
- Tsiourti, C.; Weiss, A.; Wac, K.; Vincze, M. Multimodal Integration of Emotional Signals from Voice, Body, and Context: Effects of (In)Congruence on Emotion Recognition and Attitudes Towards Robots. Int. J. Soc. Robot. 2019, 11, 555–573. [Google Scholar] [CrossRef] [Green Version]
- Feng, Y.; Perugia, G.; Yu, S.; Barakova, E.I.; Hu, J.; Rauterberg, G.W.M. Context-Enhanced Human-Robot Interaction: Exploring the Role of System Interactivity and Multimodal Stimuli on the Engagement of People with Dementia. Int. J. Soc. Robot. 2021, 14, 807–826. [Google Scholar] [CrossRef]
- Friedman, N.; Goedicke, D.; Zhang, V.; Rivkin, D.; Jenkin, M.R.M.; Degutyte, Z.; Astell, A.J.; Liu, X.; Dudek, G. Out of My Way! Exploring Different Modalities for Robots to Ask People to Move Out of the Way. 2020. Available online: https://www.semanticscholar.org/paper/Out-of-my-way!-Exploring-Different-Modalities-for-Friedman-Goedicke/c7467ad74a5f72871019d6eb2e24c907b6de108e (accessed on 13 April 2022).
- Johnson, D.O.; Agah, A. Human Robot Interaction Through Semantic Integration of Multiple Modalities, Dialog Management, and Contexts. Int. J. Soc. Robot. 2009, 1, 283–305. [Google Scholar] [CrossRef] [Green Version]
- Kang, S.H.; Han, J.H. Video Captioning Based on Both Egocentric and Exocentric Views of Robot Vision for Human-Robot Interaction. Int. J. Soc. Robot. 2021. [Google Scholar] [CrossRef]
- Kragic, D.; Vincze, M. Vision for Robotics. Found. Trends Robot. 2010, 1, 1–78. [Google Scholar] [CrossRef]
- Ronchi, M.R. Vision for Social Robots: Human Perception and Pose Estimation. Ph.D. Thesis, California Institute of Technology, Pasadena, CA, USA, 2020. [Google Scholar]
- Garcia-Salguero, M.; Gonzalez-Jimenez, J.; Moreno, F.A. Human 3D Pose Estimation with a Tilting Camera for Social Mobile Robot Interaction. Sensors 2019, 19, 4943. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Pathi, S.K.; Kiselev, A.; Kristoffersson, A.; Repsilber, D.; Loutfi, A. A Novel Method for Estimating Distances from a Robot to Humans Using Egocentric RGB Camera. Sensors 2019, 19, 3142. [Google Scholar] [CrossRef] [Green Version]
- Kostavelis, I.; Vasileiadis, M.; Skartados, E.; Kargakos, A.; Giakoumis, D.; Bouganis, C.S.; Tzovaras, D. Understanding of Human Behavior with a Robotic Agent through Daily Activity Analysis. Int. J. Soc. Robot. 2019, 11, 437–462. [Google Scholar] [CrossRef] [Green Version]
- Gurpinar, C.; Uluer, P.; Akalin, N.; Kose, H. Sign Recognition System for an Assistive Robot Sign Tutor for Children. Int. J. Soc. Robot. 2020, 2, 355–369. [Google Scholar] [CrossRef]
- Cosar, S.; Fernandez-Carmona, M.; Agrigoroaie, R.; Pages, J.; Ferland, F.; Zhao, F.; Yue, S.; Bellotto, N.; Tapus, A. ENRICHME: Perception and Interaction of an Assistive Robot for the Elderly at Home. Int. J. Soc. Robot. 2020, 12, 779–805. [Google Scholar] [CrossRef] [Green Version]
- Al-Abdullah, A.; Al-Ajmi, A.; Al-Mutairi, A.; Al-Mousa, N.; Al-Daihani, S.; Karar, A.S.; alkork, S. Artificial Neural Network for Arabic Speech Recognition in Humanoid Robotic Systems. In Proceedings of the 2019 3rd International Conference on Bio-engineering for Smart Technologies (BioSMART), Paris, France, 24–26 April 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Gao, J.; Galley, M.; Li, L. Neural Approaches to Conversational AI, Question Answering, 1052 Task-Oriented Dialogues and Social Chatbots; Now Foundations and Trends: Hanover, MA, USA, 2019. [Google Scholar]
- Dzakwan, G.; Purwarianti, A. Comparative Study of Topology and Feature Variants for Non-Task-Oriented Chatbot using Sequence to Sequence Learning. In Proceedings of the 5th International Conference on Advanced Informatics: Concept Theory and Applications (ICAICTA), Krabi, Thailand, 14–17 August 2018. [Google Scholar]
- Pham, K.T.; Nabizadeh, A.; Selek, S. Artificial Intelligence and Chatbots in Psychiatry. Psychiatr. Q. 2022, 93, 249–253. [Google Scholar] [CrossRef]
- Grassi, L.; Recchiuto, C.T.; Sgorbissa, A. Knowledge-Grounded Dialogue Flow Management for Social Robots and Conversational Agents. Int. J. Soc. Robot. 2022. [Google Scholar] [CrossRef]
- Briggs, G.; Williams, T.; Jackson, R.B.; Scheutz, M. Why and How Robots Should Say ‘No’. Int. J. Soc. Robot. 2022, 14, 323–339. [Google Scholar] [CrossRef]
- Brown, T.B.; Mann, B.; Ryder, N.; Subbiah, M.; Kaplan, J.; Dhariwal, P.; Neelakantan, A.; Shyam, P.; Sastry, G.; Askell, A.; et al. Language Models are Few-Shot Learners. Adv. Neural Inf. Process. Syst. 2020, 33, 1877–1901. [Google Scholar]
- Psychiatry.org—DSM. Available online: www.dsm5.org (accessed on 5 January 2022).
- Shuster, K.; Poff, S.; Chen, M.; Kiela, D.; Weston, J. Retrieval Augmentation Reduces Hallucination in Conversation. arXiv 2021, arXiv:2104.07567. [Google Scholar]
- Maynez, J.; Narayan, S.; Bohnet, B.; McDonald, R. On Faithfulness and Factuality in Abstractive Summarization. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics; Association for Computational Linguistics, Online, 5–10 July 2020; pp. 1906–1919. [Google Scholar] [CrossRef]
- Hazourli, A.; Djeghri, A.; Salam, H.; Othmani Morgan, A. Multi-facial patches aggregation network for facial expression recognition and facial regions contributions to emotion display. Multimed. Tools Appl. 2021, 80, 13639–13662. [Google Scholar] [CrossRef]
- Aly, A.; Tapus, A. On Designing Expressive Robot Behavior: The Effect of Affective Cues on Interaction. SN Comput. Sci. 2020, 1, 314. [Google Scholar] [CrossRef]
- Boucenna, S.; Gaussier, P.; Andry, P.; Hafemeister, L. Imitation as a Communication Tool for Online Facial Expression Learning and Recognition. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 5323–5328. [Google Scholar] [CrossRef] [Green Version]
- Ashraf, A.B.; Lucey, S.; Cohn, J.F.; Chen, T.; Ambadar, Z.; Prkachin, K.M.; Solomon, P.E. The painful face–pain expression recognition using active appearance models. Image Vis. Comput. 2009, 27, 1788–1796. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Faraj, Z.; Selamet, M.; Morales, C.; Torres, P.; Hossain, M.; Chen, B.; Lipson, H. Facially expressive humanoid robotic face. HardwareX 2021, 9, e00117. [Google Scholar] [CrossRef]
- SoftBank Robotics|Humanoid and Programmable Robots. Available online: https://www.softbankrobotics.com/emea/en (accessed on 5 March 2022).
- NAO the Humanoid and Programmable Robot|SoftBank Robotics. Available online: https://www.softbankrobotics.com/emea/en/nao (accessed on 5 March 2022).
- Frid, E.; Bresin, R. Perceptual Evaluation of Blended Sonification of Mechanical Robot Sounds Produced by Emotionally Expressive Gestures: Augmenting Consequential Sounds to Improve Non-verbal Robot Communication. Int. J. Soc. Robot. 2021, 14, 357–372. [Google Scholar] [CrossRef]
- Johnson, D.O.; Cuijpers, R.H. Investigating the Effect of a Humanoid Robot’s Head Position on Imitating Human Emotions. Int. J. Soc. Robot. 2018, 11, 65–74. [Google Scholar] [CrossRef]
- Pepper the Humanoid and Programmable Robot|SoftBank Robotics. Available online: https://www.softbankrobotics.com/emea/en/pepper (accessed on 5 March 2022).
- ASIMO by Honda|The World’s Most Advanced Humanoid Robot. Available online: https://asimo.honda.com/ (accessed on 5 March 2022).
- Sakagami, Y.; Watanabe, R.; Aoyama, C.; Matsunaga, S.; Higaki, N.; FujiMura, K. The intelligent ASIMO: System overview and integration. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, Switzerland, 30 September–4 October 2002; Volume 3, pp. 247–2483. [Google Scholar] [CrossRef]
- Metta, G.; Natale, L.; Nori, F.; Sandini, G.; Vernon, D.; Fadiga, L.; von Hofsten, C.; Rosandwr, K.; Lopes, M.; Santos-Victor, J.; et al. The iCub humanoid robot: An open-systems platform for research in cognitive development. Neural Netw. 2010, 23, 1125–1134. [Google Scholar] [CrossRef]
- Al Moubayed, S.; Beskow, J.; Skantze, G.; Granstrom, B. Furhat: A back-projected human-like robot head for multiparty human-machine interaction. In Cognitive Behavioural Systems; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
- McGinn, C.; Bourke, E.; Murtagh, A.; Cullinan, M.; Kelly, K. Exploring the application of design thinking to the development of service robot technology. In Proceedings of the ICRA2018 Workshop on Elderly Care Robotics-Technology and Ethics (WELCARO), Brisbane, Australia, 21–25 May 2018. [Google Scholar]
- OPSORO · GitHub. Available online: https://github.com/OPSORO (accessed on 5 March 2022).
- Vandevelde, C.; Wyffels, F.; Vanderborght, B.; Saldien, J. Do-It-Yourself Design for Social Robots: An Open-Source Hardware Platform to Encourage Innovation. IEEE Robot. Autom. Mag. 2017, 24, 86–94. [Google Scholar] [CrossRef] [Green Version]
- Reeti|Robot. Available online: https://robots.nu/en/robot/reeti (accessed on 12 June 2022).
- PARO Therapeutic Robot. Available online: http://www.parorobots.com/index.asp (accessed on 5 March 2022).
- Paro—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/paro/ (accessed on 5 March 2022).
- Tega Robot. Available online: https://www.wevolver.com/wevolver.staff/tega.robot (accessed on 5 March 2022).
- Overview <Huggable: A Social Robot for PEDIATRIC Care—MIT Media Lab. Available online: https://www.media.mit.edu/projects/huggable-a-social-robot-for-pediatric-care/overview/ (accessed on 5 March 2022).
- Social Robots can Benefit Hospitalized Children|MIT News|Massachusetts Institute of Technology. Available online: https://news.mit.edu/2019/social-robots-benefit-sick-children-0626 (accessed on 15 March 2022).
- Sophia—Hanson Robotics. Available online: https://www.hansonrobotics.com/sophia/ (accessed on 25 January 2022).
- Park, S.; Lee, H.; Hanson, D.; Oh, P.Y. Sophia-Hubo’s Arm Motion Generation for a Handshake and Gestures. In Proceedings of the 15th International Conference on Ubiquitous Robots (UR), Honolulu, HI, USA, 26–30 June 2018. [Google Scholar]
- The Furhat Robot|Furhat Robotics. Available online: https://furhatrobotics.com/furhat-robot/ (accessed on 25 January 2022).
- Aibo. Available online: https://us.aibo.com/ (accessed on 17 March 2022).
- Aibo—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/aibo2018/ (accessed on 17 March 2022).
- Schellin, H.; Oberley, T.; Patterson, K.; Kim, B.; Haring, K.S.; Tossell, C.C.; Phillips, E.; Visser, E.J.d. Man’s New Best Friend? Strengthening Human-Robot Dog Bonding by Enhancing the Doglikeness of Sony’s Aibo. In Proceedings of the 2020 Systems and Information Engineering Design Symposium (SIEDS), Charlottesville, VA, USA, 24 April 2020; pp. 1–6. [Google Scholar] [CrossRef]
- Nao—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/nao/ (accessed on 5 March 2022).
- Honda Global|ASIMO. Available online: https://global.honda/innovation/robotics/ASIMO.html (accessed on 5 March 2022).
- Asimo—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/asimo/ (accessed on 5 March 2022).
- Giraff. Available online: https://robots.nu/en/robot/giraff-telepresence-robot (accessed on 15 March 2022).
- Westlund, J.K.; Lee, J.J.; Plummer, L.; Faridi, F.; Gray, J.; Berlin, M.; Quintus-Bosz, H.; Hartmann, R.; Hess, M.; Dyer, S.; et al. Tega: A social robot. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; p. 561-561. [Google Scholar] [CrossRef] [Green Version]
- Sooyeon, J.; Dos Santos, K.; Graca, S.; O’Connell, B.; Anderson, L.; Stenquist, N.; Fitzpatrick, K.; Goodenough, H.; Logan, D.; Weinstock, P.; et al. Designing a Socially Assistive Robot for Pediatric Care. In Proceedings of the 14th International Conference on Interaction Design and Children, Medford, MA, USA, 21–25 June 2015. [Google Scholar]
- Said, S.; AlKork, S.; Beyrouthy, T.; Abdrabbo, M.F. Wearable bio-sensors bracelet for driveras health emergency detection. In Proceedings of the 2017 2nd International Conference on Bio-Engineering for Smart Technologies (BioSMART), Paris, France, 30 August–1 September 2017; pp. 1–4. [Google Scholar] [CrossRef]
- Said, S.; Boulkaibet, I.; Sheikh, M.; Karar, A.S.; Alkork, S.; Nait-ali, A. Machine-Learning-Based Muscle Control of a 3D-Printed Bionic Arm. Sensors 2020, 20, 3144. [Google Scholar] [CrossRef] [PubMed]
- Zhe, C.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime Multi-Person 2D Pose Estimation using Part Affinity Fields. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
- Pennington, J.; Socher, R.; Manning, C.D. Glove: Global Vectors for Word Representation. In Proceedings of the Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, 25–29 October 2014. [Google Scholar]
- Dibia, V. HandTrack: A library for prototyping real-time hand tracking interfaces using convolutional neural networks. Github Repos. 2017, 3, 6. [Google Scholar]
- Roshdy, A.; Karar, A.S.; Al-Sabi, A.; Barakeh, Z.A.; El-Sayed, F.; Alkork, S.; Beyrouthy, T.; Nait-ali, A. Towards Human Brain Image Mapping for Emotion Digitization in Robotics. In Proceedings of the 2019 3rd International Conference on Bio-engineering for Smart Technologies (BioSMART), Paris, France, 24–26 April 2019; pp. 1–5. [Google Scholar] [CrossRef]
- Roshdy, A.; Al Kork, S.; Karar, A.; Al Sabi, A.; Al Barakeh, Z.; ElSayed, F.; Beyrouthy, T.; NAIT-ALI, A. Machine Empathy: Digitizing Human Emotions. In Proceedings of the 2021 International Symposium on Electrical, Electronics and Information Engineering, Seoul, Korea, 19–21 February 2021; Association for Computing Machinery: New York, NY, USA, 2021; pp. 307–311. [Google Scholar] [CrossRef]
- Sakhineti, M.; Jayabalan, S. Design and Fabrication of SHRALA: Social Humanoid Robot Based on Autonomous Learning Algorithm. In Proceedings of the Third International Conference on Computing and Network Communications (CoCoNet), Trivandrum, India, 18–21 December 2019. [Google Scholar]
- Tiansong, L.; Feng, G.; Yilong, Y. Design of Low-cost Desktop Robot Based on 3D Printing Technology and Open-source Control System. In Proceedings of the IEEE 3rd Informatin Technology, Networking, Electronic and Automation Control Conference (ITNEC), Chengdu, China, 15–17 March 2019. [Google Scholar]
- Potnuru, A.; Jafarzadeh, M.; Tadese, Y. 3D printed dancing humanoid robot “Buddy” for homecare. In Proceedings of the IEEE International Conference on Automation Science and Engineering (CASE), Fort Worth, TX, USA, 21–25 August 2016. [Google Scholar]
- Al-Omary, A.; Akram, M.M.; Dhamodharan, V. Design and Implementation of Intelligent Socializing 3D Humanoid Robot. In Proceedings of the International Conference on Innovation and Intelligence for Informatics, Computing, and Technologies (3ICT), Virtual, 29–30 September 2021. [Google Scholar]
- Romeo, J. Why Additive Manufacturing and 3D Printing Benefits Robot Creators. Robot. Bus. Rev. 2019. Available online: https://www.roboticsbusinessreview.com/wp-content/uploads/2019/04/RBR-AdditiveManufacturing-RobotCreators-Final.pdf (accessed on 15 March 2022).
- Saini, J.; Chew, E. Recent Trends in Mechatronics Towards Industry; Springer: Berlin/Heidelberg, Germany, 2021; pp. 275–287. [Google Scholar]
- Cheng, H.; Ji, G. Design and implementation of a low cost 3D printed humanoid robotic platform. In Proceedings of the 2016 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), Chengdu, China, 19–22 June 2016; pp. 86–91. [Google Scholar] [CrossRef]
- Sidher, A.; Shen, Y. Improving a 3D-printed artificial anthropomorphic hand using the human hand model. In Proceedings of the 2017 IEEE International Conference on Real-time Computing and Robotics (RCAR), Okinawa, Japan, 14–18 July 2017; pp. 739–744. [Google Scholar] [CrossRef]
- Berra, R.; Setti, F.; Cristani, M. Berrick: A low-cost robotic head platform for human-robot interaction. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 559–566. [Google Scholar] [CrossRef]
- Delda, R.N.M.; Basuel, R.B.; Hacla, R.P.; Martinez, D.W.C.; Cabibihan, J.J.; Dizon, J.R.C. 3D Printing Polymeric Materials for Robots with Embedded Systems. Technologies 2021, 9, 82. [Google Scholar] [CrossRef]
- Netzev, M.; Houbre, Q.; Airaksinen, E.; Angleraud, A.; Pieters, R. Many Faced Robot - Design and Manufacturing of a Parametric, Modular and Open Source Robot Head. In Proceedings of the 16th International Conference on Ubiquitous Robots (UR), Jeju, Korea, 24–27 June 2019. [Google Scholar]
- Harrison, A.M.; Xu, W.M.; Trafton, J.G. User-Centered Robot Head Design: A Sensing Computing Interaction Platform for Robotics Research (SCIPRR). In Proceedings of the 13th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Chicago, IL, USA, 5–8 March 2018. [Google Scholar]
- Chen, H.; Mooring, B.; Stern, H. Dynamic wafer handling process in semiconductor manufacturing. In Proceedings of the IEEE International Conference on Robotics and Biomimetics (ROBIO), Karon Beach, Thailand, 7–11 December 2011. [Google Scholar]
- How Robotics Have Revolutionized Semiconductor Manufacturing. Available online: https://www.modutek.com/how-robotics-have-revolutionized-semiconductor-manufacturing/ (accessed on 24 January 2022).
- Ruiz, P.Y. Semiconductor solutions for robotics, a special focus on drives technologies. Infineon Technol. AG 2018. [Google Scholar]
- GPUs and the Future of Robotics—Robotics Business Review. Available online: https://www.roboticsbusinessreview.com/rbr/gpus-and-the-future-of-robotics/ (accessed on 15 March 2022).
- Yan, F.; Tran, D.M.; He, H. Robotic Understanding of Object Semantics by Referringto a Dictionary. Int. J. Soc. Robot. 2020, 12, 1251–1263. [Google Scholar] [CrossRef]
- Kim, M.; Kwon, T.; Kim, K. Can Human–Robot Interaction Promote the Same Depth of Social Information Processing as Human–Human Interaction? Int. J. Soc. Robot. 2017, 10, 33–42. [Google Scholar] [CrossRef]
- Savery, R.; Rose, R.; Weinberg, G. Finding Shimi’S Voice: Fostering Human-Robot Communication with Music and a Nvidia Jetson TX2. In Proceedings of the 17th Linux Audio Conference, Stanford, CA, USA, 23–26 March 2019. [Google Scholar]
- NVIDIA Isaac SDK|NVIDIA Developer. Available online: https://developer.nvidia.com/isaac-sdk (accessed on 15 March 2022).
- SoftBank Robotics Documentation. Available online: http://doc.aldebaran.com/2-1/family/robots/motherboard_robot.html#robot-motherboard (accessed on 12 March 2022).
- SoftBank Robotics Documentation. Available online: http://doc.aldebaran.com/2-4/family/pepper_technical/motherboard_pep.html (accessed on 12 March 2022).
- Sophia—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/sophia/ (accessed on 5 March 2022).
- ARI—PAL Robotics: Leading Service Robotics. Available online: https://pal-robotics.com/robots/ari/ (accessed on 13 March 2022).
- QTrobot—ROBOTS: Your Guide to the World of Robotics. Available online: https://robots.ieee.org/robots/qtrobot/ (accessed on 15 March 2022).
- Furhat Platform; Technical Report; Furhat Robotics: Stockholm, Sweden, 2021.
- ROS: Home. Available online: https://www.ros.org/ (accessed on 5 March 2022).
- #Tags. Available online: https://robots.ros.org/tags/#social (accessed on 10 March 2022).
- QTrobot. Available online: https://robots.ros.org/qtrobot/ (accessed on 15 March 2022).
- ARI. Available online: https://robots.ros.org/ari/ (accessed on 15 March 2022).
- Fu, G.; Zhang, X. ROSBOT: A low-cost autonomous social robot. In Proceedings of the 2015 IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Busan, Korea, 7–11 July 2015; pp. 1789–1794. [Google Scholar] [CrossRef]
- Infantino, I.; Augello, A.; Maniscalto, U.; Pilato, G.; Storniolo, P.; Vella, F. A ROS Architecture for a Cognitive Social Robot. 2018. Available online: https://www.semanticscholar.org/paper/A-ROS-architecture-for-a-Cognitive-Social-Robot-Infantino-Augello/306f90831a6db3e9f425a3c9a5cbdc3ebbd7b7a6 (accessed on 15 March 2022).
- Martín, F.; Rodríguez Lera, F.J.; Ginés, J.; Matellán, V. Evolution of a Cognitive Architecture for Social Robots: Integrating Behaviors and Symbolic Knowledge. Appl. Sci. 2020, 10, 6067. [Google Scholar] [CrossRef]
- Adam, C.; Johal, W.; Pellier, D.; Fiorino, H.; Pesty, S. Social Human-Robot Interaction: A New Cognitive and Affective Interaction-Oriented Architecture. In Proceedings of the International Conference on Social Robotics, Kansas City, MO, USA, 1–3 November 2016; Volume 9979, pp. 253–263. [Google Scholar] [CrossRef] [Green Version]
- UXA-90 Humanoid Robot|Roobuilder Co., Ltd. Available online: https://www.robobuilder.net/uxa-90 (accessed on 2 June 2022).
- SoftBank Robotics Documentation. Available online: http://doc.aldebaran.com/2-5/index_dev_guide.html# (accessed on 12 March 2022).
- OpenNAO—NAO OS—NAO Software 1.14.5 Documentation. Available online: http://doc.aldebaran.com/1-14/dev/tools/opennao.html (accessed on 12 April 2022).
- Tsardoulias, E.; Mitkas, P. Robotic frameworks, architectures and middleware comparison. arXiv 2017, arXiv:1711.06842. [Google Scholar]
- Mohamed, Y.; Lemaignan, S. ROS for Human-Robot Interaction. In Proceedings of the 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic, 27 September–1 October 2021. [Google Scholar]
- People—ROS Wiki. Available online: https://wiki.ros.org/people (accessed on 10 March 2022).
- Cob_People_Detection—ROS Wiki. Available online: http://wiki.ros.org/cob_people_detection (accessed on 10 March 2022).
- Fong, T.; Kunz, C.; Hiatt, L.M.; Bugajska, M. The Human-Robot Interaction Operating System. In Proceedings of the 1st ACM SIGCHI/SIGART Conference on Human-Robot Interaction, Salt Lake City, UT, USA, 2–3 March 2006; Association for Computing Machinery: New York, NY, USA, 2006; pp. 41–48. [Google Scholar] [CrossRef] [Green Version]
- Trafton, G.; Hiatt, L.; Harrison, A.; Tanborello, F.; Khemlani, S.; Schultz, A. ACT-R/E: An embodied cognitive architecture for human-robot interaction. J. Hum. Robot Interact. 2013, 2, 30–55. [Google Scholar] [CrossRef]
- Lim, V.; Rooksby, M.; Cross, E.S. Social Robots on a Global Stage: Establishing a Role for Culture During Human–Robot Interaction. Int. J. Soc. Robot. 2021, 13, 1307–1333. [Google Scholar] [CrossRef]
- Fortunati, L.; Manganelli, A.M.; Hoflich, J.; Ferrin, G. Exploring the Perceptions of Cognitive and Affective Capabilities of Four, Real, Physical Robots with a Decreasing Degree of Morphological Human Likeness. Int. J. Soc. Robot. 2021. [Google Scholar] [CrossRef]
- Dautenhahn, K.; Woods, S.; Kaouri, C.; Walters, M.; Koay, K.; Werry, I. What is a robot companion-Friend, assistant or butler? In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 1192–1197. [Google Scholar] [CrossRef] [Green Version]
- Bartneck, C.; Kulic, D.; Croft, E.; Zoghbi, S. Measurement Instruments for the Anthropomorphism, Animacy, Likeability, Perceived Intelligence, and Perceived Safety of Robots. Int. J. Soc. Robot. 2009, 1, 71–81. [Google Scholar] [CrossRef] [Green Version]
- Heerink, M.; Krose, B.; Evers, V.; Wielinga, B. Assessing Acceptance of Assistive Social Agent Technology by Older Adults: The Almere Model. Int. J. Soc. Robot. 2010, 2, 361–375. [Google Scholar] [CrossRef] [Green Version]
- Venkatesh, V.; Morris, M.G.; Davis, G.B.; Davis, F.D. User Acceptance of Information Technology: Toward a Unified View. Manag. Inf. Syst. Q. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
- Anzalone, S.; Boucenna, S.; Ivaldi, S.; Chetouani, M. Evaluating the Engagement with Social Robots. Int. J. Soc. Robot. 2015, 7, 465–478. [Google Scholar] [CrossRef]
- Rueben, M.; Elprama, S.A.; Chrysostomou, D.; Jacobs, A. Introduction to (re)using questionnaires in human-robot interaction research. In Human-Robot Interaction: Evaluation Methods and Their Standardization; Jost, C., Le Pévédic, B., Belpaeme, T., Bethel, C., Chrysostomou, D., Crook, N., Grandgeorge, M., Mirnig, N., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 125–144. [Google Scholar] [CrossRef]
- Avioz-Sarig, O.; Olatunju, S.; Sarne-Fleischmann, V.; Edan, Y. Robotic System for Physical Training of Older Adults. Int. J. Soc. Robot. 2020, 13, 1109–1124. [Google Scholar] [CrossRef] [PubMed]
- White Paper on Artificial Intelligence: A European Approach to Excellence and Trust. Available online: https://ec.europa.eu/info/publications/white-paper-artificial-intelligence-european-approach-excellence-and-trust_en (accessed on 4 July 2022).
- Mazzeo, G.; Staffa, M. TROS: Protecting Humanoids ROS from Privileged Attackers. Int. J. Soc. Robot. 2020, 12, 827–841. [Google Scholar] [CrossRef]
- Yaacoub, J.P.A.; Noura, H.N.; Salman, O.; Chehab, A. Robotics cyber security: Vulnerabilities, attacks, countermeasures, and recommendations. Int. J. Inf. Secur. 2022, 21, 115–158. [Google Scholar] [CrossRef]
- Miller, J.; Williams, A.; Perouli, D. A Case Study on the Cybersecurity of Social Robots. In Proceedings of the Companion of the 2018 ACM/IEEE International Conference on Human-Robot Interaction, Chicago, IL, USA, 5–8 March 2018; pp. 195–196. [Google Scholar] [CrossRef]
- Fortunati, L.; Sorrentino, A.; Fiorini, L.; Cavallo, F. The Rise of the Roboid. Int. J. Soc. Robot. 2021, 13, 1457–1471. [Google Scholar] [CrossRef]
Study | Robot | Research Goal/Application | Targets/End Users |
---|---|---|---|
Shiarlis et al. [15] | Teresa | Telepresence-behavior in interaction | Unspecified |
Shiarlis et al. [22] | Teresa | Telepresence-participation in social events | Elderly |
Niemelä et al. [23] | Double | Telepresence-communication with family members | Elderly |
Zhang et al. [24] | Unspecified | Telepresence-Control with eye gaze | Persons with motor disabilities |
Hood et al. [25] | Nao | Handwriting learning | Children |
Engwall et al. [26] | Furhat | Second language learning | Various |
Kanero et al. [27] | Nao | Second language learning | Adults |
Shimaya et al. [28] | CommU | Communication in lectures | Students and lecturers |
Reyes et al. [29] | Nao | Assistance in class | Students and lecturers |
Vogt et al. [30] | Nao | Second language learning | Children |
Schicchi et al. [21] | Pepper | Vocabulary enhancement | Children |
Obayashi et al. [31] | Mon-chan | Care in nursing homes | Elderly |
McGinn et al. [31] | Stevie | Care in a care facility | Residents and staff |
Luperto et al. [32] | Giraff-X | Assistance at home | Elders |
Ismail et al. [33] | LUCA | Analysis of attention | Children with cognitive impairment |
van der Putte et al. [20] | Pepper | Data acquisition | Hospitalized patients |
Schrum et al. [34] | Pepper | Encouraging physical exercise | Dementia patients |
Moharana et al. [35] | Different robots designed | Designing robots for dementia caregiving | Dementia caregiver support groups |
Anzalone et al. [36] | Nao | Environment perception | Children with ASD |
Huijnen et al. [37] | Kaspar | Making contact and catching attention | Children with ASD |
Taheri et al. [38] | Nao | Body gesture imitation | Children with ASD |
Chung [39] | Nao | Enhancement of social skills | Children with ASD |
Striepe et al. [40] | Reeti | Implementing behaviors of a robot storyteller | Persons aged from 18 to 30 |
Desideri et al. [41] | Nao | Studying gaze aversion in human-robot interaction | Children |
Filippini et al. [42] | Mio Amico | Assessing the emotional state of robot interlocutor | Children |
Uluer et al. [43] | Pepper | Assistance for hearing disabilities | Children |
Castellano et al. [19] | Pepper | Improving attitudes toward recycling | Children |
lio et al. [44] | ASIMO | Guidance for a science museum | Various, museum visitors |
Shi et al. [45] | Robovie | Flyer distribution | Various, pedestrians in a shopping mall |
Robot | Appearance | Height (cm) | D.o.F. | Features |
---|---|---|---|---|
Nao [118,141] | Humanoid | 58 | 25 | Touch sensors, directional microphones and speakers, 2D cameras, embedded speech recognition and dialogue, programmable, etc. |
Pepper [121] | Humanoid | 120 | 20 | Touch sensors and microphones and speakers, 2D and 3D cameras, embedded speech recognition and dialogue, programmable, etc. |
Asimo [142,143] | Humanoid | 130 | 57 | Different proprioceptive and exteroceptive sensors for motion tracking, obstacle detection, image and sound acquisition, etc. |
Kaspar [77] | Humanoid | 55 | 22 | Color camera, Kinect and IMU sensor, semi-autonomous, Wi-Fi/Ethernet connection. |
TERESA [15] | Unspecified | Unspecified | Unspecified | semi-autonomous navigation, different proprioceptive and exteroceptive sensors for motion tracking, obstacle detection, image and sound acquisition, etc. |
Furhat [125] | Human-like face | 41 | 3 | Onboard camera and microphones, speech recognition and synthesis, eye contact, etc. |
Sophia [135] | Humanoid | 167 | 83 | several cameras, audio localization array, complex and emotional expressions, natural language processing, visual tracking, etc. |
Giraff [144] | Telepresence | Unspecified | Unspecified | LCD screen, remote control, provides audio and visual cues to the user, data collection for health professionals, etc. |
Paro [130,131] | Animal (Seal) | Unspecified | Unspecified | Different kinds of sensors, learning to behave as the user prefers, can move its head and legs, etc. |
Tega [132,145] | Unspecified | 34.54 | 5 | Microphone, camera and accelerometer sensors, autonomous or remote operation, ability to generate behaviors and facial expressions, etc. |
Huggable [133,146] | Teddy bear | N/A | 12 | Can perceive physical touch, different other sensors, controlled by an application on a smart phone, teleoperation interface, etc. |
Aibo [138,139] | Dog-like | 29.3 | 22 | Different sensors, cameras, microphones, can recognize faces and voices, capable of simultaneous localization and mapping, etc. |
Robot | Processor | Processor Features | RAM | GPU |
---|---|---|---|---|
Nao V5 & V4 [174] | ATOM Z530 | 1.6 GHZ clock speed | 1 GB | None |
Pepper V1.6 [175] | ATOM E3845 | 1.91 GHZ clock speed, quadcore | 4 GB DDR3 | None |
Sophia [176] | Intel i7 | 3 GHZ | 32 GB | integrated GPU |
ARI [177] | Intel i5/i7 | Unspecified | 8 GB/16 GB | NVIDIA Jetson TX2 |
QTrobot [178] | Intel NUC i7 | Unspecified | 16 GB | None |
Furhat [179] | Intel i5 | up to 3.4 GHz | 8 GB | None |
Giraff [144] | Intel i7 | Unspecified | 8 GB | NVIDIA Jetson TX2 in [32] |
Asimo [122] | Unspecified | |||
Huggable [146] | Computational power of an Android phone | |||
Shimi in [172] | ARM | Quadcore | 8 GB | NVIDIA Jetson TX2 |
Aibo [139] | Qualcomm Snapdragon 820 | 64-bit Quadcore | 4 GB | None |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Youssef, K.; Said, S.; Alkork, S.; Beyrouthy, T. A Survey on Recent Advances in Social Robotics. Robotics 2022, 11, 75. https://doi.org/10.3390/robotics11040075
Youssef K, Said S, Alkork S, Beyrouthy T. A Survey on Recent Advances in Social Robotics. Robotics. 2022; 11(4):75. https://doi.org/10.3390/robotics11040075
Chicago/Turabian StyleYoussef, Karim, Sherif Said, Samer Alkork, and Taha Beyrouthy. 2022. "A Survey on Recent Advances in Social Robotics" Robotics 11, no. 4: 75. https://doi.org/10.3390/robotics11040075
APA StyleYoussef, K., Said, S., Alkork, S., & Beyrouthy, T. (2022). A Survey on Recent Advances in Social Robotics. Robotics, 11(4), 75. https://doi.org/10.3390/robotics11040075