Next Article in Journal
Structure and Properties of Supercritical Water: Experimental and Theoretical Characterizations
Previous Article in Journal
Energetics of Urban Canopies: A Meteorological Perspective
Previous Article in Special Issue
Medical Device Regulation Efforts for mHealth Apps during the COVID-19 Pandemic—An Experience Report of Corona Check and Corona Health
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Robotic Psychology: A PRISMA Systematic Review on Social-Robot-Based Interventions in Psychological Domains

1
Department of Education, Literatures, Intercultural Studies, Languages and Psychology, University of Florence, 50135 Florence, Italy
2
Centre for the Study of Complex Dynamics, University of Florence, 50135 Florence, Italy
*
Author to whom correspondence should be addressed.
J 2021, 4(4), 664-697; https://doi.org/10.3390/j4040048
Submission received: 26 July 2021 / Revised: 12 October 2021 / Accepted: 21 October 2021 / Published: 26 October 2021
(This article belongs to the Special Issue IT Support in the Healthcare Sector)

Abstract

:
Current technological advancements have allowed robots to be successfully employed in the healthcare sector. However, the recently acquired ability of social robots to process social information and act according to it has potentially made them very well suited to support or conduct psychological interventions. The present paper carried out a systematic review of the available literature regarding social-robot-based interventions in psychological domains using preferred reporting items for systematic reviews and meta-analyses (PRISMA) guidelines. The inclusion criteria were: (i) publication date until 2020; (ii) being an empirical study, master thesis, or project report; (iii) written in English or Italian languages (the two languages spoken by the authors); (iv) published in a scholarly peer-reviewed journal or conference proceedings, or were Ph.D. or master’s theses; and (v) assessed “social robot”-based intervention in psychological domains. Overall, the review showed that three main areas may benefit from social-robot-based interventions: social skills, mood, and wellbeing (e.g., stress and anxiety levels). Interestingly, social robots seemed to have a performance comparable to, and sometimes even better than, human operators. The main, but not exclusive, target of robot-based interventions in the psychological field was children with autism spectrum disorder (ASD). As evidence is, however, still limited and in an embryonic state, deeper investigations are needed to assess the full potential of social robots for the purposes of psychological intervention. This is relevant, considering the role that social robots could have in overcoming barriers to access psychological assessment and therapies.

1. Introduction

Nowadays, artificial entities are becoming an integral part of our daily life [1]. For instance, people are quite used to relying on vocal assistants like Siri, Google Assistant, and Amazon Alexa to support their daily life activities.
The technological advancements seen in recent years and the new paradigm in robotic sciences emphasizing the “human-oriented” values of engineering design led researchers to equip some of these artificial entities with a physical embodiment [2] and, therefore, let them interact in a more typically human fashion (e.g., proxemics, kinesics, tactile and multisensory stimulation) [3]. Personal service robots’ growth allowed them to be robust enough to be deployed in a plethora of settings, including healthcare, where they have been particularly beneficial [4,5,6]. Nonetheless, personal service robots had to learn to process social information and interact in a socially adequate and refined way to properly function and serve the purposes of the healthcare interventions. In other words, they evolved into social robots. Both scientific and non-scientific definitions see social robots as being capable of interacting with and working for humans [7]. More specifically, social robots are expected to meet various criteria, although to different degrees [8]. Social robots should: (i) avoid generating false expectations due to their shape, size, and material qualities; (ii) recognize, respond to, and employ. where possible, all modalities that humans naturally use to communicate; and (iii) be aware of human social rules and norms and behave accordingly.
Although the contribution that social robots have made to healthcare is recognized, much less attention has been given to the role that social robots can play in a specific healthcare application domain, namely psychology-based interventions [9,10]. Robotic psychology, or robopsychology, is a research field not yet fully exploited in regards to the study of compatibility between people and robotic creatures on multiple levels (i.e., sensory–motor, emotional, cognitive, and social) [11]. The proper use of social robots in the psychological field would exploit their potential even more, since relationality is a fundamental aspect for any intervention aimed at affecting people’s psychological dimensions [12,13,14]. Indeed, the effect that artificial entities have on human beings is not limited to mere instrumental support. Individuals may also satisfy relational needs through artificial entities [15,16] and, in general, such entities appear able to influence some aspects of people’s psychic life with relative ease [17,18,19]. Human beings are naturally inclined to attribute mental states to inanimate objects and animals through processes known as anthropomorphization, mind perception, and emotional attachment [20,21,22,23]. In this case, however, social robots are not just a passive receptacle of this human trend, but they have been developed to facilitate this attribution process through the implementation of human-like features that mimic human mental state representations and actions [24,25]. Thus, social robots appear particularly interesting for the purposes of psychological intervention due to their perceived similarity with human beings or other life-like creatures. In particular, researchers have used them for interventions with children with autism spectrum disorder (ASD) and to promote wellbeing (i.e., lowering stress and anxiety), topics that are of interest in this paper. ASD is characterized by persistent deficits in social communication and interaction skills, but also repetitive and restricted behavior patterns, activities, and interests [26].
The evidence about robot-based intervention effectiveness in psychology has not yet been systematized to account for the multiple types of social robots employed. The aim of this article was to offer a systematic review of social robot empirical findings regarding psychological intervention to (a) identify psychological pathologies/disorders/conditions that may benefit from robot-based intervention, together with (b) the specific psychological areas targeted by them, and (c) describe the types of robots most used for each domain of intervention.
The paper is organized as follows. In the Method and Procedures section, the systematic review methodology will be explained, including the search and selection strategy, together with the description of inclusion/exclusion criteria. In the Results section, the included studies will be presented in two separate paragraphs. The first one will summarize the characteristics of the studies in terms of robots and activities proposed, target samples, and psychological dimensions addressed. Meanwhile, the second section will report the outcomes obtained in all the selected works. Moreover, the Risk of Bias paragraph will give an overview on the potential biases affecting the studies. Finally, the last section will present a critical discussion.

2. Methods and Procedures

Search and Selection Strategy

Our systematic review was carried out through the preferred reporting items for systematic reviews and meta-analysis (PRISMA) guidelines. As a first step, we proceeded in searching for scientific studies about “social robot” psychotherapy and “robot-based” psychotherapy. The authors used the EBSCO host platform and consulted the databases of Google Scholar, PsycInfo, PubMed, Science Direct, PsycArticles, Sociological Abstracts, and Academic Search Complete. Search terms were the following: (1) Social robot psychotherapy and (2) robot-based psychotherapy. We decided to use these keywords because a broader and more inclusive definition of psychotherapy is often implicitly used by the authors that researched on social-robot-based interventions. This definition can be summarized as the mere application of clinical methods and interpersonal stances associated with a modification of people’s behaviors, cognitions, emotions, or other personal characteristics [27,28].
The inclusion criteria were (i) publication date until 2020; (ii) being an empirical study, master’s thesis, or project report; (iii) written in English or Italian languages (the two languages spoken by the authors); (iv) published in a scholarly peer-reviewed journal or conference proceedings, or were Ph.D. or master’s theses; and (v) assessed social-robot-based intervention in psychological domains. The search started in April 2020 and ended in November 2020. Sources coming from the different repositories were merged in a single dataset after removing duplicates (102). Of the 639 results obtained during the abstract screening phase, only 119 appeared to investigate psychological dimensions using a social robot, and thus were eligible and accessed in full-text. Among these 119 works, 80 were excluded based on the following exclusion criteria: (i) the research did not really include a robot-based intervention, (ii) the intervention efficacy on relevant psychological variables was missing, (iii) the intervention was carried out through virtual agents that do not have a physical embodiment, (iv) data analysis was not suitable for the systematic review process (e.g., lack of descriptive statistics, no correlation coefficients provided for the variables of interest), (v) works written in languages other than English or Italian. Finally, it was possible to identify 39 publications that were included in our systematic review (Figure 1).

3. Results

3.1. Characteristics of the Studies

Table A1 shows the characteristics of the selected reports. All thirty-nine studies used various types of social robots to affect multiple psychological dimensions. Overall, a total of nineteen different robots have been considered (Table A3).
As we can gather from Figure 2, NAO is by far the most adopted social robot among the selected studies. NAO is the only robot that is applied consistently in multiple fields (4/5 domains), being the current standard (i.e., the most frequently adopted) in interventions regarding ASD, cancer, wellbeing, and mixed targeted disorders (in the latter case jointly with PARO).
NAO has a humanoid, child-like appearance, however, its aspect looks simpler than human operators. Thanks to its simplicity, it seems to engage patients in interaction without overburdening them with social stimuli. For this reason, it has been widely used with children with ASD. Moreover, since NAO can run new Python scripts (and collections of already developed codes and tutorials are fairly available on the Internet), this allows the robot to be quite flexible and ready to be used in many different fields.
PROBO (2°) and PARO (3°) were the most used after NAO. PROBO uses human-like social cues, but it has an animal-like appearance, as well as PARO. Similarly to animal-assisted therapy, interacting and touching the fur are important therapeutic elements to feel pleasantness and affection, possibly perceiving social and emotional support. PROBO and PARO are soft and responsive robotic agents, thus they have been largely considered to mediate therapeutic interventions.
However, nearly 79% of the robots presented in the selected study were only deployed once and, thus, definitive conclusions about their effectiveness cannot be fully drawn. Finally, almost half of the studies involving social robots (47%) focused on ASD, with the other domains receiving little attention from social robot scholars.

3.2. Study Results Organized by Robot Employed

Table A2 shows the main findings of selected reports. As seen, NAO is the most widely used robot, particularly in interventions for children with ASD to improve some of the core symptoms of the disorder (e.g., social skills). Findings in this field are mixed.
On the one hand, results underlined a clear trend, indicating positive effects of the robot in fostering the children’s socio-emotional understanding [29], imitation [30,31], and eye contact [32,33,34] as relevant elements to support social interactions. In these domains, the robot is able not only to match the positive effects of human-based interventions but also to overcome them. The appearance of NAO seemed to play an important role due to its simple aspect, as compared to human interlocutors, making it able to draw the children’s interest without overwhelming them, but helping sustain the interaction. Indeed, children did not avoid the robot during conversations as they did with humans, specifically teachers [34]. This could be considered for future perspectives since NAO could potentially be utilized in studies involving children with ASD in tailored learning paths.
In social interaction, the aspect of turn taking is important to regulate reciprocal exchanges, and NAO has also been used to enhance this social skill. Overall, turn-taking was improved [30], and findings suggested similar benefits from the robot-based intervention and the standard human one, thus indicating the robot’s social ability to match humans’ beneficial effects [31,33]. Moreover, children showed a tendency to be more interested in the robot than the human operator during the interaction.
To fully analyze the emerging potential of NAO in this field, some adjustments should be done in research design parameters. Particularly, more sessions of interaction, as well as larger samples, should be considered, since these studies appeared severely underpowered due to the sample size and, consequently, they were plausibly unable to correctly reject the null hypothesis (i.e., distinguish between the conditions). A larger sample would also allow testing between conditions in a more refined way (i.e., by capturing smaller differences in terms of effect size), thus allowing a true investigation into which NAO features are more important in promoting social skills in children with ASD (i.e., including more than the two or three independent variable levels used in these two studies). Moreover, the experimental setting should be designed to be less rigid, since it could limit the children’s spontaneity.
On the other hand, the use of NAO to promote social skills in children with ASD has also led to contrasting findings. Inconclusive results arose from interventions to enhance joint attention, since some researchers reported positive effects of the robot [35] and others did not find improvements [30,31].
From these contrasting outcomes, some suggestions can be gathered to improve further explorations in this area. The NAO robot usually replies to children and the environment based on scripts defined by therapists, but sometimes these appear to be rigid and, consequently, the interaction does not end up as expected. Hence, more dynamic scripts are needed in order to flexibly adapt to the social context built with the child. This could be particularly useful to promote joint attention, as the capacity to adjust the robot’s response and rapidly reengage the child in the task is essential when the child loses involvement and attention. Furthermore, the design of social robots should consider the necessity of providing different social cues (i.e., pointing, gaze orientation), as these have been shown to be important for prompting joint attention and increasing the performance of the children [35].
Other directions for designing the tasks comprise the gradual change of activities to maintain the children’s interest, together with a long-lasting involvement of the robot in it to foster the child’s collaboration.
The use of NAO has also had effects on autistic behaviors [30,33,34]. Studies highlighted the robot’s ability to function as a behavioral activator, eliciting children’s adaptive, but also maladaptive, behaviors. Nevertheless, some evidence suggested that adaptive ones have been more frequently performed in the robot condition than in the human one [33]. Therefore, it seems important to study how to appropriately regulate the robot’s ability to activate the child in order to maintain a balance, beyond which the patient exhibits maladaptive behaviors while interacting.
In addition to interventions for children with ASD, NAO has also been used to reduce symptoms of distress in children with cancer [36,37]. The related findings underlined a significant decrease in the levels of anxiety, depression, and anger, indicating how NAO could be considered as a suitable support to promote the children’s knowledge of complex clinical conditions, educating them about health while fostering their ability to cope with it. The use of the robot as a psychologically assisting mediator in delicate situations is further sustained. Indeed, other researchers have also reported its ability to convey a sense of empathy in pediatric patients and to draw interest that distracts from the situation [38].
NAO has also found application in interventions targeting a variety of domains, including wellbeing, self-disclosure, and eating behaviors.
Among the selected reports, the use of NAO to promote wellbeing did not lead to effects on physiological measures of arousal, nor on prosocial behaviors (i.e., the amount of money that participants were willing to donate or the amount of time participants were willing to offer for participating in another questionnaire for the research) [39]. In general, technical features can play an important role in the perception conveyed while interfacing with the robot and this can potentially influence the effectiveness of an intervention. Here, robotic rigid movements have been sometimes reported, and this could have reduced the naturalness and spontaneity perceived in the interaction. Therefore, there is a need shown in research to adjust the robots’ capabilities towards more natural and fluid gestures. This progress could contribute to improving the experience of users, opening possibilities to study if the effectiveness of the intervention on wellbeing could be enhanced as well.
The particular robot’s features can be further discussed while analyzing the results coming from studies on self-disclosure. In this area, findings highlight NAO’s abilities to elicit the target dimension (i.e., self-disclosure), but less effectively than a human operator [40]. In particular, the embodiment factor is an important feature in eliciting and influencing the quantity and quality of self-disclosure. In general, during spontaneous interactions, it seems that social cues present in an interlocutor’s embodiment are important to understand the speaker’s involvement in the reciprocal social exchange. Therefore, the research on human–robot interaction (HRI) should evaluate dynamic features of embodiment in the robotic design to allow a natural social interaction that can simulate different sensory modalities (e.g., lifelike facial expressions, gaze exchanges, fluid movements, robot’s abilities to touch humans in a perceivable way to feel a closeness).
Here, a comparison can be done, considering another robot included in the selected reports. Indeed, interventions with the Huggable robot seem to elicit positive emotions, both in its embodied and avatar form, thus suggesting that the embodiment does not seem necessary to influence the emotional target [41]. Hence, in the area of embodiment analysis, HRI research should also deepen comparisons among the effectiveness of interventions provided by agents that are presented in different forms (i.e., robots, avatar, non-interactive agents).
Finally, NAO seems to be promising in interventions to promote changes in eating behaviors. Positive effects have been found in decreasing snack episodes and the frequency of craving imagery, while increasing the perceived confidence to control snack intake [42]. Research also underlines the possibility to create a personal bond with the robot, building a working alliance that ultimately holds a positive correlation with the reduction of snack episodes [42].
Among the selected studies, PROBO is the most used robot after NAO. All the studies considered used it in interventions for children with ASD.
Overall findings pointed out its positive effects in facilitating social behaviors [43] and promoting eye contact [44,45], collaborative play, and engagement [46]. From a behavioral point of view, the robot was able to decrease the level of prompts needed by the children to perform social behaviors [43] and to reduce the stereotyped ones [46].
In line with results regarding NAO, PROBO has also been shown to overcome the benefits observable in human-assisted interventions as well as to match them. In some cases, no significant differences have been found in children’s ability to detect the human’s or the robot’s preferences, neither in the frequency of asocial behaviors displayed [44] nor in the exhibition of positive affect and verbal utterances across the two experimental conditions [44,45].
Here, research findings suggested that robots could have the potential to enhance social skills in children with ASD, thanks to their ability to interact using social cues, regardless of their appearance being human-like, such as NAO, or even animal-like, such as PROBO.
After NAO and PROBO, the most used robot is PARO, which has been employed in interventions for different target dimensions.
In general, PARO seems promising to improve positive affect in children [47], mood, and alertness in adults with moderate to severe intellectual disabilities, fostering an increase in activities and directness in the environment [48]. Moreover, findings also reported the promotion of positive behavioral states (e.g., calmness, sleep behaviors, conversing) and reduction of negative ones (e.g., sadness, yelling, and isolative behaviors) in veterans with and without dementia [49]. In this last case, it is to be noted that the robot effect was not moderated by a dementia diagnosis, except for one case that approached significance.
On the contrary, from selected reports, Paro-assisted interventions do not seem to influence negative affect, anxiety, or arousal [47]. Authors have explained this result as being in line with outcomes coming from animal-assisted therapy, which usually tends to enhance positive moods rather than reduce negative moods or anxiety [47]. However, this finding is in contrast with the results of other interventions using animal-like robots, such as the Haptic Creature [50]. Here, outcomes pointed out a significant decrease in state anxiety and physical indicators (i.e., heart rate, breathing rate) of children at risk of anxiety during active trials with the robot (i.e., the robot was breathing), strengthening also the positivity of their emotional response.
These contrasting findings highlighted a need for research to deepen the matter with further studies, using the framework of animal-assisted therapy. For example, a future direction could be exploring if the differences can be explained also by the effects of age on the interaction with the robot.
Across the studies considered, KEEPON has been used with children with ASD.
Results described similar effects observed in the robot-assisted intervention and the human-mediated one. Both seemed to foster an increase in children’s rational beliefs and a decreased emotional intensity. Moreover, no differences have been found in social knowledge, irrational beliefs, and adaptive behaviors in both groups [51].
Usually, the number of sessions planned was small and this should be rethought in future frameworks. Projecting more intervention sessions could offer more time not only to extensively exercise but also to investigate if, after a longer training, behavioral changes can be observed. Furthermore, a higher number of errors has been reported when starting to learn a task in the robot condition as compared to the human-assisted intervention. By contrast, no differences have been highlighted in the number of errors made (i.e., perseverative, regressive, and lose-shift errors) during the task [52].
Sometimes the performance at the beginning of an activity can be influenced by the novelty effect—in this case, the newness of interacting with a robot. In this line, studies should investigate more deeply the way in which participants become familiar with robots and humans, describing the differences noticeable in this acclimatization phase, and how these variables can influence the performance of patients across the two conditions. Moreover, the effects of the feedback provided by the two agents should be further studied. Indeed, some evidence suggested that the robotic feedback seems not only to elicit more interest in participants, but also to allow easier learning of psychological contents [51]. In addition, from a cognitive and emotional perspective, KEEPON seemed to draw more attentional engagement and positive affect than a human operator [52].
KASPAR has also been used in interventions for children with ASD.
Overall findings underline the possibility to improve communication, psychomotor functions, and social skills, with particular regards to unprompted imitation (i.e., imitation of poses, movements, and facial expressions), prompted speech, and gesture recognition. Furthermore, a positive increasing trend has also been found in prompted imitation and unprompted speech [53] and improvements in visual perspective taking have been reported in children with moderate to high-functioning ASD [54].
Again, these findings illustrated the potential of social robots in interventions for children with ASD, but also suggested that future HRI research should shed more light on the differences in robot interaction among various levels of disorder functioning. This could represent an important insight for clinical implications.
As illustrated in Figure 2, there is a set of stand-alone studies using different robots and mainly targeting children with ASD.
Overall, the results are in line with previously reported outcomes. For instance, interventions using Daisy to promote social skills reported improvement in children’s social performance (i.e., doing activities in groups, providing information, and turn taking). By contrast, the ability to follow basic behavioral instructions and rules seemed to be not significantly affected, but it was reported that the mean regarding the skill of following instructions itself was already high enough at the first measurement (i.e., children had high scores on this variable even before starting the intervention) [55].
In addition, Robonova has been found able to elicit testing behaviors (i.e., developing the ability to assume the partner’s intentions for behaviors during the interaction)—even when the child was in a non-contingent condition, that was when the robot did not perfectly mirror the child’s actions. Moreover, Robonova seemed to draw more attention than a human operator [56].
Findings on the use of Cozmo have shed light on the potential to combine standard therapy and robot-assisted training, since the combination has been shown to be more effective than the standard therapy alone in improving children’s tendency to initiate social interaction and behavioral requests. Here, results underlined also a slight difference between the two conditions while analyzing the ability to maintain the interaction and to respond to behavioral requests. On the contrary, no differences have been found in responses to social interactions, as well as in initiating, maintaining, and responding to joint attention behaviors across the groups [57], thus being in agreement with the findings explained beforehand.
Finally, interventions with Flobi highlighted a higher eye contact and face fixation as compared to human groups [58], while KiliRo seemed promising in reducing markers of stress levels (i.e., protein and alpha-amylase levels) [59].
Taken together, results coming from these interventions convey a positive potential of social robots in helping children with ASD understand and perform essential aspects of social interactions, thanks to their repetitive and simple behaviors in approaching the children. However, the need for HRI research to further explore and confirm these preliminary findings should be noted, since here only single studies describe the potentiality of each one (i.e., Daisy, Robonova, Cozmo, Flobi, and KiliRo).
Even other social robots have shown positive effects across different psychological domains. Improvements in mood have been reported in interventions with the Pepper robot, together with increased sensory awareness and monitoring [60], but also in programs with the Giraff robot [61] and Jibo [62].
Specifically, findings on the use of Giraff pointed out an increase in mood in a group of adults and elderly when the robot played the role of a coach performing a positive psychology exercise (i.e., writing down three positive things of the day and their causes). In comparison, a decrease in mood was observed when the robot only interacted as a conversation partner [61]. This evidence implies how different roles played by the robot can moderate the effectiveness of an intervention, and this should be taken into consideration while projecting studies on HRI, since it has been shown to influence outcomes on targeted domains. In this regard, it is to be noted that social robots, like Bono-01, have also been found to be well accepted as participants in conversations, evoking a higher frequency of laughter as compared with human speakers [63].
In addition, again in relation to mood, outcomes on Jibo specified improvements in overall mood as well as in psychological wellbeing and readiness to change behavior only in a group of undergraduate students with low neuroticism and high conscientiousness. Instead, in the group with high neuroticism and low conscientiousness, the improvement was found only in psychological wellbeing, but not in mood or readiness to change [62].
Over time, research has explored the association between personality traits and mental health treatment outcomes. For example, some studies have noted that high levels of agreeableness, extraversion, openness, and conscientiousness can be associated with positive outcomes, while high levels of neuroticism may have negative associations [64].
Thus, findings on robot-assisted interventions, such as those using Jibo [62], were in line with current research and help to suggest the need for further analysis. Deepening the association between personality traits and mental health outcomes in HRI research could not only enrich emerging models [65], but also explore new ways to personalize interventions. Finally, further knowledge could contribute to improving interventions’ effectiveness, potentially enhancing patients’ mental health. This is relevant, considering, for example, how high neuroticism is connected with greater psychological care concerns [66].
Finally, findings from selected reports highlighted improved scores in the Patient Health Questionnaire-9 and Geriatric Depression Scale using Ryan, along with enhancements in patients’ language (i.e., increased average sentence length) [67], and in the State-Trait Anxiety Inventory (Form Y) using Cakna [68].
By contrast, among the studies included, programs with RoboCOP did not report significant effects on the state of anxiety or speaker confidence of participants [69]. The authors pointed out that constant reminders of improvements provided by the robot could have had an adverse effect on their confidence. Thus, it seems necessary to further study how to appropriately regulate the robot’s feedback to maximize benefits during interventions.
Overall, findings coming from selected studies suggest that social robots could potentially be suitable to support psychological interventions for different domains and disorders, but HRI research should further explore the question since studies are still limited.

3.3. Study Results Organized by Target Application and Frameworks

3.3.1. Autism Spectrum Disorder (ASD)

Social robots have been usually introduced in therapy settings basing on the fact that for children with ASD, interacting with humans is often difficult as they can be overburdened by social stimuli [70]. On the contrary, computer-like interventions can offer predictable and enjoyable social training opportunities [71]. According to this framework, indeed, social robots’ embodiment allows a balance between the possibility of providing, simultaneously, human-like cues and object-like plainness in social interactions, offering a controllable and safe environment in which children can train target behaviors, gradually increasing their complexity [71]. Therefore, in the studies included in this review, the robots have been used to foster social and emotional competencies, since these are core symptoms in ASD [26]. For this purpose, NAO and PROBO are the most commonly used robots. Overall, studies showed mixed results, but most of the selected reports found improvements in social skills. From the point of view of behavior, children particularly improved in turn taking [30,55], eye gaze [32,33,34,44,58], imitation [30,53], play skills, fostering collaboration [46], and communication [53] thanks to the interaction with robots as compared to human agents. Moreover, a reduction in autistic behaviors [34,46] and an increase in adaptive ones were also found [33]. Considering emotional domains, studies underlined the opportunity to promote socio-emotional understanding learning [29], while interfacing with the robot, resulting in the possibility to facilitate emotion recognition [44] and improve positive affect [52]. Interventions also addressed cognition, showing higher levels of attention [52,56] when the children interacted with robots compared with humans, and improvements in visual perspective taking [59].
By contrast, in a number of studies, no significant differences were identified while analyzing these domains across robot and human conditions [30,31,33,44,45,46,51], suggesting how robots can have the same performance as human operators.

3.3.2. Cancer

Based on the results showed by using social robots in therapies for children with ASD, authors like Alemi and colleagues [36,37] tried to employ them to relieve typical distress symptoms displayed by pediatric patients suffering from cancer. In their research paradigm, indeed, the use of a humanoid robot, like NAO, capable of communicating sympathetic emotions through different channels could facilitate the relief of distress involving the children and encouraging their self-expression. The findings in this field reported an overall significant decrease in children’s levels of anxiety, depression, and anger [36,37].

3.3.3. Anxiety

Selected reports targeting anxiety disorders mainly used animal-like robots, like Paro or a Haptic Creature. The reason for their use is grounded on the framework of animal-assisted therapy [47] and the social cognitive theory in human–animal interaction [50]. Specifically, authors claimed that some animal-like robots were designed to mime the effect of animals used in therapy settings, which have been shown to positively affect anxiety levels [72]. However, the access to animal-assisted therapy is often limited by a range of difficulties, including allergies, availability, and costs [50]. Therefore, the idea of overcoming these limitations with animal-like robots that are easy to find and use has inspired studies in this field. Here, authors have also built the hypothesis that calming effects provided by the robots could help children replace anxiety with pleasant stimuli, based on the systematic desensitization process described by cognitive behavioural therapy [50]. Hence, studies included in this review targeted the overall score of anxiety and some specific aspects, such as physiological parameters and the affective experience. Selected reports provided mixed results. Half of the reports underlined a decrease in anxiety levels both in children at risk [50] and in young adults [68] after interacting with the robot, improving also physical indicators (i.e., heart rate) and emotional experience [50]. By contrast, the other half of the studies did not identify significant effects on anxiety in children, students, and professionals [47], nor in their levels of arousal, but it underlined only an improvement in positive affect while performing stressful tasks [47].

3.3.4. Wellbeing

To promote wellbeing, most of the included reports tried to investigate the possibility of building a relationship between a robot and a human agent, based on the fact that socio-emotional support and the creation of a strong rapport between patients and different agents have been shown to improve people’s ability to cope with difficulties [39,41]. Following this theoretical background, authors believed the robots to be suitable for building a relationship, since their physical embodiment and ability to interact socially make them resemble human agents. In the studies considered, overall improvements were found in psychological wellbeing [62], mood [61,62], readiness to change [62], responses of empathy [38], and socio-emotional wellbeing with the elicitation of positive affects [41] in different groups of children, students, adults, and elderly. In just one study, outcomes did not show effects on physiological measures of arousal nor prosocial behaviors [39].

3.3.5. Mixed Target Disorders

More other disorders and dimensions have been targeted by selected studies, including intellectual disabilities, depression, self-disclosure, or studies that have provided robot-based mindfulness interventions and protocols for emotions and behavior change. In these fields, different frameworks explain the use of social robots with animal or human features. For example, some interventions are, again, based on the paradigms of animal-assisted therapy [48] or they sink the roots in a body of research showing how social robots can facilitate interventions [61] and convey a sense of presence while eliciting positive social responses [73,74]. In the selected studies, social robots showed the potential to improve mood in intellectual disabilities [48] and depression [67], foster positive behaviors in the elderly with dementia [49], increase sensory awareness and mood during meditation [60], and promote healthy eating behaviors [42]. Moreover, social robots were also able to create an amusing setting [63] and elicit self-disclosure, but less so than a human interaction partner [40].

3.4. Risk of Bias

Table A2 shows the main risks of bias within the selected reports. Five studies did not report sampling criteria [38,54,57,61,63]. The possible absence of eligibility criteria may have induced a biased recruitment process. Almost all studies (i.e., 37 studies out of 39) used small sample sizes, thus low representativeness and possible type II errors could not be excluded [29,30,31,32,33,35,36,37,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,67,68,69,71]. Furthermore, six used a male-only sample [32,34,45,49,58,71]. One study reported a self-selection bias [62]. Additionally, in one publication, a selection bias could not be excluded [49]. One study did not include a sample with target diagnosis (i.e., children at high risk of experiencing anxiety were involved instead of children with anxiety disorders) [50]. Six studies did not have a control group [49,53,56,59,62,68]. One more study did not comprise randomization [49]. In one study participants were unequally distributed between the experimental conditions due to the randomization process with a small sample size (law of large numbers) [31]. Moreover, two studies reported having missing data [32,48]. The lack of some data may have biased results. In addition, in one study, some participants did not complete initial measures [42]. In one study, authors claimed they used an instrument for mood assessment (i.e., the Positive and Negative Affect Scale) that did not have adequate indicators to capture all affective dimensions, and this may have induced a measurement bias [60]. In three studies, drop-out was reported [37,49,57]. In two studies, a possible examiner or therapist effect could not be excluded [29,53]. One more study reported an observer bias since it was an unblinded observational study [49]. In two studies, a possible Rosenthal effect could not be excluded, since most of the participants knew the research team before the study began [36,37]. In one study, the different complexity of the tasks between the experimental and control group may have induced biases [29]. Furthermore, in one study, the influence of possible confounding factors could not be excluded [59]. In one study, the elderly participants reported communication problems with the robot because of their lower understanding of English and lower ability to understand the robot’s voice. This may have influenced the outcomes [61]. One study reported a potential effect of familiarity with the task that may have biased results [52], whereas another study highlighted a possible habituation effect [32]. In one study, participants were familiar with the task, before the research began [50]. Thus, the prior knowledge with the task may have altered the results. Moreover, in four of the 39 publications, authors reported that participants were compensated for their participation [39,40,47,69]. Since an external reward can lower intrinsic motivation [75], compensation could have modified participants’ performance. One study reported technical constraints of the robot [46]. Therefore, technical issues could have affected the results. Finally, seven studies used a single case design [30,32,33,34,35,43,71].

4. Discussion

Advances in new technologies and, in particular, robots are an opportunity for a positive revolution in the human ability to cope with problems. Social robots can enhance human performance in education [76] and foster the health of older adults [77] and hospitalized children [78]. However, the contributions of social robots have not yet been fully exploited if we consider the relational skills inherent in this technology that would be particularly useful in the psychological domain. This contribution, therefore, aimed to systematize the current and emerging literature on the use of social robots for psychology-based interventions.
From these findings, it is possible to gather some guidelines for the effective use of social robots, although these directions cannot be too solid considering how the existing body of research is limited and mixed. Specifically, to conduct interventions for patients with ASD, the robot’s embodiment seems to be particularly relevant, while other features, like anthropomorphism, appear to have a weaker effect on target dimensions. This is in line with other studies that have already explained the importance of embodiment to convey a sense of social presence [79]. Furthermore, in the treatment of anxiety, animal-like robots seem to be preferable. Indeed, studies that promote their use follow in the footsteps of animal-assisted therapy, which has shown its potential benefits in the reduction of anxiety across different patients [72]. Finally, in interventions addressing wellbeing, the robots’ features do not appear to be particularly relevant beyond of a minimum degree of interactivity necessary to establish an immersive social interaction between human and robot.
Consequently, it is important to select the appropriate robot according to the characteristics that therapy needs and Table A3 can be useful for this purpose, since it describes the main features of each one.
Thus, overall, social robots appear to be suited to act as an aid to affect psychological dimensions or disorders. Indeed, social robots seem to have comparable and sometimes even better performance than human operators [30,33,34,45,46,58,63] and, therefore, they can expand the availability of psychology-based interventions, both towards the general population and people that otherwise will be difficult to intervene on (e.g., hospitalized people, rural communities, disadvantaged people). Indeed, the possibility to conduct smart therapies or interventions at home or in healthcare settings has the potential to increase both the accessibility and availability of treatments [80] due to the flexibility of time and places in which they can be carried out. On the one hand, the robot can act autonomously, where an expert or an artificial intelligence system enables the robot to effectively manage the intervention and any emerging problems [47,48,49,50]. On the other hand, through remote control [33,37,39,67], psychologists may have the possibility of having an easily accessible embodiment for their activity while being physically far away. These considerations are particularly relevant in the light of restrictions faced worldwide during the COVID-19 pandemic, which has rapidly posed important challenges in different fields, including the necessity to encourage innovative and more dynamic approaches to support people’s mental health.
Apart from the reduction of accessibility barriers and costs, robots can be functional also in overcoming problems with cleanliness and sterilization in delicate environments, such as hospital wards and clinics. However, while some robots, like NAO, are more easily cleanable, others like Paro have raised concerns as they have fur. Researchers have already tried to define cleaning protocols for infection prevention to make safe use of these kinds of robots. Preliminary results are positive since it seems possible for the robots to meet infection prevention and control standards, such as those defined by the National Health Service [81]. Similarly, issues with allergies that are reported in animal-assisted therapies [82] can be easily avoided, as the use of animal-like social robots has shown its benefits across the selected reports.
Considering the potential underlined by most of the studies, it is worthwhile and necessary to deepen the research in the field of robot-based interventions in psychological domains. Both research needs and future directions can be pointed out.
Firstly, among the selected reports, only a few specified the use of randomized control trials. Thus, it seems necessary to expand and systematize the body of work in order to have strong evidence to promote the use of social robots in actual clinical practice. Future research should define clear protocols to combine the standard therapy and robot-assisted interventions, as this is shown to be potentially beneficial. Secondly, future studies should involve larger samples, considering also follow-up sessions to assess long-term benefits. At the same time, it will be important to consider increasing ecological validity of studies [29,31]. Thirdly, some technical limitations pose challenges for the future. For instance, in some studies, the story played by the robot during the intervention could not be stopped by the therapist [43], some robotic movements were too quick or rigid [43], and sometimes the robot was not capable of reacting to social advances made with the children’s gestures [46]. Therefore, technical improvements are desirable because adjusting the robots’ abilities could improve not only the interaction but also the validity of the studies. Moreover, a possible future direction could be the analysis of the differences in human-robot interaction between young and the elderly. Indeed, in one of the selected studies, the latter reported communication issues and lower level of confidence as compared to younger participants [61]. This investigation could contribute to improving their experience and, hopefully, the benefits they can obtain from the robot intervention.
Speaking of therapy, it is known that self-disclosure is an essential element for the effectiveness of any therapeutic path, along with the creation of an alliance. Some of the studies included in this review tried to build a relationship between a human agent and a robot, but just one study targeted the dimension of self-disclosure [40]. The findings highlighted the possibility to elicit self-disclosure through the interaction with robots, but robots were less effective than human operators. Interestingly, the embodiment was found to influence the quantity and quality of self-disclosure. For future therapeutic perspectives, this element could be further explored, even comparing the outcomes coming from robot- and virtual-reality-based interactions, since researchers have shown how users feel comfortable to disclose personal experiences in social virtual reality [83]. Thus, future research could explore the possibility of integrating robots and virtual reality to move towards updated and more effective therapeutic and teletherapeutic pathways. In order to address these future challenges, interdisciplinary research teams should be fostered worldwide, including professionals from different fields: engineering, psychology, neuropsychology, medicine (e.g., pediatricians and geriatricians) for constant and ever-improving development.
In the end, to seed a new robotic revolution beneficial for people’s health and wellbeing, it is necessary to go beyond the application of (social) robots strictly limited to rehabilitation paths or specific medical conditions [84,85,86]. Social robots should not only avoid people being injured or harmed (as in the first Laws of Robotics envisaged by Isaac Asimov) and support them in their recovery, but also promote their wellbeing all-round, thus answering their psychological needs and demands.

5. Conclusions

This PRISMA systematic review reported the results from 39 studies illustrating social-robot-based interventions in psychological domains, considering a total of 19 different robots. Outcomes have underlined the potential for social robots to contribute to providing mental health interventions, particularly enhancing social skills learning, mood, and wellbeing across various clinical populations. Nevertheless, mixed findings are also highlighted, and these inconsistencies should represent future challenges to be investigated in order to provide evidence-based interventions that can be ever-improving, patient-tailored, and in step with needs and times.

Author Contributions

Conceptualization, M.D., and P.A.R.; methodology, M.D., G.C., M.D. and A.G.; investigation, G.C., P.A.R.; data curation, M.D., G.C., and P.A.R.; supervision, M.D. and A.G., writing—original draft preparation, M.D., G.C., P.A.R., and A.G.; writing—review and editing, M.D., G.C., P.A.R., and A.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data sharing not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. Main characteristics of the studies reviewed: authors, year, country, sample size, gender distribution, age range, mean age, sample characteristics, type of robot employed, and target dimension (n = 39).
Table A1. Main characteristics of the studies reviewed: authors, year, country, sample size, gender distribution, age range, mean age, sample characteristics, type of robot employed, and target dimension (n = 39).
Ref.YearCountryStudy TypeSample SizeSample DescriptionGender
Distribution
Age Range and Mean Age (SD)Type of RobotTarget Dimension
Autism Spectrum Disorder
[43]2012RomaniaSingle case study4Children with ASD (all with moderate autism)50% male
50% female
Range = 4–9;
Mage (SD) = NR
PROBOSocial skills and play abilities (i.e., level of prompting needed to share toys, greet, thank)
[71]2013RomaniaSingle case study3Children with ASD100% maleRange = 5–6;
Mage (SD) = NR
PROBOEmotion recognition
[51]2017RomaniaExperimental design27Children with ASD74% male
26% female
Range = 6–12;
Mage (SD) = 8.7 (1.8)
KeeponSocial knowledge;
Beliefs (rational and irrational);
Emotion intensity;
Adaptive behaviors
[44]2016BelgiumRepeated measure design30Children with ASD90% male
10% female
Range = 5–7;
Mage (SD) = 6.67 (0.92)
PROBOEmotion recognition;
Social behaviors (i.e., eye-contact, initiations of joint attention, verbal utterances, positive affect) and asocial behaviors (i.e., non-response and evading task behaviors)
[32]2012RomaniaSingle case study43 children with ASD (2 in moderate and 1 in severe range and epileptic modifications) and 1 child with elements of autism and delays in language development100% maleRange = 2–6;
Mage (SD) = 4.2 (1.67)
NAOSocial engagement (i.e., free initiations frequency, eye gaze contact duration and gaze shifting frequency, smile and laughter duration)
[46]2016BelgiumRepeated measure design30Children with ASD90% male
10% female
Range = 4–8;
Mage (SD) = 6.67 (0.92)
PROBOCollaborative behavior and play;
Engagement (i.e., eye contact, social behaviors, affect, verbal behaviors)
[52]2015RomaniaQuasi-experimental design with between-subjects comparison81Group 1:
40 typically developing children;
Group 2:
41 children with ASD
NRGroup 1:
Range = 4–7;
Mage (SD) = 5.4 (0.4);
Group 2:
Range = 4–13;
Mage (SD) = 8.4 (2.2)
KeeponCognitive flexibility;
Learning performance (i.e., perseverative errors, regressive errors, lose-shift errors);
Attentional engagement;
Positive affect
[58]2013GermanyQuasi-experimental design with between subjects comparison24Group 1:
9 children with ASD (7 children with Asperger’s autism, 1 with childhood autism, and 1 with autism with atypical symptoms);
Group 2:
15 healthy volunteers
100% maleGroup 1:
Range = 12–28;
Mage = 21
Group 2:
Range = 18–28;
Mage = 23.40
FlobiGaze behavior and eye contact
[45]2014RomaniaExperimental Pilot study11Children with ASD in high functioning range100% maleRange = 4–7;
Mage (SD) = NR
PROBOPlay skills (i.e., play performance and collaborative play);
Engagement in play (i.e., stereotypical behaviors and positive emotions);
Social skills (i.e., contingent utterances, verbal initiations, and eye contact)
[30]2019RomaniaSingle-case alternative treatment design21Children with ASDNRNRNAOSocial skills (i.e., imitation, turn-taking, joint attention);
Completing figures and categorizing items
[33]2020RomaniaSingle-subjects design5Children with ASD (3 with low functioning, 1 with moderate, and 1 with high functioning60% male
40% female
Range = 5–7;
Mage = 4.6
NAOSocial interaction and communication (i.e., turn taking, engagement, eye contact, verbal utterances);
Behaviors (stereotypical, maladaptive, and adaptive);
Emotions (functional and dysfunctional negative emotions, and positive emotions)
[29]2019ItalyRandomized control trial14Children with ASD85.71% male
14.29% female
Range = 4–8;
Intervention group:
Mage (SD) = 73.3 months (16.1)
Control group:
Mage (SD) 82.1 months (12.4)
NAOSocio-emotional understanding (i.e., contextualized emotion recognition, comprehension, emotional perspective-taking)
[35]2018RomaniaSingle-subject design5Children with ASD (2 with low functioning, 1 with moderate and 2 with high functioning)80% male
20% female
Range = 3–5
Mage = 4.78
NAOJoint attention (i.e., social cues such as gaze orientation, pointing, and vocal instruction)
[59]2017New ZealandWithin-subject design without control group7Children with ASDNRRange = 6–16;
Mage (SD) = 10.0 (3.51)
KiliRoStress levels (trough protein and alpha-amylase levels)
[31]2017RomaniaRandomized controlled trial40Children with ASDNRRange = 3–6;
Mage (SD) = NR
NAOSocial skills (i.e., imitation, joint attention, and turn taking)
[56]2015Belgium/RomaniaTwo-way mixed factorial design21Children with ASD85.71% male
14.29% female
Range = 4.5–8;
Mage = 6.19 (0.99)
RobonovaSocial interaction (i.e., eye gaze, imitation, testing behaviors);
Positive affect
[53]2019GreeceLongitudinal design with no control group7Children with ASD/ASC (autism spectrum condition) (from low to high functioning, 1 child with severe language difficulties, 1 non-verbal child)71.42% male
28.58% female
Range = 7–11;
Mage = NR
KASPARCommunication and social skills
(i.e., prompted and unprompted speech, unprompted imitation, gesture recognition, seeking response and affection, self-initiated play, focus, and attention)
[34]2012MalasyaPilot case study1Child with ASD in high functioning range100% maleAge = 10NAOSocial interaction
[55]2019GreeceExperimental design126 children with ASD (highly functional) and 6 typically developing children83.33% male
16.67% female
Range = 6–9;
Mage (SD) = NR
DaisySocial skills (i.e., participation, turn taking, providing information, following rules and instructions)
[57]2021ItalyTwo-period crossover design (repeated-measure design)24Children with ASC79.17% male
20.83% female
Range = 4–7;
Mage (SD) = 5.79 (1.02)
CozmoSocial skills (i.e., initiating, responding and maintaining social interaction, initiating and responding to behavioural requests, joint attention)
[54]2019United KingdomPre–post design pilot study1211 children with ASD and 1 child with global developmental delay,
which gave the child ASD traits
58.33% male
41.67% female
Range = 11–14;
Mage (SD) = NR
KASPARVisual perspective taking and theory of mind
Cancer
[36]2014IranExperimental design10Children with cancerNRRange = 7–12
Experimental group:
Mage (SD) = 9.5 (1.26)
Control group:
Mage (SD) = 9.4 (1.36)
NAOAnxiety;
Depression;
Anger
[37]2016IranExperimental design10Children with cancer9.1% male
90.90% female
Range = 7–12
Experimental group:
Mage (SD) = 9.5 (1.63)
Control group:
Mage (SD) = 9.8 (1.36)
NAOAnxiety;
Depression;
Anger
Mixed Target Disorders *
[48]2017HollandSingle-subject design5Adults with moderate to severe intellectual disabilities40% male
60% female
Range = 59–70;
Mage = 64
ParoMood and alertness
[60]2020JapanExperimental design28Participants all novices to mindfulness practicesNRRange = NR
Mage (SD) = 20.2 (3.4)
Pepper robotEEG asymmetry and mood
[67]2019ColoradoSingle-subject design4Elderly with depression25% male
75% female
Range = 62–93;
Mage = 76
Ryan (a she-robot)Natural language;
Involvement;
Sentiments;
Symptoms of depression
[40]2021ScotlandWithin-subjects experimental design114University studentsExperiment 1:
38.5% male
61.5% females
Experiment 2:
40.7% male
59.30% females
Experiment 3:
32.8% male
67.20% females
Experiment 1: Range = 17–42;
Mage (SD) = 24.42 (6.40)
Experiment 2:
Range = 20–62
Mage (SD) = 28.60 (9.61)
Experiment 3:
Range = 18–43;
Mage (SD) = 23.02 (4.88)
NAO e Google MINISelf-disclosure (i.e., length, sentimentality, voice pitch, harmonicity, energy)
[63]2014JapanExperimental design16Healthy elderlyGroup 1:
83.33% male
16.67% female
Groups 2: NR
Group 1:
Range = NR
Mage = 70
Groups 2:
Range = NR
Mage = 72
Bono-01Amusement (i.e., laughter)
[42]2020AustraliaPilot randomized controlled trial with stepped-wedge design26Adults wanting to reduce snack intake81.31% male
18.69% female
Range = 19–69;
Mage (SD) = 37 (13.47)
NAONumber of snack episodes;
Motivational thoughts;
Craving cognitions;
Confidence to control snacking
[49]2016CaliforniaPre–post design pilot study with no control group23Veterans (82.1% with dementia diagnosis, 12.5% without dementia,
5.4% data missing)
100% maleRange = 58–97;
Mage (SD) = 80 (10.6)
PARONegative behavioral states (i.e., anxiety, sadness, yelling behavior, isolation, pain, wandering/pacing);
Positive behavioral states (i.e., calmness, bright affect, sleeping, conversing)
Anxiety
[68]2019MalaysiaPre–post design pilot study with no control group21Young adults with mild to moderate anxiety traits60% male;
40% female
Range = 19–23;
Mage (SD) = 21.5 (1.6)
CAKNAAnxiety
[50]2012CanadaWithin subjects design participants15Children at high risk of anxiety40% male;
60% female
Range = 10–17;
Mage (SD) = 13.7 (2.3)
Haptic CreatureAnxiety
[47]2018ConnecticutExperimental design87Children in early school years47.1% male
52.9% female
Range = 6–9;
Mage (SD) = 8.15 (1.14)
ParoMood (positive and negative);
Anxiety;
Arousal
[69]2017Massachusetts (Boston)Within-subject design, counterbalanced design across two sessions1212 students and professionals (7 categorized as high competence public
speakers; 5 with moderate competence)
75% male
25% female
Range = 22–28;
Mage = 24
RoboCOPAnxiety and speaker confidence
Wellbeing And Stress
[61]2014HollandWithin-subject design378 elderly (3 retired and 5 working at the university in education or related) and 29 non-elderly (11 students
and 10 researchers; 8 with university-related professions)
Elderly:
62.5% male
37.5% female
Non-elderly:
37.93% male
62.06% female
Elderly:
Range = 62–83;
Mage (SD) = 70.38 (7.84).
Non-elderly:
Range = 20–55;
Mage (SD) = 30.48 (7.49)
GiraffMood
[41]2017MassachusettsRandomized clinical trial study5420 children in Inpatient Surgical Unit; 1 in Medical Surgical Intensive Care Unit; 24 in Hematology/Oncology Unit; 9 in the Bone Marrow Transplant Unit61.11% male; 38.9% femaleRange = 3–10; Mage (SD) = 6.09 (2.33)Huggable robotSocio-emotional wellbeing (i.e., affect, joyful play, stress and anxiety)
[62]2020MassachusettsPre–post design with no control group35Undergraduate students20% male
77.14% female
2.86% other
NRJibo robot and a Samsung Galaxy tabletPsychological wellbeing;
Mood;
Readiness to change
[39]2017HollandBetween-subjects factorial design67Healthy people50.7% male
49.3% female
Range = 18–78;
Mage (SD) = 47.93 (19.96)
NAOArousal;
Robot-touch experience;
Robot perception;
Prosocial behaviors (i.e., willingness to donate money)
[38]2016GermanyPilot study: early concept evaluation with a focus group5Children waiting to see a doctor80% male
20% female
Range = 5–12;
Mage (SD) = NR
NAO (a.k.a. Murphy Miserable Robot)Feelings in the waiting situation; Experience (i.e., stimulation, empathy, positive coping)
Note: NR = not reported; ASD = autism spectrum disorder; ASC = autism spectrum condition; * = intellectual disabilities; mindfulness; depression; self-disclosure; emotional states in elderly; behavior change intervention and reduction of high calories; positive and negative behaviors in veterans.
Table A2. Main characteristics of the studies reviewed: main findings, study limitations, risk of biases (n = 39).
Table A2. Main characteristics of the studies reviewed: main findings, study limitations, risk of biases (n = 39).
Ref.Main FindingsStudy LimitationsRisk of Biases
Autism Spectrum Disorder
[43]The robot-assisted intervention was associated with a statistically significant decrease of the level of prompt the children needed to perform the social behavior.Limited generalizability due to small sample size; the therapist could not stop the robot playing a story when necessary.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to use of single-case design.
[71]Situation-based emotion recognition performance was increased by PROBO’s active face with a moderate to large effect size; no qualitative differences were found in the valence (positive or negative) of the emotion.Limited generalizability due to small sample size and limited representativeness of the population; possible experimenter effect.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to recruitment of a male-only sample; bias due to use of single case design.
[51]An increase in rational beliefs and a decrease in emotion intensity was found in both the robot-enhanced therapy and in the treatment-as-usual group with a large effect size. No significant differences were shown in social knowledge, irrational beliefs, and adaptive behaviors in both groups.Limited generalizability due to relatively small sample size.Sampling bias due to small sample size (possible type II error).
[44]Eye contact with the interaction partner was significantly higher in the robot condition compared to the human agent condition with a medium effect size.
No significant differences were shown in detecting preferences of the robot or the adult, nor in the positive affect, frequency of asocial behaviors, and verbal utterances in the robot condition.
Limited generalizability due to relatively small sample size and single exposure to the conditions; the vast majority of the sample was male; technical constraints of the robot; the robot was not capable of reacting to social advances made by the children with gestures.Sampling bias due to small sample size (possible type II error) and low representativeness.
[32]A significant increase was found in frequency of initiations with or without prompt in the robot interaction in half of the sample.
A general low level of free initiations was found in both the robot and human condition.
Eye gaze, smile, and laughter were increased in half of the sample in the robot condition.
The frequency of shared eye gaze was increased in just one child in the robot interaction.
Limited generalizability due to small sample size and lack of female sample; the robot is capable of imitating only gross arm movements and it is not fast enough for a perfect contingency; possible habituation effect; rigidity of the experimental setup.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to recruitment of a male-only sample; bias due to missing data; bias due to use of single case design; bias due to possible habituation effect.
[46]No significant differences were observed between the robot and human condition for any of the variables (i.e., children performed well in both conditions, the robot triggered a similar frequency of independent and collaborative play behaviors as with an adult interaction partner).Limited generalizability due to relatively small sample size and single exposure to the conditions; the vast majority of the sample was male; technical constraints of the robot; the robot was not capable of reacting to social advances made by the children with gestures.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to technical constraints of the robot.
[52]A significantly higher number of errors was found in the robot condition when learning the task as compared to the human condition, with a large effect size. No significant differences were found in the number of perseverative, regressive, and lose-shift errors in the two conditions.
A significantly higher number of attentional engagement episodes and positive affects were found in the robot condition, respectively, with a large and medium effect size.
Limited generalizability due to relatively small sample size and single exposure to the conditions; no specified gender distribution; possible effect of familiarity; wide age range in the experimental group and between the groups.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to possible effect of familiarity.
[58]The level of face fixation was higher in the robot condition even if a significant decrease in eye contact was found at the end of interaction.Limited generalizability due to relatively small sample size and lack of female participants.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to recruitment of a male-only sample.
[45]More collaborative play and engagement and less stereotyped behaviors were found in the robot condition. For play performance, eye contact, positive emotions, and contingent utterances, no significant differences were found between the two conditions.Limited generalizability due to small sample size and single exposure to the conditions.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to recruitment of a male-only sample.
[30]In the robot intervention, improvements were shown in imitation, turn-taking tasks (i.e., sharing information), completing a series of figures, and categorizing items, but not in joint attention. Both the standard human treatment and robot-enhanced therapy had a positive effect on the clinical ASD symptoms, reducing their severity.Limited generalizability due to relatively small sample size; shortcomings of
participants’ demographic information.
Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to use of single-case alternative treatment design.
[33]Similar levels of performance were found for turn-taking skills across standard human treatment and robot-enhanced treatment.
The effect of the robot condition was higher and mostly statistically significant on adaptive behaviors, even if this condition also led to some increases in stereotyped and maladaptive behaviors.
More interest and eye gaze were found in the robot condition for some of the children as compared to human interaction.
Limited generalizability due to small sample size; rigidity of the experimental setup.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to use of single subject design.
[29]A large effect of the social robot was found in promoting learning of socio-emotional understanding skills. An increased ability to understand beliefs, emotions (i.e., anger, disgust, fear, happiness, sadness, and shame), and thoughts of others was found.Limited generalizability due to small sample size; the vast majority of sample was male; possible co-therapist effect; difference in the complexity of the tasks between control and robot groups.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to possible co-therapist effect; bias due to different complexity of the tasks between control and robot groups.
[35]The use of different social cues for prompting joint attention with the robot increased the performance of the children (i.e., the use of pointing with or without vocal instructions, in addition to gaze orientation, led to more joint attention engagement and maximum performance than gaze orientation alone).Limited generalizability due to small sample size; the vast majority of sample was male.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to use of single subject design.
[59]A statistically significant reduction was found in the average protein level and in the alpha-amylase levels (i.e., indicators of stress) after interacting with the robot.Limited generalizability due to small sample size; absence of a control group; no specified gender distribution; other factors may have influenced protein levels and alpha-amylase (i.e., medications, body temperature).Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to absence of a control group; bias due to possible influence of confounding factors.
[31]A significant difference was found on joint attention favoring the standard human treatment intervention as compared with the robot condition. Comparisons on imitation and turn taking were not significant.Limited generalizability due to small number of sessions; no specified gender distribution; unequal distribution of the participants in the two conditions (due to the blind randomization process) possibly affected the statistical power with reduction of significant effect detection.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to unequal distribution of the participants in the two conditions.
[56]The robot was able to elicit testing behaviors when the behavior was not a perfect mirroring, More attention was paid to the robot than the human.Limited generalizability due to relatively small sample size; the vast majority of sample was male; absence of a control group.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to absence of a control group.
[53]An improvement was found in the communication and psychomotor function domains with moderate effect sizes. An increase was found in unprompted imitation (i.e., poses, sequence of movements, and facial expressions) and prompted speech, with a large and significant effect, and in gesture recognition, with a moderate and significant effect. A positive increasing trend was found for prompted imitation and unprompted speech, with a moderate to large effect. A decrease was found in self-initiated play behavior, with a small to moderate effect.
An increase was found in focus, attention, and the behavior of seeking a response, with a moderate but not significant effect.
Limited generalizability due to small sample size; absence of a control group; possible examiner effect; the vast majority of sample was male; the behavioural coding was only conducted with the robot and not with the human session; impact of children’s mood variability on the interaction with the robot.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to absence of a control group; bias due to possible examiner effect.
[34]Autistic behaviors were suppressed during the child–robot interaction and more eye contact was observed between the child and robot compared to the interaction between child and teacher.
The robot’s simple appearance compared to humans elicited the child’s interest to sustain interaction.
Limited generalizability due to small sample size.Sampling bias due to recruitment of a male-only sample; bias due to use of single case study.
[55]A statistically significant improvement was found in children’s social performance in the robot condition for activities in groups, providing information, and turn taking. The skill of following the rule did not show statistical significance.Limited generalizability due to small sample size; the vast majority of sample was male.Sampling bias due to small sample size (possible type II error) and low representativeness.
[57]The combination of standard therapy and robot-assisted training was more effective than the standard therapy alone in improving the children’s tendency to initiate social interaction and behavioural requests. A marginal difference was found in the tendency to maintain social interaction and respond to behavioural requests in the combined therapies. No significant difference was found in the tendency to respond to social interaction, initiate, maintain, and respond to joint attention behaviors.Limited generalizability due to relatively small sample size; high drop-out rate (1/3 of the initial sample); the vast majority of sample was male.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to lack of sampling criteria explication; bias due to high drop-out rate.
[54]Visual perspective taking was improved over sessions with the robot in children with moderate to high functioning.Limited generalizability due to relatively small sample size; the progression criteria in games could not be respected because of the wide heterogeneity in children’s ability.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to lack of inclusion criteria.
Cancer
[36]A significant reduction was found in the levels of anxiety, depression, and anger in patients after treatment in the robotic group.Limited generalizability due to small sample size; no specified gender distribution; most of the participants knew the research team before the study began.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to possible Rosenthal effect.
[37]A significant decrease was found in anxiety, depression, and anger after receiving the therapy in the robot group.Limited generalizability due to small sample size; the vast majority of sample was female;
drop-out due to lack of willingness to come to the medical centers (i.e., unpleasant memories); most of the participants knew the research team before the study began.
Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to drop-out during sessions; bias due to possible Rosenthal effect.
Mixed Target Disorders *
[48]The robot showed a positive influence on mood (as rated by daily supervisors and self-rated) and alertness, with an increase in activity and interaction with the environment and a decrease in agitation in just one participant. No significant beneficial effects were found in the rest of the sample.Limited generalizability due to small sample size; limitations due to small reliability of AB design; the self-report scale on mood might have been too complicated for some of the participants.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to missing data.
[60]A significantly greater right-sided activity was found in the occipital gamma band (attributed to increased sensory awareness and open monitoring) during robot interaction in the meditation group compared to the control group.
No significant differences were found in the frontal asymmetry of the two groups.
An improvement was found in mood after interaction with the robot.
Limited generalizability due to relatively small sample size; no specified gender distribution.Sampling bias due to small sample size (possible type II error) and low representativeness; measurement bias due to using an instrument for mood assessment (i.e., the Positive and Negative Affect Scale) that did not have adequate indicators to capture all affective dimensions.
[67]Participants improved their natural language with an increase in average sentence length. High feelings of comfort and enjoyment were reported with the robot. Improvements were found in Patient Health Questionnaire-9 scores in 75% of the sample and in Geriatric Depression Scale scores in 50% of the sample.Limited generalizability due to small sample size; the vast majority of the sample was male; possible role of confounding variables (i.e., the excitement of participating in an experiment).Sampling bias due to small sample size (possible type II error) and low representativeness.
[40]The robot was able to elicit self-disclosure but less so than the human. The embodiment was an important feature in eliciting self-disclosure and influencing its quantity and quality.Limited representativeness due to recruitment of university students only; limited ecological laboratory experiment.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to compensation for participating in the study.
[63]The frequency of laughter evoked by the robot was mostly higher than that evoked by the original human speaker. The robot was accepted as a participant in the group conversation.Limited generalizability due to small sample size; the vast majority of the sample was male.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to lack of sampling criteria explication.
[42]A significant reduction was found in snack episodes and frequency of craving imagery in the robot treatment. A significant increase was found in perceived confidence to control snack intake for time duration, specific scenarios, and emotional states but not for confidence in the number of days of adherence. Significant positive correlations were found between snack episode reductions and working alliance, including a personal bond between the social robot and the participant.Limited generalizability due to small sample size; the vast majority of sample was male; short study duration.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to lack of completion of initial measurement (i.e., high-calorie diary recall segment).
[49]A significant decrease was found in negative behavioral states (i.e., observed anxiety, sadness, yelling, isolative behaviors, reports of pain, wandering/pacing) after the robot intervention.
A significant increase was found in positive behavioral states (i.e., calmness, bright affect, sleep behavior, and conversing) after the robot intervention. The presence of pre-test positive or negative behaviors was significantly associated, respectively, with post-test positive and negative behaviors. The robot effect was not moderated by a dementia diagnosis, except for one case that approached significance.
Limited generalizability due to small sample size; limited generalizability due to unblinding, and inability to distinguish the effects of staff from effects of the robot; absence of control group; absence of females in sample; drop-out due to death, transfer, or discharge.
Officially, the study was conducted as a non-research activity
Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to recruitment of a male-only sample; possible observer bias (unblinded observational study) or selection bias; bias due to no randomization; bias due to absence of a control group; bias due to drop-out.
Anxiety
[68]A significant improvement was found in State-Trait Anxiety (Form Y) scores after using the robot-based system.Limited generalizability due to small sample size; shortcomings of activity explication; only state anxiety was evaluated.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to absence of a control group.
[50]A significant decrease was found in state anxiety, heart rate, and breathing rate during active trials with the robot (the robot was breathing). A significant increase was found in emotional valence (i.e., the positivity of the emotional response was significantly strengthened).Limited generalizability due to small sample size; limited generalizability due to population with highly positive attitude towards pets; familiarity with the task before the study began.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to lack of inclusion of sample with target diagnosis; bias due to familiarity with the task.
[47]A significant improvement was found in positive affect after interacting with the robot in a stressful task, with a large effect. No effect was found in negative affect, anxiety, or arousal.Limited generalizability due to relatively small sample size and brief interactions with the social assistive robot; only short-term changes in anxiety, mood, and arousal were analyzed.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to compensation for participating in the study.
[69]No significant effects were found on state anxiety or speaker confidence even if the robot led to improvements in the rehearsal experience of presenters.Limited generalizability due to small convenience sample size; the vast majority of the sample was male; limited ecological validity.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to compensation for participating in the study.
Wellbeing And Stress
[61]An improvement was found in mood when the robot was presented as a coach intended to perform a psychological exercise. A decrease was shown in mood when the robot was presented as a conversation partner. Communication problems and less confidence were reported by the elderly interacting with the robot as compared with younger participants.Limited generalizability due to relatively small sample size; limited generalizability due to artificiality of the scenario and low ecological validity.Sampling bias due to small sample size (possible type II error) and low representativeness; sampling bias due to lack of sampling criteria explication; bias due to elderly participants’ communication problems with the robot.
[41]Positive emotions (i.e., joy) were elicited by the robot both in its embodied and avatar form. The effect was comparable between the robot and an avatar. The robot effect was higher than the effect elicited by a non-interactive toy. The embodiment did not seem necessary to elicit positive emotions.Limited generalizability due to relatively small sample size.Sampling bias due to small sample size (possible type II error).
[62]A statistically significant improvement was found in psychological wellbeing, overall mood, and readiness to change behavior in the group with low neuroticism and high conscientiousness after the robot intervention.
A significant improvement was found in psychological wellbeing also in the group with high neuroticism and low conscientiousness, but not in mood or readiness to change.
Limited generalizability due to relatively small sample size; no specified information about age; the vast majority of the sample was female; no randomization to a control group; most of the sample voluntarily signed up to the study.Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to absence of a control group; self-selection bias.
[39]No significant effect of a robot-initiated touch was found on physiological measures of arousal nor on prosocial behaviors (i.e., the amount of money participants were willing to donate or the amount of time participants were willing to offer for participating in another questionnaire).Limited generalizability due to relatively small sample size; limitations due to technical features (i.e., too quickly or slowly or rigid movements of the robot responses may have affected the spontaneity of the interaction).Sampling bias due to small sample size (possible type II error) and low representativeness; bias due to compensation for participating in the study.
[38]Empathy was the most reported level of experience when interacting with the robot that was seen as a friend. Stimulation (i.e., being interested in the robot to forget the situation) was also reported, when the robot was seen as a means of distraction.Limited generalizability due to small sample size; the vast majority of the sample was male.Sampling bias due to lack of sampling criteria explication.
Note: ASD = autism spectrum disorder; ASC = autism spectrum condition; * = intellectual disabilities; mindfulness; depression; self-disclosure; emotional states in elderly; behavior change intervention and reduction of high calories; positive and negative behaviors in veterans.
Table A3. Robot employed in the reviewed studies.
Table A3. Robot employed in the reviewed studies.
RobotBrief DescriptionClassification 1
NAO (a.k.a. Murphy miserable robot)
J 04 00048 i001
Source for image: [39]
NAO is a humanoid robot produced by Aldebaran Robotics, designed to interact with people. It is a medium-sized (57–58 cm) entertainment robot that includes an onboard computer and networking capabilities at its core. Its platform is programmable and it can handle multiple applications. It can walk, dance, speak, and recognize faces and objects. Now in its sixth generation, it is used in research, education, and healthcare all over the world. J 04 00048 i002
PROBO
J 04 00048 i003
Source for image: [87]
Probo is an animal-like green robot with an attractive trunk or proboscis, animated ears, eyes, eyebrows, eyelids, mouth, neck, and an interactive belly screen. It provides a soft touch and a huggable appearance, and it is designed to act as a social interface. It is usually used as a platform to study human robot interaction (HRI), while employing human-like social cues and communication modalities. The robot has a fully actuated head, with 20 degrees of freedom, and it is capable of expressing attention and emotions via its gaze and facial expressions and making eye-contact. J 04 00048 i004
KEEPON
J 04 00048 i005
Source for image: [51]
Keepon is a small robot with a silicone-rubber body, resembling a yellow snowman. The robot sits on a black cylinder containing four motors and two circuit boards, via which its movements can be manipulated. Its head and belly deform whenever it changes posture or someone touches it, and it can nod, turn, lean side-to-side and bob. Keepon has been widely used in human–robot interaction studies with social behaviors. J 04 00048 i006
THE HAPTIC CREATURE
J 04 00048 i007
Source for image: [50]
The Haptic Creature is an animal-like robot that is able to breathe, purr, and change the stiffness of its ears. It was developed by Steve Yohanan to study the communication of emotion through touch. Under its soft artificial fur, the robot takes advantage of an array of touch sensors to perceive a person’s touch and by changing the depth and rate of its breathing, the symmetry between the inhale and exhale, the strength of its purring, and the stiffness of its ears, it is able to communicate different emotions. J 04 00048 i008
PARO
J 04 00048 i009
Source for image: [49]
Paro is a baby harp-seal-like robot with white fur. It is equipped with four primary “senses”, such as sight (via a light sensor), hearing (including sound directionality and speech recognition), balance, and tactile sensation. It has been developed to provoke social reactions by having soft fur, by reacting to its environment with sounds (i.e., showing a diurnal sleep–wake cycle if reinforced in the form of stroking or petting), and by moving its head, tail, and eyelid. J 04 00048 i010
KASPAR
J 04 00048 i011
Source for image: [88]
Kaspar is a humanoid child-sized robot developed by the Adaptive Systems Research Group at the University of Hertfordshire. It can move its arms, head, and torso; blink; make facial expressions (e.g., smile when “happy”); and vocalize pre-programmed sounds, words, and sentences by using a neutral human voice. It can operate in a semi-autonomous manner when controlled via a wireless keypad to perform various pre-programmed actions or actions can also be triggered by activating the robot’s tactile sensors placed on different parts of its body. J 04 00048 i012
KILIRO
J 04 00048 i013
Source for image: [59]
KiliRo is a parrot-like robot capable of recognizing English alphabets from A through Z and Arabic numerals from 0 through 9 using the QR code scanning algorithm. When a number or an alphabet is shown to the robot, it reads the QR code pasted in the model and the corresponding text is converted into speech using the text-to-speech module. It also has seven touch sensors used to recognize the tactile interaction preferences of children with the KiliRo robot and, if touched, the LED light on the robot glows, encouraging the children for more physical interaction with KiliRo. J 04 00048 i014
FLOBI
J 04 00048 i015
Source for image: [89]
The humanoid robot head Flobi is able to communicate (using an external loudspeaker positioned next to the robot head) and to display a range of facial expressions. J 04 00048 i016
ROBONOVA
J 04 00048 i017
Source for image: [90]
Robonova is a small humanoid robot developed by Hitec Robotics with high speed of execution and 16 degrees of freedom, and fundamental abilities for achieving a high level of temporal contingency between the acts of the user and those of the robot. It contains remote control ability through IR, RC, or a Bluetooth joystick. J 04 00048 i018
DAISY
J 04 00048 i019
Source for image: [91]
Daisy is a semi-autonomous robot in the shape of a light blue and purple flower that resembles a stuffed soft toy. The face of the robot has two eyes with eyebrows that blink and look around and a mouth that speaks words and about 400 preinstalled phrases categorized according to their meaning with the lip sync technique. Daisy can be remotely controlled to perform sequences of movements and facial expressions. J 04 00048 i020
COZMO
J 04 00048 i021
Source for image: [92]
Cozmo robot is a small toy robot inspired by Pixar’s Wall-E and Eve, controlled through a smartphone app with different modes. It recognizes human faces, pets, and its toy cubes and uses different sensors to track its body orientation. It also displays emotions using sounds and animations. It does not have speech recognition and mainly interacts through beeps, movements, and animated eyes. J 04 00048 i022
PEPPER
J 04 00048 i023
Source for image: [93]
Pepper is a humanoid robot developed by SoftBank Robotics, capable of exhibiting body language, perceiving and interacting with its surroundings, and moving around. It can also analyze people’s expressions and voice tones, using the latest advances and proprietary algorithms in voice and emotion recognition to spark interactions. J 04 00048 i024
RYAN
J 04 00048 i025
Source for image: [94]
Ryan is a social robot developed by DreamFace Technologies, LLC. for face-to-face human–robot communication in different social, educational, and therapeutic contexts. It has an emotive and expressive animation-based face with accurate visual speech and can communicate through spoken dialogue. On its chest, there is also a tablet that can receive the user’s input or show videos or images to them.
It can also be equipped with a dialog manager called ProgramR that the robot uses to process the users’ input and reply properly to them.
J 04 00048 i026
CAKNA
J 04 00048 i027
Source for image: [95]
Cakna stands for collaborative, knowledge-oriented, and autonomous agent or means “care” in Malay language. It is half a meter tall and is designed to sit on a table or countertop. It has a limited ability to express state to the user using physical or anthropomorphic displays (e.g., mouth, arms, hands). J 04 00048 i028
ROBOCOP
J 04 00048 i029Source for image: [69]
RoboCOP (robotic coach for oral presentations) is an automated anthropomorphic robot head for presentation rehearsal. It actively listens to a presenter’s speech and offers detailed spoken feedback on five key aspects of presentations: content coverage, speaking rate, filler rate, pitch variety, and audience orientation. It also provides high-level advice on the presentation goal and audience benefits, as well as talk organization, introduction, and close. J 04 00048 i030
BONO-01
J 04 00048 i031
Source for image: [63]
Bono-01 is a conversation support robot dressed in baby clothes, capable of network communication control, transmission and reception of images, and responding to facilitate conversations. J 04 00048 i032
GIRAFF
J 04 00048 i033
Source for image: [96]
The Giraff telepresence robot is generally used to achieve remote communication between two parties. It consists of a mobile robotic base equipped with a web camera, a microphone, and a screen. Users interact through the robotic device with a remote peer who connects through a client interface. The client interface is on the remote user site and it allows the user to teleoperate the Giraff and navigate around (by pointing directly on the real time video image and pushing the left button on the mouse) while speaking through a microphone and a web camera, as well as receive the video and audio stream from the Giraff. J 04 00048 i034
JIBO
J 04 00048 i035
Source for image: [97]
JIBO is a commercial robot equipped with basic skills, such as weather forecast, jokes, music, and interactive games. It is meant to be placed on a table, desk, or counter and contains no wheels or other mechanisms to travel to a different location, but it is able to express itself with body poses and gestures using just a torso and head. The center of JIBO’s face has a touch screen that usually displays JIBO’s eye (a grey circle) that moves along with JIBO’s body (e.g., looking to the right as JIBO turns its body right). However, the screen can also be used for other purposes, such as displaying text, images, or videos. The touch aspects can be used to identify objects or make selections on the screen. J 04 00048 i036
THE HUGGABLE ROBOT
J 04 00048 i037
Source for image: [41]
The Huggable robot is a blue and green teddy-bear-like robot with a big head and large eyes that intensify the sense of infancy and make it more appealing for young children. It has twelve degrees of freedom (DOF) to perform animate and expressive motions and it is capable of manual look-at and point-at behaviors. It is also able to talk and move up and down the muzzle accordingly. J 04 00048 i038
Note: 1 classification was made according to the framework for classifying social robots proposed by Bartneck and Forlizzi [8].

References

  1. Pollak, A.; Paliga, M.; Pulopulos, M.M.; Kozusznik, B.; Kozusznik, M.W. Stress in Manual and Autonomous Modes of Collaboration with a Cobot. Comput. Hum. Behav. 2020, 112, 106469. [Google Scholar] [CrossRef]
  2. Wainer, J.; Feil-seifer, D.J.; Shell, D.A.; Mataric, M.J. The Role of Physical Embodiment in Human-Robot Interaction. In Proceedings of the ROMAN 2006—The 15th IEEE International Symposium on Robot and Human Interactive Communication, Hatfield, UK, 6–8 September 2006; pp. 117–122. [Google Scholar]
  3. Haring, K.S.; Satterfield, K.M.; Tossell, C.C.; de Visser, E.J.; Lyons, J.R.; Mancuso, V.F.; Finomore, V.S.; Funke, G.J. Robot Authority in Human-Robot Teaming: Effects of Human-Likeness and Physical Embodiment on Compliance. Front. Psychol. 2021, 12, 625713. [Google Scholar] [CrossRef] [PubMed]
  4. Holland, J.; Kingston, L.; McCarthy, C.; Armstrong, E.; O’Dwyer, P.; Merz, F.; McConnell, M. Service Robots in the Healthcare Sector. Robotics 2021, 10, 47. [Google Scholar] [CrossRef]
  5. Van Wynsberghe, A. Service Robots, Care Ethics, and Design. Ethics Inf. Technol. 2016, 18, 311–321. [Google Scholar] [CrossRef] [Green Version]
  6. Wan, S.; Gu, Z.; Ni, Q. Cognitive Computing and Wireless Communications on the Edge for Healthcare Service Robots. Comput. Commun. 2020, 149, 99–106. [Google Scholar] [CrossRef]
  7. Sarrica, M.; Brondi, S.; Fortunati, L. How Many Facets Does a “Social Robot” Have? A Review of Scientific and Popular Definitions Online. Inf. Technol. People 2019, 33, 1–21. [Google Scholar] [CrossRef]
  8. Bartneck, C.; Forlizzi, J. A Design-Centred Framework for Social Human-Robot Interaction. In Proceedings of the RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759), Kurashiki, Japan, 22 September 2004; pp. 591–594. [Google Scholar]
  9. Chen, S.-C.; Moyle, W.; Jones, C.; Petsky, H. A Social Robot Intervention on Depression, Loneliness, and Quality of Life for Taiwanese Older Adults in Long-Term Care. Int. Psychogeriatr. 2020, 32, 981–991. [Google Scholar] [CrossRef]
  10. Costescu, C.A.; Vanderborght, B.; David, D.O. The Effects of Robot-Enhanced Psychotherapy: A Meta-Analysis. Rev. Gen. Psychol. 2014, 18, 127–136. [Google Scholar] [CrossRef] [Green Version]
  11. Libin, A.V.; Libin, E.V. Person-Robot Interactions from the Robopsychologists’ Point of View: The Robotic Psychology and Robotherapy Approach. Proc. IEEE 2004, 92, 1789–1803. [Google Scholar] [CrossRef]
  12. Gelso, C.J.; Hayes, J.A. The Psychotherapy Relationship: Theory, Research, and Practice; John Wiley & Sons Inc: Hoboken, NJ, USA, 1998; 304p, ISBN 978-0-471-12720-8. [Google Scholar]
  13. Høglend, P. Exploration of the Patient-Therapist Relationship in Psychotherapy. Am. J. Psychiatry 2014, 171, 1056–1066. [Google Scholar] [CrossRef]
  14. Roos, J.; Werbart, A. Therapist and Relationship Factors Influencing Dropout from Individual Psychotherapy: A Literature Review. Psychother. Res. 2013, 23, 394–418. [Google Scholar] [CrossRef]
  15. Cheok, A.D.; Levy, D.; Karunanayaka, K. Lovotics: Love and Sex with Robots. In Emotion in Games: Theory and Praxis; Socio-Affective Computing; Karpouzis, K., Yannakakis, G.N., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 303–328. ISBN 978-3-319-41316-7. [Google Scholar]
  16. Cheok, A.D.; Karunanayaka, K.; Zhang, E.Y. Lovotics: Human—Robot Love and Sex Relationships. In Robot Ethics 2.0: New Challenges in Philosophy, Law, and Society; Lin, P., Abney, K., Jenkins, R., Eds.; Oxford University Press: Oxford, UK, 2017; Volume 193, pp. 193–213. ISBN 978-0-19-065295-1. [Google Scholar]
  17. Lin, Y.; Tudor-Sfetea, C.; Siddiqui, S.; Sherwani, Y.; Ahmed, M.; Eisingerich, A.B. Effective Behavioral Changes through a Digital MHealth App: Exploring the Impact of Hedonic Well-Being, Psychological Empowerment and Inspiration. JMIR mHealth uHealth 2018, 6, e10024. [Google Scholar] [CrossRef]
  18. Straten, C.L.; van Kühne, R.; Peter, J.; de Jong, C.; Barco, A. Closeness, Trust, and Perceived Social Support in Child-Robot Relationship Formation: Development and Validation of Three Self-Report Scales. Interact. Stud. 2020, 21, 57–84. [Google Scholar] [CrossRef]
  19. Ta, V.; Griffith, C.; Boatfield, C.; Wang, X.; Civitello, M.; Bader, H.; DeCero, E.; Loggarakis, A. User Experiences of Social Support From Companion Chatbots in Everyday Contexts: Thematic Analysis. J. Med. Internet Res. 2020, 22, e16235. [Google Scholar] [CrossRef] [PubMed]
  20. Angantyr, M.; Eklund, J.; Hansen, E.M. A Comparison of Empathy for Humans and Empathy for Animals. Anthrozoös 2011, 24, 369–377. [Google Scholar] [CrossRef]
  21. Mattiassi, A.D.A.; Sarrica, M.; Cavallo, F.; Fortunati, L. What Do Humans Feel with Mistreated Humans, Animals, Robots, and Objects? Exploring the Role of Cognitive Empathy. Motiv. Emot. 2021, 45, 543–555. [Google Scholar] [CrossRef]
  22. Misselhorn, C. Empathy with Inanimate Objects and the Uncanny Valley. Minds Mach. 2009, 19, 345–359. [Google Scholar] [CrossRef]
  23. Young, A.; Khalil, K.A.; Wharton, J. Empathy for Animals: A Review of the Existing Literature. Curator Mus. J. 2018, 61, 327–343. [Google Scholar] [CrossRef]
  24. Marchetti, A.; Manzi, F.; Itakura, S.; Massaro, D. Theory of Mind and Humanoid Robots From a Lifespan Perspective. Z. Für Psychol. 2018, 226, 98–109. [Google Scholar] [CrossRef]
  25. Schmetkamp, S. Understanding AI—Can and Should We Empathize with Robots? Rev. Philos. Psychol. 2020, 11, 881–897. [Google Scholar] [CrossRef]
  26. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders: Dsm-5, 5th ed.; American Psychiatric Pub Inc.: Washington, DC, USA, 2013; ISBN 978-0-89042-555-8. [Google Scholar]
  27. Norcross, J.C.; Beutler, L.E. A Prescriptive Eclectic Approach to Psychotherapy Training. J. Psychother. Integr. 2000, 10, 247–261. [Google Scholar] [CrossRef]
  28. Beutler, L.E. Making Science Matter in Clinical Practice: Redefining Psychotherapy. Clin. Psychol. Sci. Pract. 2009, 16, 301–317. [Google Scholar] [CrossRef]
  29. Marino, F.; Chilà, P.; Sfrazzetto, S.T.; Carrozza, C.; Crimi, I.; Failla, C.; Busà, M.; Bernava, G.; Tartarisco, G.; Vagni, D.; et al. Outcomes of a Robot-Assisted Social-Emotional Understanding Intervention for Young Children with Autism Spectrum Disorders. J. Autism Dev. Disord. 2020, 50, 1973–1987. [Google Scholar] [CrossRef] [PubMed]
  30. Cao, H.-L.; Gomez Esteban, P.; Bartlett, M.; Baxter, P.; Belpaeme, T.; Billing, E.; Cai, H.; Coeckelbergh, M.; Pop, C.; David, D. Robot-Enhanced Therapy: Development and Validation of Supervised Autonomous Robotic System for Autism Spectrum Disorders Therapy. IEEE Robot. Autom. Mag. 2019, 26, 49–58. [Google Scholar] [CrossRef]
  31. David, D. Tasks for Social Robots (Supervised Autonomous Version) on Developing Social Skills; Dream Publishing: Jamtara, India, 2017; pp. 1–18. [Google Scholar]
  32. Tapus, A.; Peca, A.; Aly, A.; Pop, C.; Jisa, L.; Pintea, S.; Rusu, A.S.; David, D.O. Children with Autism Social Engagement in Interaction with Nao, an Imitative Robot: A Series of Single Case Experiments. Interact. Stud. 2012, 13, 315–347. [Google Scholar] [CrossRef]
  33. David, D.O.; Costescu, C.A.; Matu, S.; Szentagotai, A.; Dobrean, A. Effects of a Robot-Enhanced Intervention for Children With ASD on Teaching Turn-Taking Skills. J. Educ. Comput. Res. 2020, 58, 29–62. [Google Scholar] [CrossRef]
  34. Shamsuddin, S.; Yussof, H.; Ismail, L.I.; Mohamed, S.; Hanapiah, F.A.; Zahari, N.I. Initial Response in HRI- a Case Study on Evaluation of Child with Autism Spectrum Disorders Interacting with a Humanoid Robot NAO. Procedia Eng. 2012, 41, 1448–1455. [Google Scholar] [CrossRef]
  35. David, D.O.; Costescu, C.A.; Matu, S.; Szentagotai, A.; Dobrean, A. Developing Joint Attention for Children with Autism in Robot-Enhanced Therapy. Int. J. Soc. Robot. 2018, 10, 595–605. [Google Scholar] [CrossRef]
  36. Alemi, M.; Meghdari, A.; Ghanbarzadeh, A.; Moghadam, L.J.; Ghanbarzadeh, A. Impact of a Social Humanoid Robot as a Therapy Assistant in Children Cancer Treatment. In Proceedings of the Social Robotics, Sydney, NSW, Australia, 27–29 October 2014; Beetz, M., Johnston, B., Williams, M.-A., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 11–22. [Google Scholar]
  37. Alemi, M.; Ghanbarzadeh, A.; Meghdari, A.; Moghadam, L.J. Clinical Application of a Humanoid Robot in Pediatric Cancer Interventions. Int. J. Soc. Robot. 2016, 8, 743–759. [Google Scholar] [CrossRef]
  38. Ullrich, D.; Diefenbach, S.; Butz, A. Murphy Miserable Robot: A Companion to Support Children’s Well-Being in Emotionally Difficult Situations. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA, 7–12 May 2016; Association for Computing Machinery: New York, NY, USA, 2016; pp. 3234–3240. [Google Scholar]
  39. van der Hout, V.M. The Touch of a Robotic Friend: Can a Touch of a Robot, When the Robot and the Person Have Bonded with Each Other, Calm a Person down during a Stressful Moment? Master’s Thesis, University of Twente, Twente, The Netherlands, 2017. [Google Scholar]
  40. Laban, G.; George, J.-N.; Morrison, V.; Cross, E.S. Tell Me More! Assessing Interactions with Social Robots from Speech. Paladyn J. Behav. Robot. 2021, 12, 136–159. [Google Scholar] [CrossRef]
  41. Jeong, S. The Impact of Social Robots on Young Patients’ Socio-Emotional Wellbeing in a Pediatric Inpatient Care Context. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 2017. [Google Scholar]
  42. Robinson, N.L.; Connolly, J.; Hides, L.; Kavanagh, D.J. Social Robots as Treatment Agents: Pilot Randomized Controlled Trial to Deliver a Behavior Change Intervention. Internet Interv. 2020, 21, 100320. [Google Scholar] [CrossRef]
  43. Vanderborght, B.; Simut, R.; Saldien, J.; Pop, C.; Rusu, A.S.; Pintea, S.; Lefeber, D.; David, D.O. Using the Social Robot Probo as a Social Story Telling Agent for Children with ASD. Interact. Stud. 2012, 13, 348–372. [Google Scholar] [CrossRef]
  44. Simut, R.E.; Vanderfaeillie, J.; Peca, A.; Van de Perre, G.; Vanderborght, B. Children with Autism Spectrum Disorders Make a Fruit Salad with Probo, the Social Robot: An Interaction Study. J. Autism Dev. Disord. 2016, 46, 113–126. [Google Scholar] [CrossRef]
  45. Pop, C.A.; Pintea, S.; Vanderborght, B.; David, D.O. Enhancing Play Skills, Engagement and Social Skills in a Play Task in ASD Children by Using Robot-Based Interventions. A Pilot Study. Interact. Stud. 2014, 15, 292–320. [Google Scholar] [CrossRef]
  46. Simut, R.; Costescu, C.A.; Vanderfaeillie, J.; Van De Perre, G.; Vanderborght, B.; Lefeber, D. Can You Cure Me? Children with Autism Spectrum Disorders Playing a Doctor Game With a Social Robot. Int. J. Sch. Health 2016, 3, 1–9. [Google Scholar] [CrossRef] [Green Version]
  47. Crossman, M.K.; Kazdin, A.E.; Kitt, E.R. The Influence of a Socially Assistive Robot on Mood, Anxiety, and Arousal in Children. Prof. Psychol. Res. Pract. 2018, 49, 48–56. [Google Scholar] [CrossRef]
  48. Wagemaker, E.; Dekkers, T.J.; van Rentergem, J.A.A.; Volkers, K.M.; Huizenga, H.M. Advances in Mental Health Care: Five N = 1 Studies on the Effects of the Robot Seal Paro in Adults With Severe Intellectual Disabilities. J. Ment. Health Res. Intellect. Disabil. 2017, 10, 309–320. [Google Scholar] [CrossRef] [Green Version]
  49. Lane, G.W.; Noronha, D.; Rivera, A.; Craig, K.; Yee, C.; Mills, B.; Villanueva, E. Effectiveness of a Social Robot, “Paro,” in a VA Long-Term Care Setting. Psychol. Serv. 2016, 13, 292–299. [Google Scholar] [CrossRef]
  50. Sefidgar, Y.S. TAMER: Touch-Guided Anxiety Management via Engagement with a Robotic Pet Efficacy Evaluation and the First Steps of the Interaction Design. Ph.D. Thesis, University of British Columbia, Vancouver, BC, Canada, 2012. [Google Scholar]
  51. Costescu, C.; Vanderborght, B.; Daniel, D. Robot-Enhanced Cbt for Dysfunctional Emotions in Social Situations For Children With ASD. J. Evid.-Based Psychother. 2017, 17, 119–132. [Google Scholar] [CrossRef]
  52. Costescu, C.A.; Vanderborght, B.; David, D.O. Reversal Learning Task in Children with Autism Spectrum Disorder: A Robot-Based Approach. J. Autism Dev. Disord. 2015, 45, 3715–3725. [Google Scholar] [CrossRef]
  53. Karakosta, E.; Dautenhahn, K.; Syrdal, D.S.; Wood, L.J.; Robins, B. Using the Humanoid Robot Kaspar in a Greek School Environment to Support Children with Autism Spectrum Condition. Paladyn J. Behav. Robot. 2019, 10, 298–317. [Google Scholar] [CrossRef]
  54. Wood, L.J.; Robins, B.; Lakatos, G.; Syrdal, D.S.; Zaraki, A.; Dautenhahn, K. Developing a Protocol and Experimental Setup for Using a Humanoid Robot to Assist Children with Autism to Develop Visual Perspective Taking Skills. Paladyn J. Behav. Robot. 2019, 10, 167–179. [Google Scholar] [CrossRef]
  55. Pliasa, S.; Fachantidis, N. Using Daisy Robot as a Motive for Children with ASD to Participate in Triadic Activities. Themes ELearning 2019, 12, 35–50. [Google Scholar]
  56. Peca, A.; Simut, R.; Pintea, S.; Vanderborght, B. Are Children with ASD More Prone to Test the Intentions of the Robonova Robot Compared to a Human? Int. J. Soc. Robot. 2015, 7, 629–639. [Google Scholar] [CrossRef]
  57. Ghiglino, D.; Chevalier, P.; Floris, F.; Priolo, T.; Wykowska, A. Follow the White Robot: Efficacy of Robot-Assistive Training for Children with Autism Spectrum Disorder. Res. Autism Spectr. Disord. 2021, 86, 101822. [Google Scholar] [CrossRef]
  58. Damm, O.; Malchus, K.; Jaecks, P.; Krach, S.; Paulus, F.; Naber, M.; Jansen, A.; Kamp-Becker, I.; Einhaeuser-Treyer, W.; Stenneken, P.; et al. Different Gaze Behavior in Human-Robot Interaction in Asperger’s Syndrome: An Eye-Tracking Study. In Proceedings of the 2013 IEEE RO-MAN, Gyeongju, Korea, 26–29 August 2013; pp. 368–369. [Google Scholar]
  59. Bharatharaj, J.; Huang, L.; Al-Jumaily, A.; Elara, M.R.; Krägeloh, C. Investigating the Effects of Robot-Assisted Therapy among Children with Autism Spectrum Disorder Using Bio-Markers. IOP Conf. Ser. Mater. Sci. Eng. 2017, 234, 012017. [Google Scholar] [CrossRef]
  60. Alimardani, M.; Kemmeren, L.; Okumura, K.; Hiraki, K. Robot-Assisted Mindfulness Practice: Analysis of Neurophysiological Responses and Affective State Change. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 8–12 August 2020; pp. 683–689. [Google Scholar]
  61. Gallego Pérez, J.; Lohse, M.; Evers, V. Robots for the Psychological Wellbeing of the Elderly. In Proceedings of the HRI 2014 Workshop on Socially Assistive Robots for the Aging Population, Bielefeld, Germany, 3–6 March 2014. [Google Scholar]
  62. Jeong, S.; Alghowinem, S.; Aymerich-Franch, L.; Arias, K.; Lapedriza, A.; Picard, R.; Park, H.W.; Breazeal, C. A Robotic Positive Psychology Coach to Improve College Students’ Wellbeing. In Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy, 8–12 August 2020; pp. 187–194. [Google Scholar]
  63. Yamaguchi, K.; Nergui, M.; Otake, M. A Robot Presenting Reproduced Stories among Older Adults in Group Conversation. Appl. Mech. Mater. 2014, 541–542, 1120–1126. [Google Scholar] [CrossRef]
  64. Bucher, M.A.; Suzuki, T.; Samuel, D.B. A Meta-Analytic Review of Personality Traits and Their Associations with Mental Health Treatment Outcomes. Clin. Psychol. Rev. 2019, 70, 51–63. [Google Scholar] [CrossRef]
  65. Barlow, D.H.; Curreri, A.J.; Woodard, L.S. Neuroticism and Disorders of Emotion: A New Synthesis. Curr. Dir. Psychol. Sci. 2021, 30, 09637214211030253. [Google Scholar] [CrossRef]
  66. Widiger, T.A.; Oltmanns, J.R. Neuroticism Is a Fundamental Domain of Personality with Enormous Public Health Implications. World Psychiatry 2017, 16, 144–145. [Google Scholar] [CrossRef] [Green Version]
  67. Dino, F.; Zandie, R.; Abdollahi, H.; Schoeder, S.; Mahoor, M.H. Delivering Cognitive Behavioral Therapy Using A Conversational Social Robot. In Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China, 3–8 November 2019; pp. 2089–2095. [Google Scholar]
  68. Aziz, A.A.; Yusoff, N.; Yusoff, A.N.M.; Kabir, F. Towards Robot Therapist In-the-Loop for Persons with Anxiety Traits. In Proceedings of the International Conference on Universal Wellbeing, Kuala Lumpur, Malaysia, 4–6 December 2019. [Google Scholar]
  69. Trinh, H.; Asadi, R.; Edge, D.; Bickmore, T. RoboCOP: A Robotic Coach for Oral Presentations. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 2017, 1, 1–24. [Google Scholar] [CrossRef] [Green Version]
  70. Salter, T.; Davey, N.; Michaud, F. Designing Developing QueBall, a Robotic Device for Autism Therapy. In Proceedings of the The 23rd IEEE International Symposium on Robot and Human Interactive Communication, Edinburgh, UK, 25–29 August 2014; pp. 574–579. [Google Scholar]
  71. Pop, C.A.; Simut, R.; Pintea, S.; Saldien, J.; Rusu, A.; David, D.; Vanderfaeillie, J.; Lefeber, D.; Vanderborght, B. Can the Social Robot Probo Help Children with Autism to Identify Situation-Based Emotions? A Series of Single Case Experiments. Int. J. Hum. Robot. 2013, 10, 1350025. [Google Scholar] [CrossRef]
  72. Barker, S.B.; Dawson, K.S. The Effects of Animal-Assisted Therapy on Anxiety Ratings of Hospitalized Psychiatric Patients. Psychiatr. Serv. 1998, 49, 797–801. [Google Scholar] [CrossRef] [PubMed]
  73. Li, J. The Benefit of Being Physically Present: A Survey of Experimental Works Comparing Copresent Robots, Telepresent Robots and Virtual Agents. Int. J. Hum.-Comput. Stud. 2015, 77, 23–37. [Google Scholar] [CrossRef]
  74. Lee, K.M.; Jung, Y.; Kim, J.; Kim, S.R. Are Physically Embodied Social Agents Better than Disembodied Social Agents?: The Effects of Physical Embodiment, Tactile Interaction, and People’s Loneliness in Human–Robot Interaction. Int. J. Hum.-Comput. Stud. 2006, 64, 962–973. [Google Scholar] [CrossRef]
  75. Kruglanski, A.W.; Riter, A.; Amitai, A.; Margolin, B.-S.; Shabtai, L.; Zaksh, D. Can Money Enhance Intrinsic Motivation? A Test of the Content-Consequence Hypothesis. J. Pers. Soc. Psychol. 1975, 31, 744–750. [Google Scholar] [CrossRef]
  76. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social Robots for Education: A Review. Sci. Robot. 2018, 3. [Google Scholar] [CrossRef] [Green Version]
  77. Pu, L.; Moyle, W.; Jones, C.; Todorovic, M. The Effectiveness of Social Robots for Older Adults: A Systematic Review and Meta-Analysis of Randomized Controlled Studies. Gerontologist 2019, 59, e37–e51. [Google Scholar] [CrossRef]
  78. Logan, D.E.; Breazeal, C.; Goodwin, M.S.; Jeong, S.; O’Connell, B.; Smith-Freedman, D.; Heathers, J.; Weinstock, P. Social Robots for Hospitalized Children. Pediatrics 2019, 144, e20181511. [Google Scholar] [CrossRef]
  79. Jung, Y.; Lee, K.M. Effects of Physical Embodiment on Social Presence of Social Robots; International Society for Presence Research: Valencia, Spain, 2004; pp. 80–87. [Google Scholar]
  80. Thielbar, K.O.; Triandafilou, K.M.; Barry, A.J.; Yuan, N.; Nishimoto, A.; Johnson, J.; Stoykov, M.E.; Tsoupikova, D.; Kamper, D.G. Home-Based Upper Extremity Stroke Therapy Using a Multiuser Virtual Reality Environment: A Randomized Trial. Arch. Phys. Med. Rehabil. 2020, 101, 196–203. [Google Scholar] [CrossRef]
  81. Dodds, P.; Martyn, K.; Brown, M. Infection Prevention and Control Challenges of Using a Therapeutic Robot. Nurs. Older People 2018, 30, 34–40. [Google Scholar] [CrossRef]
  82. Schultz, E. Furry Therapists: The Advantages and Disadvantages of Implementing Animal Therapy; University of Wisconsin-Stout: Menomonie, WI, USA, 2006. [Google Scholar]
  83. Maloney, D.; Zamanifard, S.; Freeman, G. Anonymity vs. Familiarity: Self-Disclosure and Privacy in Social Virtual Reality. In Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology, Osaka, Japan, 1–4 November 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 1–9. [Google Scholar]
  84. Feingold Polak, R.; Tzedek, S.L. Social Robot for Rehabilitation: Expert Clinicians and Post-Stroke Patients’ Evaluation Following a Long-Term Intervention. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 151–160, ISBN 978-1-4503-6746-2. [Google Scholar]
  85. Martí Carrillo, F.; Butchart, J.; Knight, S.; Scheinberg, A.; Wise, L.; Sterling, L.; McCarthy, C. Adapting a General-Purpose Social Robot for Paediatric Rehabilitation through In Situ Design. ACM Trans. Hum.-Robot Interact. 2018, 7, 1–30. [Google Scholar] [CrossRef] [Green Version]
  86. Céspedes, N.; Múnera, M.; Gómez, C.; Cifuentes, C.A. Social Human-Robot Interaction for Gait Rehabilitation. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1299–1307. [Google Scholar] [CrossRef]
  87. Goris, K.; Saldien, J.; Vanderborght, B.; Lefeber, D. How to Achieve the Huggable Behavior of the Social Robot Probo? A Reflection on the Actuators. Mechatronics 2011, 21, 490–500. [Google Scholar] [CrossRef]
  88. Dautenhahn, K.; Nehaniv, C.L.; Walters, M.L.; Robins, B.; Kose-Bagci, H.; Assif, N.; Blow, M. KASPAR—A Minimally Expressive Humanoid Robot for Human–Robot Interaction Research. Appl. Bionics Biomech. 2009, 6, 369–397. [Google Scholar] [CrossRef] [Green Version]
  89. Lütkebohle, I.; Hegel, F.; Schulz, S.; Hackel, M.; Wrede, B.; Wachsmuth, S.; Sagerer, G. The Bielefeld Anthropomorphic Robot Head “Flobi”. In Proceedings of the 2010 IEEE International Conference on Robotics and Automation, Paris, France, 3–7 May 2010; pp. 3384–3391. [Google Scholar]
  90. Grunberg, D.; Ellenberg, R.; Kim, Y.E.; Oh, P.Y. From RoboNova to HUBO: Platforms for Robot Dance. In Proceedings of the Progress in Robotics, Incheon, Korea, 16–20 August 2009; Kim, J.-H., Ge, S.S., Vadakkepat, P., Jesse, N., Al Manum, A., Puthusserypady, K.S., Rückert, U., Sitte, J., Witkowski, U., Nakatsu, R., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; pp. 19–24. [Google Scholar]
  91. Pliasa, S.; Fachantidis, N. Can a Robot Be an Efficient Mediator in Promoting Dyadic Activities among Children with Autism Spectrum Disorders and Children of Typical Development? In Proceedings of the 9th Balkan Conference on Informatics, Sofia, Bulgaria, 26–28 September 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 1–6. [Google Scholar]
  92. Pelikan, H.R.M.; Broth, M.; Keevallik, L. “Are You Sad, Cozmo?” How Humans Make Sense of a Home Robot’s Emotion Displays. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, Cambridge, UK, 23–26 March 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 461–470, ISBN 978-1-4503-6746-2. [Google Scholar]
  93. Pandey, A.K.; Gelin, R. A Mass-Produced Sociable Humanoid Robot: Pepper: The First Machine of Its Kind. IEEE Robot. Autom. Mag. 2018, 25, 40–48. [Google Scholar] [CrossRef]
  94. Askari, F.; Feng, H.; Sweeny, T.D.; Mahoor, M.H. A Pilot Study on Facial Expression Recognition Ability of Autistic Children Using Ryan, A Rear-Projected Humanoid Robot. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; pp. 790–795. [Google Scholar]
  95. Aziz, A.A.; Fahad, A.S.; Ahmad, F. CAKNA: A Personalized Robot-Based Platform for Anxiety States Therapy. Intell. Environ. 2017, 2017, 141–150. [Google Scholar] [CrossRef]
  96. Tiberio, L.; Padua, L.; Pellegrino, A.R.; Aprile, I.; Cortellessa, G.; Cesta, A. Assessing the Tolerance of a Telepresence Robot in Users with Mild Cognitive Impairment—A Protocol for Studying Users’ Physiological Response. In Proceedings of the HRI 2011 Workshop on Social Robotic Telepresence, Lausanne, Switzerland, 6 March 2011; CNR, ISTC, Roma, ITA: Lausanne, Switzerland, 2011; pp. 23–28. [Google Scholar]
  97. Yeung, G.; Bailey, A.; Afshan, A.; Tinkler, M.; Pérez, M.Q.; Martin, A.; Pogossian, A.A.; Spaulding, S.; Park, H.; Muco, M.; et al. A Robotic Interface for the Administration of Language, Literacy, and Speech Pathology Assessments for Children. In Proceedings of the SLaTE, Coimbra, Portugal, 27–28 June 2019. [Google Scholar]
Figure 1. Diagram showing the information flow through the review: the number of records identified, included, and excluded.
Figure 1. Diagram showing the information flow through the review: the number of records identified, included, and excluded.
J 04 00048 g001
Figure 2. Bar chart of the social robot adoption in the selected studies.
Figure 2. Bar chart of the social robot adoption in the selected studies.
J 04 00048 g002
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Duradoni, M.; Colombini, G.; Russo, P.A.; Guazzini, A. Robotic Psychology: A PRISMA Systematic Review on Social-Robot-Based Interventions in Psychological Domains. J 2021, 4, 664-697. https://doi.org/10.3390/j4040048

AMA Style

Duradoni M, Colombini G, Russo PA, Guazzini A. Robotic Psychology: A PRISMA Systematic Review on Social-Robot-Based Interventions in Psychological Domains. J. 2021; 4(4):664-697. https://doi.org/10.3390/j4040048

Chicago/Turabian Style

Duradoni, Mirko, Giulia Colombini, Paola Andrea Russo, and Andrea Guazzini. 2021. "Robotic Psychology: A PRISMA Systematic Review on Social-Robot-Based Interventions in Psychological Domains" J 4, no. 4: 664-697. https://doi.org/10.3390/j4040048

Article Metrics

Back to TopTop