Next Article in Journal
Assessing Cognitive Load Using EEG and Eye-Tracking in 3-D Learning Environments: A Systematic Review
Previous Article in Journal
Exploring Consumer Perception of Augmented Reality (AR) Tools for Displaying and Understanding Nutrition Labels: A Pilot Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Socially Assistive Robotics in Supporting Children with Autism Spectrum Disorder

College of Engineering and Technology, American University of the Middle East, Egaila 54200, Kuwait
*
Authors to whom correspondence should be addressed.
Multimodal Technol. Interact. 2025, 9(9), 98; https://doi.org/10.3390/mti9090098
Submission received: 18 May 2025 / Revised: 18 August 2025 / Accepted: 15 September 2025 / Published: 18 September 2025

Abstract

This study aimed to investigate the use of social robots as an interactive learning approach for treating children diagnosed with autism spectrum disorder (ASD). A review was conducted using the meta-analysis technique to compile pertinent research. An analysis was performed on the results of the online search process, which gathered information on pertinent research published until 31 January 2025, from three publication databases: IEEE Xplore, SCOPUS, and Google Scholar. One hundred and seven papers out of the 591 publications that were retrieved satisfied the previously established inclusion and exclusion criteria. Despite the differences in methodology and heterogeneity, the data were synthesized narratively. This review focuses on the various types of social robots used to treat ASD, as well as their communication mechanisms, development areas, target behaviors, challenges, and future directions. Both practitioners and seasoned researchers looking for a fresh approach to their next project will find this review a useful resource that offers broad summaries of state-of-the-art research in this field.

1. Introduction

Autism is a neuro-developmental disorder that affects a person’s ability to communicate, interact socially, and engage in repetitive behaviors or interests [1,2,3,4,5]. It is usually diagnosed in the early years but can also be identified later. Individuals with autism have symptoms and abilities ranging from exceptional abilities in certain areas to difficulties in performing basic, daily tasks. The usual indicators of autism include problems in social interaction and communication, delayed language development, repetitive behaviors, and an intense interest in specific topics or objects. As it is a spectrum disorder, the nature and severity of its symptoms may differ significantly between individuals. The exact cause of autism remains unknown. Globally, approximately one in every 100 young people is thought to have autism [6,7]. According to the latest reports by the Autism Community in Action, the prevalence of autism has increased from 1 in 150 in 2000 to 1 in 31 in 2022, which can mainly be attributed to enhanced awareness, broader diagnostic criteria, and improved screening tools. ASD has been reported to occur in all racial, ethnic, and socioeconomic groups. Males suffer from ASD three times more than females [8,9]. The economic burden of autism, estimated from 1990 to 2019, amounted to a staggering 7 trillion United States dollars. If the prevalence rate continues to rise, the projected cost to society could reach nearly $15 trillion by 2029 [10]. It is important to note that these estimates capture only the costs borne by society and do not account for the additional financial burden borne by families.

1.1. Types, Diagnosis, and Treatment of Autism

According to the DSM-5 (2013) and DSM-5-TR (2022), autism spectrum disorder (ASD) is now recognized as a single, unified diagnosis encompassing several previously distinct conditions [9]. These include autistic disorder (commonly referred to as classic autism), Asperger’s syndrome, childhood disintegrative disorder, and pervasive developmental disorder—not otherwise specified (PDD-NOS) [11,12]. The most severe type of autism is “classic autism”, which is characterized by notable deficits in communication, social relationships, and repetitive behaviors. The IQ and language development of people with Asperger’s syndrome (AS) are usually abnormally high, even though they struggle in social situations and may be interested in certain subjects. Childhood disintegrative disorder is an unusual condition that develops in a child for a minimum of two years, experiencing a substantial deficiency of language, social, and/or motor skills [13].
Autism is generally diagnosed in early childhood, with symptoms often becoming noticeable by the age of two to three years [14,15]. The disorder is lifelong and can be associated with other conditions such as epilepsy, fragile X syndrome, and tuberous sclerosis [16]. The diagnostic criteria have evolved, with certain subtypes being integrated into the broader classifications of ASD. This diagnosis includes previously distinct categories, such as autistic disorder, AS, and PDD-NOS. The DSM-5 employs a three-tier severity rating system for ASD since an individual needs to function daily [17,18]. People with Level 1 ASD struggle with conversations and relationships, have difficulty adapting to new routines, and need assistance in social situations. Those with Level 2 ASD require considerable support in social contexts and experience more pronounced difficulties in communication and social interaction. They may exhibit repetitive behaviors that interfere with daily functioning. Individuals diagnosed with level 3 ASD require extensive support in social situations, face significant challenges in communication and social interaction, and may display intense repetitive behaviors that hinder their daily activities [19]. These individuals often require constant supervision and assistance from caregivers. Not all individuals with ASD fit precisely into one of these levels because symptom severity can vary significantly among individuals.
There is no recognized remedy for autism, but a quick diagnosis and intervention can considerably improve its consequences. Treatment modalities may include behavioral therapy, speech therapy, and pharmacological interventions to manage co-occurring conditions such as anxiety or attention deficit hyperactivity disorder [20]. Many individuals with autism can spend quality time with appropriate assistance and accommodation. The most efficacious treatment plan depends on the patient’s unique requirements and symptoms. Applied behavior analysis (ABA) and speech therapy facilitate the development of social and communication skills and manage challenging behaviors in children with autism. Pharmacological interventions can address co-occurring conditions, such as anxiety, depression, or ADHD, commonly observed in individuals with autism. Specialized education and training, such as social skills training and occupational therapy, can assist in acquiring skills for daily living and improving overall functioning [19,20]. Furthermore, assistive technology, such as communication devices and sensory integration equipment, can help individuals communicate, manage sensory input, and engage with their environments. Finally, counselling and education of family members can enhance their understanding of autism and provide strategies to support affected family members.

1.2. Robot Assisted Therapy (RAT)

Socially assistive robotics (SARs) is a comparatively innovative discipline that uses robots to assist individuals through social interaction instead of conventional physical activities, such as assembly line work. Children with autism often demonstrate proficiency in grasping concepts related to the physical realm but may struggle with interpreting social dynamics, as noted by Klin et al. [21]. They often show greater receptiveness to feedback when delivered through technology rather than from a human source, as observed by Ozonoff [22]. Robot-assisted treatment (RAT) is a potential translational neuroscience field for children with ASDs. According to Robins et al., children with ASD often display greater interest and engagement in treatments that involve electronic or robotic components [23]. The work of Weir and Emanuel (1976) marked the beginning of the application of robotic technology to aid individuals with autism in their daily lives [24]. They employed LOGO, a mobile robot resembling a turtle that can communicate with patients in a carefully controlled setting. The environment served as a catalyst for developing relationships with the subject and improving communication by developing a shared worldview.
Recent research suggests that robots can enhance social skills, communication, and emotional regulation in individuals with autism. Nevertheless, it is crucial to recognize that robotic systems should not be considered a substitute for human contact and therapeutic interventions but rather as a supplement to existing treatments and interventions. They can mediate interactions and provide additional avenues for engagement and learning [25,26]. They can also interact with individuals with autism in a controlled and predictable manner, thereby providing opportunities to practice social skills in low-pressure environments. This predictability helps create a safe and controlled environment for therapy [27,28]. Additionally, these robots can help individuals recognize and respond to their emotions, manage sensory overload, communicate with others, and express their needs. They are equipped with AI and sensors that enable them to identify and respond to human emotions and social cues. This capability is crucial for engaging children in meaningful interactions [25,27,29]. The incorporation of features such as expressive faces, moving limbs, and verbal communication functionalities can help capture and maintain attention, improving facial expression skills, and engaging children more intensively than human therapists [29,30]. Robot-based interventions mostly involve proven techniques such as social stories, chaining, modeling, and incorporating special interests, as cited in Diehl et al.’s 2012 study [31]. Encouraging physical interaction with robots can increase motivation and engagement. Although it does not significantly influence performance, it is an essential mode of communication and can enhance the therapeutic experience [32].

1.3. Existing Review

A systematic review by [33] investigated the utilization of SARs in therapies for children with ASD, specifically focusing on early childhood (0–6 years). Only nine studies were included in the review, and no studies from 2024–25 were included. The review in [26] provides an account of commercial social robots that use children with ASD without considering interventions. The scoping review by [34] focused only on the use of AI in robots for ASDs, and [17] reviewed the diagnostic and assessment in the context of South Korea only. Finally, the review in [31] spans only 15 research articles and focuses on their clinical applications. Our review spans more than a hundred articles and includes studies published up to 2025. The multidimensional analysis offered in this study has not been reported previously.
This literature review aims to advance our understanding of early childhood therapies for children with autism and draw attention to the promising potential of robotics in managing this complex and diverse condition. As our knowledge and use of these treatments grow, we intend to paint a more comprehensive depiction of the lives of children and how their development can be enhanced by the successful integration of robots 146 into their care. We also investigated the diversity of therapeutic robots available for therapy. Additionally, from customized interventions and cutting-edge machine learning applications to ethical considerations in the design and implementation of social and emotional robots, these collaborative efforts show the commitment of the scientific community to address different facets of using robots to enhance the lives of children with ASD.

2. Materials and Methods

Research on RAT for children with ASD has been published in journals and conference proceedings in various fields, including rehabilitation, psychology, engineering, computer science, medical physics, and biomedical engineering. This section illustrates the filtering and article selection criteria and the process used to identify the relevant articles.

2.1. Research Questions

The following research topics, which center on both technological developments and real-world applications of RAT, were designed to direct the literature review and analysis:
  • RQ1: What social robots have been employed for RAT in children with ASD, and how do they differ in terms of functionality, performance, and cost?
  • RQ2: How are social robots integrated into therapeutic interventions for individuals with ASD, and which deployment strategies are the most effective
  • RQ3: Which developmental areas, communication modalities, and target behaviors are primarily targeted by robot-assisted interventions for ASD?

2.2. Search Approach

This review covers publications from 2005 to 2025 and uses the following databases.
Our literature search spanned three primary databases—Scopus, IEEE Xplore, and Google Scholar—covering studies published from 2005 to January 2025. To ensure comprehensive retrieval of relevant articles, we employed a structured set of Boolean search strings combining key terms and synonyms related to autism and robotic therapy. Variations in the methodological terminology were accommodated using a flexible search string. By merging synonyms for the keywords “humanoid,” “social,” “robots,” “autism, “and “assistive,” a comprehensive search string was constructed as follows:
  • S1—(“humanoid robots” OR “social robots”) AND “autism treatment”
  • S2—“Robot-assisted therapy” AND “autism spectrum disorder”
  • S3—“Assistive technology” AND “social skills” AND “autism”
  • S4—“Socially assistive robots” AND “autism treatment”
  • S5—“Robots” AND” autism treatment
These terms were selected to capture studies at the intersection of robotics, therapeutic intervention, and autism treatment, regardless of the specific robot models or intervention settings. Terminology variations were considered to enhance the sensitivity and inclusivity.

2.3. Inclusion and Exclusion Criteria

To ensure a rigorous and focused review, clear inclusion and exclusion criteria were established and consistently applied throughout the article-selection process. The inclusion criteria were designed to ensure the relevance, quality, and applicability of studies focusing specifically on RAT for children with ASD, while the exclusion criteria aimed to eliminate studies lacking empirical evidence or not meeting peer-reviewed standards. These criteria reflect our intent to include only studies that contribute meaningfully to understanding the technological and therapeutic dimensions of social robots in ASD intervention. We believe that this added clarity strengthens the transparency and rigor of our reviews.
Studies were included if they met the following criteria:
  • I1: Original research articles published in English.
  • I2: Explicitly focused on the use of robots in therapeutic interventions for children with ASD.
  • I3: Involved robot-assisted therapy targeting developmental or behavioral outcomes relevant to ASD
To maintain the specificity and quality of the review, studies were excluded on the following criteria:
  • E1: Did not involve either ASD or robotic applications
  • E2: Non-peer-reviewed sources, such as preprints, theses, and technical reports.
  • E3: Review articles, editorials, commentaries, and other secondary sources
Furthermore, when duplicate studies were identified from the same research group using the same methodology, preference was given to the article that provided the most comprehensive information. For example, conference articles were eliminated if the authors employed the same methodology for both journals and conference articles. The term “children with autism” in our review generally referred to individuals aged 3–14 years, as reflected in most of the included studies. These criteria ensured that the selected literature was relevant and robust, facilitating a high-quality synthesis.
In addition, the term “children with autism” in our review generally referred to individuals aged 3–14 years, as reflected in most of the included studies. These criteria ensured that the selected literature was relevant and robust, facilitating a high-quality synthesis.

2.4. PRISMA Flowchart

To enhance transparency and methodological rigor, the study selection process was documented using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework. A total of 591 studies were retrieved from the databases, and after removing duplicates and screening titles and abstracts for relevance, 201 full-text articles were assessed for eligibility. After applying the inclusion and exclusion criteria, 107 studies were included in the final analysis of this review. A PRISMA flow diagram summarizing the selection process is presented in Figure 1.

2.5. Data Extraction and Synthesis

Data extraction was performed using a structured Excel spreadsheet developed to capture the key attributes of each eligible study. For each included article, the following details were recorded: title, publication year, journal, authors, keywords, robot name, robot shape, participant type, number of participants, targeted developmental area, communication methodology, intervention setup, target behaviors (e.g., eye contact, imitation, emotion recognition), and documented challenges or limitations. The extracted data were independently reviewed for accuracy and consistency. Any discrepancies were resolved through discussions among the authors. The information was then synthesized using a narrative approach, categorizing studies according to robot type (e.g., humanoid, toy-like, animal-like), developmental focus (e.g., cognitive, sensory), and behavioral outcomes targeted in the interventions. This synthesis allowed a cross-comparative analysis of robotic platforms, therapeutic integration strategies, and efficacy across different participant profiles and clinical settings. Patterns and trends in robot usage, communication modes, and targeted skills were highlighted to provide a multidimensional perspective of the field. Tables and figures were created to visualize the distribution of robot types, intervention domains, and target behaviors, thereby ensuring a transparent and replicable review process.

2.6. Quality Assessment

To ensure the reliability and validity of the findings, a risk of bias assessment was performed for all included studies. It was done using the Joanna Briggs Institute (JBI) critical appraisal tools, with the appropriate checklist selected depending on the study type (e.g., randomized controlled trials, cohort studies, qualitative research). This approach ensured methodological rigor and allowed comparability across the diverse studies included in the review. Studies with significant limitations, such as poorly defined interventions or insufficient participant details, were noted and cautiously considered in the synthesis. This quality appraisal provides a framework for interpreting the strength and generalizability of the evidence in the literature reviewed.

2.7. Journal and Conference Publications

A thorough selection process identified 107 articles for in-depth examination, comprising 55 Journal Articles, 40 Conference Proceedings, and 2 Book Chapters. This curated collection encompasses a wide range of scholarly works that fulfill specific requirements.

2.8. Publication Map

Figure 2 demonstrates a general idea of the increasing usage of robots in treating autism over the years, showing a clear trend toward more publications on this topic, particularly in recent years. Only a few publications were published each year from 2005 to 2012, indicating the initial stages of research on the deployment of robots in autism therapy. During this period, the focus was on exploratory studies, initial trials, and establishing foundational concepts for robot use in therapeutic settings. There was a noticeable increase in the number of publications in 2013, with only one publication in 2015. By 2017 and 2018, the number of publications increased to seven, suggesting a growing interest in the validation of robotic interventions as a feasible method for treating autism. During this phase, research has expanded to cover distinct types of robots, target behaviors, and therapeutic outcomes, driven by initial positive results and technological advancements. This trend was maintained during the COVID period, and the most significant growth was observed from 2019 onwards, with seven or more publications annually, peaking at 12 publications by 2023. This trend reflects a strong interest in robot-assisted therapy for autism owing to advancements in robotics, artificial intelligence, and machine learning, which have enabled the development of modern and adaptable therapeutic robots for autism. The surge in recent publications indicates a shift from exploratory to applied research, including clinical trials and real-world applications in educational and therapeutic settings.

2.9. Frequency Analysis of the Keywords

The word cloud in Figure 3 provides a visual representation of the major research trends in robot-assisted therapy for ASD, focusing on social engagement, autonomy, and assistive technologies. The prominence of terms related to emotion recognition, attention, and communication suggests a strong interdisciplinary approach that integrates AI, psychology, and human–robot interaction to improve ASD therapy outcomes.

3. Results

3.1. Social Robots for Treating ASD

A robot designed to communicate with people and other robots is known as a social robot (SR). Robotic devices, known as social assistance robots (SARs), are designed to accompany, assist, and help people in a variety of settings, particularly those who might require extra company or exceptional care. They are designed to interact and communicate with humans socially, often mimicking human behavior and emotions [35]. SARs come in a range of sizes and levels of complexity, from tiny basic gadgets to more sophisticated robots with sensors and artificial intelligence that enable them to adjust to the requirements and preferences of the people they assist. They are increasingly being used in various settings, including healthcare, education, and therapy, to support and enhance human communication and interaction. They are also used in therapy to assist people with mental health illnesses such as autism, depression, and anxiety. These robots can be programmed to recognize and respond to emotions, thereby providing a safe and non-judgmental space for individuals to express their feelings and regulate their emotions. They are also used in educational settings to support learning and engage students. For example, a social robot can be programmed to provide one-on-one tutoring or assistance during classroom activities. They are also extensively used in healthcare settings to provide emotional support and companionship to patients. The morphology, degree of autonomy, and interaction modalities of robots employed in therapies for children diagnosed with ASD vary, reflecting different design philosophies of the robots. This review did not analyze every robot used to treat and educate individuals with ASD. We have mostly included robots designed to assist people with ASD that have been validated in clinical settings, research, and/or practice.

3.2. Humanoid Robots

The term “social humanoid robots” (SHR) refers to social robots that behave and appear like humans. They can be perfect additions to or replacements for the work performed by therapists, teachers, caregivers, and family members. SHR with anthropomorphic features can mimic human social behaviors, such as speech, gestures, and facial expressions. These robots have demonstrated promising results in educational and therapeutic settings after clinical validation in activities such as motor imitation, emotion identification, and social skills training.

3.2.1. Zeno

Zeno is a humanoid robot designed for use in educational and healthcare settings [36]. It can recognize emotions and respond to them and has been used to help children with autism develop social and emotional skills. It was developed by RoboKind (Dallas, TX, USA) and is only 17 inches in size, but is not very expressive, making it ideal for social and emotional interaction research. It is equipped with synthetic skin to provide a natural facial expression. In addition, the face is motorized to provide realistic eye movements, mouth synchronizations, and emotion displays. The ZENO robot (Hanson Robotics, Hongkong, China) has been used in interventions to establish its efficacy as a mediator of behavioral interventions for children with ASD. It has been reported to enhance facial emotion recognition skills in children with ASD compared to those of typically developing children, and it accurately conveys six basic emotions [37]. An ABA-based autonomous therapy system using a Zeno R-50 robot successfully detected reinforcers, and interventions indicated improvements in the therapy protocol [38]. Another study described control algorithms for human–robot interaction by mimicking the behaviors of the Zeno robot and humans. The study demonstrated that simple actuators, sensors, and control schemes can generate smooth and responsive robot trajectories, potentially aiding in clinical treatment and diagnosis [39].

3.2.2. Furhat

Furhat is a social robot designed for use in customer service and educational environments. It has a human-like face and can interact with users through natural language processing, making it useful for engaging and entertaining customers and students. OpenAI’s GPT-3 language model can have open chat-style interactions (OpenAI ChatBot). For example, a pilot project in Sweden used Furhat to help children with autism practice social communication skills, such as maintaining eye contact and taking turns in conversation. The children interacted with Furhat in a game-like setting designed to be engaging and fun while also encouraging social interaction. Other reports have investigated the potential of Furhat for teaching social skills to children with autism and providing social support to adults. Although these studies are still in their initial stages, they suggest that social robots, such as Furhat, may play a role in supporting individuals with autism and helping them develop important social and communication skills. The primary drawbacks of the Furhat robot are its high price and complex operation. Specialized knowledge is required to program and maintain such systems.

3.2.3. Milo

Milo is a humanoid social robot developed by Robokind, a robotics company based in Texas, United States. Milo is designed to help individuals with ASD improve their social and emotional skills [40]. Milo uses a combination of facial recognition, natural language processing, and social behavior modeling to engage individuals on the autism spectrum in an engaging, interactive, and conducive manner for learning. Milo has a human-like appearance and a friendly and approachable demeanor. They can display a wide range of facial expressions, gestures, and body movements, which helps individuals with ASD learn and practice important social skills, such as making eye contact, understanding emotions, and following social cues. Milo’s interactions are structured and customizable, allowing for personalized lessons that cater to unique needs and abilities.
The Milo robot is designed to engage children in both educational and therapeutic ways, focusing on areas where children with ASD typically struggle, such as joint attention, imitation, and communication skills [41,42,43]. Current research has some methodological limitations, making it difficult to draw firm conclusions about the long-term efficacy of robots such as Milo in autism [31,44].

3.2.4. Pepper

Pepper is a humanoid robot designed for industrial production that was initially launched in June 2014 to cater to business-to-business (B2B) applications [45]. Over time, its capabilities have been refined and adapted to meet business-to-consumer (B2C) needs, expanding its role in various customer-facing environments, such as retail, hospitality, and healthcare. One distinguishing feature of Pepper is its ability to exhibit natural body language, which enhances its human-like presence in interactions. The robot is equipped with advanced perception systems that allow it to recognize and respond to environmental stimuli, detect human presence, and navigate autonomously in hospital settings. Its multimodal interaction capabilities enable it to engage with users through speech, gestures, and movements, thereby creating seamless and intuitive communication. Pepper is integrated with innovative voice and emotion recognition technology and employs proprietary algorithms to analyze human facial expressions and voice tones. This allows it to dynamically adapt its responses and foster more engaging and personalized interactions with users. These advanced recognition systems enable Pepper to understand context, identify emotions, and react accordingly, making it an asset in industries that focus on customer engagement and automation. In addition to its sensor-equipped design, Pepper is built with high-level interfaces for multimodal communication, ensuring effective human–robot collaboration. Its ability to interpret emotions, understand spoken language, and respond in a socially intelligent manner makes it one of the most widely used humanoid robots for interactive and service-oriented applications.
Research on the use of Pepper robots for treating ASD has shown promising results in various therapeutic settings and has been used in multiple studies to enhance social skills, emotional recognition, and engagement in children with ASD. It has been used in intervention protocols to address emotional deficits using additional devices, such as the Empatica E4 bracelet, to collect physiological data, which helps in understanding the emotional states of children during therapy sessions [46]. Experimental trials have utilized Pepper robots in therapeutic laboratories to promote functional acquisition in children diagnosed with high-functioning autism or Asperger’s syndrome [47]. Benedicto [48] highlighted the importance of nonverbal communication in enhancing human–robot interactions for effective therapy. They can be programmed to improve joint attention, imitation, and social communication [49].

3.2.5. KASPAR

KASPAR (Kinesics and Synchronization in Personal Assistive Robotics) is a humanoid robot developed by the Adaptive Systems Research Group at the University of Hertfordshire in the UK [50]. It is specifically designed to assist children with ASD in developing social and communication skills in controlled and engaging environments [51]. It features a child-sized humanoid form with simplified facial expressions to reduce sensory overload, making interactions more predictable and comfortable for children with autism. The robot uses gestures, facial expressions, and speech to encourage social engagement, teach emotional recognition, and support activities such as turn-taking and nonverbal communication. The customizable interactions of KASPAR allow therapists, teachers, and caregivers to use it as an educational and therapeutic tool for children with autism. Studies have shown that children with autism often find it easier to interact with robots than with humans because robots provide consistent and non-judgmental responses to their queries. KASPAR is used in schools, therapy centers, and research institutions worldwide to support children with special needs and to improve their social interaction abilities.
The KASPAR robot has been used in multiple studies to determine its efficacy in facilitating behavioral interventions in children with ASD. It was intentionally designed as a communication tool that focused on repetitive forms of communication, aiming to simplify and make social interaction more comfortable for children [50,52]. It has also been successfully deployed to engage children through various games intended to help them view their surroundings from the robot’s viewpoint and improve their visual perspective-taking (VPT) [53,54,55]. Interventions utilizing it help enhance communication, psychomotor function, social skills, and imitation [50]. Touch sensors available in robots that can distinguish between gentle and harsh touches have been used to increase body awareness [56]. Although the children found their robotic partners more interesting and entertaining than their human partners, they appeared to collaborate and solve problems more effectively with their human partners. Other reviews of its effectiveness have also underscored the potential of this robot for intervention in children with ASD [23,55].

3.2.6. NAO

The NAO is a humanoid robot developed by SoftBank Robotics (formerly, Aldebaran Robotics). It is equipped with an inertial navigation device that helps maintain stability. Two ultrasonic sensors detected and avoided obstacles, thereby enabling precise movement. The pressure sensors regulate the corresponding center of pressure on each foot, balancing the weight of NAO. It can speak, listen, and execute a spatial acoustic location owing to its speakers and voice recognition system. It also boasts two high-definition CMOS cameras that allow forward vision. It is widely used in education, research, healthcare, and social robotics because of its advanced AI capabilities, human-like interaction, and programmability. The NAO has undergone multiple upgrades, with NAO V6 (the latest version) featuring improved AI, battery life, and vision-processing capabilities [57]. The NAO robot, along with an RGB-D sensor-based perception system, has been used to capture social engagement cues by comparing the behavioral and exploratory patterns of children with ASD to those with typical development (TD) in a 4-dimensional space (3D spatial + time) during a joint attention (JA) task while interacting with either a human or a robot. It is designed with features helpful for therapy [58] and promotes joint attention and supports the initiation of interactions [58]. Its deployment with game-based approaches also enhances social interactions by acting as teachers, toys, and peers [59]. It has been proven effective in improving verbal communication and social interaction by facilitating two-way communication, which significantly impacts children’s responses, engagement, and interaction [60,61,62]. It reduces anxiety during social interactions by providing a structured and predictable engagement model [63].

3.2.7. QTrobot

The QTrobot is an interactive humanoid robot designed by LuxAI, specifically developed to support children with ASD by improving their social, emotional, and communication skills. It is widely used in therapy centers, schools, and research institutions to facilitate structured and engaging interactions with children with developmental challenges. It features a digital screen that displays various facial expressions, helping children recognize and interpret emotions in a controlled setting. It uses a combination of speech, gestures, and visual cues to engage children, making interactions more predictable and less overwhelming than human interactions. QTrobot is effective in providing emotional training for children with ASD. It also has a positive impact on children’s attention, imitation skills, and the presence of repetitive and stereotyped behaviors [64].

3.2.8. FACE

The Facial Automation for Conveying Emotions (FACE) robot is a humanoid robotic head developed for human–robot interaction (HRI) research, emotion recognition, and autism therapy [65]. It was designed to replicate human facial expressions using artificial muscles and advanced AI algorithms. The artificial skin, a thin silicone mask with an actuation and sensing system, covers the artificial skull that makes up the head of FACE. It is created using life-casting processes and visually resembles the portrait of a subject’s head regarding structure and texture. Six primary emotions—happiness, sorrow, surprise, anger, disgust, and fear—can be repeatedly, flexibly expressed and modulated by FACE using servomotors and an artificial muscle architecture. It is also equipped with cameras for vision-based interaction (detecting facial expressions and eye contact), speech synthesis and recognition for verbal communication, and head movement and gaze tracking to enhance natural interactions. It can be integrated with other assistive technologies, such as EEG, AR, and VR. The FACE robot, designed to express and convey emotions, has been explored as a therapeutic tool for individuals with ASD.
It has been used in a few clinical trials to demonstrate its effectiveness as a mediator of behavioral interventions [66]. A robot’s adaptive therapeutic platform integrates data from wearable and ambient sensors to harmonize its expressions and movements with the inferred emotional state of the subject [67]. It improved imitative skills and shared attention among a small group of children with ASD [68], which was also confirmed by other studies [69,70]. These initial results have led researchers to propose that FACE treatment may help children with ASD develop pragmatic and emotional responsiveness skills. Children with ASD tend to see more in the face of a robot than in a human converser. This suggests that the FACE robot can serve as an effective training partner for face-to-face communication, potentially transferring these experiences to human interactions [71].

3.3. Animal Robots

Some studies support the employment of non-humanoid or ambiguously shaped robots, contending that children with ASD can benefit from reduced sensory overstimulation and sustained attention due to lower visual complexity.

3.3.1. Robo Parrot

Robo Parrot is a robotic bird designed to mimic the appearance, movements, and sometimes even the vocal abilities of real parrots [72]. The head and entire body movements of the robot are controlled using two motors. The neck, legs, and wings were moved using motors. Robo Parrot’s beak and eyes were controlled using a head motor, and the robot can open and close its wings, beak, and eyelids. It can also shift its neck to the left or right and its body in three different directions. Robo Parrot can sense when a hand is near its head and beak due to several sensors, including a microphone, an infrared sensor, and a Hall effect sensor. These robotic companions serve various purposes, ranging from entertainment and education to advanced applications in surveillance and artificial intelligence-driven interaction. Many Robo Parrots are interactive toys that respond to touch, sound, or voice commands and engage children and pet lovers alike. The system was designed to facilitate communication and social interaction in children with ASD.

3.3.2. Probo

Probo is a social and emotional robot developed at the Vrije Universiteit Brussel (VUB) specifically for human–robot interaction (HRI), with a special focus on children, particularly for therapeutic and educational purposes [73]. It was designed to simulate human-like emotions and social behaviors to support children undergoing hospital treatment or other stressful conditions during the pandemic. Probo has a plush, animal-like appearance that is intentionally non-threatening and friendly to children. Because Probo’s head is completely actuated and can display facial expressions, it can be used to portray emotions during a conversation. Its face has 20 degrees of freedom (DoF). The robot in the study by Pop et al. [73] was always operated by a human in a Wizard of Oz-style setup, enabling an immediate response to participants’ unexpected behaviors and reactions. In the realm of human–computer interaction, the Wizard of Oz experiment involves participants interacting with a computer system that they perceive to be autonomous but is entirely or partially managed by humans.

3.3.3. JARI

JARI is a social robot explicitly designed to support emotional and social interaction in children aged 6–8 with ASD [74]. The original design avoids resembling specific animals or humanoid figures and is intentionally designed to sidestep uncanny valley discomfort and reduce sensory overload. It uses a mix of tactile fabrics and gentle LED color cues tailored to engage users without overstimulation. It consists of mechanical and electronic modules built around a Raspberry Pi–Arduino system. It features movement, light, sound, and neck articulation, all of which are controlled using a web-based Wizard-of-Oz interface.

3.3.4. Kiwi

Kiwi is a tabletop robot tutor and friend [75]. It has an owl-like “skin” and is intended to be straightforward, gender-neutral, and non-threatening. An animated face with two eyes, eyebrows, and a mouth was observed on the smartphone, which was placed in front of the face. Visemes were used to move the mouth, and the Facial Action Coding System (FACS) served as the basis for the affective facial expressions. Simple bending and squash-and-stretch bodily gestures were used with Kiwi’s speech and facial expressions to make communication seem natural. The Co-Robot Dialogue system (CoRDial), a software stack that controls the robot’s speech, movement controllers, and face animations, operated the robot as in other SPRITE deployments.

3.3.5. iCat

The iCat is a humanoid robot stimulating focus group discussions. It can simultaneously perform multiple interactive tasks. This includes speaking to users, tracking their head movements, performing lip-syncing and blinking, and displaying facial expressions [76]. The robot mechanically rendered facial expressions, making them more relatable and engaging for users [37]. Philips Research has made the iCat platform available to universities and research laboratories to stimulate research in Human–Robot Interaction (HRI) [76,77,78].

3.3.6. Paro

Paro is a therapeutic robotic baby harp seal that may be used in nursing homes and hospitals. The robot is designed to react to its name and cry out for attention. It has an off switch [79]. The PARO seal robot was used as a social mediator in groups of kids with neurodevelopmental difficulties in research done by Veronasi et al. [80]. The statistical analysis’s findings indicated that among individuals with autism who did not have intellectual impairment, there was a positive association between the “interaction” dimension and the PARO seal. These findings demonstrate how the PARO robot can help kids with autism who do not have intellectual impairments develop their social and communication abilities [80].

3.4. Toy Robots

By creating a friendly organic shape that does not conjure up any animal, the non-recognized zoomorphic design seeks to strike a balance between enhancing approachability and avoiding the triggering of previous associations. Researchers have highlighted the significance of modifying a robot’s realism to meet therapeutic objectives. Less realistic designs produce a more controlled and less frightening environment, whereas more realistic robots may aid children in extrapolating learned behaviors to human contexts [81]. Therefore, the intended instructional or therapeutic goals, usage context, and individual needs should be considered when choosing between humanoid and nonhumanoid forms. Their capabilities can be enhanced by equipping them with artificial intelligence and sensors that enable them to identify and respond to human emotions and social cues. This capability is crucial for engaging children with autism in meaningful interactions.

3.4.1. Robosapien

Robosapien is a biomorphic robot designed by Mark Tilden and manufactured by WowWee Toys. An IR remote control with 21 distinct keys for different tasks was preprogrammed and used to control robot movements. Using this remote control, approximately 67 distinct executable robot functions can be performed. These consist of using feet to walk, both hands to hold objects, a tiny loudspeaker unit to produce preprogrammed sound effects, and two light-emitting diodes (LEDs) to form the eyes [81].

3.4.2. DREAM

The DREAM project developed a supervised autonomous system called robot-enhanced therapy (RET) to implement interventions for children with autism [82]. The system follows a modular approach and can be easily used and adapted by other research groups in the future. Studies have shown that RET is a promising approach that may be as efficient as classical interventions for many outcomes in children with ASDs. A simplified version of the DREAM system was implemented as an Ask NAO Tablet application for wider use. The applications will be evaluated in children with ASDs, and therapist feedback will be used to update them.

3.4.3. Lego NXT

The LEGO NXT Platform is a programmable robotics system developed as part of the LEGO Mindstorms NXT. It does not have a single fixed shape but is a modular and customizable robotic system that allows users to build different robot configurations. This robot is used as an educational intervention to improve social interactions in children with ASD by incorporating socially assistive robotics (SAR) and direct instruction [83].

3.5. Sensory Integration Techniques Used in Social Robots for Treating Autism

3.5.1. Multisensory Processing

Social robots often use audio-visual cues to enhance their social interactions. These cues are simplified versions of human interactions, making it easier for children with autism to process and respond to them. This approach helps improve engagement and reduce social anxiety by providing a controlled and predictable environment [27,84]. They are also equipped with emotion-based gestures and facial expressions to facilitate emotional recognition and social-behavioral learning. This multisensory approach includes visual and auditory feedback to reinforce positive social interactions [85,86].

3.5.2. Personalized Interventions

Advanced AI and machine learning algorithms enable robots to adapt to the individual needs of each child, thereby providing personalized therapeutic interventions for children with autism. This customization helps address specific sensory needs and improves the efficacy of therapy [27]. Vision- and audio-based monitoring systems are used to quantitatively measure interactions, allowing for precise adjustments to therapy plans based on a child’s progress [85,86].

3.5.3. Interactive and Engaging Environments

Some interventions incorporate augmented reality (AR) technology to create multimodal and interactive environments. This approach aims to enhance sensory integration by providing a rich and engaging platform for children to practice their social skills [87]. Some robots are designed to provide consistent and repeatable stimuli, which are crucial for children with ASD, who often benefit from predictable and structured environments [88].

3.5.4. Data Collection and Analysis

Robots collect detailed data on children’s responses and progress, facilitating more accurate assessments and enabling therapists to make informed adjustments to therapy [27,89].

3.6. Comparison of Physical Features of Social Robots

Table 1 provides a comparative overview of various humanoid and socially interactive robots, highlighting their physical attributes, processing capabilities, sensors, and mobility. The robots listed in the table vary in size, weight, and application areas, with some designed for social interaction, education, and therapeutic purposes. The weight of the robots listed ranges from 0.8 kg to 28 kg and height from 16 cm to 167 cm. The processor column indicates whether the robot is powered by Intel, ARM, or other processors, which affect its computational performance. Intel is a CISC (Complex Instruction Set Computer) developed by Intel Corporation, and it is known for high performance. It is widely used in desktop computers as well as some popular robots, including Pepper and Furhat etc.
The ARM (Advanced RISC Machine) is a central processing unit (CPU) that uses the RISC (Reduced Instruction Set Computing) architecture. It is known for its energy efficiency and is widely utilized in robots. The examples include Jibo, Kuri, and Temi, etc. The Degree of Freedom (DoF) column refers to the number of independent movements or parameters that define the robot’s ability to move in 3D space. Each DoF represents one independent motion, such as moving up-down, left-right, forward-backward, or rotating around an axis. DoF varies from 3 (Furhat) to 83 (Sophia) with many others having a moderate range of around 20 (NAO). The sensor suite of each robot is also detailed, including cameras, microphones, touch sensors, and depth sensors, which enable the robots to perceive and interact with their environments.

3.7. Analysis by Participant Types

A total of 1819 participants participated in all studies included in this review. The participants spanned a range of demographics and conditions, highlighting the diverse applications of robotics in the field of autism research. Most of the studies reviewed recruited participants specifically for clinical validation, whereas fewer than 5% utilized pre-existing datasets developed by other researchers (Figure 4). Approximately 80% of the studies involved fewer than 24 participants, with only 20% including sample sizes of 23–46 participants. The largest participant group was reported by [90], which included 67 individuals, followed by Lohan et al. [91], who had a sample size of 64. Notably, 10% of the studies were based on a single-subject design involving only one participant. In one case [51] an existing dataset of 300 children was used to investigate the impact of robots.
For instance, refs. [51,53,92], studied children with ASD only, whereas Gkiolnta et al. [41] focused on a unique comparison between a girl with ASD and intellectual disability (ID) and a typically developing boy. Palestra et al. [49] explored interactions with three high-functioning children with ASD, while Cao and Arshad et al. [82,93] worked with larger groups of 21 and 8 participants, respectively. Some studies, such as those by [94], have included a broader scope of participants, including adolescents with ASD, typically developing controls, and parents of both groups. Other studies, such as those by [95], focused on children aged 6–12 years with ASD, while [68] included preschoolers who spoke Chinese, highlighting the cultural and linguistic diversity in autism research.
The unique study included a single-case analysis by [96], which examined a 24-year-old man with schizophrenia and autism. Similarly, Anzalone et al. [58] and Gkiolnta et al. [41] investigated children with autism and typically developing controls, reflecting the comparative nature of these studies. Some studies have focused on specific subgroups, such as typically developing children and those with additional developmental challenges. Several studies have included small participant groups. For example, Mariniu et al. [53] worked with 7 children with autism, while Mishra et al. [97] and Wainer et al. [54] studied 6 children with autism. Studies by Robins et al. [7] and Pioggia et al. [69] involved 4 children with autism, focusing on smaller, controlled interactions.
However, many studies have involved only moderate participant numbers. For instance, refs. [37,98] studied 12 children each. Furthermore, studies by [99,100] engaged 15 children with autism, while Arshad et al. [93] studied 8 children with autism, and Mengoni et al. [52] expanded the study to a group of 40 children with autism. Larger studies include significantly more participants, allowing for broader analysis. Holeva et al. [95] studied 51 children aged 6–12 years with autism, and Kumazaki et al. included 46 [94] and 34 [101] participants in two separate studies, typically developing children and their parents. Other large studies [90,102,103] involved 21, 35 (19 children with autism and 16 neurotypical), and 16 subjects, respectively. Lohan et al. [91] extended the participant base by examining 31 children with autism and 33 typically developing children.

3.8. Classification Based on Robot Used

57 different robots were used in the studies included in this review. These robots vary widely in form and function, ranging from widely adopted humanoid robots, such as NAO, Zeno, KASPAR, Pepper, and FACE, to more specialized or experimental platforms, such as TeoG, Cogui, Tinku, Buddy, and various custom-built systems. This diversity in robot selection reflects the field’s exploration of multiple design paradigms—humanoid, animal-like, toy-like, android, and virtual avatars—to address the unique therapeutic, educational, and social needs of children with autism. As shown in Figure 5, NAO emerged as the most frequently utilized robot, appearing in 42 studies and accounting for 39.25% of the total number of studies reviewed. NAO’s popularity stems from its humanoid design, expressive capabilities, and programmability, making it highly suitable for social, emotional, and communication-based interventions with children on the autism spectrum. Following the NAO, Zeno, and KASPAR were the next most common robots, each used in seven studies (5.47% of total). Zeno is particularly notable for its lifelike facial expressions thanks to its Frubber skin, while KASPAR, a child-sized minimally expressive humanoid, is valued for safe physical interaction and teaching social cues. Other moderately used robots included FACE and Pepper (both used in five studies, 3.91%) and Actroid-F (four studies, 3.13%). FACE and Actroid-F are advanced androids designed with highly realistic facial expressions that are often used in emotion recognition and gaze studies. However, Pepper is a taller humanoid robot with a touchscreen that is extensively used in educational and therapeutic settings. A range of robots, including Robota, Paro, Probo, Ifbot, LEGO NXT, and Alice, appeared in two to three studies (1.56–2.34% of all studies). These vary in form and function, from animal-like designs such as Paro to toy-based or modular humanoid platforms such as LEGO NXT and Robota. Beyond these, the list included over 40 different robots used in only one study (0.78%). These include specialized or experimental platforms such as Touchball, Tegu, Teog, Cogui, and Tablet avatars, as well as custom-built robots such as PvBOT and DreamRobot. This wide range reflects the innovative and exploratory nature of robot-assisted interventions for autism. The designs span from humanoid and android robots (like NAO, Zeno, Actroid-F) to animal-like robots (such as Paro), and toy-like or virtual systems (like Keepon, Robosapien, Cozmo, and digital avatars).

3.9. Classification Based on Robot Shape

The distribution of robot shapes used in autism-related research reveals a strong preference, suggesting that humanoid robots account for 74 out of 107 total instances, approximately 69.16% of all robots employed across studies, as shown in Figure 6. Humanoid robots, such as NAO, Zeno, KASPAR, and Pepper, are widely favored because of their human-like physical features and behaviors, including facial expressions, gestures, gaze, and speech. These qualities make them ideal for interventions aimed at improving social interactions, emotional recognition, and communication. This human resemblance facilitates engagement and enables structured therapeutic interactions that mimic real-life human communication. The second most common category was toy-like robots, which were used in 21 studies (20.19% of the total). Robots such as Keepon, Cozmo, Robosapien, and LEGO NXT are often smaller, simpler, and less intimidating than humanoid robots. These are designed for play- or game-based interventions, focusing on attention enhancement, behavioral engagement, and motor coordination. Toy-like robots are especially useful for children with lower comfort levels in social situations and for those who respond better to playful stimuli than to social interaction. Animal-like robots appeared in five studies (4.81%), most notably Paro, a baby seal, and Pleo, a dinosaur. These robots provide emotional comfort and reduce stress and are frequently used in calming or companionship-based interventions, mimicking the effects of animal-assisted therapies.
Overall, the data highlights a clear preference for humanoid robots because they resemble human form and can simulate social behaviors. However, the inclusion of simpler toy-like robots and virtual avatars indicates a growing interest in the accessibility and diversity of intervention approaches. While the NAO robot dominates the field in terms of usage, the breadth of other robots used, even in single studies, reflects a dynamic and evolving research landscape in autism therapy.

3.10. Article Classification by Children’s Developmental Area

Research studies related to robotics for autism target different developmental areas, which are referred to as the specific domains of skills or abilities that the intervention or research aims to enhance, support, or address. These areas are typically aligned with the challenges faced by individuals with autism and can include cognitive, sensory, or a combination of both. The choice of the development area depends on the specific needs of the target population. For example, children who struggle with social communication may benefit from cognitively focused intervention. Children with sensory sensitivity may require sensory-focused activities to effectively manage and process stimuli. Many individuals with autism require support in both areas, making integrated interventions crucial for their comprehensive development.
Identifying developmental areas is important because it allows researchers and therapists to design robots and activities that address specific developmental needs and help track improvements in specific skill domains. Cognitive development focuses on enhancing mental processes. This category includes studies that focus on cognitive aspects such as learning, communication, emotion recognition, and social interactions [7,32,37,41,50,52,53,54,58,63,64,69,90,93,95,96,97,103,104,105,106,107,108,109,110,111]. These studies rely on robots to enhance cognitive skills and often involve analyzing complex behaviors and decision-making processes. They emphasize the understanding and interpretation of behaviors such as joint attention, emotional understanding, and social communication. Robots have been used to create structured interventions that foster improved cognitive outcomes in patients with dementia.
Sensory development involves sensory processing, body awareness, and motor coordination. Many individuals with autism experience sensory sensitivity and other challenges, making this area of research vital. This category includes studies focusing on sensory aspects, such as body awareness, physical interaction, and sensory-motor coordination. These studies involve the detection of sensory inputs, such as gaze, gestures, and facial expressions, to foster better sensory integration [97,100,112,113]. Robots can detect and respond to sensory stimuli, thereby creating an engaging environment that supports sensory integration in children with autism. These studies are particularly beneficial for developing motor skills and understanding physical interactions in children.
Some studies have integrated cognitive and sensory development areas [49,55,90,98,99,114]. Robots in this category assist in recognizing sensory inputs while enabling cognitive responses, making them comprehensive and versatile. These studies emphasize the dual role of robots in capturing sensory data (e.g., gaze and motion) and providing cognitive responses (e.g., decision-making and behavior shaping). They are ideal for fostering holistic development and simultaneously addressing children’s sensory and cognitive challenges.

3.11. Article Classification Based on Communication Methodologies

Robots are children’s best friends when it comes to communication and interaction development, as they help improve their linguistic and verbal abilities. By posing queries and exhibiting social behaviors, they adjust their interactions to the child’s level of responsiveness, offering encouragement and clearing any misunderstandings. Robots promote cooperative play through cooperative exercises and structured games, which help children develop critical social skills in an entertaining and encouraging manner. Diverse types of communication methodologies have been used in studies, including direct communication, nonverbal cues, interactive behavior, and social interaction. These strategies focus on enhancing the interaction between robots and individuals with autism by targeting key developmental areas such as social, emotional, and physical skills.
Direct communication is crucial for developing fundamental skills in the field. By providing predictable and consistent feedback, robots reduce the anxiety often associated with human interaction, making it easier for individuals with autism to practice and refine their skills. Direct communication involves robots engaging in explicit verbal or non-verbal exchanges with individuals. These methods often simulate natural human interactions and provide structured learning opportunities. Marinoiu et al. [53] used robots to enhance human–robot interaction through action and emotion recognition, thereby promoting social and emotional understanding. Mishra et al. [92] used robots as prompters in social interactions to guide individuals through conversational exchanges.
Nonverbal communication leverages the ability of robots to use gestures, facial expressions, gaze tracking, and other physical signals to convey meanings. This approach is particularly beneficial for individuals with limited verbal abilities. Robots such as FACE use facial expressions to convey emotions, stimulate social interactions, and foster nonverbal communication skills [69]. Robots engage children in joint attention tasks using gaze and pointing gestures, thereby enhancing their ability to share their focus with others [82]. Nonverbal cues are the foundation of social interactions. Robots that utilize such cues help individuals with autism recognize and respond to social signals, which they often find challenging to interpret. These interactions can improve emotion regulation and understanding among users.
Interactive behavior involves structured activities such as turn-taking, imitation, and role-playing. These activities are designed to enhance social and cognitive skills through active participation. By simulating real-world scenarios, robots can help individuals practice essential social behaviors in controlled and supportive environments. Gkiolnta et al. [41] used robots for assistance in structured educational setups, encouraging participation and learning through interaction. They can also be used to engage in imitation and turn-taking games, providing a playful and engaging environment for skill-building [7].
Social interaction focuses on building interpersonal skills, such as empathy, collaboration, and emotional expression. Robots in this category aim to create meaningful and dynamic social interaction. Social interaction strategies are essential for addressing core deficits in autism, such as difficulties in forming relationships and understanding social norms. Robots provide non-threatening partners to practice these skills and promote confidence and independence in social settings. Research on engagement and social communication has also become prominent, with [54,103,111] investigating how eye contact, cooperative play, and emotional interactions contribute to improved communications.
Robots perform role-play in scripted scenarios, helping individuals practice social and play skills in a structured way. Kasper uses gestures, facial expressions, and speech prompts to encourage interaction and coordination and to bridge social gaps [54].
Physical interaction focuses on tactile and motor activities, encouraging individuals to explore and interact with their environment through touch and movement. It helps develop body awareness, motor coordination, and sensory integration. In addition, physiological and physical interaction studies, such as those by [32,56], have explored how body part identification, eye contact improvement, and physiological responses indicate engagement in robot therapy. Ref. [56] used touch sensors in robots to distinguish between gentle and harsh touches and teach individuals appropriate physical interactions. [32] facilitated physical interaction through robots to foster engagement and sensory exploration. These activities are particularly beneficial for individuals with sensory sensitivity or difficulties in developing fine and gross motor skills. In an article by [53], a robot was used to improve interaction, where the therapist initiated the action through the robot and annotated the reactions.

3.12. Categorization Based on Target Behavior

The target behavior is a specific action, response, or pattern of behavior that is identified, observed, measured, and modified in behavioral studies, therapeutic settings, or intervention programs. In ASD, these are specific skills or responses that children are trained to develop. The most notable target behaviors found in ASD studies include joint attention, eye contact, imitation, turn-taking, role-playing, emotion recognition and expression, self-initiated interactions, triadic interactions, and dyadic interactions. A sizable portion of research has investigated joint attention and social engagement, which are critical developmental skills for children with autism.

3.12.1. Joint Attention

This refers to an individual’s ability to share a focus on an object, event, or activity with another individual. This elementary social communication skill allows people to engage in meaningful interactions by coordinating their attention with others. Children with autism often struggle with these skills, making it difficult for them to develop social, cognitive, and linguistic abilities. Studies have revealed that RAT is superior to human-agent approaches in improving joint attention in children with autism [115]. To accomplish this, we must practice shifting our attentional focus, tracking things, and sustaining our focus. Robots can easily recognize and reinforce the performance of these skills. A total of 42 (39.62%) studies included in this review explicitly addressed joint attention as a target behavior. The NAO robot is by far the most dominant platform used for studies involving joint attention, appearing in 17 studies, making up 40.48% of the studies related to joint attention. Pepper, Virtual Avatar, and Robota were used in two studies, and other robots, such as KASPAR, Cozmo, INSIDE, Hello Bot, and Buddy, were used in one study.
Joint attention can be divided into two classes: (1) response to joint attention (RJA), which is the ability of a child to follow the eye gaze and/or point to another person, and (2) initiation of joint attention (IJA), which is the ability of a child to voluntarily seek out another’s attention or use eye gaze and/or pointing to share their experience of an object or event. Kumazaki et al. [96] used CommU, a social robot with a movable head and large eyes, to help children with autism and typically developing young adults with appropriate cognitive functioning. The results indicated that children with autism showed improvements in RJA tasks with humans after practicing RJA using CommU. So et al. [107] used an indirect method to teach IJA and RJA to young autistic preschoolers with low assistance needs, in contrast to [96]. In three distinct plays, they programmed two NAO robots to talk to and perform fictional actions on stage. Following the two views of each drama, the children were asked to role-play with the robot as a character in the drama. Using the robot, the students switched roles and played different characters. According to the findings, children in the intervention condition generated noticeably more IJA in the post-test than in the pre-test, even though they had previously attained a prominent level of RJA in the pre-test. In contrast, robotic intervention had negligible effects on JA according to some studies, such as Zheng et al. [105] and Srinivasan et al. [116,117]. In two different studies, Srinivasan et al. [116,117] examined how rhythm and robotic therapies affected the JA of 36 school-aged children with autism and developmental delays, as determined by the Standardized Test of Joint Attention (JTAT) (32). But no study has yet reached a consensus on the effectiveness of robots in enhancing JA in children with autism. Although So et al. [107] reported that robot dramas have positive learning outcomes, but they did not compare the effectiveness of learning with human-based interventions. Furthermore, only a few studies have investigated whether robotic interventions promote JA more effectively than human interventions [94,117].

3.12.2. Eye Contact

The act of two people looking at each other’s faces, particularly their eyes, is known as eye contact. In addition to keeping an eye on each other ‘s eyes and facial expressions, it also helps to coordinate interactions and create mutual recognition. It is a crucial aspect of social communication, and children with autism often struggle to maintain consistent gaze and social engagement. People with autism show little reciprocal use of eye contact and rarely engage in interactive games. Several studies have explored how robot-assisted interventions can improve eye contact behaviors in ASD therapy by leveraging humanoid robots, AI-driven gaze tracking, and multi-agent interactions. Of all the studies included in this review, 34 (32.07%) addressed eye contact behavior in human–robot interaction research for children with autism. The NAO robot was the most common, appearing in six studies, followed by Pepper in four, and KASPAR in three studies. Other robots, such as Zeno RoboKind, Robota, FACE, Actroid-F, and Hello Bot, appeared less frequently, typically in one study. Humanoid robots dominate the field of robot design and account for most studies. Humanoid robots are often child-sized to enhance the relatability and engagement of young research participants. A few studies employed vehicle-shaped robots (such as Code Rocky) or zoomorphic designs (such as DreamRobot, shaped like a rabbit), while others experimented with virtual avatars such as “Danny,” which provided cartoon-like interaction platforms. The strong preference for humanoid robots reflects their effectiveness in stimulating eye contact behaviors, as their familiar facial features and body gestures naturally elicit gaze-following and joint-attention responses. Some studies have explored softer and more abstract designs to accommodate children who are sensitive to rigid or overly anthropomorphic robots.
Cao et al. [90] conducted a comparative study on eye contact between children with autism and typically developing children using the NAO robot. The study found that children with autism responded well to robotic gaze cues, leading to increased eye contact duration, whereas typically developing children exhibited more spontaneous engagement with robots. The structured and predictable nature of the robot’s gaze shifts helped children with autism gradually improve their attentional focus, reinforcing the importance of robotic interventions in eye-contact training. Shamsuddin et al. [104] investigated how children with autism interact with the NAO humanoid robot during eye-contact tasks. The findings suggest that children maintained their gaze longer with NAO than with human therapists because of the non-threatening and predictable nature of robotic interactions. This study highlighted that robots using structured eye movements, repetitive gestures, and verbal prompts were more successful in eliciting prolonged eye contact than unstructured human-led interactions. Mehmood et al. [118] used AI-driven adaptive gaze tracking and reported that children maintained longer attention spans when the robot dynamically adjusted its gaze and head orientation in response to the child’s engagement. Unlike static or preprogrammed gaze behaviors, real-time gaze adaptation enables more natural and engaging interactions, reinforcing eye contact behaviors in ASD therapy. Wainer et al. [54] examined small-group technology-assisted instruction, where a virtual teacher and a robotic peer worked together to improve eye-contact behaviors in ASD children. The study found that children in multi-agent settings displayed higher levels of eye contact than those in one-on-one robotic interactions. The presence of a robotic peer encouraged social motivation, leading to more sustained and meaningful eye-contact exchanges between children and robots. Kim et al. [17] explored the effectiveness of multi-robot therapy and reported that children engaged more when two robots worked together, using synchronized gaze shifts and verbal prompts to encourage reciprocal eye-contact behaviors. The findings suggest that multi-robot therapy may be more effective than single-robot therapy in fostering sustained eye contact engagement. Kobayashi et al. [119] introduced eye-tracking technology and emphasized the role of real-time feedback mechanisms in optimizing eye-contact interventions, making therapy more adaptive and effective. Studies by [41,53,92] explored the role of eye contact in fostering meaningful interactions. Both Cao and Palestra [49,90] examined how gazing behavior and turn-taking mechanisms enhance engagement in social robotics. Similarly, imitation and gesture learning, which are essential for social development, have been explored in studies [7,68,102] focusing on gesture accuracy, shared attention, and role-switching behaviors.

3.12.3. Imitation

Of all the studies included in this review, 37 (32.07%) addressed imitation behavior in human–robot interaction research for children with autism. A variety of robots with a clear preference for humanoid designs were used in reviewed imitation-related studies. The NAO robot was the most widely utilized platform, appearing in eight studies, followed by KASPAR in four studies, and Pepper and FACE robots, each in three studies. Other robots, such as Zeno, Robota, DreamRobot, Hello Bot, and combinations such as Buddy, Navel, and NAO, have been used in individual studies. Humanoid robots dominated the shape category, accounting for approximately 80% of all imitation studies. Within this group, some robots were further specialized as child-sized humanoids or humanoids with highly expressive facial features, such as Zeno RoboKind and FACE, making them particularly suitable for imitation tasks involving gestures, emotions, and facial expressions. In contrast, only a few studies have used zoomorphic designs, such as rabbit-shaped robots (e.g., DreamRobot), or soft/inflatable designs, such as modified TeoG robots. Toy- and cartoon-like robots, although present, were significantly less common. The overall trend shows a strong bias toward using humanoid robots for imitation tasks because of their human-like form, which naturally elicits mimicry behaviors such as gaze following, gesture imitation, and social engagement among children with autism. Many studies have focused on mimicking robot gestures (e.g., arm/hand movements and musical rhythms), while others have emphasized socio-cognitive outcomes, such as empathy and social responsiveness.
Ali et al. [100] and Melo et al. [112] demonstrated that robots can facilitate spontaneous imitation, joint attention, and language acquisition in structured learning environments. Pioggia et al. [69] and Bhargavi et al. [108] analyzed how robots can imitate and respond to emotional cues to enhance human interactions. Similarly, imitation and gesture learning, which are essential for social development, have been explored in studies by [23,65,102], focusing on gesture accuracy, shared attention, and role-switching behaviors.

3.12.4. Turn-Taking

Turn-taking encompasses verbal or nonverbal exchanges between interactional partners and involves actions such as orienting to social cues, turning quickly, avoiding overlaps, observing the task that requires a response, and providing an adequate response [120]. Children with autism frequently struggle to keep their turn and consider the viewpoint of the listener, which is necessary for successful turn-taking [121]. According to Broz, Nehaniv, Kose-Bagci, and Dautenhahn [122], “Turn-taking behavior dynamics arise from the interaction between two partners; therefore, humans manage to understand when to start and stop their turns in social interactions based on the context and purpose of the interaction and on the feedback from the social interaction partners.” Robots can be used to teach children to wait for a response from the device after acting in a way that promotes turn-based engagement. Before indicating the next action, the child must wait for the robot to complete the sequence.
Eighteen studies focusing on turn-taking as a target behavior were identified in this review. Among these, the NAO robot was the most used platform, appearing in four studies, whereas other robots, such as KASPAR, Robota, Cozmo, and Cogui, were used individually in one study. Some studies also employed combinations of multiple robots, including humanoid, animal-like, and toy-like robots (e.g., NAO, Kaspar, Probo, Zeno, Paro, Keepon, Robovie, Huggable, Pepper, and Tega) to enrich the turn-taking scenarios. Categorized by robot shape, humanoid robots dominated the landscape and were used in approximately 67% of the studies. Other shapes include vehicle-shaped robots, such as Code Rocky; zoomorphic robots, such as Paro; and toy- or cartoon-like robots, such as Cozmo and Buddy. The preference for humanoid robots in turn-taking studies highlights the importance of human-like embodiment in facilitating interactive and socially contingent behaviors, particularly in therapeutic or educational interventions for children with autism. The use of various robot types in a few studies reflects growing efforts to diversify interaction styles and adapt to different child preferences and engagement levels.
Autonomous robots with real-time gaze tracking [105] and multi-robot therapy setups [17] have been particularly effective in sustaining engagement and encouraging turn-taking. To improve engagement and interaction, therapists in [53] employed robots to initiate actions and annotate their responses. Cao et al. [82] emphasized how RAT can improve social contact and communication between educators and students, as well as between them and social humanoid robots, in therapeutic activities. Ali et al. [114] and Melo et al. [112] demonstrated how robots can facilitate spontaneous imitation, joint attention, and language acquisition in structured learning environments.

3.12.5. Emotion Recognition and Expression

Many children struggle with social cognition, regardless of their cognitive or language abilities. Individuals with autism typically do not express their feelings in a manner that is easy for others to identify and comprehend. They either do not react emotionally or their reactions can occasionally seem excessive. Social awareness, perspective taking, and implicit theory of mind deficiencies are all components of social cognitive impairment [123]. In the reviewed studies that focused on emotion as a target behavior, 39 papers were identified. Among these, the NAO robot was the most frequently utilized platform, appearing in nine studies, followed by Pepper in four studies and various forms of FACE robots in another four studies. Robots such as Zeno RoboKind and its variants (Zeno R-50 and Zeno R25) have also appeared in multiple studies that emphasize emotional engagement with them. Several studies employed combinations of robots, including humanoid robots such as NAO, Kaspar, Probo, and Zeno, often combined with cartoon-like or animal-like robots like Keepon and Paro. When categorized by shape, humanoid robots dominated and were used in more than 80% of studies. Robots with highly expressive facial features, such as FACE and Zeno, are particularly favored in studies requiring emotion recognition, mimicry, or emotional engagement tasks. In contrast, only a few studies have explored animal-like designs (e.g., Paro) or cartoon-like robots (e.g., Buddy and Navel). Robots with integrated facial displays, such as Pepper, which has a chest-mounted screen, are notably effective in facilitating emotional communication.
Studies have demonstrated that children with autism can significantly improve their contextualized emotion recognition, comprehension, and emotional perspective-taking using robot-assisted therapy [17,124,125]. Social robots offer a unique and captivating platform for therapeutic interventions. Emotional recognition and response have also been widely studied, with Marino et al. [115] and Lecciso et al. [37] focusing on detecting and expressing fundamental emotions, respectively. Marino et al. [115] designed robot-based game activities and discovered that they improved their capacity to comprehend the thoughts, feelings, and beliefs of others. Arshad et al. [93] and Fuentes-Alvarez et al. [126] studied how robots can measure and adapt to user engagement levels to improve cognitive performance.
An example of the HomeLab setting used to develop emotion recognition skills in ASD. The intervention setting comprised a humanoid robot, NAO, which served as a co-therapist by providing emotional and communication prompts and reinforcements with partial autonomous control. It also contains cameras integrated into a processing module capable of providing real-time behavior inference data and a wide-screen LED TV.

3.12.6. Social Interaction

According to the currently available evidence, poor social interaction is the most basic and enduring core characteristic of ASD [127]. An increasing number of researchers are interested in creating interactive robots that can socially interact with children. In the reviewed studies focused on social interaction as a target behavior, 19 papers were identified. The NAO robot was the most used platform, appearing in six studies, either individually or in combination with other robots such as Buddy and Navel. Other robots featured in these studies included Pepper, Kaspar, Cozmo, Pleo, FACE, Dream Robot, Lego® NXT, iCat, and combinations such as Touchball and Ifbot, among others. When categorized based on robot shapes, humanoid robots were the predominant choice, accounting for approximately 70% of the studies. These humanoid robots often feature enhanced facial expressions, as seen in robots such as FACE and iCat, making them particularly suitable for studying social-engagement behaviors. A small number of studies have employed animal-like robots (e.g., Pleo, DreamRobot) and toy-like designs (e.g., Cozmo, Lego NXT), indicating a diversification of approaches to address different child interaction preferences. Additionally, hybrid systems that combine multiple robots and tablet-integrated humanoid platforms, such as Pepper, have been explored to enrich social interaction scenarios. Overall, the strong emphasis on humanoid robots highlights the importance of human-like embodiment and expressive capabilities in facilitating effective social communication interventions for children with autism.
The best way to evaluate social interactions with those with autism is to examine how often and to what extent they interact with others. The number of times a person interacts with someone or something else during a given period is known as interaction frequency. Measuring the frequency of social interactions allows for the acquisition of essential information regarding the level of engagement and social connections. Duration quantifies the time spent interacting with others. By calculating the interaction time and duration, it is possible to distinguish between short- and long-term encounters, offering valuable information about the calibre and impact of these social exchanges. Another measure is intensity, which describes the emotional depth or relevance of a social connection. The emotional reactions of people with autism when they engage with social robots can be used to gauge the intensity of their interactions. To ascertain a person’s emotional participation and expression during an interaction, facial expressions, body language, and other nonverbal cues are examined.
A study by [54] used Kaspar in a collaborative video game with children with autism who had difficulties in social interaction. The results showed that the children found the activity entertaining, were more engaged, and displayed better collaborative behaviors with the humanoid robot than with the human. Ref. [64] used robots to interact with children through games that required attention and social engagement, significantly increasing social interaction and attention in children. This suggests that robots may be a promising tool for helping children with autism overcome behavioral challenges.

3.13. Challenges in Human–Robot Interaction and Autism Therapy

The integration of human–robot interaction (HRI) in autism therapy and cognitive training presents several challenges across multiple dimensions, including engagement sustainability, social interaction complexity, response variability, and technological limitations.

3.13.1. Variability in Child Behavior

A primary difficulty, as highlighted by Marinoiu et al. [53], lies in addressing the variability in child behavior, where unpredictable actions, partial visibility, and nonstandard camera angles impact recognition accuracy. Mishra et al. [92] further emphasize the complexity of predicting real-time engagement, as fluctuating affective states and physiological signals make automated assessments challenging.

3.13.2. Managing Repetitive Behaviors

Managing repetitive and stereotypical behaviors is another key issue, as noted by [41], which affects participation in structured activities. Palestra et al. [49] pointed out the difficulty of maintaining consistent engagement, particularly in eye contact detection and adaptive learning exercises, whereas [82] stressed the need for a system that can autonomously manage social cues while keeping users engaged. [93,115] discuss the challenges in motor imitation, especially in children with autism, where gesture accuracy and joint movement imitation remain difficult to replicate effectively. Similarly, Fears et al. highlighted issues with motor coordination, particularly when mimicking complex joint movement.

3.13.3. Meaningful Social Interaction

Another significant challenge is ensuring meaningful social interactions through nonverbal cues. Pioggia et al. [68] focused on the difficulty of encouraging sustained attention in nonverbal interactions, particularly when repetitive behaviors interfere with engagement. So et al. noted the difficulty in sustaining engagement in structured activities due to individual variability in responses. Amirova et al. [103] emphasize how socio-behavioral outcomes fluctuate based on external factors, such as parental presence, affecting robot-assisted interventions. Zheng et al. [105] discussed how children with autism often do not respond to prompts, making gaze tracking and engagement monitoring difficult without the use of physical sensors.

3.13.4. Technological Limitations

The technological limitations of HRI-based therapies are also significant. Holeva et al. highlight treatment adherence variability and the lack of long-term follow-up data, with some studies hindered by external factors such as COVID-19 restrictions. Identifying unique response patterns, thereby potentially revealing ASD-related endophenotypes, is challenging [94]. They also noted replicability issues in clinician-led assessments and the difficulty in establishing standardized and automated interactions. Ensuring that the emotional skills learned through robot interactions can be generalized to real-world settings is also a challenging task [37]. There is a trade-off between energy efficiency and engagement, as longer therapy sessions require an optimized power management [126].

3.13.5. Social Cues

Some researchers have found that joint attention performance is lower with robot interactions than with human partners, particularly when dealing with complex social cues [126]. Managing variations in tactile engagement is challenging, as children with autism display different sensitivities to touch [56]. Robins et al. stressed the difficulty of maintaining long-term engagement, ensuring that social improvements persist over extended interactions. Wainer et al. in [54] highlighted the need for simplified interaction mechanics while accommodating various user responses. Kozima et al. [128] emphasized the need to balance predictability and spontaneity, as overly structured interactions may reduce spontaneous social behaviors.

3.13.6. Design and Engineering Challenges

Ensuring effective robotic assistance requires overcoming several design and engineering challenges. Shamsuddin et al. [104] noted that the initial engagement levels vary widely, making it difficult to design a universal robotic interaction model. Cao et al. [82] identified challenges in maintaining a high joint attention performance because robotic engagement is often less effective than human-led interventions. Mehmood et al. [118] discussed variability in attentional latency, which requires personalized adjustments to stimuli and response mechanisms. Other technological challenges include limited expressive capabilities [50], the need for contextual adaptation [108], and the need for continuous and long-term interaction to ensure effectiveness [23]. Wang et al. [129] stressed the importance of adaptive interactions tailored to individual needs, particularly in children with various levels of social and cognitive abilities. Developing a robot for autism therapy involves a structured design-based research process to ensure that the robot effectively meets the needs of children with autism [27]. It must be made clear whether the participants’ age, sex, and IQ have an impact on the therapy’s consequence and whether any positive benefits are limited to the robotic session or remain noticeable beyond the clinical or experimental setting [124].

3.13.7. Intensive Programming

The extensive programming requirements and scarcity of off-the-shelf programs available to doctors are two of the main obstacles preventing the broader use of robots in therapy. Efforts are underway to develop intuitive and adaptable programming platforms to address these challenges.

3.13.8. Coordination Among Participants

Coordinating multi-participant tasks in robot-assisted therapy poses significant complexity, as highlighted by Saadatzi et al. [109]. Similarly, Kim et al. [17] discussed the limited scope of interaction behaviors, suggesting that robots must be programmed to achieve greater social flexibility. Santos et al. [98] emphasize balancing motor and cognitive training demands, ensuring that robotic assistance does not overburden the user. Pioggia et al. [77] noted the difficulty in achieving lifelike emotional expressions essential for creating naturalistic interactions. Mengoni et al. [52] emphasized adapting robotic interventions to individualized engagement needs, ensuring that robots are flexible enough to respond to different personalities and learning styles.

3.13.9. Replication

Investigators carry out most interventions within an excessively controlled scenario, substantially standardized conditions, and applicable in regular clinical procedures. [130,131]. In addition, most studies have single-case designs, which makes it difficult to assess their effect sizes [107].

4. Discussion

The synthesis of studies in this review suggests that a broad array of social robots has been developed and employed to address key challenges in ASD therapy. The review identified a diverse range of social robots, including humanoid (e.g., NAO, Zeno, KASPAR), animal-like (e.g., Paro, Probo), and toy-like (e.g., LEGO NXT, Cozmo) robots. These robots differ significantly in their capabilities, including the degree of freedom, sensor integration, and expressiveness. More advanced robots, such as NAO and FACE, offer high interactivity and flexibility but come at a higher cost, whereas simpler robots provide cost-effective but more limited functionalities. The variety in design allows therapists to match the robot’s features to the individual needs of children with autism.
Integration strategies for social robots in therapeutic settings range from therapist-guided (Wizard-of-Oz) to partially autonomous systems. Robots were employed in both one-on-one and group sessions to promote skills such as eye contact, imitation, and recognizing emotions. Effective deployment often depends on a robot’s ability to deliver predictable and repeatable stimuli and respond adaptively to user behavior. Studies have shown that structured integration into therapy, especially in conjunction with visual or auditory aids, enhances engagement and learning outcomes.
Robot-assisted interventions target a range of developmental areas, with a strong emphasis on cognitive (e.g., communication and emotion recognition) and sensory (e.g., motor coordination and body awareness) domains. Communication methodologies include verbal prompts, nonverbal cues, and interactive behaviors, such as turn-taking and imitation. The target behaviors most frequently addressed were joint attention, eye contact, and social interaction. The reviewed literature highlights that robots serve as effective mediators in facilitating these behaviors in structured and low-anxiety environments.

4.1. Critical Analysis and Limitations of the Study

The findings affirm the promising potential of RAT in children with autism, but several critical limitations and considerations emerge. One significant issue was methodological heterogeneity across the included studies. Variations in study design, participant characteristics, session duration, evaluation metrics, and intervention settings pose significant challenges in drawing generalized conclusions. This diversity reflects the experimental and exploratory nature of the field but complicates efforts to compare results, identify best practices, and synthesize quantitative outcomes. This affects the overall conclusions by introducing variations in outcome reliability and comparison. For example, studies using different outcome measures to assess social interactions may not yield directly comparable results. Additionally, variations in robot autonomy and therapist involvement limit our ability to determine the relative effectiveness of different deployment strategies. Furthermore, equity and accessibility pose significant challenges to their use. High-cost robots with sophisticated capabilities are often out of reach in low-resource settings, potentially exacerbating disparities in therapy access. Moreover, cultural and linguistic differences are rarely addressed in robot design, which could affect the relevance and effectiveness of interventions in diverse populations. Finally, the exclusion of non-English and grey literature may have resulted in the omission of relevant studies, potentially skewing the findings toward better-documented or higher-resource interventions. Therefore, while this review offers valuable insights into trends and common practices, its findings must be interpreted carefully.

4.2. Future Research

An analysis of over 100 research articles on robot-assisted therapy for children with autism revealed six key future research directions, as identified by various authors.

4.2.1. Expansion of Sample Size and Diversity

A recurring limitation across multiple studies is the small sample size and lack of diversity among the participant groups. Several authors have emphasized the need for large-scale studies to enhance the generalizability of robot-assisted interventions across different cultures, ASD severity levels, and age groups. Mengoni et al. [52] proposed conducting a full-scale randomized controlled trial to validate findings on social skill development using humanoid robots in therapy. Robins et al. [23] stressed the importance of longitudinal studies to understand how children respond to robotic interventions over extended periods. Costa et al. [63] advocated incorporating a more diverse population to explore how therapy outcomes vary across socioeconomic and linguistic backgrounds, and we concur. Considering these suggestions in future research will ensure the generalizability of robot-assisted interventions and account for variations in individual learning styles, cognitive abilities, and behavioral responses. Research suggests that demographic factors, language barriers, and socioeconomic conditions should be considered when evaluating the effectiveness of robotic therapies.

4.2.2. Enhancing Personalization and Adaptive AI for Therapy

Several studies have proposed the integration of advanced AI and machine learning models to enable robots to adapt to the unique needs of each child. These include real-time emotion recognition, gaze tracking, speech analysis, and behavioral prediction to create personalized therapeutic experiences for patients. Many researchers are working on reinforcement learning algorithms that allow robots to adjust their interaction styles based on child responses, thereby making interventions more effective and engaging. To ensure widespread accessibility, researchers should focus on developing affordable, scalable, and robotic-therapy models. Low-cost, AI-driven, cloud-connected robots that can be updated remotely and personalized for individual children’s needs could significantly improve accessibility and adoption [39] and remotely guide interventions in real time via cloud-based systems [131]. Fuentes-Alvarez [126] emphasized the need for self-sustaining robots for prolonged autism therapy sessions. The self-sustaining robot operates autonomously over an extended period without requiring constant human intervention, manual recharging or frequent maintenance. In the context of autism therapy, self-sustaining robots are designed to function effectively in homes, schools, or clinical environments while adapting to a child’s learning progress, emotional state, and engagement levels.

4.2.3. Investigating Long-Term Efficacy and Generalization

Although many studies have reported short-term improvements in joint attention, social skills, and emotional recognition, there is a significant knowledge gap regarding the long-term retention of these skills. Future research should focus on follow-up studies to assess whether the skills learned through robot interactions can be generalized to real-world human interaction. A critical challenge is understanding how these skills persist over months or years and whether robot-assisted therapy can create sustainable improvements in communication and social engagement in children with autism. Ref. [132] suggested that longitudinal studies are necessary to track how children with autism generalize robot-learned skills to human interactions with human interactions. Sandoval et al. [131] recommended comparing robot-based therapy outcomes with traditional behavioral interventions to assess the sustained benefits.

4.2.4. Exploring Multimodal and Multi-Robot Therapy Approaches

Another emerging trend is the use of a combination of sensory modalities (visual, auditory, and tactile) to enhance engagement. Several studies have suggested integrating multisensory experiences, such as touch, gestures, and vocal intonations, to create a more realistic and immersive therapeutic environment. Additionally, multi-robot therapy, in which multiple robots interact with a child simultaneously or in a group setting, is gaining interest. Some researchers have hypothesized that multi-robot systems may enhance peer interaction skills, as children with autism tend to respond differently to social stimuli when engaging with multiple social agents rather than just one. Saadatzi et al. [109] found that children with autism exhibit more attention and fewer stereotyped behaviors when interacting with robots that incorporate visual, auditory, and tactile cues. Ivanov et al. [133] examined object-based joint attention and suggested using robot-guided objects to reinforce learning. Jung et al. [134] proposed a robotic coaching model that combines motor, social, and cognitive training for ASD children. Cao et al. [90] compared single- and multi-bot interventions, noting that multi-robot interactions encourage greater peer engagement.

4.2.5. Investigating Ethical, Psychological, and Safety Concerns

As robot-assisted therapy has become more established, researchers have raised concerns regarding its ethical implications, potential overreliance on robots, and emotional attachment. The development of social robots has prompted ethical considerations, particularly in the context of the European DREAM project [135]. The ethical evolution and development of this project highlight the importance of considering ethical implications when integrating robotics into autism therapy. Future research should address whether prolonged exposure to humanoid robots affects children’s perceptions of human relationships. Additionally, safety protocols must be refined to prevent unintended consequences, such as robot-induced stress, dependency, or confusion in distinguishing between human and robotic interactions. Some studies advocate for structured withdrawal protocols to ensure the balanced integration of robots into therapy without replacing human caregivers. Ref. [39] explored the role of soft social robots and the effect of tactile interaction on child-robot emotional bonds. Ref. [110] warned against over-dependence on AI-based interventions. Mengoni et al. [57] recommended the use of structured withdrawal protocols to prevent excessive reliance on robots in ASD therapy. Authors are putting efforts to develop more adaptable, ethical, and socially integrative robotic interventions. As research progresses, collaboration among roboticists, psychologists, and educators will be key to ensuring that robots enhance rather than replace human-led therapies.

4.2.6. Real-World Applications and Integration into Home and School Environments

A major limitation of many existing studies is that they were conducted in controlled laboratories or clinical settings. Future research should focus on evaluating the feasibility of robot-assisted interventions in real-world environments, such as homes, schools, and community centers. This includes studying how therapists, teachers, and parents can be trained to incorporate robots effectively into their daily therapeutic routines. Some researchers have proposed the development of low-cost, scalable robotic therapy solutions that are widely accessible to families and educational institutions, thereby increasing the adoption rates. Instead of substituting human-led interventions, future research should examine how robots can be integrated into current therapeutic frameworks [136]. To ensure that they can support meaningful interactions and maintain interest over time, educators, therapists, and caregivers must be trained to use robotic technologies [59]. Future studies can close the gap between experimental robotic interventions and useful real-world applications by addressing these crucial aspects, which will eventually result in scalable, easily accessible, and successful robot-assisted therapy options for children with autism. While robotic surgery focuses on enhancing precision and minimally invasive procedures in clinical settings, the use of robots for treating autism emphasizes therapeutic interaction and behavioral development, illustrating how robotics can support both physical and cognitive aspects of healthcare.

4.3. Contribution to the Field

This review makes a significant contribution to the growing field of socially assistive robotics by synthesizing two decades of empirical research on RAT for children with autism. It provides a comprehensive mapping of the types of robots employed, targeted developmental domains, and deployment strategies used in therapeutic settings. By organizing and comparing findings across a wide range of platforms and approaches, this study advances current knowledge by identifying common trends, strengths, and limitations in existing literature. This review also highlights the therapeutic potential of socially assistive robots in enhancing engagement, improving targeted social behaviors, and supporting structured learning environments for children with autism. Reinforcing these contributions helps clarify the role of RAT as a complementary tool in clinical practice and provides an evidence-based foundation for informing future intervention design, policies, and interdisciplinary collaborations.

5. Conclusions

The integration of social robotics into autism treatment has presented promising advancements in improving social skills, communication, and engagement in children with autism. This review highlights the diverse range of robotic interventions designed to address core social challenges in ASD therapy. Key findings from the literature recommend that robot-assisted interventions improve joint attention, emotion recognition, and verbal/nonverbal communication while also fostering more sustained engagement than traditional therapy methods at the same time. The structured and predictable nature of robots makes them particularly effective in reducing anxiety and encouraging social interaction among children with autism. Despite these advancements, several challenges remain. There is still a lack of research on the long-term effects of robot-assisted therapy on the generalization of social skills in authentic environments. Concerns regarding accessibility, possible emotional attachment, and excessive dependence on robots are among the ethical issues that must be considered. To guarantee broad accessibility, future research should concentrate on long-term studies, integrate robotics into domestic and educational settings, and create affordable, scalable solutions. In summary, social robots are an effective tool for autism treatment that enhances, rather than replaces, human-led interactions. To provide comprehensive, adaptable, and socially integrative robotic solutions, cooperation between roboticists, psychologists, educators, and doctors is essential as research progresses. Social robots have the potential to revolutionize ASD interventions with further research, making therapy more individualized, engaging, and successful in improving the long-term social development of individuals with autism.

Author Contributions

Conceptualization, M.N.; methodology, M.N., D.D. and A.P.; software, M.N., J.M.H.B., D.D. and A.P. validation, M.N., J.M.H.B., D.D. and A.P.; formal analysis, M.N., J.M.H.B., D.D. and A.P.; investigation, M.N., J.M.H.B., D.D. and A.P.; resources, M.N., J.M.H.B., D.D. and A.P.; data curation, M.N., J.M.H.B., D.D. and A.P.; writing—M.N., J.M.H.B., D.D. and A.P.; visualization, M.N., J.M.H.B., D.D. and A.P.; supervision, M.N. and J.M.H.B.; project administration, M.N. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bespalova, I.N.; Reichert, J.; Buxbaum, J.D. Candidate susceptibility genes for Autism. In The Neurobiology of Autism; Johns Hopkins University Press: Baltimore, MD, USA, 2005; pp. 217–232. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-67749117036&partnerID=40&md5=9320a9fc7e5482611d3419f2773dd2d1 (accessed on 14 August 2015).
  2. Dawson, G.; Sterling, L. Autism Spectrum Disorders. Encycl. Infant Early Child. Dev. 2008, 1–3, 137–143. [Google Scholar] [CrossRef]
  3. Ošlejšková, H.; Pejčochová, J. Autisms. Ceska A Slov. Neurol. A Neurochir. 2010, 73, 627–641. [Google Scholar]
  4. Frith, U. Autism—Are we any closer to explaining the enigma? Psychologist 2014, 27, 744–745. [Google Scholar]
  5. Matas, C.G.; Aburaya, F.C.L.M.; Kamita, M.K.; de Souza, R.Y.C.K. Principal Findings of Auditory Evoked Potentials in Autism Spectrum Disorder. In Neurobiology of Autism Spectrum Disorders; Springer: Cham, Switzerland, 2024; pp. 333–347. [Google Scholar] [CrossRef]
  6. Zeidan, J.; Fombonne, E.; Scorah, J.; Ibrahim, A.; Durkin, M.S.; Saxena, S.; Yusuf, A.; Shih, A.; Elsabbagh, M. Global prevalence of autism: A systematic review update. Autism Res. 2022, 15, 778–790. [Google Scholar] [CrossRef]
  7. Robins, B.; Dautenhahn, K.; Boekhorst, R.T.; Billard, A. Robotic assistants in therapy and education of children with autism: Can a small humanoid robot help encourage social interaction skills? Univers. Access Inf. Soc. 2005, 4, 105–120. [Google Scholar] [CrossRef]
  8. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, 5th ed.; American Psychiatric Publishing: Washington, DC, USA, 2024; Available online: https://www.psychiatry.org/getmedia/2ed086b0-ec88-42ec-aa0e-f442e4af74e6/APA-DSM5TR-Update-September-2024.pdf (accessed on 13 March 2015).
  9. Centers for Disease Control and Prevention. Data and Statistics on Autism Spectrum Disorder. 2025. Available online: https://www.cdc.gov/autism/data-research/index.html (accessed on 13 March 2015).
  10. Cakir, J.; Frye, R.E.; Walker, S.J. The lifetime social cost of autism: 1990–2029. Res. Autism Spectr. Disord. 2020, 72, 101502. [Google Scholar] [CrossRef]
  11. Jackson, S.L.; Volkmar, F.R.; Volkmar, F. Diagnosis and definition of autism and other pervasive developmental disorders. In Autism and Pervasive Developmental Disorders; Cambridge University Press: Cambridge, UK, 2019; pp. 1–24. [Google Scholar]
  12. Mercadante, M.T.; Van der Gaag, R.J.; Schwartzman, J.S. Non-Autistic Pervasive Developmental Disorders: Rett syndrome, disintegrative disorder and pervasive developmental disorder not otherwise specified. Braz. J. Psychiatry 2006, 28, s12–s20. [Google Scholar] [CrossRef] [PubMed][Green Version]
  13. Mughal, S.; Faizy, R.M.; Saadabadi, A. Autism spectrum disorder (regressive autism, child disintegrative disorder). In StatPearls [Internet]; StatPearls Publishing: Treasure Island, FL, USA, 2020. [Google Scholar][Green Version]
  14. Hassan, A.Z.; Zahed, B.T.; Zohora, F.T.; Moosa, J.M.; Salam, T.; Rahman, M.M.; Ferdous, H.S.; Ahmed, S.I. Developing the concept of money by interactive computer games for autistic children. In Proceedings of the 2011 IEEE International Symposium on Multimedia, ISM 2011, Dana Point, CA, USA, 5–7 December 2011; pp. 559–564. [Google Scholar] [CrossRef]
  15. Robel, L. Diagnosis and evaluation of autism and pervasive developmental disorders. Med. Ther. Pediatr. 2012, 15, 219–223. [Google Scholar] [CrossRef]
  16. Artigas-Pallarés, J.; Gabau-Vila, E.; Guitart-Feliubadaló, M. Syndromic autism: I. General aspects. Rev. Neurol. 2005, 40 (Suppl. S1), S143–S149. [Google Scholar] [CrossRef]
  17. Kim, J.I.; Yoo, H.J. Diagnosis and Assessment of Autism Spectrum Disorder in South Korea. J. Korean Acad. Child Adolesc. Psychiatry 2024, 35, 15–21. [Google Scholar] [CrossRef]
  18. Williams, D.L. Working memory and autism. In Working Memory and Clinical Developmental Disorders: Theories, Debates and Interventions; Routledge: Abingdon, UK, 2018; pp. 38–52. [Google Scholar] [CrossRef]
  19. Hodges, H.; Fealko, C.; Soares, N. Autism spectrum disorder: Definition, epidemiology, causes, and clinical evaluation. Transl Pediatr 2020, 9 (Suppl. S1), S55. [Google Scholar] [CrossRef] [PubMed]
  20. Rossignol, D.A. Novel and emerging treatments for autism spectrum disorders: A systematic review. Ann. Clin. Psychiatry 2009, 21, 213–236. [Google Scholar] [CrossRef] [PubMed]
  21. Klin, A.; Lin, D.J.; Gorrindo, P.; Ramsay, G.; Jones, W. Two-year-olds with autism orient to non-social contingencies rather than biological motion. Nature 2009, 459, 257–261. [Google Scholar] [CrossRef]
  22. Ozonoff, S. Reliability and validity of the Wisconsin Card Sorting Test in studies of autism. Neuropsychology 1995, 9, 491. [Google Scholar] [CrossRef]
  23. Robins, B.; Dautenhahn, K.; Dubowski, J. Robots as isolators or mediators for children with autism? A cautionary tale. In AISB’05 Convention: Social Intelligence and Interaction in Animals, Robots and Agents, Proceedings of the Symposium on Robot Companions: Hard Problems and Open Challenges in Robot-Human Interaction, Hatfield, UK, 14–15 April 2025; The Society for the Study of Artificial Intelligence and the Simulation of Behaviour (AISB): London, UK, 2005; pp. 82–88. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-67650488745&partnerID=40&md5=dc453db9d34d94f03b04619f07bd8dea (accessed on 20 August 2025).
  24. Emanuel, R.; Weir, S. Catalysing communication in an autistic child in a LOGO-like learning environment. In Proceedings of the 2nd Summer Conference on Artificial Intelligence and Simulation of Behaviour, Edinburgh, UK, 12–14 July 1976; pp. 118–129. [Google Scholar]
  25. Perillo, F.; Romano, M.; Vitiello, G. Social Robots Design to improve Social Skills in Autism Spectrum Disorder. In Proceedings of the International Workshop on Digital Innovations for Learning and Neurodevelopmental Disorders (DILeND 2024), CEUR Workshop Proceedings, Rome, Italy, 24–25 May 2024; pp. 58–64. [Google Scholar]
  26. Puglisi, A.; Caprì, T.; Pignolo, L.; Gismondo, S.; Chilà, P.; Minutoli, R.; Marino, F.; Failla, C.; Arnao, A.A.; Tartarisco, G.; et al. Social humanoid robots for children with autism spectrum disorders: A review of modalities, indications, and pitfalls. Children 2022, 9, 953. [Google Scholar] [CrossRef]
  27. Wagino, W.; Abidin, Z.; Beny, A.O.N.; Anggara, O.F.; Anggraeny, D.; Sari, D.E.; Pradana, H.D. A Robot for Children on the Autistic Spectrum. J. Eng. Sci. Technol. 2024, 19, 51–58. [Google Scholar]
  28. Cersosimo, R.; Pennazio, V. Promoting social communication skills in autism spectrum disorder through robotics and virtual worlds. In Intelligent Educational Robots: Toward Personalized Learning Environments; Walter de Gruyter GmbH & Co KG: Berlin, Germany, 2024; p. 95. [Google Scholar]
  29. Lee, J.; Takehashi, H.; Nagai, C.; Obinata, G.; Stefanov, D. Which robot features can stimulate better responses from children with autism in robot-assisted therapy? Int. J. Adv. Robot. Syst. 2012, 9, 72. [Google Scholar] [CrossRef]
  30. Narejo, I.H.; Matloob, M.J.; Attaullah, H.M.; Khawar, R. Designing and Evaluating a Mobile Social Robot as an Intervention Tool for Autism Spectrum Disorder. In Proceedings of the 2024 IEEE 1st Karachi Section Humanitarian Technology Conference (KHI-HTC), Tandojam, Pakistan, 8–9 January 2024; IEEE: New York, NY, USA, 2024; pp. 1–5. [Google Scholar]
  31. Diehl, J.J.; Schmitt, L.M.; Villano, M.; Crowell, C.R. The clinical use of robots for individuals with autism spectrum disorders: A critical review. Res. Autism Spectr. Disord. 2012, 6, 249–262. [Google Scholar] [CrossRef]
  32. Pinto-Bernal, M.J.; Cespedes, N.; Castro, P.; Munera, M.; Cifuentes, C.A. Physical human-robot interaction influence in ASD therapy through an affordable soft social robot. J. Intell. Robot. Syst. 2022, 105, 67. [Google Scholar] [CrossRef]
  33. Gómez-Espinosa, A.; Moreno, J.C.; la Cruz, S.P.-D. Assisted robots in therapies for children with autism in early childhood. Sensors 2024, 24, 1503. [Google Scholar] [CrossRef] [PubMed]
  34. Pagliara, S.M.; Bonavolonta, G.; Pia, M.; Falchi, S.; Zurru, A.L.; Fenu, G.; Mura, A. The Integration of Artificial Intelligence in Inclusive Education: A Scoping Review. Information 2024, 15, 774. [Google Scholar] [CrossRef]
  35. Mahdi, H.; Akgun, S.A.; Saleh, S.; Dautenhahn, K. A survey on the design and evolution of social robots—Past, present and future. Rob. Auton. Syst. 2022, 156, 104193. [Google Scholar] [CrossRef]
  36. Hanson, D.; Baurmann, S.; Riccio, T.; Margolin, R.; Dockins, T.; Tavares, M.; Carpenter, K. Zeno: A cognitive character. In Proceedings of the AI Magazine, and Special Proceeding of AAAI National Conference, Chicago, IL, USA, 13–17 July 2008. (later cited as 2009). [Google Scholar]
  37. Lecciso, F.; Levante, A.; Fabio, R.A.; Caprì, T.; Leo, M.; Carcagnì, P.; Distante, C.; Mazzeo, P.L.; Spagnolo, P.; Petrocchi, S. Emotional expression in children with ASD: A pre-study on a two-group pre-post-test design comparing robot-based and computer-based training. Front. Psychol. 2021, 12, 2826. [Google Scholar] [CrossRef] [PubMed]
  38. Salvador, M.; Marsh, A.S.; Gutierrez, A.; Mahoor, M.H. Development of an ABA autism intervention delivered by a humanoid robot. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2016; pp. 551–560. [Google Scholar] [CrossRef]
  39. Torres, N.A.; Clark, N.; Ranatunga, I.; Popa, D. Implementation of interactive arm playback behaviors of social robot Zeno for autism spectrum disorder therapy. In Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, Heraklion, Greece, 6–8 June 2012. [Google Scholar] [CrossRef]
  40. Watson, S.W. Socially Assisted Robotics as an Intervention for Children With Autism Spectrum Disorder. In Using Assistive Technology for Inclusive Learning in K-12 Classrooms; IGI Global: Hershey, PE, USA, 2023; pp. 24–41. [Google Scholar]
  41. Gkiolnta, E.; Zygopoulou, M.; Syriopoulou-Delli, C.K. Robot programming for a child with autism spectrum disorder: A pilot study. Int. J. Dev. Disabil. 2023, 69, 424–431. [Google Scholar] [CrossRef]
  42. Taheri, A.R.; Alemi, M.; Meghdari, A.; Pouretemad, H.R.; Basiri, N.M. Social robots as assistants for autism therapy in Iran: Research in progress. In Proceedings of the 2014 2nd RSI/ISM International Conference on Robotics and Mechatronics, ICRoM 2014, Tehran, Iran, 15–17 October 2014; pp. 760–766. [Google Scholar] [CrossRef]
  43. Zabidi, S.A.M.; Yusof, H.M.; Sidek, S.N. Platform to Improve Joint Attention. In Proceedings of the RiTA 2020: Proceedings of the 8th International Conference on Robot Intelligence Technology and Applications, Online, 11–13 December 2020; Springer Nature: Berlin, Germany, 2021; p. 214. [Google Scholar]
  44. Senland, A. Robots and autism spectrum disorder: Clinical and educational applications. In Innovative Technologies to Benefit Children on the Autism Spectrum; IGI Global Scientific Publishing: Palmdale, PA, USA, 2014; pp. 178–196. [Google Scholar] [CrossRef]
  45. Pandey, A.K.; Gelin, R. A mass-produced sociable humanoid robot: Pepper: The first machine of its kind. IEEE Robot. Autom. Mag. 2018, 25, 40–48. [Google Scholar] [CrossRef]
  46. Benedicto, G.; Val, M.; Fernández, E.; Ferrer, F.S.; Ferrández, J.M. Autism Spectrum Disorder (ASD): Emotional Intervention Protocol. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2022; pp. 310–322. [Google Scholar] [CrossRef]
  47. Gena, C.; Mattutino, C.; Brighenti, S.; Meirone, A.; Petriglia, F.; Mazzotta, L.; Liscio, F.; Nazzario, M.; Ricci, V.; Quarato, C.; et al. Sugar, Salt & Pepper-Humanoid robotics for autism In Proceedings of the Joint Proceedings of the ACM IUI 2021 Workshops co-located with 26th ACM Conference on Intelligent User Interfaces (ACM IUI 2021), College Station, TX, USA, 13–17 April 2021. CEUR Workshop Proceedings. Glowacka, D., Krishnamurthy, V.R., Eds.; CEUR-WS.org: College Station, TX, USA, 2021; Volume 2903. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-85110500897&partnerID=40&md5=b56ca0cb99a88b3e7bc00e66cf882ccf (accessed on 21 August 2025).
  48. Benedicto, G.; Juan, C.G.; Fernández-Caballero, A.; Fernandez, E.; Ferrández, J.M. Unravelling the Robot Gestures Interpretation by Children with Autism Spectrum Disorder During Human-Robot Interaction. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Cham, Switzerland, 2024; pp. 342–355. [Google Scholar] [CrossRef]
  49. Palestra, G.; De Carolis, B.; Esposito, F. Artificial Intelligence for Robot-Assisted Treatment of Autism. In Proceedings of the Workshop on Artificial Intelligence with Application in Health, Bari, Italy, 14 November 2017; pp. 17–24. [Google Scholar]
  50. Dautenhahn, K.; Nehaniv, C.L.; Walters, M.L.; Robins, B.; Kose-Bagci, H.; Mirza, N.A.; Blow, M. KASPAR—A minimally expressive humanoid robot for human–robot interaction research. Appl. Bionics Biomech. 2009, 6, 369–397. [Google Scholar] [CrossRef]
  51. Wood, L.J.; Zaraki, A.; Robins, B.; Dautenhahn, K. Developing kaspar: A humanoid robot for children with autism. Int. J. Soc. Robot. 2021, 13, 491–508. [Google Scholar] [CrossRef]
  52. Mengoni, S.E.; Irvine, K.; Thakur, D.; Barton, G.; Dautenhahn, K.; Guldberg, K.; Robins, B.; Wellsted, D.; Sharma, S. Feasibility study of a randomised controlled trial to investigate the effectiveness of using a humanoid robot to improve the social skills of children with autism spectrum disorder (Kaspar RCT): A study protocol. BMJ Open 2017, 7, e017376. [Google Scholar] [CrossRef]
  53. Marinoiu, E.; Zanfir, M.; Olaru, V.; Sminchisescu, C. 3D human sensing, action and emotion recognition in robot assisted therapy of children with autism. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 2158–2167. [Google Scholar]
  54. Wainer, J.; Dautenhahn, K.; Robins, B.; Amirabdollahian, F. A pilot study with a novel setup for collaborative play of the humanoid robot KASPAR with children with autism. Int. J. Soc. Robot. 2014, 6, 45–65. [Google Scholar] [CrossRef]
  55. Lakatos, G.; Wood, L.J.; Syrdal, D.S.; Robins, B.; Zaraki, A.; Dautenhahn, K. Robot-mediated intervention can assist children with autism to develop visual perspective taking skills. Paladyn 2020, 12, 87–101. [Google Scholar] [CrossRef]
  56. Costa, S.; Lehmann, H.; Dautenhahn, K.; Robins, B.; Soares, F. Using a humanoid robot to elicit body awareness and appropriate physical interaction in children with autism. Int. J. Soc. Robot. 2015, 7, 265–278. [Google Scholar] [CrossRef]
  57. Shamsuddin, S.; Ismail, L.I.; Yussof, H.; Zahari, N.I.; Bahari, S.; Hashim, H.; Jaffar, A. Humanoid robot NAO: Review of control and motion exploration. In Proceedings of the 2011 IEEE International Conference on Control System, Computing and Engineering, Penang, Malaysia, 25–27 November 2011; IEEE: New York, NY, USA, 2011; pp. 511–516. [Google Scholar]
  58. Anzalone, S.M.; Tilmont, E.; Boucenna, S.; Xavier, J.; Jouen, A.L.; Bodeau, N.; Maharatna, K.; Chetouani, M.; Cohen, D.; MICHELANGELO Study Group. How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D+ time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Disord. 2014, 8, 814–826. [Google Scholar] [CrossRef]
  59. Lytridis, C.; Vrochidou, E.; Chatzistamatis, S.; Kaburlasos, V. Social Engagement Interaction Games Between Children with Autism and Humanoid Robot NAO. In Advances in Intelligent Systems and Computing, Proceedings of the International Conference on Soft Computing Models in Industrial and Environmental Applications, San Sebastian, Spain, 6–8 June 2018; Springer: Cham, Switzerland, 2019; pp. 562–570. [Google Scholar] [CrossRef]
  60. Malik, N.A.; Shamsuddin, S.; Yussof, H.; Miskam, M.A.; Hamid, A.C. Feasibility of using a humanoid robot to elicit communicational response in children with mild autism. In IOP Conference Series: Materials Science and Engineering, Proceedings of the 5th International Conference on Mechatronics (ICOM’13), Kuala Lumpur, Malaysia, 2–4 July 2013; IOP Publishing Ltd.: Bristol, UK, 2013. [Google Scholar] [CrossRef]
  61. Miskam, M.A.; Hamid, M.A.C.; Yussof, H.; Shamsuddin, S.; Malik, N.A.; Basir, S.N. Study on social interaction between children with autism and humanoid robot NAO. Appl. Mech. Mater. 2013, 393, 573–578. [Google Scholar] [CrossRef]
  62. Hamid, A.C.; Miskam, M.A.; Yussof, H.; Shamsuddin, S.; Hashim, H.; Ismail, L. Human-robot interaction (HRI) for children with autism to augment communication skills. Appl. Mech. Mater. 2013, 393, 598–603. [Google Scholar] [CrossRef]
  63. Costa, A.P.; Kirsten, L.; Charpiot, L.; Steffgen, G. Mental health benefits of a robot-mediated emotional ability training for children with autism: An exploratory study. In Proceedings of the Annual Meeting of the International Society for Autism Research (INSAR 2019), Montreal, QC, Canada, 1–4 May 2019. [Google Scholar]
  64. Costa, A.P.; Charpiot, L.; Lera, F.R.; Ziafati, P.; Nazarikhorram, A.; Van Der Torre, L.; Steffgen, G. More attention and less repetitive and stereotyped behaviors using a robot with children with autism. In Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China, 27–31 August 2018; IEEE: New York, NY, USA, 2018; pp. 534–539. [Google Scholar]
  65. Pioggia, G.; Ahluwalia, A.; Carpi, F.; Marchetti, A.; Ferro, M.; Rocchia, W.; Rossi, D.D. FACE: Facial automaton for conveying emotions. Appl. Bionics Biomech. 2004, 1, 91–100. [Google Scholar] [CrossRef]
  66. Ghiglino, D.; Chevalier, P.; Floris, F.; Priolo, T.; Wykowska, A. Follow the white robot: Efficacy of robot-assistive training for children with autism spectrum disorder. Res. Autism Spectr. Disord. 2021, 86, 101822. [Google Scholar] [CrossRef]
  67. Mazzei, D.; Billeci, L.; Armato, A.; Lazzeri, N.; Cisternino, A.; Pioggia, G.; Igliozzi, R.; Muratori, F.; Ahluwalia, A.; De Rossi, D. The FACE of autism. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 791–796. [Google Scholar] [CrossRef]
  68. Pioggia, G.; Igliozzi, R.; Sica, M.L.; Ferro, M.; Muratori, F.; Ahluwalia, A.; De Rossi, D. Exploring emotional and imitational android-based interactions in autistic spectrum disorders. J. CyberTherapy Rehabil. 2008, 1, 49–61. [Google Scholar]
  69. Pioggia, G.; Sica, M.L.; Ferro, M.; Igliozzi, R.; Muratori, F.; Ahluwalia, A.; De Rossi, D. Human-robot interaction in autism: FACE, an android-based social therapy. In Proceedings of the RO-MAN 2007-the 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Republic of Korea, 26–29 August 2007; IEEE: New York, NY, USA, 2007; pp. 605–612. [Google Scholar]
  70. Mazzei, D.; Lazzeri, N.; Billeci, L.; Igliozzi, R.; Mancini, A.; Ahluwalia, A.; Muratori, F.; De Rossi, D. Development and evaluation of a social robot platform for therapy in autism. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS, Boston, MA, USA, 30 August–3 September 2011; pp. 4515–4518. [Google Scholar] [CrossRef]
  71. Yoshikawa, Y.; Kumazaki, H.; Matsumoto, Y.; Miyao, M.; Kikuchi, M.; Ishiguro, H. Relaxing gaze aversion of adolescents with autism spectrum disorder in consecutive conversations with human and android robot-a preliminary study. Front. Psychiatry 2019, 10, 370. [Google Scholar] [CrossRef]
  72. Soleiman, P.; Moradi, H.; Mahmoudi, M.; Teymouri, M.; Pouretemad, H.R. The use of RoboParrot in the therapy of children with autism children: In case of teaching the turn-taking skills. In Proceedings of the 16th International Conference on Intelligent Virtual Agent, Los Angeles, CA, USA, 20–23 September 2016. [Google Scholar]
  73. Pop, C.A.; Simut, R.E.; Pintea, S.; Saldien, J.; Rusu, A.S.; Vanderfaeillie, J.; David, D.O.; Lefeber, D.; Vanderborght, B. Social robots vs. computer display: Does the way social stories are delivered make a difference for their effectiveness on ASD children? J. Educ. Comput. Res. 2013, 49, 381–401. [Google Scholar] [CrossRef]
  74. Ruiz, E.P.M.; Fernández, H.H.O.; Cena, C.E.G.; León, R.C. Design of JARI: A Robot to Enhance Social Interaction in Children with Autism Spectrum Disorder. Machines 2025, 13, 436. [Google Scholar] [CrossRef]
  75. Pakkar, R.; Clabaugh, C.; Lee, R.; Deng, E.; Mataricć, M.J. Designing a socially assistive robot for long-term in-home use for children with autism spectrum disorders. In Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India, 14–18 October 2019; IEEE: New York, NY, USA, 2019; pp. 1–7. [Google Scholar]
  76. Van Breemen, A.J.N. Animation engine for believable interactive user-interface robots. In Proceedings of the 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Sendai, Japan, 28 September–2 October 2004; pp. 2873–2878. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-14044268833&partnerID=40&md5=fdcf5d91794f372159f1ee9e25a2cc8e (accessed on 1 June 2025).
  77. Van Breemen, A.J.N. ICat: Experimenting with animabotics. In AISB’05 Convention: Social Intelligence and Interaction in Animals, Robots and Agents, Proceedings of Symposium on Robotics, Mechatronics and Animatronics in the Creative and Entertainment Industries and Arts, University of Hertfordshire, Hatfield, UK, 12–15 April 2005; AISB: Bristol, UK, 2005; pp. 27–32. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-37349106715&partnerID=40&md5=44486dfb04ab0dc9587cd5a9243fc8fd (accessed on 1 June 2025).
  78. Van Breemen, A.; Yan, X.; Meerbeek, B. iCat: An animated user-interface robot with personality. In Proceedings of the International Conference on Autonomous Agents and Multiagent Systems, Utrecht, The Netherlands, 25–29 July 2005; pp. 17–18. Available online: https://www.scopus.com/inward/record.uri?eid=2-s2.0-33644802349&partnerID=40&md5=0acb82815e3fd9cb348c947f912515f9 (accessed on 1 June 2025).
  79. Shibata, T.; Mitsui, T.; Wada, K.; Touda, A.; Kumasaka, T.; Tagami, K.; Tanie, K. Mental commit robot and its application to therapy of children. In Proceedings of the 2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics. Proceedings (Cat. No. 01TH8556), Como, Italy, 8–12 July 2001; IEEE: New York, NY, USA, 2001; pp. 1053–1058. [Google Scholar]
  80. Veronesi, C.; Trimarco, B.; Botticelli, N.; Armani, G.; Bentenuto, A.; Fioriello, F.; Picchiotti, G.; Sogos, C. Use of the PARO robot as a social mediator in a sample of children with neurodevelopmental disorders and typical development. Clin. Ter. 2023, 174, 132–138. [Google Scholar]
  81. Qidwai, U.; Shakir, M.; Connor, O.B. Robotic toys for autistic children: Innovative tools for teaching and treatment. In Proceedings of the 2013 7th IEEE GCC Conference and Exhibition, GCC 2013, Doha, Qatar, 17–20 November 2013; pp. 188–192. [Google Scholar] [CrossRef]
  82. Cao, H.L.; Esteban, P.G.; Bartlett, M.; Baxter, P.; Belpaeme, T.; Billing, E.; Cai, H.; Coeckelbergh, M.; Costescu, C.; David, D.; et al. Robot-enhanced therapy: Development and validation of supervised autonomous robotic system for autism spectrum disorders therapy. IEEE Robot. Autom. Mag. 2019, 26, 49–58. [Google Scholar] [CrossRef]
  83. Nikolopoulos, C.; Kuester, D.; Sheehan, M.; Ramteke, S.; Karmarkar, A.; Thota, S.; Kearney, J.; Boirum, C.; Bojedla, S.; Lee, A. Robotic agents used to help teach social skills to children with Autism: The third generation. In Proceedings of the IEEE International Workshop on Robot and Human Interactive Communication, Atlanta, GA, USA, 31 July–3 August 2011; pp. 253–258. [Google Scholar] [CrossRef]
  84. Sartorato, F.; Przybylowski, L.; Sarko, D.K. Improving therapeutic outcomes in autism spectrum disorders: Enhancing social communication and sensory processing through the use of interactive robots. J. Psychiatr. Res. 2017, 90, 1–11. [Google Scholar] [CrossRef] [PubMed]
  85. Bevill, R.; Park, C.H.; Kim, H.J.; Lee, J.W.; Rennie, A.; Jeon, M.; Howard, A.M. Interactive robotic framework for multi-sensory therapy for children with autism spectrum disorder. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Christchurch, New Zealand, 7–10 March 2016; pp. 421–422. [Google Scholar] [CrossRef]
  86. Bevill, R.; Azzi, P.; Spadafora, M.; Park, C.H.; Kim, H.J.; Lee, J.; Raihan, K.; Jeon, M.; Howard, A.M. Multisensory robotic therapy to promote natural emotional interaction for children with ASD. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Christchurch, New Zealand, 7–10 March 2016; p. 571. [Google Scholar] [CrossRef]
  87. Yin, X.; Hou, S.; Hu, H. Research on interactive design of social interaction training APP for children with autistic spectrum disorder (ASD) based on multi-modal interaction. E3S Web Conf. 2020, 179, 02044. [Google Scholar] [CrossRef]
  88. Lee, J.; Obinata, G.; Aoki, H. A pilot study of using touch sensing and robotic feedback for children with autism. In Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction, Bielefeld, Germany, 3–6 March 2014; pp. 222–223. [Google Scholar] [CrossRef]
  89. Mavadati, S.M.; Feng, H.; Salvador, M.; Silver, S.; Gutierrez, A.; Mahoor, M.H. Robot-based therapeutic protocol for training children with Autism. In Proceedings of the 25th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN 2016, New York, NY, USA, 26–31 August 2016; pp. 855–860. [Google Scholar] [CrossRef]
  90. Cao, H.L.; Simut, R.E.; Desmet, N.; De Beir, A.; Van De Perre, G.; Vanderborght, B.; Vanderfaeillie, J. Robot-assisted joint attention: A comparative study between children with autism spectrum disorder and typically developing children in interaction with NAO. IEEE Access 2020, 8, 223325–223334. [Google Scholar] [CrossRef]
  91. Lohan, K.S.; Sheppard, E.; Little, G.; Rajendran, G. Toward improved child–robot interaction by understanding eye movements. IEEE Trans. Cogn. Dev. Syst. 2018, 10, 983–992. [Google Scholar] [CrossRef]
  92. Mishra, R. Towards adaptive and personalized robotic therapy for children with autism spectrum disorder. In Proceedings of the 2022 10th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos (ACIIW); IEEE: New York, NY, USA, 2022; pp. 1–5. [Google Scholar]
  93. Arshad, N.I.; Hashim, A.S.; Ariffin, M.M.; Aszemi, N.M.; Low, H.M.; Norman, A.A. Robots as assistive technology tools to enhance cognitive abilities and foster valuable learning experiences among young children with autism spectrum disorder. IEEE Access 2020, 8, 116279–116291. [Google Scholar] [CrossRef]
  94. Kumazaki, H.; Warren, Z.; Swanson, A.; Yoshikawa, Y.; Matsumoto, Y.; Ishiguro, H.; Sarkar, N.; Minabe, Y.; Kikuchi, M. Impressions of humanness for android robot may represent an endophenotype for autism spectrum disorders. J. Autism Dev. Disord. 2018, 48, 632–634. [Google Scholar] [CrossRef]
  95. Holeva, V.; Nikopoulou, V.A.; Lytridis, C.; Bazinas, C.; Kechayas, P.; Sidiropoulos, G.; Papadopoulou, M.; Kerasidou, M.D.; Karatsioras, C.; Geronikola, N.; et al. Effectiveness of a Robot-Assisted Psychological Intervention for Children with Autism Spectrum Disorder. J. Autism Dev. Disord. 2022, 54, 577–593. [Google Scholar] [CrossRef]
  96. Kumazaki, H.; Muramatsu, T.; Yoshikawa, Y.; Matsumoto, Y.; Ishiguro, H.; Mimura, M. Android robot was beneficial for communication rehabilitation in a patient with schizophrenia comorbid with autism spectrum disorders. Schizophr. Res. 2023, 254, 116–117. [Google Scholar] [CrossRef]
  97. Mishra, R.; Welch, K.C. Towards Forecasting Engagement in Children with Autism Spectrum Disorder using Social Robots and Deep Learning. In Proceedings of the SoutheastCon 2023, Orlando, FL, USA, 1–16 April 2023; IEEE: New York, NY, USA, 2023; pp. 838–843. [Google Scholar]
  98. Santos, L.; Geminiani, A.; Schydlo, P.; Olivieri, I.; Santos-Victor, J.; Pedrocchi, A. Design of a robotic coach for motor, social and cognitive skills training toward applications with ASD children. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1223–1232. [Google Scholar] [CrossRef]
  99. Jyoti, V.; Gupta, S.; Lahiri, U. Understanding the role of objects in joint attention task framework for children with autism. IEEE Trans. Cogn. Dev. Syst. 2020, 13, 524–534. [Google Scholar] [CrossRef]
  100. Ali, S.; Mehmood, F.; Khan, M.J.; Ayaz, Y.; Asgher, U.; Sadia, H.; Edifor, E.; Nawaz, R. A preliminary study on effectiveness of a standardized multi-robot therapy for improvement in collaborative multi-human interaction of children with ASD. IEEE Access 2020, 8, 109466–109474. [Google Scholar] [CrossRef]
  101. Kumazaki, H.; Muramatsu, T.; Yoshikawa, Y.; Yoshimura, Y.; Ikeda, T.; Hasegawa, C.; Saito, D.N.; Shimaya, J.; Ishiguro, H.; Mimura, M.; et al. Brief report: A novel system to evaluate autism spectrum disorders using two humanoid robots. J. Autism Dev. Disord. 2019, 49, 1709–1716. [Google Scholar] [CrossRef]
  102. Fears, N.E.; Sherrod, G.M.; Blankenship, D.; Patterson, R.M.; Hynan, L.S.; Wijayasinghe, I.; Popa, D.O.; Bugnariu, N.L.; Miller, H.L. Motor differences in autism during a human-robot imitative gesturing task. Clin. Biomech. 2023, 106, 105987. [Google Scholar] [CrossRef]
  103. Amirova, A.; Rakhymbayeva, N.; Zhanatkyzy, A.; Telisheva, Z.; Sandygulova, A. Effects of parental involvement in robot-assisted autism therapy. J. Autism Dev. Disord. 2023, 53, 438–455. [Google Scholar] [CrossRef] [PubMed]
  104. Shamsuddin, S.; Yussof, H.; Ismail, L.I.; Mohamed, S.; Hanapiah, F.A.; Zahari, N.I. Initial response in HRI-a case study on evaluation of child with autism spectrum disorders interacting with a humanoid robot Nao. Procedia Eng. 2012, 41, 1448–1455. [Google Scholar] [CrossRef]
  105. Zheng, Z.; Zhao, H.; Swanson, A.R.; Weitlauf, A.S.; Warren, Z.E.; Sarkar, N. Design, development, and evaluation of a noninvasive autonomous robot-mediated joint attention intervention system for young children with ASD. IEEE Trans. Hum. Mach. Syst. 2017, 48, 125–135. [Google Scholar] [CrossRef]
  106. Anhuaman, A.; Granados, C.; Meza, W.; Raez, R. Cogui: Interactive Social Robot for Autism Spectrum Disorder Children: A Wonderful Partner for ASD Children. In Proceedings of the Companion of the 2023 ACM/IEEE International Conference on Human-Robot Interaction, Stockholm Sweden, 13–16 March 2023; pp. 861–864. [Google Scholar]
  107. So, W.C.; Cheng, C.H.; Lam, W.Y.; Huang, Y.; Ng, K.C.; Tung, H.C.; Wong, W. A robot-based play-drama intervention may improve the joint attention and functional play behaviors of chinese-speaking preschoolers with autism spectrum disorder: A pilot study. J. Autism Dev. Disord. 2020, 50, 467–481. [Google Scholar] [CrossRef]
  108. Bhargavi, Y.; Bini, D.; Prince, S. AI-based Emotion Therapy Bot for Children with Autism Spectrum Disorder (ASD). In Proceedings of the 2023 9th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 17–18 March 2023; IEEE: New York, NY, USA, 2023; pp. 1895–1899. [Google Scholar]
  109. Saadatzi, M.N.; Pennington, R.C.; Welch, K.C.; Graham, J.H. Small-group technology-assisted instruction: Virtual teacher and robot peer for individuals with autism spectrum disorder. J. Autism Dev. Disord. 2018, 48, 3816–3830. [Google Scholar] [CrossRef]
  110. Kim, E.S.; Berkovits, L.D.; Bernier, E.P.; Leyzberg, D.; Shic, F.; Paul, R.; Scassellati, B. Social robots as embedded reinforcers of social behavior in children with autism. J. Autism Dev. Disord. 2013, 43, 1038–1049. [Google Scholar] [CrossRef]
  111. Telisheva, Z.; Amirova, A.; Rakhymbayeva, N.; Zhanatkyzy, A.; Sandygulova, A. The Quantitative Case-by-Case Analyses of the Socio-Emotional Outcomes of Children with ASD in Robot-Assisted Autism Therapy. Multimodal Technol. Interact. 2022, 6, 46. [Google Scholar] [CrossRef]
  112. Melo, F.S.; Sardinha, A.; Belo, D.; Couto, M.; Faria, M.; Farias, A.; Gamboa, H.; Jesus, C.; Kinarullathil, M.; Lima, P.; et al. Project INSIDE: Towards autonomous semi-unstructured human–robot social interaction in autism therapy. Artif. Intell. Med. 2019, 96, 198–216. [Google Scholar] [CrossRef] [PubMed]
  113. Kostrubiec, V.; Kruck, J. Collaborative research project: Developing and testing a robot-assisted intervention for children with autism. Front. Robot. AI 2020, 7, 37. [Google Scholar] [CrossRef]
  114. Ali, S.; Mehmood, F.; Dancey, D.; Ayaz, Y.; Khan, M.J.; Naseer, N.; Amadeu, R.D.C.; Sadia, H.; Nawaz, R. An adaptive multi-robot therapy for improving joint attention and imitation of ASD children. IEEE Access 2019, 7, 81808–81825. [Google Scholar] [CrossRef]
  115. Marino, F.; Chilà, P.; Sfrazzetto, S.T.; Carrozza, C.; Crimi, I.; Failla, C.; Busà, M.; Bernava, G.; Tartarisco, G.; Vagni, D.; et al. Outcomes of a robot-assisted social-emotional understanding intervention for young children with autism spectrum disorders. J. Autism Dev. Disord. 2020, 50, 1973–1987. [Google Scholar] [CrossRef] [PubMed]
  116. Srinivasan, S.M.; Eigsti, I.-M.; Neelly, L.; Bhat, A.N. The effects of embodied rhythm and robotic interventions on the spontaneous and responsive social attention patterns of children with autism spectrum disorder (ASD): A pilot randomized controlled trial. Res. Autism Spectr. Disord. 2016, 27, 54–72. [Google Scholar] [CrossRef]
  117. Srinivasan, S.M.; Eigsti, I.-M.; Gifford, T.; Bhat, A.N. The effects of embodied rhythm and robotic interventions on the spontaneous and responsive verbal communication skills of children with Autism Spectrum Disorder (ASD): A further outcome of a pilot randomized controlled trial. Res. Autism Spectr. Disord. 2016, 27, 73–87. [Google Scholar] [CrossRef]
  118. Mehmood, F.; Mahzoon, H.; Yoshikawa, Y.; Ishiguro, H.; Sadia, H.; Ali, S.; Ayaz, Y. Attentional behavior of children with ASD in response to robotic agents. IEEE Access 2021, 9, 31946–31955. [Google Scholar] [CrossRef]
  119. Kobayashi, T.; Toyamaya, T.; Shafait, F.; Iwamura, M.; Kise, K.; Dengel, A. Recognizing words in scenes with a head-mounted eye-tracker. In Proceedings of the 2012 10th IAPR International Workshop on Document Analysis Systems, Gold Coast, Australia, 27–29 March 2012; IEEE: New York, NY, USA, 2012; pp. 333–338. [Google Scholar][Green Version]
  120. Chadsey, J. ‘Promoting Social Communication: Children With Developmental Disabilities From Birth to Adolescence,’ by Howard Goldstein, Louise A. Kaczmarek, and Kristina M. English (Book Review). Ment. Retard. 2003, 41, 301. [Google Scholar] [CrossRef]
  121. Stanton-Chapman, T.L.; Snell, M.E. Promoting turn-taking skills in preschool children with disabilities: The effects of a peer-based social communication intervention. Early Child. Res. Q. 2011, 26, 303–319. [Google Scholar] [CrossRef]
  122. Broz, F.; Nehaniv, C.L.; Kose, H.; Dautenhahn, K. Interaction histories and short-term memory: Enactive development of turn-taking behaviours in a childlike humanoid robot. Philosophies 2019, 4, 26. [Google Scholar] [CrossRef]
  123. Leekam, S. Social cognitive impairment and autism: What are we trying to explain? Philos. Trans. R. Soc. B Biol. Sci. 2016, 371, 20150082. [Google Scholar] [CrossRef] [PubMed]
  124. Pennisi, P.; Tonacci, A.; Tartarisco, G.; Billeci, L.; Ruta, L.; Gangemi, S.; Pioggia, G. Autism and social robotics: A systematic review. Autism Res. 2016, 9, 165–183. [Google Scholar] [CrossRef]
  125. Bird, G.; Leighton, J.; Press, C.; Heyes, C. Intact automatic imitation of human and robot actions in autism spectrum disorders. Proc. R. Soc. B Biol. Sci. 2007, 274, 3027–3031. [Google Scholar] [CrossRef]
  126. Fuentes-Alvarez, R.; Morfin-Santana, A.; Ibañez, K.; Chairez, I.; Salazar, S. Energetic optimization of an autonomous mobile socially assistive robot for autism spectrum disorder. Front. Robot. AI 2023, 9, 372. [Google Scholar] [CrossRef] [PubMed]
  127. Mottron, L.; Bzdok, D. Autism spectrum heterogeneity: Fact or artifact? Mol. Psychiatry 2020, 25, 3178–3185. [Google Scholar] [CrossRef]
  128. Kozima, H.; Nakagawa, C. Interactive robots as facilitators of childrens social development. In Mobile Robots: Towards New Applications; Citeseer; I-Tech Education and Publishing: Vienna, Austria, 2006. [Google Scholar]
  129. Wang, C.-P. Training children with autism spectrum disorder, and children in general with AI robots related to the automatic organization of sentence menus and interaction design evaluation. Expert Syst. Appl. 2023, 229, 120527. [Google Scholar] [CrossRef]
  130. Hume, K.; Steinbrenner, J.R.; Odom, S.L.; Morin, K.L.; Nowell, S.W.; Tomaszewski, B.; Szendrey, S.; McIntyre, N.S.; Yücesoy-Özkan, S.; Savage, M.N. Evidence-based practices for children, youth, and young adults with autism: Third generation review. J. Autism Dev. Disord. 2021, 51, 4013–4032. [Google Scholar] [CrossRef]
  131. Sandoval, J.; Gutiérrez, B.M.A.; Jiménez-Pérez, L.; Robles, S.R. Integration of school technologies for language learning for students with autism spectrum disorder: A systematic review. Int. J. Technol. Knowl. Soc. 2024, 20, 23. [Google Scholar] [CrossRef]
  132. Lorenzo, G.; Lledó, A.; Pérez-Vázquez, E.; Lorenzo-Lledó, A. Action protocol for the use of robotics in students with Autism Spectrum Disoders: A systematic-review. Educ. Inf. Technol. 2021, 26, 4111–4126. [Google Scholar] [CrossRef]
  133. Ivanova, G.; Ivanov, A.; Kirilova, M. Design and Development of a 3D Social Mobile Robot with Artificial Intelligence for Teaching with Assistive Technology. In Proceedings of the 2025 24th International Symposium INFOTEH-JAHORINA (INFOTEH), East Sarajevo, Bosnia and Herzegovina, 19–21 March 2025; IEEE: New York, NY, USA, 2025; pp. 1–6. [Google Scholar]
  134. Jung, Y.; Jung, G.; Jeong, S.; Kim, C.; Woo, W.; Hong, H.; Lee, U. ‘Enjoy, but Moderately!’: Designing a Social Companion Robot for Social Engagement and Behavior Moderation in Solitary Drinking Context. Proc. ACM Hum Comput. Interact. 2023, 7, 1–24. [Google Scholar] [CrossRef]
  135. Richardson, K.; Coeckelbergh, M.; Wakunuma, K.; Billing, E.; Ziemke, T.; Gomez, P.; Vanderborght, B.; Belpaeme, T. Robot enhanced therapy for children with autism (DREAM): A social model of autism. IEEE Technol. Soc. Mag. 2018, 37, 30–39. [Google Scholar] [CrossRef]
  136. da Silva, N.I.; Lebrun, I. Autism spectrum disorder: History, concept and future perspective. J. Neurol. Res. Rev. Rep. 2023, 173, 5–8. [Google Scholar] [CrossRef]
Figure 1. PRISMA flowchart.
Figure 1. PRISMA flowchart.
Mti 09 00098 g001
Figure 2. Year-wise distribution of publications on social robots for autism therapy.
Figure 2. Year-wise distribution of publications on social robots for autism therapy.
Mti 09 00098 g002
Figure 3. Word cloud for keywords.
Figure 3. Word cloud for keywords.
Mti 09 00098 g003
Figure 4. Distribution of articles based on the participants.
Figure 4. Distribution of articles based on the participants.
Mti 09 00098 g004
Figure 5. Distribution of robots used in the studies included in this review.
Figure 5. Distribution of robots used in the studies included in this review.
Mti 09 00098 g005
Figure 6. Distribution of articles based on robot shapes used in autism-related research.
Figure 6. Distribution of articles based on robot shapes used in autism-related research.
Mti 09 00098 g006
Table 1. Comparison of physical attributes of social robots.
Table 1. Comparison of physical attributes of social robots.
RobotHeightWeightProcessorDoFSensors
Pepper120 cm28 kgIntel Atom E3845192 HD cameras, 3D sensors, 4 microphones, 2 touch sensors, 2 sonars, 6 lasers, 3 bumper sensors, and gyroscope.
NAO58 cm5.48 kgIntel Atom 1.6 GHz252 cameras, 4 microphones, 4 sonars, gyroscope, touch sensors, IMU
Jibo28 cm3 kgARM Cortex-A83Touch sensors, cameras, microphones
Paro57 cm2.7 kg32-bit RISC 7Light, sound, touch, temperature sensors, microphone arrays
Kuri50 cm6.3 kgARM-based processor4HD camera, microphones, speakers
Temi100 cm11 kgARM-based processor3Cameras, depth sensors, lidar, microphones, touch sensors
QTrobot60 cm5 kgIntel® NUC i5/i7 PC4Cameras, microphones
Zeno63.5 cm6.5 kgdual core 1.5 GHz ARM Cortex A36Cameras, 3 microphones, 2 IR, 2 bumper, gyroscope, accelerometer
Furhat41 cm3.5 kgIntel Core i5-7260U31080p RGB camera, stereo microphones, RFID reader
Sophia167 cm20 kgIntel i7 3 GHz with GPU833 cameras, force sensors, touch sensors microphone, Speaker, IMU.
Milo60 cm5 kgVIA Mini-ITX, 600 MHz13Facial recognition, microphones, touch sensors
RoboParrot50 cm1.5 kgAtMega16 Cameras, microphones, touch sensors
Probo66 cm5.7 kgIntel Core i3/i520Cameras, microphones, touch sensors
Zoomer26 cm0.8 KgMicrocontroller Microphone, speaker
JARI60 cm15 kgA Raspberry Pi 3 B+ and Arduino Nano3Camera, microphone, ultrasonic sensor
Charli141 cm12.1 kgIntel Atom 1.6 GHz25Camera, gyro, accelerometer, 2 microphones
LEGO Mindstorm NXT-2.1 kg32-bit microprocessor32 touch sensors, 1 ultrasonic sensor, 1 color/light sensor.
Keepon27.5 cm1.5 kgMicrocontrollers4Cameras, 2 microphones, array of touch sensors
Pleo17.8 cm1.6 kg7 CPUs15Color camera, IR sensor, temperature sensor, RFID reader, 2 microphones, foot, orientation and touch sensors
Paro16 cm2.7 kg32-bit RISC processors7Light sensor, temperature sensor, tactile sensors, microphone array.
Aibo29.3 cm2.2 kgQualcomm Snapdragon 820, 64-bit quad-core 222 cameras, Time-of-flight sensors, 2 IR sensors, 4 microphones, capacitive touch sensors, 2 motion and 4 paw contact sensors.
KASPAR55 cm15 kgOnboard mini-PC17Cameras in eyes, force-sensing resistor or capacitive touch sensors
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nadeem, M.; Barakat, J.M.H.; Daas, D.; Potams, A. A Review of Socially Assistive Robotics in Supporting Children with Autism Spectrum Disorder. Multimodal Technol. Interact. 2025, 9, 98. https://doi.org/10.3390/mti9090098

AMA Style

Nadeem M, Barakat JMH, Daas D, Potams A. A Review of Socially Assistive Robotics in Supporting Children with Autism Spectrum Disorder. Multimodal Technologies and Interaction. 2025; 9(9):98. https://doi.org/10.3390/mti9090098

Chicago/Turabian Style

Nadeem, Muhammad, Julien Moussa H. Barakat, Dani Daas, and Albert Potams. 2025. "A Review of Socially Assistive Robotics in Supporting Children with Autism Spectrum Disorder" Multimodal Technologies and Interaction 9, no. 9: 98. https://doi.org/10.3390/mti9090098

APA Style

Nadeem, M., Barakat, J. M. H., Daas, D., & Potams, A. (2025). A Review of Socially Assistive Robotics in Supporting Children with Autism Spectrum Disorder. Multimodal Technologies and Interaction, 9(9), 98. https://doi.org/10.3390/mti9090098

Article Metrics

Back to TopTop