You are currently viewing a new version of our website. To view the old version click .
Sensors
  • Review
  • Open Access

4 November 2019

Sensor-Based Technology for Social Information Processing in Autism: A Review

and
1
Early Support and Counselling Center Jena, Herbert Feuchte Stiftungsverbund, 07743 Jena, Germany
2
Social Potential in Autism Research Unit, Friedrich Schiller University, 07743 Jena, Germany
3
Department of General Psychology and Cognitive Neuroscience, Friedrich Schiller University Jena, Am Steiger 3/Haus 1, 07743 Jena, Germany
4
Michael Stifel Center Jena for Data-Driven and Simulation Science, Friedrich Schiller University, 07743 Jena, Germany
This article belongs to the Special Issue Biometric Systems

Abstract

The prevalence of autism spectrum disorders (ASD) has increased strongly over the past decades, and so has the demand for adequate behavioral assessment and support for persons affected by ASD. Here we provide a review on original research that used sensor technology for an objective assessment of social behavior, either with the aim to assist the assessment of autism or with the aim to use this technology for intervention and support of people with autism. Considering rapid technological progress, we focus (1) on studies published within the last 10 years (2009–2019), (2) on contact- and irritation-free sensor technology that does not constrain natural movement and interaction, and (3) on sensory input from the face, the voice, or body movements. We conclude that sensor technology has already demonstrated its great potential for improving both behavioral assessment and interventions in autism spectrum disorders. We also discuss selected examples for recent theoretical questions related to the understanding of psychological changes and potentials in autism. In addition to its applied potential, we argue that sensor technology—when implemented by appropriate interdisciplinary teams—may even contribute to such theoretical issues in understanding autism.

1. Introduction

Throughout the last decades, the number of people diagnosed with an Autism Spectrum Disorder (ASD) increased dramatically [1,2] and so did the need for high-quality diagnostic protocols and therapies. With the ongoing progress in computer sciences and hardware, a lot of creative ideas emerged on how to use sensor data to identify and observe autistic markers, support diagnostic procedures and enhance specific therapies to improve individuals’ outcomes.
ASD is a behaviorally defined group of neurodevelopmental disorders that are specified by impaired reciprocal social communication and restricted, repetitive patterns of behavior or activities (DSM-5), [3]. The symptoms are usually apparent from early childhood and tend to persist throughout life [4]. Common social impairments include a lack of social attention as evident in abnormal eye gaze or eye contact [5] and social reciprocity such as in reduced sharing of emotions in facial [6] or vocal behavior [7]. Further, only a minority of the affected people report having mutual friendships [8]. Related to the restricted and repetitive behaviors, stereotyped motor movements and speech are the stand-out features in many people with ASD [9]. Other symptoms are insisting on sameness and routines [10], special interests and hyper- or hyporeactivity to sensory input from various modalities [11]. The exact profile and severity of symptoms in people with ASD as well as their personal strengths and coping capabilities vary to a great degree, and so does their need for support.
Reasons for the increased prevalence over the last decades include a more formalized diagnostic approach and heightened awareness. The current ‘gold standard’ for a diagnosis of ASD consists of an assessment of current behavior, a biographical anamnesis, and a parental report, all collected and evaluated by a trained multi-professional team [12]. Although the screening and diagnostic methods for ASD improved throughout the last years, many affected people, especially women [13] and high functioning people, still receive a late diagnosis. Since early interventions have been shown to be most effective for improving adaptive behavior, as well as IQ and language skills [14], there is continued demand for methods promoting early assessment in order to avoid follow-up problems. In this context, progress in automatic and sensor-assisted identification of ASD-specific behavioral patterns could make an important contribution to an earlier and less biased diagnosis.
Even beyond assessment, advances in digital technology are highly relevant for autism, and in more than just one way. First, there is some evidence that many autistic people show behavioral tendencies to interact with technology and to potentially prefer such interactions to interactions with humans. It is thought that autistic traits are related to systemizing, the drive to analyze how systems work, as well as to predict, control and construct systems [15]. In this context, high information technology (IT) employment rates are often used as a proxy for higher rates of strong systemizers in a population. Intriguingly, recent research from the Netherlands reported that the prevalence of childhood autism diagnoses, but not of two control neurodevelopmental diagnoses (i.e., ADHD and dyspraxia), was substantially higher in Eindhoven, a classical (IT) region, when compared to two control regions (Utrecht and Haarlem) that had been selected for high demographic and socio-economic similarity in criteria other than the proportion of IT-related jobs [16]. Second, and qualifying any simplistic interpretation of this correlative (but not necessarily causal) relationship, there is evidence that technology can potentially provide powerful social support for children with autism. For instance, children with ASD often perform better with a social robot than a human partner (e.g., in terms of enhanced levels of social behavior towards robots), tend to perceive interactions with robots as positive [17], and subsequently show reduced levels of repetitive or stereotyped actions. For a recent review, see Pennisi et al. [18].
Scientific interest in the utilization of sensor technology to gain an understanding of people with ASD has increased considerably in the recent past. Some fields of research focus on different neurobiological assessments and try to identify autism-specific signals or ‘biomarkers’ to better understand the neurobiological underpinnings of the disorder. Good overviews covering methods including electroencephalography (EEG), magnetoencephalography (MEG), and functional magnetic resonance imaging (fMRI) are provided by Billeci et al. [19] and by Marco et al. [20]. Other research focused on an autonomic activity such as heart rate variability (HRV) or skin conductance responses (SCR). These can be studied with Wearable devices, typically in the context of emotional monitoring in ASD as seen in a review by Taj-Eldin, et al. [21]. Applications in VR environments [22,23,24] have also been reviewed as promising methods to train and practice social skills.
The aim of this review is to provide an overview of the current state of research using sensor-based technology in the context of ASD. We focus on sensor technology that is applicable without constraining natural movement, and on sensory input from the face, the voice, or body movements. Accordingly, this review does not consider evidence from wearable technology, VR, or psychophysiological and neurophysiological recordings. Note also that while we provide details of our procedure for identifying relevant original findings to enhance reproducibility, this paper represents a thematic review in which we pre-selected for contents as described below, and in which we occasionally discuss additional relevant findings that were not formally identified by this literature search. For instance, we may refer to some key findings regarding psychological theories of human social or emotional communication where relevant, even when the findings were not obtained with individuals with autism.

4. Supporting Interventions

While accurate early diagnosis and an assessment of specific impairments are crucial, they also are prerequisites that inform environmental adjustment, intervention, and training approaches which ultimately can be valuable for the individual person with ASD following diagnosis. Technically-assisted training often has the benefit of being readily available, problem-specific, cost-effective, and widely accepted by affected children. Additionally, smart responses of training systems that give reliable, immediate feedback and appraisal can be highly beneficial for fast learning results. Note that although we identified many publications on interventions aiming at autism as a target condition, many of these reported conceptual or technological contributions and few of them presented original data from people with ASD that qualified them for inclusion in this review (cf. Section 2). The original articles presented in this review, regarding sensor-based supporting interventions for ASD, are listed in Table 2.
Table 2. Original Articles on Sensor-Based Supporting Interventions in ASD.

4.1. Emotion Expression and Recognition

Emotion expression, especially from the face, is considered highly relevant in autism research. A game called FaceMaze [68], in combination with automatic online expression recognition of the user was specifically developed to improve facial expression production. In this game, children played a maze game while posing ‘happy’ or ‘angry’ facial expressions to overcome obstacles in the game. In a pre-post-rating with naive human raters, quality ratings for both trained expressions (happy and angry expression) in ASD children (N = 17, aged 6–18) increased in the post-test while ratings for an untrained emotion (surprise) did not change. Another smaller study created a robot-child-interaction and tested it with three children with ASD that were to imitate a robot’s facial expression [69]. The robot correctly recognized the children’s’ imitated expressions through an embedded camera in half of the cases. In these cases, it was able to give immediate positive feedback. It also correctly did not respond in about one-third of the trials where there was no imitation by the participants. Piana, et al. [70] designed a serious game with online 3D data acquisition that trained children with ASD (N = 10) in several sessions to recognize and express emotional body-movements. Both emotion expression (mean accuracy gain = 21%) and recognition (mean accuracy gain = 28%) increased throughout the sessions. Interestingly, performance in the T.E.C. (Test of Emotion Comprehension, assessing emotional understanding more generally), increased as well (mean gain = 14%).

4.2. Social Skills

Robins et al. [71] created an interactive robot (KASPAR) with force sensitive resistor sensors. They later planned to use KASPAR for robot-assisted play to teach touch, joint attention and body awareness [72], although conclusive data on experimental results from interactions between individuals with autism and KASPAR may still be in the pipeline. Learning social skills also presupposes attention to potential social cues and social engagement. Costa et al. [73] reported preliminary research on using the LEGO mindstorm robots with adolescents with ASD (N = 2), in an attempt to increase openness and induce communication since the participants actively had to provide verbal commands or instructional acts. They reported that the two participants behaved differently, one being indifferent, and one being increasingly interested in the interaction. Wong and Zhong [74] used a robotic platform (polar bear) to teach children with ASD (N = 8) social skills. They found, that within five sessions an increase in turn-taking, joint attention and eye contact was observable, resulting in overall 90% achievement of individually defined goals.
Greeting is a basic element of communication. In a greeting game with 3D body movement as well as voice acquisition [75], a participant would play an avatar with his own face, learning to greet (vocalization, eye contact and waving) and get immediate appraisal upon success. A single case study suggested that this intervention can be effective at teaching greeting behavior. As a more complex pilot intervention, Mower et al. [76] created the embodied conversational agent ‘Rachel’ that acted as an emotional coach in guiding children through emotional problem-solving tasks. Of their two participants with ASD, audio and video data were acquired for post hoc analysis and tentatively suggested that the interface could elicit interactive behavior.
Overall, there is some evidence that sensor technology can improve social skills in people with autism, and the use of sophisticated robotic platforms can be regarded as particularly promising. As limitations, it needs to be noted that all studies that met the criteria to be included in this review only tested very few participants, and that there typically were no real-world follow-up tests reported. As a result, a systematic quantitative assessment of treatment effects and effect sizes, as well as a comparison with more conventional interventions (e.g., social competence training) will require substantial cross-disciplinary research. Moreover, most studies were driven by a combination of theoretically interesting and technically advanced approaches, and from the perspective of typical development. Designing more user-centered and irritation-free approaches could promote both usability and motivation for people with autism to engage in technology-driven interventions.

5. Monitoring

Monitoring a child’s emotional state or behavioral changes can be crucial for the outcomes of a learning environment. As discussed above, emotional expressions from people with ASD may differ in several respects from those of TD people. As a result, there is a higher risk that caregivers or interaction partners overlook or misinterpret the emotional state of people with autism.
Del Coco et al. [77] created a humanoid and tablet-assisted therapy setup that was trained to monitor behavioral change in children with ASD via a video processing module. Besides creating a visual output of behavioral cues, they computed a score for affective engagement (happiness related features) from visual cues such as facial AUs, head pose and gaze that provides the practitioner with a behavioral trend along with the treatment. Dawood et al. [78] used facial expressions, eye gaze and head movements to identify five discrete emotional states of young adults with ASD in learning situations (e.g., anxiety, engagement, uncertainty). Their resulting model yielded a high validity in identifying emotional states of participants with high-functional ASD. At the same time, a lower validity was found for TD participants, suggesting differential facial expressions of certain emotional states in ASD. For monitoring social interactions, Winoto et al. [79] created a machine-learning-based social interaction coding of 3D data around a target user. Kolakowska et al. [80] approached automatic progress recognition with different tablet games. Over a 6-month time window, they were able to identify movement patterns in their study group of children with ASD (N = 40), that not only related to development in fine motor skills but also other fields like communication and socio-emotional skills. Overall, these initial studies suggest that sensor-based monitoring of emotional and behavioral changes may support caregivers in optimizing learning outcomes.

6. Discussion

The studies discussed above demonstrate substantial research activities towards using sensor-based technology in the context of autism overall, with attention to multiple aspects including diagnosis/classification and intervention. At the same time, it appears that much current research is largely driven by fast technological progress in terms of innovative engineering and data analysis methods. It remains a significant challenge to reconcile these developments with the specific testing of psychological or neuroscientific theories regarding functional changes and potentials in autism. Similarly, systematic studies with theory-driven protocols and larger samples are required to evaluate in more detail both the diagnostic and interventional potential of sensor-based technology. For the ultimate goal of evaluating its practical relevance, quantitative assessments of diagnostic sensitivities and specificities, or of treatment effect sizes, will be as important as will be comparative studies with more traditional approaches to diagnosis and intervention.
One of many examples of how sensor technology has the potential to go beyond application, and to contribute to current neurocognitive theories of communication is related to the theory of a tight link between perception and motor action in communication. This link now has been firmly established in speech communication [81], but there are reasons to believe that perception and action are also closely linked in nonverbal emotional and social communication. For instance, listening to laughter normally activates premotor and primary motor cortex [82], and may involuntarily elicit orofacial responses in a perceiver in parallel. In turn, there also is initial evidence that voluntary motor imitation can actually facilitate facial emotion recognition, particularly in people with high levels of autistic traits [83] who are thought to engage less in spontaneous imitation. A consistent theoretical account for such findings is that imitation, and covert sensorimotor simulation of others’ actions, may be based in part on the so-called mirror neuron system. This system consists of neurons that fire not only when a person performs an action, but also when s/he observes the same action in another individual. However, the human mirror neuron system is thought be specifically impaired in autism [63], and a subset of promising intervention approaches for autism using neurofeedback [84] are based on this theory. However, it should be noted that the underlying theory remains disputed [85].
Findings such as those by Lewis and Dunn [83] may be taken to suggest that interventions that promote facial imitation of emotions in autistic people should also support their abilities for emotion recognition and bidirectional communication. However, it is technically challenging to objectively quantify the degree of facial imitation, and in fact, a limitation of the study by Lewis and Dunn was that these authors failed to quantify imitation beyond simply asking participants to rate their own degree of imitation. Other studies measured facial imitation more objectively but typically did so by measuring the facial muscle response for selected target action units with electromyography (EMG, e.g., [86,87]). Although this can provide an objective measure of facial imitation, the fact that the method uses recording electrodes attached to facial muscles has many drawbacks. For instance, one concern is that this technology could draw the participants’ attention to their own facial behavior, which in turn could influence facial action. We believe that contact- and irritation-free assessment of imitation as provided by modern sensor and real-time facial emotion recognition technologies is the method of choice to promote better understanding not only the role of spontaneous facial imitation in emotion recognition in normal communication, but also to determine the potential role of impaired links between perception and action for communication difficulties in people with autism.
While the research discussed in this review appears to underline a great potential for the use of sensor technology, in particular in the context of autism, it is equally clear that many current tests of assessments or interventions will benefit in validity from a clear conceptual framework of autism spectrum disorders in the developmental perspective. At present, and honoring findings of large individual variability within both people with ASD and TD, results that were obtained with only a few participants (not always well described, and sometimes obtained in the absence of a TD group) or with experimental groups that are not comparable with respect to their basic characteristics (e.g., age, gender, IQ) need to be interpreted with caution in order to avoid biased or overgeneralized interpretation of individual study findings.
Other potential obstacles relate to sophisticated developments (and costs) of some of the systems used, which make them unlikely to become available in greater quantities. Moreover, even readily available systems may get discontinued or run out of support, such as in the case of Microsoft’s Kinect in 2017, and this provides great challenges for large-sample research in autism which often takes years to complete. Research aiming at training and modeling behavior of people with ASD also will increasingly need to consider usability, to the extent that the relevant systems are to be used by individuals with ASD, their parents, caregivers, and therapists.
Finally, compared to the typical approach of developing sensor-based technology with neurotypical individuals before applying it to people with autism, a more promising strategy may be one in which technology design originates from a user-centered perspective, with autistic people as users actively involved in the process. Such an approach has been forcefully advocated by Rajendran [88], who argues that this may both enhance our understanding of autism and promote better inclusivity of people with autism in an increasingly digital world. At the same time, such technologies ultimately can be useful for people without autism as well. This is because autism is seen as a unique window into social communication and social learning more generally.

7. Conclusions

Technical advancements and the ongoing developments in sensor technology and data science promise to unlock huge potentials for the diagnosis and understanding of autism, and for supporting affected people with training or intervention programs that can be tailored to their specific needs. At the same time, living up to these potentials calls for a concerted and interdisciplinary effort in which computer scientists, engineers, psychologists, and neuroscientists jointly collaborate in large-scale research projects that can uncover, in a quantitative manner, the efficiency of these approaches. In our view, this will be the route not only for establishing routine contributions to evidence-based diagnosis and interventions in autism [89] but also to ensure that more people with autism can genuinely benefit from tailor-made technology.

Funding

Previous research by SRS on related topics has been funded by a grant from the Bundesministerium für Bildung und Forschung (BMBF), in a project on an irritation-free and emotion-sensitive training system (IRESTRA; Grant Reference: 16SV7210), and another BMBF project on the psychological measurement of anxiety in human-robot interaction (3DimIR, Grant Reference 03ZZ0459B).

Acknowledgments

AEK and SRS would like to thank the Herbert Feuchte Stiftungsverbund for supporting the research in the Social Potentials in Autism Research Unit (www.autismus.uni-jena.de). SRS would like to thank the Swiss Center for Affective Sciences at the University of Geneva, Switzerland, for hosting a sabbatical leave in summer 2019 during which this paper was written.

Conflicts of Interest

The authors declare no conflict of interest. In particular, funding bodies had no role in the planning, collection, or interpretation of evidence reviewed in this paper; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Matson, J.L.; Kozlowski, A.M. The increasing prevalence of autism spectrum disorders. Res. Autism Spectr. Disord. 2011, 5, 418–425. [Google Scholar] [CrossRef]
  2. Weintraub, K. The prevalence puzzle: Autism counts. Nature 2011, 479, 22–24. [Google Scholar] [CrossRef] [PubMed]
  3. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders (DSM-5®); American Psychiatric Pub.: Washington, DC, USA, 2013. [Google Scholar]
  4. Seltzer, M.M.; Shattuck, P.; Abbeduto, L.; Greenberg, J.S. Trajectory of development in adolescents and adults with autism. Ment. Retard. Dev. Disabil. Res. Rev. 2004, 10, 234–247. [Google Scholar] [CrossRef] [PubMed]
  5. Dawson, G.; Toth, K.; Abbott, R.; Osterling, J.; Munson, J.; Estes, A.; Liaw, J. Early Social Attention Impairments in Autism: Social Orienting, Joint Attention, and Attention to Distress. Dev. Psychol. 2004, 40, 271–283. [Google Scholar] [CrossRef] [PubMed]
  6. Brewer, R.; Biotti, F.; Catmur, C.; Press, C.; Happé, F.; Cook, R.; Bird, G. Can neurotypical individuals read autistic facial expressions? Atypical production of emotional facial expressions in autism spectrum disorders. Autism Res. 2016, 9, 262–271. [Google Scholar] [CrossRef] [PubMed]
  7. Green, H.; Tobin, Y. Prosodic analysis is difficult… but worth it: A study in high functioning autism. Int. J. Speech Lang. Pathol. 2009, 11, 308–315. [Google Scholar] [CrossRef]
  8. Orsmond, G.I.; Krauss, M.W.; Seltzer, M.M. Peer relationships and social and recreational activities among adolescents and adults with autism. J. Autism Dev. Disord. 2004, 34, 245–256. [Google Scholar] [CrossRef]
  9. Leekam, S.R.; Prior, M.R.; Uljarevic, M. Restricted and repetitive behaviors in autism spectrum disorders: A review of research in the last decade. Psychol. Bull. 2011, 137, 562–593. [Google Scholar] [CrossRef]
  10. Szatmari, P.; Georgiades, S.; Bryson, S.; Zwaigenbaum, L.; Roberts, W.; Mahoney, W.; Goldberg, J.; Tuff, L. Investigating the structure of the restricted, repetitive behaviours and interests domain of autism. J. Child Psychol. Psychiatry 2006, 47, 582–590. [Google Scholar] [CrossRef]
  11. Leekam, S.R.; Nieto, C.; Libby, S.J.; Wing, L.; Gould, J. Describing the sensory abnormalities of children and adults with autism. J. Autism Dev. Disord. 2007, 37, 894–910. [Google Scholar] [CrossRef]
  12. Falkmer, T.; Anderson, K.; Falkmer, M.; Horlin, C. Diagnostic procedures in autism spectrum disorders: A systematic literature review. Eur. Child Adolesc. Psychiatry 2013, 22, 329–340. [Google Scholar] [CrossRef]
  13. Green, R.M.; Travers, A.M.; Howe, Y.; McDougle, C.J. Women and Autism Spectrum Disorder: Diagnosis and Implications for Treatment of Adolescents and Adults. Curr. Psychiatry Rep. 2019, 21, 22. [Google Scholar] [CrossRef] [PubMed]
  14. Oono, I.P.; Honey, E.J.; McConachie, H. Parent-mediated early intervention for young children with autism spectrum disorders (ASD). Evid.-Based Child Heal. A Cochrane Rev. J. 2013, 8, 2380–2479. [Google Scholar] [CrossRef]
  15. Baron-Cohen, S.; Richler, J.; Bisarya, D.; Gurunathan, N.; Wheelwright, S. The systemizing quotient: an investigation of adults with Asperger syndrome or high-functioning autism, and normal sex differences. Philos. Trans. R. Soc. B Boil. Sci. 2003, 358, 361–374. [Google Scholar] [CrossRef] [PubMed]
  16. Roelfsema, M.T.; Hoekstra, R.A.; Allison, C.; Wheelwright, S.; Brayne, C.; Matthews, F.E.; Baron-Cohen, S. Are autism spectrum conditions more prevalent in an information-technology region? A school-based study of three regions in the Netherlands. J. Autism Dev. Disord. 2012, 42, 734–739. [Google Scholar] [CrossRef]
  17. Dautenhahn, K. Socially intelligent robots: Dimensions of human–robot interaction. Philos. Trans. R. Soc. B Boil. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef]
  18. Pennisi, P.; Tonacci, A.; Tartarisco, G.; Billeci, L.; Ruta, L.; Gangemi, S.; Pioggia, G. Autism and social robotics: A systematic review. Autism Res. 2016, 9, 165–183. [Google Scholar] [CrossRef]
  19. Billeci, L.; Sicca, F.; Maharatna, K.; Apicella, F.; Narzisi, A.; Campatelli, G.; Calderoni, S.; Pioggia, G.; Muratori, F. On the Application of Quantitative EEG for Characterizing Autistic Brain: A Systematic Review. Front. Hum. Neurosci. 2013, 7, 442. [Google Scholar] [CrossRef]
  20. Marco, E.J.; Hinkley, L.B.N.; Hill, S.S.; Nagarajan, S.S.; Hinkley, L.B.N. Sensory processing in autism: A review of neurophysiologic findings. Pediatr. Res. 2011, 69, 48R–54R. [Google Scholar] [CrossRef]
  21. Taj-Eldin, M.; Ryan, C.; O’Flynn, B.; Galvin, P. A Review of Wearable Solutions for Physiological and Emotional Monitoring for Use by People with Autism Spectrum Disorder and Their Caregivers. Sensors 2018, 18, 4271. [Google Scholar] [CrossRef]
  22. Parsons, S.; Mitchell, P. The potential of virtual reality in social skills training for people with autistic spectrum disorders. J. Intellect. Disabil. Res. 2002, 46, 430–443. [Google Scholar] [CrossRef] [PubMed]
  23. Bellani, M.; Fornasari, L.; Chittaro, L.; Brambilla, P. Virtual reality in autism: State of the art. Epidemiol. Psychiatr. Sci. 2011, 20, 235–238. [Google Scholar] [CrossRef]
  24. Pan, X.; Hamilton, A.F.D.C. Why and how to use virtual reality to study human social interaction: The challenges of exploring a new research landscape. Br. J. Psychol. 2018, 109, 395–417. [Google Scholar] [CrossRef] [PubMed]
  25. Bölte, S.; Hubl, D.; Feineis-Matthews, S.; Prvulovic, D.; Dierks, T.; Poustka, F. Facial affect recognition training in autism: Can we animate the fusiform gyrus? Behav. Neurosci. 2006, 120, 211–216. [Google Scholar] [CrossRef] [PubMed]
  26. Kasari, C.; Sigman, M.; Mundy, P.; Yirmiya, N. Affective sharing in the context of joint attention interactions of normal, autistic, and mentally retarded children. J. Autism Dev. Disord. 1990, 20, 87–100. [Google Scholar] [CrossRef] [PubMed]
  27. Weigelt, S.; Koldewyn, K.; Kanwisher, N. Face identity recognition in autism spectrum disorders: A review of behavioral studies. Neurosci. Biobehav. Rev. 2012, 36, 1060–1084. [Google Scholar] [CrossRef]
  28. Sheppard, E.; Pillai, D.; Wong, G.T.-L.; Ropar, D.; Mitchell, P. How Easy is it to Read the Minds of People with Autism Spectrum Disorder? J. Autism Dev. Disord. 2016, 46, 1247–1254. [Google Scholar] [CrossRef]
  29. Samad, M.D.; Bobzien, J.L.; Harrington, J.W.; Iftekharuddin, K.M. [INVITED] Non-intrusive optical imaging of face to probe physiological traits in Autism Spectrum Disorder. Opt. Laser Technol. 2016, 77, 221–228. [Google Scholar] [CrossRef]
  30. Del Coco, M.; Leo, M.; Carcagni, P.; Spagnolo, P.; Luigi Mazzeo, P.; Bernava, M.; Marino, F.; Pioggia, G.; Distante, C. A Computer Vision based Approach for Understanding Emotional Involvements in Children with Autism Spectrum Disorders. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1401–1407. [Google Scholar]
  31. Leo, M.; Carcagnì, P.; Distante, C.; Spagnolo, P.; Mazzeo, P.; Rosato, A.; Petrocchi, S.; Pellegrino, C.; Levante, A.; De Lumè, F. Computational Assessment of Facial Expression Production in ASD Children. Sensors 2018, 18, 3993. [Google Scholar] [CrossRef]
  32. Phillips, M.L.; Young, A.W.; Senior, C.; Brammer, M.; Andrew, C.; Calder, A.J.; Bullmore, E.T.; Perrett, D.I.; Rowland, D.; Williams, S.C.R.; et al. A specific neural substrate for perceiving facial expressions of disgust. Nature 1997, 389, 495–498. [Google Scholar] [CrossRef]
  33. Sander, D.; Grandjean, D.; Scherer, K.R. An Appraisal-Driven Componential Approach to the Emotional Brain. Emot. Rev. 2018, 10, 219–231. [Google Scholar] [CrossRef]
  34. Samad, M.D.; Diawara, N.; Bobzien, J.L.; Taylor, C.M.; Harrington, J.W.; Iftekharuddin, K.M. A pilot study to identify autism related traits in spontaneous facial actions using computer vision. Res. Autism Spectr. Disord. 2019, 65, 14–24. [Google Scholar] [CrossRef]
  35. Egger, H.L.; Dawson, G.; Hashemi, J.; Carpenter, K.L.; Sapiro, G. 23.1 Autism and Beyond: Lessons from an Iphone Study of Young Children. J. Am. Acad. Child Adolesc. Psychiatry 2018, 57, S33–S34. [Google Scholar] [CrossRef]
  36. Jaswal, V.K.; Akhtar, N. Being versus appearing socially uninterested: Challenging assumptions about social motivation in autism. Behav. Brain Sci. 2019, 42, e82. [Google Scholar] [CrossRef]
  37. Tanaka, J.W.; Sung, A. The “Eye Avoidance” Hypothesis of Autism Face Processing. J. Autism Dev. Disord. 2016, 46, 1538–1552. [Google Scholar] [CrossRef]
  38. Chawarska, K.; Shic, F. Looking but not seeing: Atypical visual scanning and recognition of faces in 2 and 4-year-old children with autism spectrum disorder. J. Autism Dev. Disord. 2009, 39, 1663–1672. [Google Scholar] [CrossRef]
  39. Liu, W.; Li, M.; Yi, L. Identifying children with autism spectrum disorder based on their face processing abnormality: A machine learning framework. Autism Res. 2016, 9, 888–898. [Google Scholar] [CrossRef]
  40. Król, M.; Król, M.E. A Novel Eye Movement Data Transformation Technique that Preserves Temporal Information: A Demonstration in a Face Processing Task. Sensors 2019, 19, 2377. [Google Scholar] [CrossRef]
  41. Wang, S.; Jiang, M.; Duchesne, X.M.; Laugeson, E.A.; Kennedy, D.P.; Adolphs, R.; Zhao, Q. Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking. Neuron 2015, 88, 604–616. [Google Scholar] [CrossRef]
  42. Min, C.-H.; Tewfik, A.H. Novel pattern detection in children with autism spectrum disorder using iterative subspace identification. In Proceedings of the 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, Dallas, TX, USA, 14–19 March 2010; pp. 2266–2269. [Google Scholar]
  43. Min, C.-H.; Fetzner, J. Vocal Stereotypy Detection: An Initial Step to Understanding Emotions of Children with Autism Spectrum Disorder. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 3306–3309. [Google Scholar]
  44. Marchi, E.; Schuller, B.; Baron-Cohen, S.; Lassalle, A.; O’Reilly, H.; Pigat, D.; Golan, O.; Friedenson, S.; Tal, S.; Bolte, S. Voice Emotion Games: Language and Emotion in the Voice of Children with Autism Spectrum Conditio. In Proceedings of the 3rd International Workshop on Intelligent Digital Games for Empowerment and Inclusion (IDGEI 2015) as part of the 20th ACM International Conference on Intelligent User Interfaces, IUI 2015, Atlanta, GA, USA, 29 March–1 April 2015; p. 9. [Google Scholar]
  45. Ringeval, F.; DeMouy, J.; Szaszak, G.; Chetouani, M.; Robel, L.; Xavier, J.; Cohen, D.; Plaza, M. Automatic Intonation Recognition for the Prosodic Assessment of Language-Impaired Children. IEEE Trans. Audio Speech Lang. Process. 2010, 19, 1328–1342. [Google Scholar] [CrossRef]
  46. Gonçalves, N.; Costa, S.; Rodrigues, J.; Soares, F. Detection of stereotyped hand flapping movements in Autistic children using the Kinect sensor: A case study. In Proceedings of the 2014 IEEE international conference on autonomous robot systems and competitions (ICARSC), Espinho, Portugal, 14–15 May 2014; pp. 212–216. [Google Scholar]
  47. Jazouli, M.; Majda, A.; Merad, D.; Aalouane, R.; Zarghili, A. Automatic detection of stereotyped movements in autistic children using the Kinect sensor. Int. J. Biomed. Eng. Technol. 2019, 29, 201–220. [Google Scholar] [CrossRef]
  48. Anzulewicz, A.; Sobota, K.; Delafield-Butt, J.T. Toward the Autism Motor Signature: Gesture patterns during smart tablet gameplay identify children with autism. Sci. Rep. 2016, 6, 31107. [Google Scholar] [CrossRef] [PubMed]
  49. Samad, M.D.; Diawara, N.; Bobzien, J.L.; Harrington, J.W.; Witherow, M.A.; Iftekharuddin, K.M. A Feasibility Study of Autism Behavioral Markers in Spontaneous Facial, Visual, and Hand Movement Response Data. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 26, 353–361. [Google Scholar] [CrossRef] [PubMed]
  50. Jaiswal, S.; Valstar, M.F.; Gillott, A.; Daley, D. Automatic detection of ADHD and ASD from expressive behaviour in RGBD data. In Proceedings of the 2017 12th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2017), Washington, DC, USA, 30 May–3 June 2017; pp. 762–769. [Google Scholar]
  51. Westeyn, T.L.; Abowd, G.D.; Starner, T.E.; Johnson, J.M.; Presti, P.W.; Weaver, K.A. Monitoring children’s developmental progress using augmented toys and activity recognition. Pers. Ubiquitous Comput. 2012, 16, 169–191. [Google Scholar] [CrossRef]
  52. Anzalone, S.M.; Tilmont, E.; Boucenna, S.; Xavier, J.; Jouen, A.-L.; Bodeau, N.; Maharatna, K.; Chetouani, M.; Cohen, D.; Group, M.S. How children with autism spectrum disorder behave and explore the 4-dimensional (spatial 3D+ time) environment during a joint attention induction task with a robot. Res. Autism Spectr. Disord. 2014, 8, 814–826. [Google Scholar] [CrossRef]
  53. Campbell, K.; Carpenter, K.L.; Hashemi, J.; Espinosa, S.; Marsan, S.; Borg, J.S.; Chang, Z.; Qiu, Q.; Vermeer, S.; Adler, E. Computer vision analysis captures atypical attention in toddlers with autism. Autism 2019, 23, 619–628. [Google Scholar] [CrossRef]
  54. Petric, F.; Hrvatinić, K.; Babić, A.; Malovan, L.; Miklić, D.; Kovačić, Z.; Cepanec, M.; Stošić, J.; Šimleša, S. Four tasks of a robot-assisted autism spectrum disorder diagnostic protocol: First clinical tests. In Proceedings of the IEEE Global Humanitarian Technology Conference (GHTC 2014), San Jose, CA, USA, 10–13 October 2014; pp. 510–517. [Google Scholar]
  55. Belin, P.; Fecteau, S.; Bedard, C. Thinking the voice: neural correlates of voice perception. Trends Cogn. Sci. 2004, 8, 129–135. [Google Scholar] [CrossRef]
  56. Schweinberger, S.R.; Kawahara, H.; Simpson, A.P.; Skuk, V.G.; Zäske, R. Speaker perception. Wiley Interdiscip. Rev. Cogn. Sci. 2014, 5, 15–25. [Google Scholar] [CrossRef]
  57. Philip, R.C.M.; Whalley, H.C.; Stanfield, A.C.; Sprengelmeyer, R.; Santos, I.M.; Young, A.W.; Atkinson, A.P.; Calder, A.J.; Johnstone, E.C.; Lawrie, S.M.; et al. Deficits in facial, body movement and vocal emotional processing in autism spectrum disorders. Psychol. Med. 2010, 40, 1919–1929. [Google Scholar] [CrossRef]
  58. Schelinski, S.; Roswandowitz, C.; von Kriegstein, K. Voice identity processing in autism spectrum disorder. Autism Res. 2017, 10, 155–168. [Google Scholar] [CrossRef]
  59. Skuk, V.G.; Palermo, R.; Broemer, L.; Schweinberger, S.R. Autistic traits are linked to individual differences in familiar voice identification. J. Autism Dev. Disord. 2019, 49, 2747–2767. [Google Scholar] [CrossRef] [PubMed]
  60. Fruhholz, S.; Marchi, E.; Schuller, B. The Effect of Narrow-Band Transmission on Recognition of Paralinguistic Information from Human Vocalizations. IEEE Access 2016, 4, 6059–6072. [Google Scholar] [CrossRef]
  61. Gilchrist, K.H.; Hegarty-Craver, M.; Christian, R.B.; Grego, S.; Kies, A.C.; Wheeler, A.C. Automated detection of repetitive motor behaviors as an outcome measurement in intellectual and developmental disabilities. J. Autism Dev. Disord. 2018, 48, 1458–1466. [Google Scholar] [CrossRef] [PubMed]
  62. Happé, F.G.E. An advanced test of theory of mind: Understanding of story characters’ thoughts and feelings by able autistic, mentally handicapped, and normal children and adults. J. Autism Dev. Disord. 1994, 24, 129–154. [Google Scholar] [CrossRef]
  63. Oberman, L.M.; Hubbard, E.M.; Mccleery, J.P.; Altschuler, E.L.; Ramachandran, V.S.; Pineda, J.A. EEG evidence for mirror neuron dysfunction in autism spectrum disorders. Cogn. Brain Res. 2005, 24, 190–198. [Google Scholar] [CrossRef]
  64. Schneider, D.; Slaughter, V.P.; Bayliss, A.P.; Dux, P.E. A temporally sustained implicit theory of mind deficit in autism spectrum disorders. Cognition 2013, 129, 410–417. [Google Scholar] [CrossRef]
  65. Low, J.; Apperly, I.A.; Butterfill, S.A.; Rakoczy, H. Cognitive Architecture of Belief Reasoning in Children and Adults: A Primer on the Two-Systems Account. Child Dev. Perspect. 2016, 10, 184–189. [Google Scholar] [CrossRef]
  66. Kulke, L.; Von Duhn, B.; Schneider, D.; Rakoczy, H. Is Implicit Theory of Mind a Real and Robust Phenomenon? Results from a Systematic Replication Study. Psychol. Sci. 2018, 29, 888–900. [Google Scholar] [CrossRef]
  67. Premack, D.; Woodruff, G. Does the chimpanzee have a theory of mind? Behav. Brain Sci. 1978, 1, 515–526. [Google Scholar] [CrossRef]
  68. Gordon, I.; Pierce, M.D.; Bartlett, M.S.; Tanaka, J.W. Training Facial Expression Production in Children on the Autism Spectrum. J. Autism Dev. Disord. 2014, 44, 2486–2498. [Google Scholar] [CrossRef]
  69. Leo, M.; Del Coco, M.; Carcagni, P.; Distante, C.; Bernava, M.; Pioggia, G.; Palestra, G. Automatic emotion recognition in robot-children interaction for ASD treatment. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Santiago, Chile, 11–18 December 2015; pp. 145–153. [Google Scholar]
  70. Piana, S.; Malagoli, C.; Usai, M.C.; Camurri, A. Effects of Computerized Emotional Training on Children with High Functioning Autism. IEEE Trans. Affect. Comput. 2019, 1, 1. [Google Scholar] [CrossRef]
  71. Robins, B.; Amirabdollahian, F.; Ji, Z.; Dautenhahn, K. Tactile interaction with a humanoid robot for children with autism: A case study analysis involving user requirements and results of an initial implementation. In Proceedings of the 19th International Symposium in Robot and Human Interactive Communication, Viareggio, Italy, 13–15 September 2010; pp. 704–711. [Google Scholar]
  72. Mengoni, S.E.; Irvine, K.; Thakur, D.; Barton, G.; Dautenhahn, K.; Guldberg, K.; Robins, B.; Wellsted, D.; Sharma, S. Feasibility study of a randomised controlled trial to investigate the effectiveness of using a humanoid robot to improve the social skills of children with autism spectrum disorder (Kaspar RCT): A study protocol. BMJ Open 2017, 7, e017376. [Google Scholar] [CrossRef] [PubMed]
  73. Costa, S.; Resende, J.; Soares, F.O.; Ferreira, M.J.; Santos, C.P.; Moreira, F. Applications of simple robots to encourage social receptiveness of adolescents with autism. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; pp. 5072–5075. [Google Scholar]
  74. Wong, H.; Zhong, Z. Assessment of robot training for social cognitive learning. In Proceedings of the 2016 16th International Conference on Control, Automation and Systems (ICCAS), Gyeongju, South Korea, 16–19 October 2016; pp. 893–898. [Google Scholar]
  75. Uzuegbunam, N.; Wong, W.-H.; Cheung, S.-C.S.; Ruble, L. In MEBook: Kinect-based self-modeling intervention for children with autism. In Proceedings of the 2015 IEEE International Conference on Multimedia and Expo (ICME), Turin, Italy, 29 June–3 July 2015; pp. 1–6. [Google Scholar]
  76. Mower, E.; Black, M.P.; Flores, E.; Williams, M.; Narayanan, S. Rachel: Design of an emotionally targeted interactive agent for children with autism. In Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain, 11–15 July 2011; pp. 1–6. [Google Scholar]
  77. Del Coco, M.; Leo, M.; Carcagnì, P.; Fama, F.; Spadaro, L.; Ruta, L.; Pioggia, G.; Distante, C. Study of mechanisms of social interaction stimulation in autism spectrum disorder by assisted humanoid robot. IEEE Trans. Cogn. Dev. Syst. 2017, 10, 993–1004. [Google Scholar] [CrossRef]
  78. Dawood, A.; Turner, S.; Perepa, P. Affective Computational Model to Extract Natural Affective States of Students with Asperger Syndrome (AS) in Computer-Based Learning Environment. IEEE Access 2018, 6, 67026–67034. [Google Scholar] [CrossRef]
  79. Winoto, P.; Chen, C.G.; Tang, T.Y. The development of a Kinect-based online socio-meter for users with social and communication skill impairments: A computational sensing approach. In Proceedings of the 2016 IEEE International Conference on Knowledge Engineering and Applications (ICKEA), Singapore, Singapore, 28–30 September 2016; pp. 139–143. [Google Scholar]
  80. Kołakowska, A.; Landowska, A.; Anzulewicz, A.; Sobota, K. Automatic recognition of therapy progress among children with autism. Sci. Rep. 2017, 7, 13863. [Google Scholar] [CrossRef]
  81. Pickering, M.J.; Garrod, S. An integrated theory of language production and comprehension. Behav. Brain Sci. 2013, 36, 329–347. [Google Scholar] [CrossRef]
  82. Warren, J.E.; Sauter, D.A.; Eisner, F.; Wiland, J.; Dresner, M.A.; Wise, R.J.S.; Rosen, S.; Scott, S.K. Positive Emotions Preferentially Engage an Auditory–Motor “Mirror” System. J. Neurosci. 2006, 26, 13067–13075. [Google Scholar] [CrossRef]
  83. Lewis, M.B.; Dunn, E. Instructions to mimic improve facial emotion recognition in people with sub-clinical autism traits. Q. J. Exp. Psychol. 2017, 70, 1–14. [Google Scholar] [CrossRef]
  84. Pineda, J.A.; Carrasco, K.; Datko, M.; Pillen, S.; Schalles, M. Neurofeedback training produces normalization in behavioural and electrophysiological measures of high-functioning autism. Philos. Trans. R. Soc. B Boil. Sci. 2014, 369, 20130183. [Google Scholar] [CrossRef]
  85. Caramazza, A.; Anzellotti, S.; Strnad, L.; Lingnau, A. Embodied Cognition and Mirror Neurons: A Critical Assessment. Annu. Rev. Neurosci. 2014, 37, 1–15. [Google Scholar] [CrossRef]
  86. Dimberg, U.; Thunberg, M.; Elmehed, K. Unconscious facial reactions to emotional facial expressions. Psychol. Sci. 2000, 11, 86–89. [Google Scholar] [CrossRef] [PubMed]
  87. Korb, S.; With, S.; Niedenthal, P.; Kaiser, S.; Grandjean, D. The Perception and Mimicry of Facial Movements Predict Judgments of Smile Authenticity. PLoS ONE 2014, 9, e99194. [Google Scholar] [CrossRef] [PubMed]
  88. Rajendran, G. Virtual environments and autism: A developmental psychopathological approach. J. Comput. Assist. Learn. 2013, 29, 334–347. [Google Scholar] [CrossRef]
  89. Wong, C.; Odom, S.L.; Hume, K.A.; Cox, A.W.; Fettig, A.; Kucharczyk, S.; Brock, M.E.; Plavnick, J.B.; Fleury, V.P.; Schultz, T.R. Evidence-Based Practices for Children, Youth, and Young Adults with Autism Spectrum Disorder: A Comprehensive Review. J. Autism Dev. Disord. 2015, 45, 1951–1966. [Google Scholar] [CrossRef]

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.