Abstract
The aim of this systematic review is to identify recent digital technologies used to detect early signs of autism spectrum disorder (ASD) in preschool children (i.e., up to six years of age). A systematic literature search was performed for English language articles and conference papers indexed in Pubmed, PsycInfo, ERIC, CINAHL, WoS, IEEE, and ACM digital libraries up until January 2020. A follow-up search was conducted to cover the literature published until December 2020 for the usefulness and interest in this area of research during the Covid-19 emergency. In total, 2427 articles were initially retrieved from databases search. Additional 481 articles were retrieved from follow-up search. Finally, 28 articles met the inclusion criteria and were included in the review. The studies included involved four main interface modalities: Natural User Interface (e.g., eye trackers), PC or mobile, Wearable, and Robotics. Most of the papers included (n = 20) involved the use of Level 1 screening tools. Notwithstanding the variability of the solutions identified, psychometric information points to considering available technologies as promising supports in clinical practice to detect early sign of ASD in young children. Further research is needed to understand the acceptability and increase use rates of technology-based screenings in clinical settings.
1. Introduction
Autism spectrum disorder (ASD) is a category of neurodevelopmental disorder characterized by persistent deficits in social communication and social interaction across multiple contexts as well as restricted, repetitive patterns of behavior, interests, or activities [1]. The care and social needs of preschool children with ASD (typically up to six years of age), in particular, are significant [2,3], usually extend to parents and siblings [2,4,5], and require substantial community resources [2,6,7]. In response to these needs, early detection of ASD has become a priority for primary care and other community settings [8] to provide early intervention services and to improve outcomes [2,9].
Timely (i.e., early) identification of ASD may be achieved by implementing screening methods and instruments that allow health and other professionals (e.g., social care, educators) for a rapid and relatively inexpensive evaluation of this condition in young children [10]. Screening measures that are suitable for use to identify ASD are already available and can vary by format (e.g., parent-report versus direct observation), scope, and target population [11]. With regard to the scope of the screening instruments, “broadband” screens cover multiple developmental domains, while “narrow” screens cover only those signs and symptoms specific to the condition of interest [11,12]. With regard to the target population, screening instruments can be used to conduct universal population-wide testing (also referred to as “universal screening” or Level 1 screening), or to identify possible signs of ASD in high-risk populations, such as siblings of children with ASD or those referred for speech or other developmental concerns to community pediatric services (also referred to as Level 2 screening) [12,13].
A number of relevant systematic reviews have examined the use of screening instruments for the identification of ASD in pediatric populations (o-6 years; see [13,14] for an overview of recent systematic reviews). Current evidence suggests that the most used and reliable instruments available to clinicians (e.g., pediatricians; developmental/child psychologists, child psychiatrists) are in the form of questionnaires, checklists, or observation scales where parents or clinicians are required to report/observe overt behavioral signs of ASD (e.g., limited smiles, eye contact) [11]. Advantages of these approaches have been extensively recognized and include high predictive values, ease of use, speed of administration, and limited or no specific administration/scoring training [13,14].
Notwithstanding the advantages, and the widespread implementation of these instruments in primary and community care settings as well as specialized services [15], screening instruments are still underused in routine clinical practice because of a number of challenges, such as lack of time, disruption of workflow, lack of familiarity with screening tools, difficulty with scoring, as well as lack of office-based systems for making referrals and monitoring outcomes (for an overview see [9]). As a consequence of these challenges, in spite of the possibility to reliably diagnose ASD in children during the first two years of life [2,12,16,17], current evidence reports that the diagnosis remains delayed in many children [18,19,20]. For instance, in a recent survey involving 1223 families and 760 professionals in 14 European countries [18], only 3.1% of the parents reported having noticed problems after responding to a specific ASD screening survey. In addition, the average age at diagnosis was 36.4 (SD = 17.7) months, with most diagnoses occurring between 32 and 46 months. In light of this evidence, it has been suggested that more effective screening strategies are needed to reduce the proportion of children who receive a late diagnosis or remain undetected [14,21,22]. Specifically, screening strategies are needed that (a) are able to reduce the workload of clinicians, (b) can be easily implemented within routine clinical practice, and (c) are psychometrically sound.
Over the past decade, advances in information and communication technologies (ICT) have opened innovative and promising scenarios for clinicians to improve both identification, treatment and support (e.g., [23,24]) of children with ASD. Such solutions may be further used to help clinicians (and other stakeholders) improve early screening of ASD in that they may allow them monitoring young children’s behaviors in clinical settings as well as in their natural environments [25].
This paper is aimed at providing a picture of the different technology-based solutions to screen for ASD reported in the literature since 2010. This starting date was chosen as it represents the time period when most of the current mobile devices (e.g., touch-screen devices) were first introduced in the market [26]. For the scopes of the present study, we use the term “technology” to refer to any ICT-based product, either mainstream (e.g., smartphone, tablet) or emergent (e.g., robots), that was tested for the purpose of screening for ASD.
Accordingly, our objectives are to review studies that implemented technological solutions specifically developed to screen for ASD in clinical practice, laboratory settings, at children’s homes, or in community settings, and to determine the level of development (maturity) reached by those solutions, as well as their expected contribution in supporting ASD screening practices. This review focuses on both Level 1 and Level 2 screeners. While Level 1 screening tools may be used to identify children at risk of ASD in the general population, Level 2 screeners are mainly used to distinguish between children with signs of ASD and those with other developmental concerns (e.g., language disorders, intellectual disability, other neurodevelopmental disorders). In this view, screening for ASD may be conceived as a multistep process, according to which children who fail a Level 1 screening would require a secondary (i.e., Level 2) screener before being referred to a more comprehensive and diagnostic assessment process [12,13,27] Providing such a comprehensive overview of the literature (including both levels of ASD screening) was thought to be useful to guide researchers and professionals in their choice of technology options in daily practice, as well as to stimulate their research initiatives aimed at adding essential evidence about technology-based ASD screeners.
2. Materials and Methods
2.1. Search Strategy
A systematic search was conducted following the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) reporting guideline recommendations [28] to identify studies reporting on commercially available ICT solutions or assistive technology products to screening children aged 0–6 years for ASD. The search was performed using the following academic databases: MEDLINE, consulted through the free electronic access PubMed; PsycINFO, ERIC, and CINAHL consulted through EBSCOHost; and Web of Science. IEEE and ACM digital libraries were also included. Search terms related to children, ASD, information technology and screening were used, and the search queries conducted with each database are listed in Appendix A.
The search was conducted by the first author in February 2020 and was restricted to English-language, peer-reviewed journals, and conference papers published as of January 2020. Figure 1 illustrates the search process and outcome. Initially, 2427 article titles were identified. The titles were reduced to 2283 once the duplicates and articles not in English were removed. The three authors assessed the eligibility of titles and abstracts. If the title of an article matched pre-specified inclusion and exclusion criteria (see Appendix B), then the abstract was further read by all raters. Full texts were downloaded to judge the article’s eligibility for the review if the abstract matched further specified inclusion and exclusion criteria (details in Appendix B).
Figure 1.
PRISMA flowchart of the articles’ selection process.
On this basis, 55 full-text articles were downloaded and fully read by the first author, who finally selected 20 of them according to specific inclusion/exclusion criteria (see below). Subsequently, an ancestral and forward search (i.e., Google Scholar’s “cited by” function) was conducted by the first author using the 55 articles originally reviewed. In addition, in order to keep up with the rapid publication rate in ASD research, as well as to identify research in this area during the Covid-19 emergency, a follow-up search was conducted on Google Scholar (using the search terms “autism” and “screening”) to identify papers published between March and December 2020. The Google Scholar search yielded in total 481 titles, of which four were included in the review. The additional forward and ancestral searches led to the finding of further four papers and so 28 articles were finally included in the review.
2.2. Full-Texts’ Inclusion and Exclusion Criteria
The following inclusion criteria were used in selecting the studies for the review:
- The paper had to report on the development and/or implementation of technology arrangements (whether they are commercially available or not, independently if they have been specially developed for screening or adapted from solutions available for different purposes) aimed at detecting early signs of ASD across a range of clinical (e.g., primary care; specialized clinics/services), and other settings such as laboratory, home, or school.
- The studies had to target children aged ≤6 years. Studies involving broader age ranges were included providing that they involved children within the aforementioned age group (i.e., age ≤ 6 years).
- The studies had to provide quantitative information on the capability of the technology (or the technology-based approach) of:
- Screening for ASD at the population level (Level 1 screening; L1), such as children evaluated by primary care physicians, or
- Screening for ASD in a subsample of the population identified as at risk for the disorder (Level 2 screening; L2), such as a referred clinical sample with a variety of developmental concerns, siblings of children with ASD, pre-term children, children with genetic syndromes that are usually associated with ASD, or children with a diagnosis of other neurodevelopmental disorders [29].
Excluded from the review were studies:
- Reporting on a retrospective analysis of existing databases of evaluation records which were not directly implemented in the aforementioned applied settings and/or did not involve the target users (i.e., health professionals; caregivers);
- Focusing on invasive or non-invasive techniques to investigate biological processes and structures (e.g., electroencephalography, brain imaging, electrodermal activity);
- Using technology to investigate physiological (e.g., heart rate; eye movements), behavioral (e.g., vocal or movement patterns; crying), or cognitive differences between children with/at risk of ASD and controls not for the purpose of developing a screening tool;
- Providing training to professionals on the use of a screening tool.
2.3. Data Coding and Extraction
The studies that met the aforementioned inclusion criteria were coded in terms of participant characteristics (i.e., number, age-range and sex), target users of the technology, indicators used to assess ASD condition, types of technology used, context(s) of use of the technology, screening level, and maturity of the technology. A brief description of each technology identified, the methodology for its evaluation, and its psychometric properties were also provided.
Country of origin of the study was reported based on (i) the information provided in the methodology, or (ii) the affiliation of the corresponding or the first author of the paper. To classify the types of technologies used in each paper, we adapted the classification proposed by Kientz et al. [30] which includes six different types of interface, namely (a) Personal computers (PC) or mobile, (b) shared interactive interfaces, (c), virtual, augmented, and mixed reality, (d) sensor-based and wearable, (e) natural user interfaces, and (f) robotics. Likewise, to rate the maturity of the technology identified, we used the maturity levels proposed by Kientz et al. [30], that is, (a) functional prototype or (b) publicly available. Specifically, a functional prototype refers to technology that has been developed and interacted with the intended users for the target purposes but may require assistance with setup, use, or maintenance. Technologies classified as publicly available, in contrast, refer to commercial products, software that is open source, or applications available for download on websites or on mobile marketplaces (even if no longer available at the time of the present review).
When not specifically mentioned in the paper, we conceived L1 screening as applying to (a) all children regardless of the risk status (such as the M-CHAT), (b) tools implemented to assess children during routine pediatric visits, (c) experimental or observational studies that compared children with a diagnosis of ASD with neurotypical children. In contrast, we conceived L2 screening tools as (a) targeted at children already identified as being at increased risk (e.g., due to a positive family history), and/or (b) used to distinguish between ASD and other neurodevelopmental disorders.
Finally, we extracted relevant information on psychometric properties typically used for screeners, when available. Metrics extracted included (1) sensitivity (the percent of cases with ASD classified by the instrument as ASD); (2) specificity (the percent of cases without ASD classified as not having ASD); (3) positive predictive validity (the percent of cases accurately predicted as having ASD); and (4) negative predictive validity (the percent of cases accurately predicted as not having ASD). Measures of accuracy in distinguishing between clinical and non-clinical groups were also considered relevant.
2.4. Inter-Rater Agreement
The first author calculated the inter-rater agreement between the three raters pairwise on all titles (n = 2283) and abstracts (n = 229). Based on rating criteria (see details in Appendix B), proportional agreement on the titles and abstracts was calculated by taking the number of agreements and dividing this by the number of agreements plus disagreements, multiplied by 100. Their agreement ranged between 65% and 84% for the titles, and 93% and 96% for the abstracts.
Consensus was reached on the titles and abstracts with disagreement after the three raters reviewed them again together. Inter-rater agreement was also checked on the summary points of the variables coded (see above). The first author extracted the information for the 28 papers included and a second rater extracted the information for eight randomly selected papers. The two authors agreed on 149 of the 152 summary points checked (i.e., 19 summary points per article multiplied by 8 articles). Following the same formula used above, the percentage of agreement was 98%. The two raters then discussed the discrepancies until a 100% agreement was reached.
3. Results
3.1. Overview of the Results
We identified 28 studies that used mainstream or adapted information technologies to screen children up to 6 years for ASD (see Table 1). Seven of the included studies [22,31,32,33,34,35,36] involved children recruited from primary care or pediatric services, while five studies involved children referred to tertiary care or specialized ASD centers [37,38,39,40,41]. A total of 7308 children participated in the studies. Of these, 3498 were males, 1851 females. In nine studies gender information were missing.
Table 1.
Studies included in the review.
Ages of the children involved in the studies varied greatly. Two studies involved children from 6 to 18 months [42,43]. Six studies involved children within the 10 to 48 months range [36,40,41,44,45,46], seven studies involved participants aged between 16 and 30 months [22,25,31,32,34,35,47], and four studies involved children within the 18- and 72-months range [37,39,48,49]. Three studies involved a sample of children aged between 48 and 72 months [33,50,51]. The remaining six studies involved samples within age ranges that included children with ASD both up to six years and older [38,52,53,54,55,56].
The majority of the studies reported in the papers identified was conducted in the USA (n = 19). Seven studies were conducted in as many countries, including China [51], Peru [49], UK [48], Italy [52], France [54], Colombia [56], and Sri-Lanka [46]. Two papers either did not provide information [55] or provided unclear information as to the country of origin of the participants recruited [45].
3.2. Types of Technologies Used
The studies included in the review involved four main interface modalities, namely (a) natural user interface (NUI), (b) PC or mobile, (c) wearable, and (d) robotics. Figure 2 illustrates the frequencies of the different interfaces used within each category.
Figure 2.
Frequency of technologies used in the papers included grouped according to interface category.
The former category (i.e., NUI) included 11 papers. Of these, five papers involved the use of eye trackers [38,40,41,49,51], two studies used voice-based recording systems [44,53], two studies employed face-recognition to detect facial expressions [25,36], one paper involved motion recognition using touch screen sensor technologies [48], and one paper tracked pupil diameter [54].
The second category (i.e., PC or mobile) included 16 papers. The studies reported by Abbas et al. [37] and Kanne et al. [39] were included in both categories (i.e., PC and Mobile) as they combined the two strategies within the same application. In a similar vein, the studies reported by Egger et al. [25] and Carpenter et al. [36] were included both in the NUI and PC/Mobile category. Accordingly, 11 papers reported on the use of computerized solutions (PC or mobile platforms) to administer parent-reported questionnaires [22,31,32,33,34,35,37,39,46,47,55], and seven papers employed screening tools in which videos were collected from [37,39,43,50] or showed via [25,36,42] parents’ mobile/PC devices.
The third category (i.e., wearable) included two papers [45,52] that used wearable sensors to track the kinematics of children’s movements while they were performing specific reaching and grasping movements.
The fourth category (i.e., robot) included one paper [56] that reported on the use of a humanoid robot to assess joint attention skills.
3.3. Screening Level
The majority of the papers included in the review (71%; n = 20) involved the use of L1 screening tools. A detailed analysis of the differences between the two screening approaches according to relevant study characteristics (e.g., target population; type of interface used) was not performed because of the relatively low number of L2 papers. However, it should be noted that all papers involving parent-reported questionnaires (n = 11) focused on L1 screening approach. In contrast, papers involving L2 screening tools were mostly focused on using objective screening measures such as eye-tracking (n = 3), audio recording (n = 1), or kinematics (n = 1). The identified papers were grouped according to the different age ranges of the populations involved. Detailed descriptions of each study are provided in Table 2.
Table 2.
Analysis of the studies included in the review.
3.3.1. L1 Screening Tools
Solutions Tested with Children up to 30 Months
Nine papers were identified that involved children in the 16–30 months age range [22,25,31,32,34,35,36,46,47]. Of these, two papers reported on studies aimed at adapting the M-CHAT for its administration via tablet [3,31]. Benefits of the use of tablet over the traditional paper-and-pencil form have been clearly highlighted by Campbell et al. [31], who documented that after implementation of the digital M-CHAT (a) the proportion of children screening positive with accurate documentation in the Electronic Health Records (EHR) increased from a mean of 54% to 92%, and (b) the proportion of physicians referring a child for a developmental assessment after a positive score increased from 56% to 100% (see also Major et al. [58] for secondary analyses).
Three studies reported on the use of automated EHR [22,32,35] to facilitate screening procedures within pediatric clinics. Both Bauer et al. [22] and Downs et al. [32] (see also [59], not included in this review) implemented the Child Health Improvement Through Computer Automation system (CHICA). CHICA is a computer decision support system developed to facilitate surveillance and screening for ASD in primary pediatric care services by implementing automated administration and scoring of the M-CHAT. Although encouraging results were observed in terms of increased screening of children for ASD, in both studies concerns were raised about the physicians’ response to the alerts that a patient had a concerning M-CHAT. In a similar line of investigation, Schrader et al. [35] implemented the Smart Early Screening for Autism and Communication Disorders (Smart ESAC) in a pediatric service. Results indicated a statistically significant reduction in the average age of referral after the implementation of the Smart ESAC compared to the 16 years prior to system implementation.
Ben-Sasson et al. [47] created a survey through which parents recruited via online advertisement could describe in their own words their concerns regarding their child’s social-communication development. Parents were further asked to complete the M-CHAT-R/F and the Autism Spectrum Quotient (ASQ) questionnaire. The authors were able to reliably predict the risk status of a child being on the spectrum by supplementing their written descriptions with only one of 11 questions taken from the M-CHAT-R.
Wingfield et al. [46] developed a mobile-based questionnaire with automatic scoring to be administered by non-specialist health/social workers in low-income countries. The system is a set of 21 “yes-no” questions for the parents. Preliminary evidence shows high accuracy in distinguishing between already diagnosed children with ASD and their neurotypical peers.
Finally, two studies used mobile devices to track facial expressions [25,36]. Egger et al. [25] developed an iPhone/iPad-based application to screen for signs of ASD in the general population. The app includes a short set of questionnaires as well as four brief videos. While the child watches the videos, the camera embedded on the device records his or her face. The recorded videos are thus uploaded by the caregivers on a server that automatically analyzes the child’s facial expressions and attention to estimate the risk of ASD. Preliminary results indicated that (a) the majority of parents were willing to upload the full videos of their children; and (b) significant associations were found between emotions and attention and age, sex, and autism risk status (based on the M-CHAT scores). Similar encouraging results were reported by Carpenter et al. [36] who seemingly used the same system as that tested by Egger et al. [25].
Solutions Tested with Children up to Six Years
Vargas-Cuentas et al. [49] presented a 1-min video displaying a social scene with playing children and an abstract scene with moving shapes on either side of the screen. Observer’s eye gaze while watching the videos were automatically tracked to assess spatial preference. Results from the proof-of-concept study comparing the eye gaze of children with ASD over those of their neurotypical peers as controls showed that the former group spent 26.9% to 32.8% of the time gazing at the social scene, compared to 44.2% to 50.7 of the control group.
Anzulewicz et al. [48] used two commercially available gameplays running on iPad to record children’s movements while interacting with the device. Differences between children with a diagnosis of ASD and their neurotypical peers were estimated by means of a machine learning algorithm which resulted highly accurate in distinguishing the two groups based on the sole kinematics information.
Wan et al. [51] used an eye tracker to distinguish children with ASD from their neurotypical peers. They developed a rapid screening session which involved the presentation of a video showing a speaking girl for a very brief time interval (i.e., about 10 s). Automatic analysis of children’s gaze produced reliable results in distinguishing between the two groups (i.e., ASD and neurotypical). Despite several differences in gazing behavior between the two groups while watching the speaking face, only the fixation times at the moving mouth and body could significantly discriminate the ASD group from the control group with acceptable classification accuracy.
Duda et al. [33] tested the Mobile Autism Risk Assessment (MARA) screening tool with children aged between 16 months and 17 years referred to a developmental-behavioral pediatric clinic. MARA is a 7-item parent questionnaire that can be administered via an electronic platform with automatic scoring. Before its implementation in a clinical setting, the questionnaire was validated in a series of preliminary studies [60]. Results from the implementation study showed that children who received a clinical ASD diagnosis were more likely than those without a clinical ASD diagnosis to receive a MARA score that was indicative of ASD. Importantly, the respondent could complete the MARA questionnaire either at home or in the clinic. Based on this preliminary clinical validation, two further papers by Abbas et al. [37] and Kanne et al. [39] tested the Cognoa application involving children aged between 18 to 72 months. Cognoa is a mobile-based application (i.e., tablet; smartphone) using the same algorithm used in MARA. It follows a two-stage approach to ASD screening whereby a parent (a) answers to a 15-item questionnaire and (b) uploads through the mobile phone at least 1–2 min. videos of the child being rated recorded in different everyday scenarios (e.g., mealtime, playtime, or conversations). Videos are then rated by specialized assessors to determine the need for further assessment. Results indicated that the Cognoa (a) performed similarly to other screening measures (i.e., MCHAT-R/F; SCQ; SRS; CBCL-ASP), and (b) was able to reliably screen all children in the 18–72-month age range, thus covering the screening age gap between 30 months and 48 months.
In a similar vein, Tariq et al. [50] created a mobile web portal to test the ability of machine learning to reliably detect autism based on short home videos of children. The results suggest that machine learning may enable rapid ASD detection outside of clinics, thus reducing waiting periods for access to care and reach underserved populations.
3.3.2. L2 Screening Tools
Solutions Tested with Children up to 18 Months
Two papers were included that involved children up to 18 months [42,43]. Young et al. [42] developed a web-based application named Video-referenced Infant Rating System for Autism (VIRSA). The application is intended to be used by parents and shows pairs of videos of parents and infants playing together. After the presentation of each pair of videos, the respondent is asked to make judgments of which video is most similar to the child being rated. The application was tested involving infants with an older sibling with ASD, with preliminary results showing that VIRSA could correctly identify all children diagnosed with ASD at 18 months.
Talbott et al. [43] reported on the feasibility of instructing parents to administer specific semi-structured behavioral probes using the Telehealth Evaluation of Development for Infants (TEDI). This approach resulted reliable and acceptable to parents, although the sample involved was relatively small (i.e., 11 children).
Solutions Tested with Children up to 48 Months
Four papers were identified involving children aged between 10 and 48 months. Pierce et al. [41] developed the GeoPref test based on the assumption for which preference for geometric shapes over social content might be a reliable biomarker of ASD (see also [61]). The test involved the use of an eye-tracker that monitored the gaze behavior of the child while he or she was watching a video representing dynamic geometric images paired with a video representing dynamic social images. Results showed that a subset of ASD toddlers who fixated on the geometric images >69% of the time was accurately identified as being on the spectrum with high specificity. These promising results were further replicated by Moore et al. [40] using longer and more complex social scenes (see also [62] for the use of the GeoPref test as a symptom severity prognostic tool).
Wedyan and Al-Jumaily [45] conducted a proof-of-concept study to investigate the use of a wrist-worn light sensor to monitor object manipulation skills of children while they inserted a ball into a plastic tube. Automatic classification of the movement data was able to differentiate children at high risk of ASD from those at low risk with high accuracy.
Oller et al. [44] used the Language ENvironment Analysis (LENA) system to collect whole day audio recordings of infants in their homes. They further developed an automated approach to data analysis that was able to differentiate between vocalizations produced by neurotypical children from those produced by children with ASD or language delay.
Solutions Tested with Children up to Six Years and Older
Two papers were included in this group. Frazier et al. [38] estimated an Autism Risk Index by means of eye-tracking technology used to record fixations of children while presented with a variety of social and nonsocial visual stimuli. The results indicated that, for children with ASD up to 48 months and older, the index was able to classify their clinical condition with very good accuracy. Classification accuracy was also strong for children aged 30 months or younger.
Ramirez-Duque [56] tested the feasibility of using a social robot with a humanoid appearance to elicit and assess joint attention in children with a diagnosis of ASD. The robot was used in triadic interactions. The results showed that children with ASD produced less joint attention-related behaviors compared to a control group of children with other neurodevelopmental disorders.
3.4. Technology Maturity
About half (57%; n = 16) of the papers identified reported on the use of the screening tools were classified as reporting on a Functional Prototype (see Figure 3). Of these prototypes, 10 (62%) were L1 screening tools. Similarly, of the papers reporting on technologies classified as publicly available (n = 12), the majority (92%; n = 11) reported on L1 screening tools. Almost all the screening tools classified as publicly available (n = 10) were PC/Mobile interfaces used to administer parent-reported questionnaires for L1 screening. In contrast, functional prototypes were mostly represented by NUI interfaces (56%; n = 9), of which five involved the use of eye trackers.
Figure 3.
Papers included in the review grouped according to maturity and screening levels.
3.5. Psychometric Properties
Table 2 reports key information on the psychometric properties of the screening tools assessed in the papers identified. Five studies reported all the four metrics considered relevant for a screening tool (i.e., Sp; Se; PPV; NPV), and 18 papers reported at least one of such psychometric metrics or provided information of accuracy in detecting risk of ASD. Of the papers reporting psychometric information (n = 23), eight papers reported sensitivity and specificity values equal or over 75%. It should be noted, however, that sensitivity values below this threshold may be not indicative of poor psychometric properties, as the tool may be reliable in detecting specific ASD subgroups (e.g., [41]).
4. Discussion
Prospective identification of early signs of ASD is widely considered a priority to ensure that children at risk of this condition have timely access to specialized services and interventions [11]. The aim of this paper was to provide healthcare and other practitioners with an overview of the technologies available to support them in the identification of overt behavioral signs of ASD in children up to six years of age. Overall, the solutions identified varied greatly in terms of screening modalities (e.g., questionnaires, behavior observations), type of interface used (e.g., tablets, eye tracker), the granularity of behavioral indicators used to estimate the risk for ASD (e.g., from subtle eye movements to behaviorally defined clinical symptoms), intended technology users (e.g., parents, clinicians), and age ranges covered by the screening tools developed. Notwithstanding such variability, psychometric information point to considering available technologies as promising support in clinical practice to detect early sign of ASD in young children. In light of these findings, some considerations may be put forward.
First, one of the main barriers to ASD screening seems to be implementing such activity within routine clinical practice due to lack of administration or scoring time [9]. The literature identified in the current review suggests that the administration and the scoring of either existing (e.g., M-CHAT) or newly developed parent-reported questionnaires can be automated through machine learning (ML). Such ML-based solutions can be implemented within the EHR of specific primary care or specialized services (e.g., CHICA), and are effective in reducing the burden on care staff. Specifically, the evidence reviewed indicates a rapid increase in the number of children screened for ASD during the visits. Despite such encouraging results, however, it remains unclear whether clinicians would take advantage of this automated approach to screening. For instance, in the study by Downs et al. [32], almost half of positive M-CHAT results were not followed up by clinicians. A possible strategy to cope with this issue may be automating the whole screening process to ensure that at-risk children are properly assessed [32].
Second, several mobile solutions have been developed that allow data collection on children’s behaviors in non-clinical settings (e.g., home). The most affordable and effective solutions include the use of smartphones to record videos of children in their daily contexts which are subsequently analyzed (i.e., scored) by expert clinicians [37,39]. In these studies, home-made videos could be further supplemented by short questionnaires to improve the accuracy of the screening process. Alternatively, Young et al. [42] substituted text-based with video-based questionnaires to enable detection of ASD in infancy and clearly showed that video can be used to improve parent reporting of early development. Together, mobile-based solutions may be considered a strategy to (a) reduce the burden on health services, (b) increase the number of screened children, and (c) accelerate the diagnostic process. Further research is needed, however, to explore whether these mobile-based screening strategies can be effective also when used in other settings and by other users, such as kindergartens and pre-school teachers. Indeed, there are limited screening tools developed for these stakeholders (i.e., pre-school teachers), despite their importance as informants of ASD children’s social behaviors compared to their normative peer groups [63,64]. As mobile, interactive, and smart technologies (e.g., smartphones, tablets, robots) are becoming increasingly available in educational settings to foster children’s learning and creativity (e.g., [65,66,67]), teachers can be trained to use them also to contribute to the screening of young children, thus providing valuable information on children’s behavior in socially rich environments (e.g., kindergartens; primary schools).
Third, encouraging evidence is available on the use of technology combined with ML to detect early signs of ASD through the monitoring and successive analysis of bio-behavioral markers, such as speech, movement and gaze behavior. In particular, monitoring of eye gaze behavior by means of an eye tracker resulted in the most used screening strategy to (a) distinguish between children at risk and neurotypical children (e.g., [49,51]), (b) perform L2 screening procedures (e.g., [38]), or (c) identify ASD subgroups [41]. Overall, current evidence suggests that monitoring of eye gaze should not be considered as a replacement of more traditional screening practices (e.g., parent-reported questionnaires), but an additional source of information about early signs of ASD. As already mentioned, screening is indeed widely considered a multistep process, whereby failing a L1 assessment would require a secondary screener (L2) before initiating a diagnostic process [27]. Likely, based on present findings, we argue that the increased availability of affordable and reliable eye trackers could facilitate the diffusion of this screening strategy in a variety of contexts as L2 screeners. However, more research is needed on (a) the integration of this technology in routine clinical practice, (b) whether the use of eye trackers is acceptable to clinicians, and (c) how the information gathered from the analysis of the eye movement of children can be integrated with the results obtained from more traditional screening tests.
Voice recordings and movement observation, as well as social robots, were also further strategies identified in the present review to screen for ASD in young children (e.g., [52,53]). Although promising, however, these emerging technologies may be considered at an earlier stage of development compared to eye tracking.
Fourth, maturity of screening solutions in terms of technological development was found to be well balanced across maturity levels (i.e., Publicly Available, Functional Prototypes), but highly unbalanced for what concerns the level of screening. Specifically, almost all the solutions included in the Publicly Available category belong to L1 (or universal) screening tools. This is not surprising given that the majority of the L1 screening solutions identified are parent-reported questionnaires which included already validated (and available) tools (e.g., M-CHAT). Based on this finding, it can be argued that the transition from traditional to technology-based screening tools may be primarily based on adaptation from currently available forms of screening strategies (i.e., questionnaires).
Fifth, understanding the feasibility, acceptability, and effectiveness of implementing telehealth assessment is becoming of fundamental importance to cope with the limitations to health services delivery due to either low resources available (e.g., lack of trained staff), or public health emergencies (e.g., coronavirus disease 2019) [68,69]. As showed in the study by Talbott et al. [43], this approach required the active involvement of parents who had to elicit target behaviors and collect data to be shared with expert clinicians. Though telehealth assessment resulted acceptable to parents, more research is needed to understand the applicability of telehealth assessment to those parents who may experience language barriers or are less confident with technology.
Sixth, despite we attempted to provide a comprehensive overview of the technology-based solutions available to screen for ASD, some limitations may have reduced the number of potentially relevant screening solutions. For instance, we excluded papers reporting on screening tools at a conceptual design phase that were not tested with the target population. Two further limitations include the decision (a) to focus on screening tools to assess overt children’s behaviors, thus excluding technologies to detect biological markers related to ASD condition, and (b) to exclude the literature focusing exclusively on ML-approaches to ASD screening that was not implemented in clinical settings.
In conclusion, the results of the present review of the literature suggest that technology may be a valuable support for ASD screening. Already validated parent-reported questionnaires may be easily adapted to be administered through mobile platforms to speed up the administration and scoring processes. Commercially available mobile technologies may be used to extend the screening process to children’s life settings (e.g., home, kindergartens). In addition, more sophisticated technologies such as eye-trackers may be considered as a valid supplement to traditional screening measures.
Author Contributions
Conceptualization, L.D., P.P.-F., and G.H.; methodology, L.D., P.P.-F., and G.H.; formal analysis, L.D.; data curation, L.D.; writing—original draft preparation, L.D.; writing—review and editing, L.D., P.P.-F., and G.H. All authors have read and agreed to the published version of the manuscript.
Funding
This research was in part funded by the ERASMUS+ Programme of the European Union under the project New Monitoring guidelines to develop innovative ECEC teachers curricula (NEMO), grant number 2019-1-IT02-KA201-063340.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
The data presented in this study are openly available in Open Science Framework. doi: 10.17605/OSF.IO/8Y9RG. In detail, all the information concerning selection and scoring of the papers’ titles can be found here: https://mfr.osf.io/render?url=https://osf.io/udezy/?direct%26mode=render%26action=download%26mode=render. All the information concerning the selection and scoring of the papers’ abstracts can be found here: https://mfr.osf.io/render?url=https://osf.io/kx87p/?direct%26mode=render%26action=download%26mode=render.
Conflicts of Interest
The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.
Appendix A
Table A1.
PubMed search terms.
Table A1.
PubMed search terms.
| Search ID | Search Terms |
|---|---|
| 1 | Child, Preschool (Mesh) |
| 2 | Infant * or baby or babies or toddler * or girl * or boy * or pre * school * |
| 3 | #1 OR #2 |
| 4 | Autism Spectrum Disorder (Mesh) |
| 5 | autis * or asperger * or pervasive or PDD or PDDNOS or pervasive develop * or autistic |
| 6 | #4 OR #5 |
| 7 | 3# AND #6 |
| 8 | Technology (Mesh) |
| 9 | Computer or mobile or digital or smart or wearable * or ICT or information technology or electronic or device or smartphone or mobile phone or virtual reality or robots or social robot * or augmented reality or speech generating device or SGD or iPad or tablet or eye tracker or gaze tracker or eye tracking or sensors or artificial intelligence or AI or voice-controlled or personal assistants or virtual assistants or smartwatch or iWatch or smartglasses or GPS or assistive technology or AT or internet of things or IOT |
| 10 | #8 OR #9 |
| 11 | Early Diagnosis (Mesh) |
| 12 | Early Medical Intervention (Mesh) |
| 13 | Early Intervention, Educational (Mesh) |
| 14 | #11 OR #12 OR #13 |
| 15 | #10 AND #14 |
| 16 | #7 AND #15 |
Table A2.
EBSCO and Web of Science (WoS) search terms.
Table A2.
EBSCO and Web of Science (WoS) search terms.
| Search ID | Search Terms |
|---|---|
| 1 | Infant * or baby or babies or toddler * or girl * or boy * or pre * school * |
| 2 | autis * or asperger * or pervasive or PDD or PDDNOS or pervasive develop * or autistic |
| 3 | #1 AND #2 |
| 4 | Technology or Computer or mobile or digital or smart or wearable * or ICT or information technology or electronic or device or smartphone or mobile phone or virtual reality or robots or social robot * or augmented reality or speech generating device or SGD or iPad or tablet or eye tracker or gaze tracker or eye tracking or sensor * or artificial intelligence or AI or voice-controlled or personal assistants or virtual assistants or smartwatch or iWatch or smartglasses or GPS or assistive technology or AT or internet of things or IOT |
| 5 | Diagnosis or Screening or Early Intervention or Preschool Education |
| 6 | #4 AND #5 |
| 7 | #3 AND #6 |
* Limiters—Published Date: 19900101-20191231; Expanders—Apply equivalent subjects; Narrow by Language—English; Narrow by SubjectAge:—preschool age (2–5 yearrs); Narrow by SubjectAge—childhood (birth-12 yearrs); Search modes—Boolean/Phrase.
Table A3.
Institute of Electrical and Electronics Engineers (IEEE) search terms.
Table A3.
Institute of Electrical and Electronics Engineers (IEEE) search terms.
| Search ID | Search Terms |
|---|---|
| 1 | preschool |
| 2 | Infant * |
| 3 | #1 OR #2 |
| 4 | Autism Spectrum Disorder (Mesh) |
| 5 | autis* or pervasive |
| 6 | #4 OR #5 |
| 7 | 3# AND #6 |
Table A4.
Association for Computing Machinery (ACM) search terms.
Table A4.
Association for Computing Machinery (ACM) search terms.
| Search ID | Search Terms |
|---|---|
| 1 | (All: autism) AND (All: infant) |
Appendix B
In this first step, the titles of the papers retrieved will be reviewed by three independent researchers (Lorenzo Desideri, Patricia Pérez-Fuster, and Gerardo Herrera) and scored as not relevant (0), probably relevant (1), or relevant (2). The scores will be added to make a sum score ranging from 0 to 6. All publications with a sum score of 6 will be selected for the next step. In general, in case of doubt please keep the tittle in the list (i.e., if the age-range is not specified, if the target population is not clear or if it may include ASD together with other populations, or if it is not clear whether it is related to screening/monitor/intervention or not, or if it is not clear if it is a review paper or a primary study).
Table A5.
Instructions for titles scoring.
Table A5.
Instructions for titles scoring.
| Score | Instructions | Examples |
|---|---|---|
| 0 points | (a) Title refers to a different age range than 0–6 OR (b) Title refers to a different term than autism (i.e., elderly or cerebral palsy, but not autism related terms) OR (c) Title refers to a different application area than Screening/ Monitoring or Intervention OR (d) Title is related to a systematic review or meta-analysis (instead of being a primary study) OR (e) Tittle is related to genetic/biochemical research | Title 1: “Digital images as meaning bridges: Case study of assimilation using avatar software in counselling with a 14-year-old boy” Explanation: The study satisfies two inclusion criteria: (1) it involves autism, (2) it refers to a technology-based intervention. However, it is explicitly mentioned that it does not focus on pre-school children. Title 2: “Technology-mediated learning in students with ASD. A bibliographical review” Explanation: The study satisfies two inclusion criteria: (a) it involves autism, (b) it refers to a technology-based intervention. However, it is a systematic review. |
| 1 point | (a) Title includes any term related to autism spectrum disorder condition (autis* or Asperger* or pervasive or PDD or PDDNOS or pervasive develop* or autistic) OR (b) Title refers to (any kind of) technology-based intervention or screening (or monitoring) AND (c) Title does not qualify for any of the 5 options that apply for 0 points | Title 1: “Sustained Community Implementation of JASPER Intervention with Toddlers with Autism” Explanation: The article refers to autism (and toddlers) which is the focus of our study. Even if we don’t know whether JASPER is a technology-based intervention, it is worth including this article in the next step. Title 2: “Factor Analysis of the Childhood Autism Rating Scale in a Sample of Two Year Olds with an Autism Spectrum Disorder” Explanation: The study satisfies two inclusion criteria: (a) it involves autism, (b) it refers to a tool for diagnosis. I know that the Childhood Autism Rating Scale is an observational tool, but I would prefer to be highly inclusive in this very first step. |
| 2 point | (a) Title includes any term related to autism spectrum disorder condition (autis* or as-perger* or pervasive or PDD or PDDNOS or pervasive develop* or autistic) AND (b) Title refers to (any kind of) technology-based intervention or screening (or monitoring) AND (c) Title does not qualify for any of the 5 options than apply for 0 points | Title 1: “Randomised controlled trial of an iPad based early intervention for autism: TOBY playpad study protocol” Explanation: The study satisfies inclusion criteria: (a) it involves autism AND (b) it refers to a technology-based intervention. Title 2: “Automatic newborn cry analysis: a non-invasive tool to help autism early diagnosis” Explanation: The article refers to autism (and newborns). We might suppose that the mentioned “tool” is a kind of digital technology. Hence, it would be better to include this title in the next step. |
References
- American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders, 5th ed.; American Psychiatric Association: Washington, DC, USA, 2013. [Google Scholar]
- Hyman, S.L.; Levy, S.E.; Myers, S.M. Identification, evaluation, and management of children with autism spectrum disorder. Pediatrics 2020, 145. [Google Scholar] [CrossRef] [PubMed]
- World Health Organization. Meeting Report: Autism Spectrum Disorders & Other Developmental Disorders: From Raising Awareness to Building Capacity; World Health Organization: Geneva, Switzerland, 2013. [Google Scholar]
- Rojas-Torres, L.P.; Alonso-Esteban, Y.; Alcantud-Marín, F. Early Intervention with Parents of Children with Autism Spectrum Disorders: A Review of Programs. Children 2020, 7, 294. [Google Scholar] [CrossRef] [PubMed]
- Seymour, M.; Wood, C.; Giallo, R.; Jellett, R. Fatigue, stress and coping in mothers of children with an autism spectrum disorder. J. Autism Dev. Disord. 2013, 43, 1547–1554. [Google Scholar] [CrossRef]
- Cakir, J.; Frye, R.E.; Walker, S.J. The lifetime social cost of autism: 1990–2029. Res. Autism Spectr. Disord. 2020, 72, 101502. [Google Scholar] [CrossRef]
- Tachibana, Y.; Miyazaki, C.; Ota, E.; Mori, R.; Hwang, Y.; Kobayashi, E.; Kamio, Y. A systematic review and meta-analysis of comprehensive interventions for pre-school children with autism spectrum disorder (ASD). PLoS ONE 2017, 12, e0186502. [Google Scholar] [CrossRef]
- Daniels, A.M.; Halladay, A.K.; Shih, A.; Elder, L.M.; Dawson, G. Approaches to enhancing the early detection of autism spectrum disorders: A systematic review of the literature. J. Am. Acad. Child Adolesc. Psychiatry 2014, 53, 141–152. [Google Scholar] [CrossRef]
- Zwaigenbaum, L.; Bauman, M.L.; Choueiri, R.; Kasari, C.; Carter, A.; Granpeesheh, D.; Natowicz, M.R. Early intervention for children with autism spectrum disorder under 3 years of age: Recommendations for practice and research. Pediatrics 2015, 136 (Suppl. 1), S60–S81. [Google Scholar] [CrossRef]
- Pierce, K.; Courchesne, E.; Bacon, E. To screen or not to screen universally for autism is not the question: Why the task force got it wrong. J. Pediatrics 2016, 176, 182–194. [Google Scholar] [CrossRef]
- Zwaigenbaum, L.; Brian, J.A.; Ip, A. Early detection for autism spectrum disorder in young children. Paediatr. Child Health 2019, 24, 424–432. [Google Scholar] [CrossRef]
- Lord, C.; Brugha, T.S.; Charman, T.; Cusack, J.; Dumas, G.; Frazier, T.; Veenstra-VanderWeele, J. Autism spectrum disorder. Nat. Rev. Dis. Primers 2020, 6, 1–23. [Google Scholar] [CrossRef] [PubMed]
- Petrocchi, S.; Levante, A.; Lecciso, F. Systematic Review of Level 1 and Level 2 Screening Tools for Autism Spectrum Disorders in Toddlers. Brain Sci. 2020, 10, 180. [Google Scholar] [CrossRef]
- Levy, S.E.; Wolfe, A.; Coury, D.; Duby, J.; Farmer, J.; Schor, E.; Warren, Z. Screening tools for autism spectrum disorder in primary care: A systematic evidence review. Pediatrics 2020, 145 (Suppl. 1), S47–S59. [Google Scholar] [CrossRef]
- Austin, J.; Manning-Courtney, P.; Johnson, M.L.; Weber, R.; Johnson, H.; Murray, D.; Murray, M. Improving access to care at autism treatment centers: A System analysis approach. Pediatrics 2016, 137 (Suppl. 2), S149–S157. [Google Scholar] [CrossRef]
- Bryson, S.E.; Zwaigenbaum, L.; McDermott, C.; Rombough, V.; Brian, J. The Autism Observation Scale for Infants: Scale development and reliability data. J. Autism Dev. Disord. 2008, 38, 731–738. [Google Scholar] [CrossRef]
- Zwaigenbaum, L.; Bryson, S.; Rogers, T.; Roberts, W.; Brian, J.; Szatmari, P. Behavioral manifestations of autism in the first year of life. Int. J. Dev. Neurosci. 2005, 23, 143–152. [Google Scholar] [CrossRef]
- Bejarano-Martín, Á.; Canal-Bedia, R.; Magán-Maganto, M.; Fernández-Álvarez, C.; Cilleros-Martín, M.V.; Sánchez-Gómez, M.C.; de la Paz, M.P. Early detection, diagnosis and intervention services for young children with autism spectrum disorder in the European Union (ASDEU): Family and professional perspectives. J. Autism Dev. Disord. 2020, 50, 3380–3394. [Google Scholar] [CrossRef]
- Centres for Disease, Control and Prevention. Screening and Diagnosis of Autism Spectrum Disorder. Available online: https://www.cdc.gov/ncbddd/autism/screening.html#:~:text=Diagnosing%20autism%20spectrum%20disorder%20(ASD,at%2018%20months%20or%20younger (accessed on 19 December 2020).
- García-Primo, P.; Hellendoorn, A.; Charman, T.; Roeyers, H.; Dereu, M.; Roge, B.; Canal-Bedia, R. Screening for autism spectrum disorders: State of the art in Europe. Eur. Child Adolesc. Psychiatry 2014, 23, 1005–1021. [Google Scholar] [CrossRef]
- Arunyanart, W.; Fenick, A.; Ukritchon, S.; Imjaijitt, W.; Northrup, V.; Weitzman, C. Developmental and autism screening: A survey across six states. Infants Young Child. 2012, 25, 175–187. [Google Scholar] [CrossRef]
- Bauer, N.S.; Sturm, L.A.; Carroll, A.E.; Downs, S.M. Computer decision support to improve autism screening and care in community pediatric clinics. Infants Young Child. 2013, 26, 306–317. [Google Scholar] [CrossRef]
- Bölte, S.; Bartl-Pokorny, K.D.; Jonsson, U.; Berggren, S.; Zhang, D.; Kostrzewa, E.; Marschik, P.B. How can clinicians detect and treat autism early? Methodological trends of technology use in research. Acta Paediatr. 2016, 105, 137–144. [Google Scholar] [CrossRef]
- Desideri, L.; Di Santantonio, A.; Varrucciu, N.; Bonsi, I.; Di Sarro, R. Assistive Technology for Cognition to Support Executive Functions in Autism: A Scoping Review. Adv. Neurodev. Disord. 2020, 4, 330–343. [Google Scholar] [CrossRef]
- Egger, H.L.; Dawson, G.; Hashemi, J.; Carpenter, K.L.; Espinosa, S.; Campbell, K.; Sapiro, G. Automatic emotion and attention analysis of young children at home: A ResearchKit autism feasibility study. NPJ Digit. Med. 2018, 1, 1–10. [Google Scholar] [CrossRef]
- Stephenson, J.; Limbrick, L. A review of the use of touch-screen mobile devices by people with developmental disabilities. J. Autism Dev. Disord. 2015, 45, 3777–3791. [Google Scholar] [CrossRef]
- McCarty, P.; Frye, R.E. Early Detection and Diagnosis of Autism Spectrum Disorder: Why is it so difficult? Semin. Pediatric Neurol. 2020, 100831. [Google Scholar] [CrossRef]
- Liberati, A.; Altman, D.G.; Tetzlaff, J.; Mulrow, C.; Gøtzsche, P.C.; Ioannidis, J.P.; Moher, D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: Explanation and elaboration. J. Clin. Epidemiol. 2009, 62, e1–e34. [Google Scholar] [CrossRef]
- Robins, D.L.; Dumont-Mathieu, T.M. Early screening for autism spectrum disorders: Update on the modified checklist for autism in toddlers and other measures. J. Dev. Behav. Pediatrics 2006, 27, S111–S119. [Google Scholar] [CrossRef]
- Kientz, J.A.; Hayes, G.R.; Goodwin, M.S.; Gelsomini, M.; Abowd, G.D. Interactive Technologies and Autism, 3rd ed.; Morgan & Claypool Publishers: San Rafael, CA, USA, 2020. [Google Scholar]
- Campbell, K.; Carpenter, K.L.; Espinosa, S.; Hashemi, J.; Qiu, Q.; Tepper, M.; Dawson, G. Use of a digital modified checklist for autism in toddlers–revised with follow-up to improve quality of screening for autism. J. Pediatri. 2017, 183, 133–139. [Google Scholar] [CrossRef]
- Downs, S.M.; Bauer, N.S.; Saha, C.; Ofner, S.; Carroll, A.E. Effect of a Computer-Based Decision Support Intervention on Autism Spectrum Disorder Screening in Pediatric Primary Care Clinics: A Cluster Randomized Clinical Trial. JAMA Netw. Open 2019, 2, e1917676. [Google Scholar] [CrossRef]
- Duda, M.; Daniels, J.; Wall, D.P. Clinical evaluation of a novel and mobile autism risk assessment. J. Autism Dev. Disord. 2016, 46, 1953–1961. [Google Scholar] [CrossRef]
- Harrington, J.W.; Bai, R.; Perkins, A.M. Screening children for autism in an urban clinic using an electronic M-CHAT. Clin. Pediatrics 2013, 52, 35–41. [Google Scholar] [CrossRef]
- Schrader, E.; Delehanty, A.D.; Casler, A.; Petrie, E.; Rivera, A.; Harrison, K.; Wetherby, A.M. Integrating a New Online Autism Screening Tool in Primary Care to Lower the Age of Referral. Clin. Pediatrics 2020, 59, 305–309. [Google Scholar] [CrossRef]
- Carpenter, K.L.; Hahemi, J.; Campbell, K.; Lippmann, S.J.; Baker, J.P.; Egger, H.L.; Dawson, G. Digital behavioral phenotyping detects atypical pattern of facial expression in toddlers with autism. Autism Res. 2020. [Google Scholar] [CrossRef]
- Abbas, H.; Garberson, F.; Glover, E.; Wall, D.P. Machine learning approach for early detection of autism by combining questionnaire and home video screening. J. Am. Med Inform. Assoc. 2018, 25, 1000–1007. [Google Scholar] [CrossRef]
- Frazier, T.W.; Klingemier, E.W.; Parikh, S.; Speer, L.; Strauss, M.S.; Eng, C.; Youngstrom, E.A. Development and Validation of objective and quantitative eye tracking− based measures of autism risk and symptom levels. J. Am. Acad. Child Adolesc. Psychiatry 2018, 57, 858–866. [Google Scholar] [CrossRef]
- Kanne, S.M.; Carpenter, L.A.; Warren, Z. Screening in toddlers and preschoolers at risk for autism spectrum disorder: Evaluating a novel mobile-health screening tool. Autism Res. 2018, 11, 1038–1049. [Google Scholar] [CrossRef]
- Moore, A.; Wozniak, M.; Yousef, A.; Barnes, C.C.; Cha, D.; Courchesne, E.; Pierce, K. The geometric preference subtype in ASD: Identifying a consistent, early-emerging phenomenon through eye tracking. Mol. Autism 2018, 9, 19. [Google Scholar] [CrossRef]
- Pierce, K.; Marinero, S.; Hazin, R.; McKenna, B.; Barnes, C.C.; Malige, A. Eye tracking reveals abnormal visual preference for geometric images as an early biomarker of an autism spectrum disorder subtype associated with increased symptom severity. Biol. Psychiatry 2016, 79, 657–666. [Google Scholar] [CrossRef]
- Young, G.S.; Constantino, J.N.; Dvorak, S.; Belding, A.; Gangi, D.; Hill, A.; Ozonoff, S. A video-based measure to identify autism risk in infancy. J. Child Psychol. Psychiatry 2020, 61, 88–94. [Google Scholar] [CrossRef]
- Talbott, M.R.; Dufek, S.; Zwaigenbaum, L.; Bryson, S.; Brian, J.; Smith, I.M.; Rogers, S.J. Brief Report: Preliminary feasibility of the TEDI: A novel parent-administered telehealth assessment for autism spectrum disorder symptoms in the first year of life. J. Autism Dev. Disord. 2020, 5, 3432–3439. [Google Scholar] [CrossRef]
- Oller, D.K.; Niyogi, P.; Gray, S.; Richards, J.A.; Gilkerson, J.; Xu, D.; Warren, S.F. Automated vocal analysis of naturalistic recordings from children with autism, language delay, and typical development. Proc. Natl. Acad. Sci. USA 2010, 107, 13354–13359. [Google Scholar] [CrossRef]
- Wedyan, M.; Al-Jumaily, A. Early diagnosis autism based on upper limb motor coordination in high risk subjects for autism. In Proceedings of the 2016 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS), Tokyo, Japan, 17–20 December 2016; pp. 13–18. [Google Scholar] [CrossRef]
- Wingfield, B.; Miller, S.; Yogarajah, P.; Kerr, D.; Gardiner, B.; Seneviratne, S.; Coleman, S. A predictive model for paediatric autism screening. Health Inform. J. 2020. [Google Scholar] [CrossRef]
- Ben-Sasson, A.; Robins, D.L.; Yom-Tov, E. Risk assessment for parents who suspect their child has autism spectrum disorder: Machine learning approach. J. Med. Internet Res. 2018, 20, e134. [Google Scholar] [CrossRef]
- Anzulewicz, A.; Sobota, K.; Delafield-Butt, J.T. Toward the autism motor signature: Gesture patterns during smart tablet gameplay identify children with autism. Sci. Rep. 2016, 6, 1–13. [Google Scholar] [CrossRef]
- Vargas-Cuentas, N.I.; Roman-Gonzalez, A.; Gilman, R.H.; Barrientos, F.; Ting, J.; Hidalgo, D.; Zimic, M. Developing an eye-tracking algorithm as a potential tool for early diagnosis of autism spectrum disorder in children. PLoS ONE 2017, 12, e0188826. [Google Scholar] [CrossRef]
- Tariq, Q.; Daniels, J.; Schwartz, J.N.; Washington, P.; Kalantarian, H.; Wall, D.P. Mobile detection of autism through machine learning on home video: A development and prospective validation study. PLoS Med. 2018, 15, e1002705. [Google Scholar] [CrossRef]
- Wan, G.; Kong, X.; Sun, B.; Yu, S.; Tu, Y.; Park, J. Applying eye tracking to identify autism spectrum disorder in children. J. Autism Dev. Disord. 2019, 49, 209–215. [Google Scholar] [CrossRef]
- Crippa, A.; Salvatore, C.; Perego, P.; Forti, S.; Nobile, M.; Molteni, M.; Castiglioni, I. Use of machine learning to identify children with autism and their motor abnormalities. J. Autism Dev. Disord. 2015, 45, 2146–2156. [Google Scholar] [CrossRef]
- Gong, Y.; Yatawatte, H.; Poellabauer, C.; Schneider, S.; Latham, S. Automatic Autism Spectrum Disorder Detection Using Everyday Vocalizations Captured by Smart Devices. In Proceedings of the 2018 ACM International Conference on Bioinformatics, Computational Biology, and Health Informatics, Washington, DC, USA, 29 August–1 September 2018; pp. 465–473. [Google Scholar] [CrossRef]
- Martineau, J.; Hernandez, N.; Hiebel, L.; Roché, L.; Metzger, A.; Bonnet-Brilhault, F. Can pupil size and pupil responses during visual scanning contribute to the diagnosis of autism spectrum disorder in children? J. Psychiatr. Res. 2011, 45, 1077–1082. [Google Scholar] [CrossRef]
- Thabtah, F. An accessible and efficient autism screening method for behavioural data and predictive analyses. Health Inform. J. 2019, 25, 1739–1755. [Google Scholar] [CrossRef]
- Ramirez-Duque, A.A.; Bastos, T.; Munera, M.; Cifuentes, C.A.; Frizera-Neto, A. Robot-Assisted Intervention for children with special needs: A comparative assessment for autism screening. Robot. Auton. Syst. 2020, 103484. [Google Scholar] [CrossRef]
- Frazier, T.W.; Klingemier, E.W.; Beukemann, M.; Speer, L.; Markowitz, L.; Parikh, S.; Wexberg, S.; Giuliano, K.; Schulte, E.; Delahunty, C.; et al. Development of an objective autism risk index using remote eye tracking. J. Am. Acad. Child Adolesc. Psychiatry 2016, 55, 301–309. [Google Scholar] [CrossRef]
- Major, S.; Campbell, K.; Espinosa, S.; Baker, J.P.; Carpenter, K.L.; Sapiro, G.; Dawson, G. Impact of a digital Modified Checklist for Autism in Toddlers–Revised on likelihood and age of autism diagnosis and referral for developmental evaluation. Autism 2020, 24, 1629–1638. [Google Scholar] [CrossRef]
- Bauer, N.S.; Carroll, A.E.; Saha, C.; Downs, S.M. Computer decision support changes physician practice but not knowledge regarding autism spectrum disorders. Appl. Clin. Inform. 2015, 6, 454. [Google Scholar] [CrossRef]
- Wall, D.P.; Dally, R.; Luyster, R.; Jung, J.Y.; DeLuca, T.F. Use of artificial intelligence to shorten the behavioral diagnosis of autism. PLoS ONE 2012, 7, e43855. [Google Scholar] [CrossRef]
- Pierce, K.; Conant, D.; Hazin, R.; Stoner, R.; Desmond, J. Preference for geometric patterns early in life as a risk factor for autism. Arch. Gen. Psychiatry 2011, 68, 101–109. [Google Scholar] [CrossRef]
- Bacon, E.C.; Moore, A.; Lee, Q.; Carter Barnes, C.; Courchesne, E.; Pierce, K. Identifying prognostic markers in autism spectrum disorder using eye tracking. Autism 2020, 24, 658–669. [Google Scholar] [CrossRef]
- Morales-Hidalgo, P.; Hernández-Martínez, C.; Voltas, N.; Canals, J. EDUTEA: A DSM-5 teacher screening questionnaire for autism spectrum disorder and social pragmatic communication disorder. Int. J. Clin. Health Psychol. 2017, 17, 269–281. [Google Scholar] [CrossRef]
- Nah, Y.H. Preliminary data of a preschool teacher-screening checklist for autism spectrum disorder in Singapore. Adv. Autism 2020, 6, 303–313. [Google Scholar] [CrossRef]
- Resnick, M. Lifelong Kindergarten: Cultivating Creativity through Projects, Passion, Peers, and Play; The MIT Press: London, UK, 2017. [Google Scholar]
- Hughes-Roberts, T.; Brown, D.; Standen, P.; Desideri, L.; Negrini, M.; Rouame, A.; Hasson, C. Examining engagement and achievement in learners with individual needs through robotic-based teaching sessions. Br. J. Educ. Technol. 2019, 50, 2736–2750. [Google Scholar] [CrossRef]
- Desideri, L.; Bonifacci, P.; Croati, G.; Dalena, A.; Gesualdo, M.; Molinario, G.; Ottaviani, C. The Mind in the Machine: Mind Perception Modulates Gaze Aversion During Child–Robot Interaction. Int. J. Soc. Robot. 2020, 1–16. [Google Scholar] [CrossRef]
- Conti, E.; Chericoni, N.; Costanzo, V.; Lasala, R.; Mancini, A.; Prosperi, M.; Apicella, F. Moving Toward Telehealth Surveillance Services for Toddlers at Risk for Autism during the COVID-19 Pandemic. Front. Psychiatry 2020, 11, 565999. [Google Scholar] [CrossRef]
- Dahiya, A.V.; McDonnell, C.; DeLucia, E.; Scarpa, A. A systematic review of remote telehealth assessments for early signs of autism spectrum disorder: Video and mobile applications. Pract. Innov. 2020, 5, 150. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).


