3.1.1. Facial Mimicry
Autistic children barely acts on normal facial mimicry [29
]. Facial mimicry regards physical reactions that specifically replicate the facial expressions others make. The study by McIntosh [30
] reports that autistic participants were likely to fail to carry out the facial mimicry required by autistic children to show their emotional responses externally. Rather, they tend to receive the information from others’ facial expressions. In other words, autistic children mostly face a challenge in that they cannot instantly interpret and react to emotions interpersonally. Oberman et al. [31
] implemented an intervention in order to train autistic children’s facial expressions. They sampled a total of 13 autistic children diagnosed with ASD. The intervention is composed of 192 static images of facial expressions. The finding from the study points out that there is a gap of temporal dynamics between participants’ voluntary initiation of facial mimicry and the given materials to be mimicked. The study findings reveal that there are the delayed responses from autistic children, influenced by facial mimicry training. This is called a human-mirror neuron, and is a part of the human brain that particularly controls image replication of external information. Dapretto et al. [32
] also conducted experiments to test whether high-functioning autistic participants had any mirror neuron activities in their facial mimicry test. The study resulted in no mirror neuron activation in the inferior frontal gyrus of the human brain in comparison to those of non-autistic participants. Since the findings of subsequent studies have confirmed this, it is believed that facial mimicry is one of the crucial aspects to address in training rudimentary social skills.
With regard to social skill training, NPCs should use multiple kinds of facial mimicry to stimulate emotional responses. High variation of facial mimicry is a gifted ability unique to humans. Facial mimicry has been supportive in giving emotional cues that explain the states of mind. However, owing to the remote condition in VR, autistic children are scarcely exposed to the interactions initiated by facial expressions. In addition, even prior social skill training interventions have mostly not used interactive interventions to train facial mimicry. Otherwise, even if autistic participants had opportunities to experience facial mimicry, this has been limited to receiving facial expression images. Learner-centered activation of facial mimicry to simultaneously communicate with others in VR could effectively address the aforementioned problems in training. For example, collaborative VR, such as Secondlife and Opensimulator, allow users to control facial expressions to interpersonally share emotional states. The heads-up display (HUD) technique in VR platforms encourages learners to initiate different types of facial mimicry in an instantaneous way.
3.1.2. Simulation of Body Gestures
Spontaneous mirroring by neuron activation also influences the initiation of human gestures. Human gestures have been typically defined as all kinetic behavior that delivers either cognitive or emotional information to another being. To generate gestures, humans operate various types of body parts including a group of hand and, arm muscles to signify an external meaning. Mostly, gestures are a supplementary tool used for interpersonal communication [33
]. Once a person wants to augment the meaning of what he or she is stating, gestures concurrently occur. Specifically, researchers have classified gesture types based on their functionality and intensity of emotion [35
]. Evidence from early studies [37
] proposed that gestures deliver innate emotional signals [36
], as well as giving socio-cultural cues to understanding certain circumstances in which interpersonal conversation compensates for verbal impairment [38
Use of gestures has been a determinant in gauging social functioning disorder of autistic children. According to some studies regarding embodiment [39
], it is believed that embodied mimicry usually occurs when beginning with communication. De Marchena and Eigsti [40
] noticed that individuals with ASD tended to use fewer gestures when initiating interpersonal communication. In this study, as the researchers sampled 20 adolescents with ASD to observe the frequency of conversational gestures in social interactions in comparison to those of a control group. Through behavioral analyses, the study finding confirmed that gesture count was statistically correlated with the ratings of communicative quality in interactions (r(15) = 0.69, p
= 0.005). In addition, to date, various studies have stated that autistic children experienced difficulty in imitating gestures in interpersonal communication [31
]. The study by Smith and Bryson [39
] also elucidated that autistic children struggled with simulating both types of gestures (i.e., socio-communicative and pantomimed actions) compared to non-autistic participants. In response to this need, it is evident that special training to produce gestures by autistic children themselves is beneficial in social communication skill development [41
]. Similarly, the study by Ellawadi and Weismer [42
] reports evidence that gestures by autistic children are correlated with their behavior regulation and joint attention.
Online embodiment [22
] and underlying situated cognition [22
] are key notions to describe how gestures can be applied in VR-based social skill training for autistic children. Online embodiment is the simulated body schema in virtual environment [22
]. Online embodiment demonstrates the importance of sensory and motor responses that ensure autistic children can manipulate their online avatars. In accordance with this assumption, the role of NPCs in social skill training needs to facilitate avatar body movements that can display autistic children’s emotional reactions in different social settings. Technically, for instance, several interventions with body-gesture recognition devices (e.g., Microsoft Kinect) [42
] have been used for understanding autistic children. Schuller, Marchi, Baron-Cohen, O’Reilly, Robinson, Davies, Golan, Friedenson, Tal and Newman [45
] showed a body-tracking based social skill intervention that enabled autistic children to replicate certain gestures in the training.