Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (15)

Search Parameters:
Keywords = facial mimicry

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
22 pages, 2072 KiB  
Article
Does Identifying with Another Face Alter Body Image Disturbance in Women with an Eating Disorder? An Enfacement Illusion Study
by Jade Portingale, David Butler and Isabel Krug
Nutrients 2025, 17(11), 1861; https://doi.org/10.3390/nu17111861 - 29 May 2025
Viewed by 655
Abstract
Background/Objectives: Individuals with eating disorders (EDs) experience stronger body illusions than control participants, suggesting that abnormalities in multisensory integration may underlie distorted body perception in these conditions. These illusions can also temporarily reduce body image disturbance. Given the centrality of the face [...] Read more.
Background/Objectives: Individuals with eating disorders (EDs) experience stronger body illusions than control participants, suggesting that abnormalities in multisensory integration may underlie distorted body perception in these conditions. These illusions can also temporarily reduce body image disturbance. Given the centrality of the face to identity and social functioning—and emerging evidence of face image disturbance in EDs—this study examined, for the first time, whether individuals with EDs exhibit heightened susceptibility to a facial illusion (the enfacement illusion) and whether experiencing this illusion improves face and/or body image. Methods: White Australian female participants (19 with an ED and 24 controls) completed synchronous and asynchronous facial mimicry tasks to induce the enfacement illusion. Susceptibility was assessed via self-report and an objective self-face recognition task, alongside pre- and post-task measures of perceived facial attractiveness, facial adiposity estimation, and head/body dissatisfaction. Results: The illusion was successfully induced across both groups. Contrary to predictions, ED and control participants demonstrated comparable susceptibility, and neither group experienced improvements in face or body image. Notably, participants with EDs experienced increased head dissatisfaction following the illusion. Conclusions: These findings indicate that the multisensory integration processes underlying self-face perception, unlike those underlying body perception, may remain intact in EDs. Participant reflections suggested that the limited therapeutic benefit of the enfacement illusion for EDs may reflect the influence of maladaptive social-evaluative processing biases inadvertently triggered during the illusion. A novel dual-process model is proposed in which distorted self-face perception in EDs may arise from biased social-cognitive processing rather than sensory dysfunction alone. Full article
(This article belongs to the Special Issue Cognitive and Dietary Behaviour Interventions in Eating Disorders)
Show Figures

Figure 1

17 pages, 3042 KiB  
Review
Nuancing ‘Emotional’ Social Play: Does Play Behaviour Always Underlie a Positive Emotional State?
by Giada Cordoni and Ivan Norscia
Animals 2024, 14(19), 2769; https://doi.org/10.3390/ani14192769 - 25 Sep 2024
Cited by 2 | Viewed by 2460
Abstract
This review focuses on social play, a complex behaviour that is often difficult to categorize. Although play has been typically associated with positive emotional states, a thorough examination of the literature indicates that it may relate to different emotional systems, from attachment to [...] Read more.
This review focuses on social play, a complex behaviour that is often difficult to categorize. Although play has been typically associated with positive emotional states, a thorough examination of the literature indicates that it may relate to different emotional systems, from attachment to conflict. Play oscillates between competition and cooperation, and includes a spectrum in between; thus, quantitatively identifying and demonstrating the emotional nature of play remains challenging. We considered examples from human and non-human animal studies and explored the emotional and neuro-hormonal systems involved in play. We assessed ethological data possibly indicating the emotional states underlying play, and we focused on the cooperative and competitive elements of play. We investigated the relationship between play and affiliative/aggressive behaviours, the communicative meaning of play signals (especially primate play faces), and the motor and possibly emotional contagion function of rapid motor mimicry during play. From all the literature on play, this review selects and combines studies in an innovative way to present the methods (e.g., play indices and social network analysis), tools (e.g., sequential analysis and facial coding software), and evidence indicative of the emotional states underlying play, which is much more complex than previously thought. Full article
(This article belongs to the Special Issue Emotional Contagion in Animals)
Show Figures

Figure 1

15 pages, 1670 KiB  
Article
Recognition of Dynamic Emotional Expressions in Children and Adults and Its Associations with Empathy
by Yu-Chen Chiang, Sarina Hui-Lin Chien, Jia-Ling Lyu and Chien-Kai Chang
Sensors 2024, 24(14), 4674; https://doi.org/10.3390/s24144674 - 18 Jul 2024
Cited by 3 | Viewed by 2301
Abstract
This present study investigates emotion recognition in children and adults and its association with EQ and motor empathy. Overall, 58 children (33 5–6-year-olds, 25 7–9-year-olds) and 61 adults (24 young adults, 37 parents) participated in this study. Each participant received an EQ questionnaire [...] Read more.
This present study investigates emotion recognition in children and adults and its association with EQ and motor empathy. Overall, 58 children (33 5–6-year-olds, 25 7–9-year-olds) and 61 adults (24 young adults, 37 parents) participated in this study. Each participant received an EQ questionnaire and completed the dynamic emotion expression recognition task, where participants were asked to identify four basic emotions (happy, sad, fearful, and angry) from neutral to fully expressed states, and the motor empathy task, where participants’ facial muscle activity was recorded. The results showed that “happy” was the easiest expression for all ages; 5- to 6-year-old children performed equally well as adults. The accuracies for “fearful,” “angry,” and “sad” expressions were significantly lower in children than in adults. For motor empathy, 7- to 9-year-old children exhibited the highest level of facial muscle activity, while the young adults showed the lowest engagement. Importantly, individual EQ scores positively correlated with the motor empathy index in adults but not in children. In sum, our study echoes the previous literature, showing that the identification of negative emotions is still difficult for children aged 5–9 but that this improves in late childhood. Our results also suggest that stronger facial mimicry responses are positively related to a higher level of empathy in adults. Full article
(This article belongs to the Special Issue Emotion Recognition and Cognitive Behavior Analysis Based on Sensors)
Show Figures

Figure 1

12 pages, 933 KiB  
Article
Mimicking Facial Expressions Facilitates Working Memory for Stimuli in Emotion-Congruent Colours
by Thaatsha Sivananthan, Steven B. Most and Kim M. Curby
Vision 2024, 8(1), 4; https://doi.org/10.3390/vision8010004 - 30 Jan 2024
Viewed by 2497
Abstract
It is one thing for everyday phrases like “seeing red” to link some emotions with certain colours (e.g., anger with red), but can such links measurably bias information processing? We investigated whether emotional face information (angry/happy/neutral) held in visual working memory (VWM) enhances [...] Read more.
It is one thing for everyday phrases like “seeing red” to link some emotions with certain colours (e.g., anger with red), but can such links measurably bias information processing? We investigated whether emotional face information (angry/happy/neutral) held in visual working memory (VWM) enhances memory for shapes presented in a conceptually consistent colour (red or green) (Experiment 1). Although emotional information held in VWM appeared not to bias memory for coloured shapes in Experiment 1, exploratory analyses suggested that participants who physically mimicked the face stimuli were better at remembering congruently coloured shapes. Experiment 2 confirmed this finding by asking participants to hold the faces in mind while either mimicking or labelling the emotional expressions of face stimuli. Once again, those who mimicked the expressions were better at remembering shapes with emotion-congruent colours, whereas those who simply labelled them were not. Thus, emotion–colour associations appear powerful enough to guide attention, but—consistent with proposed impacts of “embodied emotion” on cognition—such effects emerged when emotion processing was facilitated through facial mimicry. Full article
Show Figures

Figure 1

14 pages, 6948 KiB  
Article
The Neural Mechanisms of Group Membership Effect on Emotional Mimicry: A Multimodal Study Combining Electromyography and Electroencephalography
by Beibei Kuang, Shenli Peng, Yuhang Wu, Ying Chen and Ping Hu
Brain Sci. 2024, 14(1), 25; https://doi.org/10.3390/brainsci14010025 - 25 Dec 2023
Viewed by 1782
Abstract
Emotional mimicry plays a vital role in understanding others’ emotions and has been found to be modulated by social contexts, especially group membership. However, the neural mechanisms underlying this modulation remain unclear. We explored whether and how group membership modulated emotional mimicry using [...] Read more.
Emotional mimicry plays a vital role in understanding others’ emotions and has been found to be modulated by social contexts, especially group membership. However, the neural mechanisms underlying this modulation remain unclear. We explored whether and how group membership modulated emotional mimicry using a multimodal method combining facial electromyography (fEMG) and electroencephalography (EEG). We instructed participants to passively view dynamic emotional faces (happy vs. angry) of others (in-group vs. out-group) and simultaneously recorded their fEMG and EEG responses. Then, we conducted combined analyses of fEMG-EEG by splitting the EEG trials into two mimicry intensity categories (high-intensity mimicry vs. low-intensity mimicry) according to fEMG activity. The fEMG results confirmed the occurrence of emotional mimicry in the present study but failed to find a group membership effect. However, the EEG results showed that participants mimicked in-group happiness and anger more than out-group. Importantly, this in-group preference involved different neural mechanisms in happiness and anger mimicry. In-group preference for happiness mimicry occurred at multiple neural mechanisms such as N1 (at P7, Pz, and P8), P2 (at Pz and P8), N2 (at P8), and P3 (at P7, Pz, and P8); in-group preference for anger mimicry occurred at P1 (at P7) and P2 (at Pz). Our findings provide new neural evidence for the effect of group membership on emotional mimicry by uncovering the temporal dynamics of this effect. Full article
Show Figures

Figure 1

19 pages, 1688 KiB  
Article
Electromyographic Validation of Spontaneous Facial Mimicry Detection Using Automated Facial Action Coding
by Chun-Ting Hsu and Wataru Sato
Sensors 2023, 23(22), 9076; https://doi.org/10.3390/s23229076 - 9 Nov 2023
Cited by 7 | Viewed by 2525
Abstract
Although electromyography (EMG) remains the standard, researchers have begun using automated facial action coding system (FACS) software to evaluate spontaneous facial mimicry despite the lack of evidence of its validity. Using the facial EMG of the zygomaticus major (ZM) as a standard, we [...] Read more.
Although electromyography (EMG) remains the standard, researchers have begun using automated facial action coding system (FACS) software to evaluate spontaneous facial mimicry despite the lack of evidence of its validity. Using the facial EMG of the zygomaticus major (ZM) as a standard, we confirmed the detection of spontaneous facial mimicry in action unit 12 (AU12, lip corner puller) via an automated FACS. Participants were alternately presented with real-time model performance and prerecorded videos of dynamic facial expressions, while simultaneous ZM signal and frontal facial videos were acquired. Facial videos were estimated for AU12 using FaceReader, Py-Feat, and OpenFace. The automated FACS is less sensitive and less accurate than facial EMG, but AU12 mimicking responses were significantly correlated with ZM responses. All three software programs detected enhanced facial mimicry by live performances. The AU12 time series showed a roughly 100 to 300 ms latency relative to the ZM. Our results suggested that while the automated FACS could not replace facial EMG in mimicry detection, it could serve a purpose for large effect sizes. Researchers should be cautious with the automated FACS outputs, especially when studying clinical populations. In addition, developers should consider the EMG validation of AU estimation as a benchmark. Full article
(This article belongs to the Special Issue Advanced-Sensors-Based Emotion Sensing and Recognition)
Show Figures

Figure 1

17 pages, 2649 KiB  
Article
Exploring the Influence of Context on Emotional Mimicry and Intention: An Affirmation of the Correction Hypothesis
by Xiaohui Xu and Ping Hu
Behav. Sci. 2023, 13(8), 677; https://doi.org/10.3390/bs13080677 - 11 Aug 2023
Viewed by 1657
Abstract
Background: Emotional mimicry, a phenomenon frequently observed in our everyday interactions, is the act of replicating another individual’s facial expression. The Emotion Mimicry in Context View and the Correction Hypothesis underscore the critical role of context and intention within emotional mimicry. Methods: In [...] Read more.
Background: Emotional mimicry, a phenomenon frequently observed in our everyday interactions, is the act of replicating another individual’s facial expression. The Emotion Mimicry in Context View and the Correction Hypothesis underscore the critical role of context and intention within emotional mimicry. Methods: In two distinct studies, participants were presented with facial expressions of models (happiness and anger) within various contexts (affiliative, distancing, and neutral). Concurrently, we recorded electromyography (EMG) to index emotional mimicry, while participants explicitly rated the models’ intentions. Results: We observed context swiftly influences emotional mimicry within 500 ms, notably when the intentions of contexts are opposing to the intentions of facial expressions, leading to weakened muscle responses and diminished perceived intention. Furthermore, a notable correlation was discovered in the mimicry of angry faces; the more distancing the context, the stronger the corrugator supercilii (CS) muscle activity after context processing. Conclusions: First, emotional mimicry should not be simply viewed as an output corresponding to the expresser’s facial expressions but the dynamic process involving the active participation of the observer. Second, intention serves as a pivotal anchor, effectively integrating facial and contextual information. As such, we provided empirical support for the Correction Hypothesis. Full article
(This article belongs to the Section Cognition)
Show Figures

Figure 1

18 pages, 2748 KiB  
Article
“When You’re Smiling”: How Posed Facial Expressions Affect Visual Recognition of Emotions
by Francesca Benuzzi, Daniela Ballotta, Claudia Casadio, Vanessa Zanelli, Carlo Adolfo Porro, Paolo Frigio Nichelli and Fausta Lui
Brain Sci. 2023, 13(4), 668; https://doi.org/10.3390/brainsci13040668 - 16 Apr 2023
Cited by 2 | Viewed by 3608
Abstract
Facial imitation occurs automatically during the perception of an emotional facial expression, and preventing it may interfere with the accuracy of emotion recognition. In the present fMRI study, we evaluated the effect of posing a facial expression on the recognition of ambiguous facial [...] Read more.
Facial imitation occurs automatically during the perception of an emotional facial expression, and preventing it may interfere with the accuracy of emotion recognition. In the present fMRI study, we evaluated the effect of posing a facial expression on the recognition of ambiguous facial expressions. Since facial activity is affected by various factors, such as empathic aptitudes, the Interpersonal Reactivity Index (IRI) questionnaire was administered and scores were correlated with brain activity. Twenty-six healthy female subjects took part in the experiment. The volunteers were asked to pose a facial expression (happy, disgusted, neutral), then to watch an ambiguous emotional face, finally to indicate whether the emotion perceived was happiness or disgust. As stimuli, blends of happy and disgusted faces were used. Behavioral results showed that posing an emotional face increased the percentage of congruence with the perceived emotion. When participants posed a facial expression and perceived a non-congruent emotion, a neural network comprising bilateral anterior insula was activated. Brain activity was also correlated with empathic traits, particularly with empathic concern, fantasy and personal distress. Our findings support the idea that facial mimicry plays a crucial role in identifying emotions, and that empathic emotional abilities can modulate the brain circuits involved in this process. Full article
(This article belongs to the Section Cognitive, Social and Affective Neuroscience)
Show Figures

Figure 1

20 pages, 2259 KiB  
Article
Investigating the Relationship between Facial Mimicry and Empathy
by Yevgeniya Kovalchuk, Elizabeta Budini, Robert M. Cook and Andrew Walsh
Behav. Sci. 2022, 12(8), 250; https://doi.org/10.3390/bs12080250 - 24 Jul 2022
Cited by 8 | Viewed by 5405
Abstract
Facial expressions play a key role in interpersonal communication when it comes to negotiating our emotions and intentions, as well as interpreting those of others. Research has shown that we can connect to other people better when we exhibit signs of empathy and [...] Read more.
Facial expressions play a key role in interpersonal communication when it comes to negotiating our emotions and intentions, as well as interpreting those of others. Research has shown that we can connect to other people better when we exhibit signs of empathy and facial mimicry. However, the relationship between empathy and facial mimicry is still debated. Among the factors contributing to the difference in results across existing studies is the use of different instruments for measuring both empathy and facial mimicry, as well as often ignoring the differences across various demographic groups. This study first looks at the differences in the empathetic abilities of people across different demographic groups based on gender, ethnicity and age. The empathetic ability is measured based on the Empathy Quotient, capturing a balanced representation of both emotional and cognitive empathy. Using statistical and machine learning methods, this study then investigates the correlation between the empathetic ability and facial mimicry of subjects in response to images portraying different emotions displayed on a computer screen. Unlike the existing studies measuring facial mimicry using electromyography, this study employs a technology detecting facial expressions based on video capture and deep learning. This choice was made in the context of increased online communication during and after the COVID-19 pandemic. The results of this study confirm the previously reported difference in the empathetic ability between females and males. However, no significant difference in empathetic ability was found across different age and ethnic groups. Furthermore, no strong correlation was found between empathy and facial reactions to faces portraying different emotions shown on a computer screen. Overall, the results of this study can be used to inform the design of online communication technologies and tools for training empathy team leaders, educators, social and healthcare providers. Full article
(This article belongs to the Section Social Psychology)
Show Figures

Figure 1

16 pages, 2717 KiB  
Article
Emotion Recognizing by a Robotic Solution Initiative (EMOTIVE Project)
by Grazia D’Onofrio, Laura Fiorini, Alessandra Sorrentino, Sergio Russo, Filomena Ciccone, Francesco Giuliani, Daniele Sancarlo and Filippo Cavallo
Sensors 2022, 22(8), 2861; https://doi.org/10.3390/s22082861 - 8 Apr 2022
Cited by 15 | Viewed by 4581
Abstract
Background: Emotion recognition skills are predicted to be fundamental features in social robots. Since facial detection and recognition algorithms are compute-intensive operations, it needs to identify methods that can parallelize the algorithmic operations for large-scale information exchange in real time. The study aims [...] Read more.
Background: Emotion recognition skills are predicted to be fundamental features in social robots. Since facial detection and recognition algorithms are compute-intensive operations, it needs to identify methods that can parallelize the algorithmic operations for large-scale information exchange in real time. The study aims were to identify if traditional machine learning algorithms could be used to assess every user emotions separately, to relate emotion recognizing in two robotic modalities: static or motion robot, and to evaluate the acceptability and usability of assistive robot from an end-user point of view. Methods: Twenty-seven hospital employees (M = 12; F = 15) were recruited to perform the experiment showing 60 positive, negative, or neutral images selected in the International Affective Picture System (IAPS) database. The experiment was performed with the Pepper robot. Concerning experimental phase with Pepper in active mode, a concordant mimicry was programmed based on types of images (positive, negative, and neutral). During the experimentation, the images were shown by a tablet on robot chest and a web interface lasting 7 s for each slide. For each image, the participants were asked to perform a subjective assessment of the perceived emotional experience using the Self-Assessment Manikin (SAM). After participants used robotic solution, Almere model questionnaire (AMQ) and system usability scale (SUS) were administered to assess acceptability, usability, and functionality of robotic solution. Analysis wasperformed on video recordings. The evaluation of three types of attitude (positive, negative, andneutral) wasperformed through two classification algorithms of machine learning: k-nearest neighbors (KNN) and random forest (RF). Results: According to the analysis of emotions performed on the recorded videos, RF algorithm performance wasbetter in terms of accuracy (mean ± sd = 0.98 ± 0.01) and execution time (mean ± sd = 5.73 ± 0.86 s) than KNN algorithm. By RF algorithm, all neutral, positive and negative attitudes had an equal and high precision (mean = 0.98) and F-measure (mean = 0.98). Most of the participants confirmed a high level of usability and acceptability of the robotic solution. Conclusions: RF algorithm performance was better in terms of accuracy and execution time than KNN algorithm. The robot was not a disturbing factor in the arousal of emotions. Full article
(This article belongs to the Special Issue Assistive Robots for Healthcare and Human-Robot Interaction)
Show Figures

Figure 1

14 pages, 880 KiB  
Article
Efficacy of Facial Exercises in Facial Expression Categorization in Schizophrenia
by Francesco Pancotti, Sonia Mele, Vincenzo Callegari, Raffaella Bivi, Francesca Saracino and Laila Craighero
Brain Sci. 2021, 11(7), 825; https://doi.org/10.3390/brainsci11070825 - 22 Jun 2021
Cited by 6 | Viewed by 4170
Abstract
Embodied cognition theories suggest that observation of facial expression induces the same pattern of muscle activation, and that this contributes to emotion recognition. Consequently, the inability to form facial expressions would affect emotional understanding. Patients with schizophrenia show a reduced ability to express [...] Read more.
Embodied cognition theories suggest that observation of facial expression induces the same pattern of muscle activation, and that this contributes to emotion recognition. Consequently, the inability to form facial expressions would affect emotional understanding. Patients with schizophrenia show a reduced ability to express and perceive facial emotions. We assumed that a physical training specifically developed to mobilize facial muscles could improve the ability to perform facial movements, and, consequently, spontaneous mimicry and facial expression recognition. Twenty-four inpatient participants with schizophrenia were randomly assigned to the experimental and control group. At the beginning and at the end of the study, both groups were submitted to a facial expression categorization test and their data compared. The experimental group underwent a training period during which the lip muscles, and the muscles around the eyes were mobilized through the execution of transitive actions. Participants were trained three times a week for five weeks. Results showed a positive impact of the physical training in the recognition of others’ facial emotions, specifically for the responses of “fear”, the emotion for which the recognition deficit in the test is most severe. This evidence suggests that a specific deficit of the sensorimotor system may result in a specific cognitive deficit. Full article
(This article belongs to the Special Issue The Role of the Sensorimotor System in Cognitive Functions)
Show Figures

Graphical abstract

13 pages, 1590 KiB  
Article
Associations between Cognitive Concepts of Self and Emotional Facial Expressions with an Emphasis on Emotion Awareness
by Peter Walla and Aimee Mavratzakis
Psych 2021, 3(2), 48-60; https://doi.org/10.3390/psych3020006 - 27 Apr 2021
Viewed by 2819
Abstract
Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous [...] Read more.
Recognising our own and others’ emotions is vital for healthy social development. The aim of the current study was to determine how emotions related to the self or to another influence behavioural expressions of emotion. Facial electromyography (EMG) was used to record spontaneous facial muscle activity in nineteen participants while they passively viewed negative, positive and neutral emotional pictures during three blocks of referential instructions. Each participant imagined themself, another person or no one experiencing the emotional scenario, with the priming words “You”, “Him” or “None” presented before each picture for the respective block of instructions. Emotion awareness (EA) was also recorded using the TAS-20 alexithymia questionnaire. Corrugator supercilii (cs) muscle activity increased significantly between 500 and 1000 ms post stimulus onset during negative and neutral picture presentations, regardless of ownership. Independent of emotion, cs activity was greatest during the “no one” task and lowest during the “self” task from less than 250 to 1000 ms. Interestingly, the degree of cs activation during referential tasks was further modulated by EA. Low EA corresponded to significantly stronger cs activity overall compared with high EA, and this effect was even more pronounced during the “no one” task. The findings suggest that cognitive processes related to the perception of emotion ownership can influence spontaneous facial muscle activity, but that a greater degree of integration between higher cognitive and lower affective levels of information may interrupt or suppress these behavioural expressions of emotion. Full article
Show Figures

Figure 1

26 pages, 8602 KiB  
Article
Building an Emotionally Responsive Avatar with Dynamic Facial Expressions in Human—Computer Interactions
by Heting Wang, Vidya Gaddy, James Ross Beveridge and Francisco R. Ortega
Multimodal Technol. Interact. 2021, 5(3), 13; https://doi.org/10.3390/mti5030013 - 20 Mar 2021
Cited by 14 | Viewed by 9266
Abstract
The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, [...] Read more.
The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, Diana simulates emotions with dynamic facial expressions. When two people collaborated to build blocks, their affects were recognized and labeled using the Affdex SDK and a descriptive analysis was provided. When participants turned to collaborate with Diana, their subjective responses were collected and the length of completion was recorded. Three modes of Diana were involved: a flat-faced Diana, a Diana that used mimicry facial expressions, and a Diana that used emotionally responsive facial expressions. Twenty-one responses were collected through a five-point Likert scale questionnaire and the NASA TLX. Results from questionnaires were not statistically different. However, the emotionally responsive Diana obtained more positive responses, and people spent the longest time with the mimicry Diana. In post-study comments, most participants perceived facial expressions on Diana’s face as natural, four mentioned uncomfortable feelings caused by the Uncanny Valley effect. Full article
(This article belongs to the Special Issue Social Interaction and Psychology in XR)
Show Figures

Figure 1

19 pages, 1581 KiB  
Article
Infant Emotional Mimicry of Strangers: Associations with Parent Emotional Mimicry, Parent-Infant Mutual Attention, and Parent Dispositional Affective Empathy
by Eliala A. Salvadori, Cristina Colonnesi, Heleen S. Vonk, Frans J. Oort and Evin Aktar
Int. J. Environ. Res. Public Health 2021, 18(2), 654; https://doi.org/10.3390/ijerph18020654 - 14 Jan 2021
Cited by 7 | Viewed by 7157
Abstract
Emotional mimicry, the tendency to automatically and spontaneously reproduce others’ facial expressions, characterizes human social interactions from infancy onwards. Yet, little is known about the factors modulating its development in the first year of life. This study investigated infant emotional mimicry and its [...] Read more.
Emotional mimicry, the tendency to automatically and spontaneously reproduce others’ facial expressions, characterizes human social interactions from infancy onwards. Yet, little is known about the factors modulating its development in the first year of life. This study investigated infant emotional mimicry and its association with parent emotional mimicry, parent-infant mutual attention, and parent dispositional affective empathy. One hundred and seventeen parent-infant dyads (51 six-month-olds, 66 twelve-month-olds) were observed during video presentation of strangers’ happy, sad, angry, and fearful faces. Infant and parent emotional mimicry (i.e., facial expressions valence-congruent to the video) and their mutual attention (i.e., simultaneous gaze at one another) were systematically coded second-by-second. Parent empathy was assessed via self-report. Path models indicated that infant mimicry of happy stimuli was positively and independently associated with parent mimicry and affective empathy, while infant mimicry of sad stimuli was related to longer parent-infant mutual attention. Findings provide new insights into infants’ and parents’ coordination of mimicry and attention during triadic contexts of interactions, endorsing the social-affiliative function of mimicry already present in infancy: emotional mimicry occurs as an automatic parent-infant shared behavior and early manifestation of empathy only when strangers’ emotional displays are positive, and thus perceived as affiliative. Full article
(This article belongs to the Special Issue The Role of Parenting in Typical and Atypical Child Development)
Show Figures

Figure 1

14 pages, 683 KiB  
Article
Facial Imitation Improves Emotion Recognition in Adults with Different Levels of Sub-Clinical Autistic Traits
by Andrea E. Kowallik, Maike Pohl and Stefan R. Schweinberger
J. Intell. 2021, 9(1), 4; https://doi.org/10.3390/jintelligence9010004 - 13 Jan 2021
Cited by 9 | Viewed by 4517
Abstract
We used computer-based automatic expression analysis to investigate the impact of imitation on facial emotion recognition with a baseline-intervention-retest design. The participants: 55 young adults with varying degrees of autistic traits, completed an emotion recognition task with images of faces displaying one of [...] Read more.
We used computer-based automatic expression analysis to investigate the impact of imitation on facial emotion recognition with a baseline-intervention-retest design. The participants: 55 young adults with varying degrees of autistic traits, completed an emotion recognition task with images of faces displaying one of six basic emotional expressions. This task was then repeated with instructions to imitate the expressions. During the experiment, a camera captured the participants’ faces for an automatic evaluation of their imitation performance. The instruction to imitate enhanced imitation performance as well as emotion recognition. Of relevance, emotion recognition improvements in the imitation block were larger in people with higher levels of autistic traits, whereas imitation enhancements were independent of autistic traits. The finding that an imitation instruction improves emotion recognition, and that imitation is a positive within-participant predictor of recognition accuracy in the imitation block supports the idea of a link between motor expression and perception in the processing of emotions, which might be mediated by the mirror neuron system. However, because there was no evidence that people with higher autistic traits differ in their imitative behavior per se, their disproportional emotion recognition benefits could have arisen from indirect effects of imitation instructions Full article
(This article belongs to the Special Issue Advances in Socio-Emotional Ability Research)
Show Figures

Figure 1

Back to TopTop