Special Issue "Perceptual and Affective Mechanisms in Facial Expression Recognition"

A special issue of Brain Sciences (ISSN 2076-3425).

Deadline for manuscript submissions: closed (15 November 2019).

Special Issue Editors

Dr. Lucia Ricciardi
E-Mail
Guest Editor
Neurosciences Research Centre, Molecular and Clinical Sciences Research Institute, St George's University of London, London, United Kingdom
Interests: movement disorders; Parkinson’s disease; impulse controls disorders; affective neuroscience; psychophysiology; brain stimulation; DBS
Dr. Matteo Bologna
E-Mail Website
Guest Editor
Department of Human Neurosciences, Sapienza University of Rome, Rome, Italy
Interests: motor neurosciences; movement disorders

Special Issue Information

Dear Colleagues,

Facial emotion expressivity and facial expression recognition have been active research areas and have attracted increasing attention from researchers in neuroscience, psychology, computer science, linguistics, and related disciplines.

Encouraged by the writings of Charles Darwin, eminent researchers such as Carroll Izard and Paul Ekman developed set of theories and methods on this topic.

Despite the increasing number of studies on facial emotion expressivity and facial expression recognition, the method of assessing and measuring them is challenging and their physiological mechanisms are still not entirely elucidated. Moreover, facial emotion expressivity and facial expression recognition are often impaired in a number of psychiatric and neurodegenerative disorders e.g. Parkinson’s disease and parkinsonism, Huntington’s disease, etc., and the pathophysiological basis underlining these abnormalities is not yet well-known.

In this Special Issue, we are interested in the processes and the neural structures involved in facial emotion expressivity and facial expression recognition, as well as in classic and innovative ways of assessing facial expressions in healthy people and in people with neuropsychiatric disoders. Studies using various methods, including electrophysiology and non-invasive brain stimulation techniques, motion analysis, and neuroimaging are welcomed.

Dr. Lucia Ricciardi
Dr. Matteo Bologna
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Brain Sciences is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Face expressivity
  • Face recognition
  • Facial emotions
  • Emotion processing
  • Emotions
  • Amimia
  • Hypomimia
  • Parkinson’s disease
  • Movement disorders
  • Neuropsychiatric conditions

Published Papers (5 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Article
Social Perception of Faces: Brain Imaging and Subjective Ratings
Brain Sci. 2020, 10(11), 861; https://doi.org/10.3390/brainsci10110861 - 16 Nov 2020
Viewed by 552
Abstract
The aim of this study was to investigate how a female face is perceived in terms of its attractiveness, dominance, health, femininity-masculinity, and maturity in direct relation to the body fat percentage (BFP) conveyed by the face. To compare how young adults (ages [...] Read more.
The aim of this study was to investigate how a female face is perceived in terms of its attractiveness, dominance, health, femininity-masculinity, and maturity in direct relation to the body fat percentage (BFP) conveyed by the face. To compare how young adults (ages 18 to 35) respond to different levels of body fat percentage both subjectively and objectively we collected survey ratings and electroencephalography (EEG) data across five different levels of BFP from 40 participants. We adapted the experimental design from a prior behavioral study and used calibrated and morphed female face images of five different BFP levels. The results of the survey are in consensus with the previous study and assessed to be a successful replication. From the EEG data, event-related potentials (ERPs) were extracted from one electrode location (right occipitotemporal brain region) known to be particularly sensitive to face-stimuli. We found statistically significant differences in the amplitudes of the P200 component (194 ms post stimulus onset) between the thickest face and all four other BFP conditions, and in the amplitudes of the N300 component (274 ms post stimulus onset) between the average face and three other BFP conditions. As expected, there were no significant differences among the N170 amplitudes of all five BFP conditions since this ERP component simply reflects the processing of faces in general. From these results, we can infer that holistic face encoding characterized by the N170 component in the right occipitotemporal area is followed by serial evaluative processes, whose categorical and qualitative matrix and spatiotemporal dynamics should be further explored in future studies, especially in relation to the social constructs that were focused on in this study. Full article
(This article belongs to the Special Issue Perceptual and Affective Mechanisms in Facial Expression Recognition)
Show Figures

Figure 1

Article
Seeing a Face in a Crowd of Emotional Voices: Changes in Perception and Cortisol in Response to Emotional Information across the Senses
Brain Sci. 2019, 9(8), 176; https://doi.org/10.3390/brainsci9080176 - 25 Jul 2019
Cited by 1 | Viewed by 1566
Abstract
One source of information we glean from everyday experience, which guides social interaction, is assessing the emotional state of others. Emotional state can be expressed through several modalities: body posture or movements, body odor, touch, facial expression, or the intonation in a voice. [...] Read more.
One source of information we glean from everyday experience, which guides social interaction, is assessing the emotional state of others. Emotional state can be expressed through several modalities: body posture or movements, body odor, touch, facial expression, or the intonation in a voice. Much research has examined emotional processing within one sensory modality or the transfer of emotional processing from one modality to another. Yet, less is known regarding interactions across different modalities when perceiving emotions, despite our common experience of seeing emotion in a face while hearing the corresponding emotion in a voice. Our study examined if visual and auditory emotions of matched valence (congruent) conferred stronger perceptual and physiological effects compared to visual and auditory emotions of unmatched valence (incongruent). We quantified how exposure to emotional faces and/or voices altered perception using psychophysics and how it altered a physiological proxy for stress or arousal using salivary cortisol. While we found no significant advantage of congruent over incongruent emotions, we found that changes in cortisol were associated with perceptual changes. Following exposure to negative emotional content, larger decreases in cortisol, indicative of less stress, correlated with more positive perceptual after-effects, indicative of stronger biases to see neutral faces as happier. Full article
(This article belongs to the Special Issue Perceptual and Affective Mechanisms in Facial Expression Recognition)
Show Figures

Figure 1

Article
Electrophysiological Responses to Emotional Facial Expressions Following a Mild Traumatic Brain Injury
Brain Sci. 2019, 9(6), 142; https://doi.org/10.3390/brainsci9060142 - 18 Jun 2019
Cited by 2 | Viewed by 2125
Abstract
The present study aimed to measure neural information processing underlying emotional recognition from facial expressions in adults having sustained a mild traumatic brain injury (mTBI) as compared to healthy individuals. We thus measured early (N1, N170) and later (N2) event-related potential (ERP) components [...] Read more.
The present study aimed to measure neural information processing underlying emotional recognition from facial expressions in adults having sustained a mild traumatic brain injury (mTBI) as compared to healthy individuals. We thus measured early (N1, N170) and later (N2) event-related potential (ERP) components during presentation of fearful, neutral, and happy facial expressions in 10 adults with mTBI and 11 control participants. Findings indicated significant differences between groups, irrespective of emotional expression, in the early attentional stage (N1), which was altered in mTBI. The two groups showed similar perceptual integration of facial features (N170), with greater amplitude for fearful facial expressions in the right hemisphere. At a higher-level emotional discrimination stage (N2), both groups demonstrated preferential processing for fear as compared to happiness and neutrality. These findings suggest a reduced early selective attentional processing following mTBI, but no impact on the perceptual and higher-level cognitive processes stages. This study contributes to further improving our comprehension of attentional versus emotional recognition following a mild TBI. Full article
(This article belongs to the Special Issue Perceptual and Affective Mechanisms in Facial Expression Recognition)
Show Figures

Figure 1

Article
Joint Modulation of Facial Expression Processing by Contextual Congruency and Task Demands
Brain Sci. 2019, 9(5), 116; https://doi.org/10.3390/brainsci9050116 - 17 May 2019
Cited by 13 | Viewed by 2384
Abstract
Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by [...] Read more.
Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250–450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions. Full article
(This article belongs to the Special Issue Perceptual and Affective Mechanisms in Facial Expression Recognition)
Show Figures

Figure 1

Article
The Motivational Power of the Happy Face
Brain Sci. 2019, 9(1), 6; https://doi.org/10.3390/brainsci9010006 - 07 Jan 2019
Cited by 7 | Viewed by 2369
Abstract
People who are cheerful have better social relationships. This might be the case because happy faces communicate an invitation to interact. Thus, happy faces might have a strong motivational effect on others. We tested this hypothesis in a set of four studies. Study [...] Read more.
People who are cheerful have better social relationships. This might be the case because happy faces communicate an invitation to interact. Thus, happy faces might have a strong motivational effect on others. We tested this hypothesis in a set of four studies. Study 1 (N = 94) showed that approach reactions to happy faces are faster than other reactions to happy or angry faces. Study 2 (N = 99) found the same effect when comparing reactions to happy faces with reactions to disgusted faces. Supporting the notion that this effect is related to motivation, habitual social approach motivation intensified the motivational effect of happy faces (Study 3, N = 82). Finally, Study 4 (N = 40) showed that the reaction-time asymmetry does not hold for categorization tasks without approach and avoidance movements. These studies demonstrate that happy faces have a strong motivational power. They seem to activate approach reactions more strongly than angry or disgusted faces activate avoidance reactions. Full article
(This article belongs to the Special Issue Perceptual and Affective Mechanisms in Facial Expression Recognition)
Show Figures

Figure 1

Back to TopTop