Next Article in Journal
Infant Developmental Outcomes: Influence of Prenatal Maternal–Fetal Attachment, Adult Attachment, Maternal Well-Being, and Perinatal Loss
Previous Article in Journal
Spatio-Temporal Evolution, Prediction and Optimization of LUCC Based on CA-Markov and InVEST Models: A Case Study of Mentougou District, Beijing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effect of Surgical Masks on the Featural and Configural Processing of Emotions

1
Aldo Ravelli Research Center for Neurotechnology and Experimental Brain Therapeutics, Department of Health Science, University of Milan, Via A. di Rudinì, 8, 20142 Milan, Italy
2
Department of Neurology and Laboratory of Neuroscience, IRCCS Istituto Auxologico Italiano, Piazzale Brescia, 20, 20149 Milan, Italy
3
ASST Santi Paolo e Carlo, San Paolo University Hospital, Via A. di Rudinì, 8, 20142 Milan, Italy
4
Department of Oncology and Hematology-Oncology, University of Milan, Via Festa del Perdono, 7, 20122 Milan, Italy
5
Psycho-Oncology Division, IRCCS-Istituto Europeo di Oncologia, Via G. Ripamonti, 435, 20141 Milan, Italy
6
Fondazione IRCCS Ca’ Granda Ospedale Maggiore Policlinico di Milano, Via F. Sforza, 35, 20122 Milan, Italy
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2022, 19(4), 2420; https://doi.org/10.3390/ijerph19042420
Submission received: 28 January 2022 / Revised: 15 February 2022 / Accepted: 17 February 2022 / Published: 19 February 2022

Abstract

:
From the start of the COVID-19 pandemic, the use of surgical masks became widespread. However, they occlude an important part of the face and make it difficult to decode and interpret other people’s emotions. To clarify the effect of surgical masks on configural and featural processing, participants completed a facial emotion recognition task to discriminate between happy, sad, angry, and neutral faces. Stimuli included fully visible faces, masked faces, and a cropped photo of the eyes or mouth region. Occlusion due to the surgical mask affects emotion recognition for sadness, anger, and neutral faces, although no significative differences were found in happiness recognition. Our findings suggest that happiness is recognized predominantly via featural processing.

1. Introduction

As the COVID-19 pandemic spreads, surgical masks became the main tool for fighting the disease. The use of surgical masks is of fundamental importance to protect individuals from infection [1]; however, it should be noted that surgical masks make it impossible to get a full view of people’s faces, therefore impairing the ability to read the facial emotions of others [2]. Surgical masks cover the lower part of the face, which is an important area for nonverbal communication of emotional states [3]. Understanding other people’s emotions is a fundamental ability for humans and a fundamental function at the basis of social interactions [4].
Ekman and colleagues (1992) [5] indicated six facial expressions reflecting basic emotions: sadness, happiness, anger, disgust, fear, and surprise [5]. These emotions appear to be universal and not linked to the individual’s culture. On this basis, the facial action coding system (FACS) was developed [6]. The FACS aims to create a taxonomy of facial expressions. In the FACS system, each anatomically possible facial expression is associated with specific movements of one or more muscles, which form the action units, and their possible temporal segments [6]. Each emotion can thus be categorized on the basis of specific movements. Different emotional expressions have one or more action units linked with specific parts of the face [7]. Therefore, different facial features carry specific information depending on the emotion that is observed [8]. With the aim to clarify how humans extract information about emotions from a seen face, numerous studies have focused on the featural and configural processing of faces [9]. Featural processing concerns the processing of information carried by specific face parts, for example the shape of the nose or the size of the mouth, while configural processing concerns the spatial relation between face parts, for example the distance between eyes and mouth [10,11]. Composite effect shows that emotion recognition is based mostly on configural rather than featural processing [9] and that configural processing is automatic and cannot be easily suppressed. Some studies argue that emotion processing is a two-step process [12], or that configural and featural processing contribute differentially to the recognition of different emotions [13]. In the process of a completely visible face, it is possible to rely on featural and configural information. When a masked face is viewed, perception of emotions relies only on featural processing of the upper face region, forcing individuals to process only featural information deriving from the upper part of the face. Experimental evidence suggests that happiness recognition relies more on featural processing while configural processing is strongly involved in sadness and anger recognition [13]. Therefore, it is important to consider how the disruption of configural processing and forcing people to process only part of the featural information due to a face mask can impact the recognition of specific emotions. It is possible that the emotional processing of a face does not rely on featural and configural processing but is instead based on different emotion-specific mechanisms and neural networks [14]. In a case study reported by Pegna and colleagues (2005) [15], a cortically blind patient with bilateral destruction of visual cortices and consequent cortical blindness could guess the emotional facial expression being displayed. At the same time, the patient could not guess other emotional stimuli such as pleasant or unpleasant scenes of sports, sex, violence, or mutilation. Functional magnetic resonance imaging (fMRI) showed the activation of the right amygdala during the processing of emotional faces. Evidence of the importance of face features derives from eye tracker studies which have shown that, during free exploration of faces with different emotional expressions, participants tended to look longer at the eyes region than at the mouth region [16,17,18]. Studies have also shown that there are different fixation patterns depending on the emotions being observed, with fixation on the upper part in recognizing fear and surprise and on the lower part in recognizing happiness [8]. Experimental evidence supports the importance of contextual factors in emotion perception. Kret and de Gelder (2012) [19] compared the effect of Islamic headdresses vs. scarfs and caps on emotion recognition, finding that fear and anger recognition were more accurate in women wearing the niqab than in women wearing cap and scarf; conversely, happiness recognition was more accurate in women wearing cap and scarf than in women wearing the niqab. Graham and Ritchie (2019) [20] found that sunglasses reduced the rating of trustworthiness in the observer and impaired identity recognition. Further evidence of effects due to minimal occlusion of the eye region comes from a study made by Kramer and Ritchie (2016) [21], where participants had more difficulty in matching identity when one of the two faces shown wore glasses. Although the eye region has played a predominant role in theories of emotion recognition, several studies have cleared the role of the mouth region in emotion recognition. Applying the Bubbles technique to emotion recognition, Blais and colleagues (2012) [22] found that humans rely more on the mouth than on the eyes area in both static and dynamic emotion recognition tasks. The authors proposed that the mouth region contains more discriminative motions between expressions compared to the eye region.
To date, it is not clear whether emotion perception from a seen face is a phenomenon that relies purely on configural or featural processing of the face elements. Many findings show that emotion perception is a complex phenomenon depending also on contextual factors and on the specific emotion perceived by the observer. In the last two years many studies have focused their attention on the role played by surgical masks in the recognition of facial characteristics such as age, identity, or emotions, highlighting the impact of multiple fields of social cognition [2,23,24,25,26,27,28]. Surgical masks have a detrimental effect on matching face identity [28]. The specific mechanism underlying the effect of surgical masks on identity matching is due to qualitative and quantitative change in the way masked faces are perceived by the observer due to the lack of information carried out by the configural processing of a seen face [24].
Regarding the emotional process of a seen face, it was found that surgical masks superimposed on a portrait of a person mimicking the six basic emotions exert differential effects based on the emotion that was shown, with major effects on the recognition of angry, disgusted, and happy faces [23]. As reported by Marini and colleagues (2021) [29], surgical masks have different effects based on the specific emotion. In their study, subjects made more errors in sadness recognition than in happiness recognition when observing a masked face. This result is interesting, especially considering that the mouth region seems to be more informative in the recognition of positive emotions.
However, to date little is known about the specific mechanism that underlies these effects in terms of speed and accuracy. Some insight could be found in the study made by Fitousi and colleagues (2021) [30], where subjects were asked to discriminate between angry and neutral masked faces in the context of an inverted faces task. Results showed that the facial inversion effect was comparable in masked and unmasked conditions, demonstrating that emotion recognition with masked faces is based on featural rather than configural processing of the seen face. It should be noted that in their experiment Fitousi and colleagues [30] asked participants to discriminate only between angry and neutral faces and did not investigate the effects on other emotions. In addition to featural and configural characteristics of the stimuli, emotion perception can be affected by personal characteristics of the observer. Many studies have found that alexithymia, or difficulty in identifying emotions [31], can predict ability in the perception of facial expressions [32]. Experimental findings showed an association between high levels in alexithymia and atypical attentional processing of faces or atypical fixation patterns in face scanning [33,34]. In a recent fMRI study made by Rosenberg and colleagues (2020), it was found that subjects with high alexithymia levels have less sensitivity in the detection of anger signals in a seen face in the context of an emotional priming task. The authors concluded that difficulty in automatic perception of emotion can contribute to interpersonal relationship problems associated with alexithymia [35].
The aim of the present study was to test how occlusion due to a face mask can affect specific emotion recognition and to clarify the contribution of upper and lower face parts in emotion recognition and their role in the activation of featural or configural processing mechanisms. We tested subjects in a facial emotion recognition task (FERT) in which stimuli were presented in four different conditions characterized by different manipulations of the presented face. In masked face condition (MF) faces wore a surgical mask; in non-masked condition (NM) faces were fully visible; in eyes-only condition (EO) faces were cropped in order to make visible only the eyes region; in mouth-only condition (MO) faces were cropped to make only the mouth region visible. The rationale behind the presentation of these conditions is that configural processing seems to be automatic, so when a masked face is presented, it is possible that subjects attempt to process the face via configural mechanisms that are impaired by the visual occlusion due to the surgical mask. EO and MO conditions were presented to verify if a pure featural presentation of the faces could have different effects in terms of speed and accuracy due to the activation of featural processing and to evaluate the specific contribution of the different face features in emotion recognition.
Another aim was to assess if individual differences in alexithymia could relate to performance in different experimental conditions in order to clarify the relation with this personality trait and the process underlying emotion recognition.

2. Materials and Methods

2.1. Participants

Thirty-one healthy Caucasian voluntary participants (16 males aged 21–58) with normal or corrected-to-normal visual acuity took part in the experiment.
All participants had no history of neuropsychiatric disorders, as confirmed by clinical history and an anamnestic interview. Demographic information about our sample is listed in Table 1. Participants who took part in the experiment were treated following the Helsinki Declaration principles and provided informed consent prior to the experiment.

2.2. Procedure

Participants were asked to perform a facial emotion recognition task (FERT) in which they had to press a key to indicate the emotion expressed by faces of male and female Caucasian adults. Stimulus faces consisted of the portrait photos of 8 Caucasian adults (four men, four women) extracted from the NimStim Face Stimulus Set [36] expressing anger, happiness, sadness, and neutral expression. Each portrait was presented in the same facial expression 3 times for a total of 96 trials (8 faces × 4 emotions × 3 repetitions) in each experimental block. Four alternative sets of pictures were presented, one for each face manipulation condition: Not Masked (NM) consisting of a fully visible face; Masked Face (MF) in which a surgical mask was superimposed on the stimulus face; Mouth Only (MO) consisting of a cropped version of the stimulus to make only the lower part of the face visible; Eyes Only (EO) consisting of a cropped version of the stimulus to make only the upper part of the face visible (see Figure 1).
The experimental procedure consisted of four blocks; only one set of stimuli was presented in each block. The order of the blocks was counterbalanced between participants. Stimulus presentation, timing, and data collection were controlled via the E-Prime (Psychology Software Tools, Pittsburgh, PA, USA) software, running on a laptop computer. Reaction times (RTs) and error rate (percentage of incorrect responses) were recorded.
Each block was preceded by a practice phase consisting of 10 trials, which were discarded from the analysis. Each trial started with a fixation cross at the center of the screen, which was replaced by the stimulus face after 1000 ms of fixation. Labels with emotions were placed in the lower part of the screen, in correspondence with the respective response keys, to facilitate the correct association of keys to the respective emotions; labels remained visible on the screen during the whole experimental procedure.
The stimulus remained on the screen until a response was given. No feedback was given in the case of wrong responses.
In order to assess the possible effects of participants’ mood, we administered four visual analogue scales (VAS). Each VAS consisted of a horizontal line, 10 cm in length, anchored by word descriptors at each end.
The patient marked on the line the point that they felt represented their perception of their current emotional state. The VAS score was calculated by measuring in millimeters from the left-hand end of the line to the point that the patient marked. We used a VAS for happiness: 0 = unhappy and 10 = happy; for sadness: 0 = no sadness and 10 = sadness; for anger: 0 = no anger and 10 = anger; for mood: 0 = the worst mood ever and 10 = the best mood ever.
In order to assess participants’ perception of task difficulty we administered a visual analogue scale (VAS) for each experimental condition administered. Participants marked on the line the point they felt represented the difficulty of each experimental condition (NM, MF, MO, EO) administered.
VAS on perceived difficulty had the same scale and evaluation method used for the VAS for mood assessment.
Then the participants completed the Toronto Alexithymia Scale—20 (TAS-20) [37] questionnaire. In order to exclude subjects with a deficit in emotion recognition we administered the Comprehensive Affect Test System (CATS) [38]. The whole procedure lasted about 45 min.

2.3. Statistical Analyses

All subjects involved in the experiment showed normal scores in CATS, and no one was excluded due to impaired emotion recognition process. No outliers were present in the CATS scores (all scores were within ± 2 S.D. from the group mean).
We used a 4 × 4 (4 face manipulation conditions × 4 face emotions) repeated measures ANOVA with reaction times for correct responses only and error rates (calculated as percentage of incorrect responses) as dependent variables. We made post hoc comparisons using paired sample t-tests with Bonferroni p-value correction for multiple comparisons (α = 0.012).
We correlated reaction times for correct responses and error rates with TAS-20 total score and TAS-20 subscales scores.
VAS on perceived difficulty between face manipulation condition scores were entered in a repeated measure ANOVA with face manipulation condition as a within-factor. Post hoc comparisons were made using paired sample t-tests with Bonferroni p-value correction for multiple comparisons (α = 0.012).

3. Results

Error rates analysis showed the main effects of face manipulation condition (F(3,90) = 54.346, p < 0.001, η2 = 0.644), face emotion (F(3,90) = 64.404, p < 0.001, η2 = 0.682), and interaction between face manipulation conditions and face emotion (F(9,270) = 7.936, p < 0.001, η2 = 0.209).
To clarify the interaction between face manipulation conditions and face emotion, we carried out post hoc paired sample t-tests comparing each face manipulation condition with the NM condition.
Post hoc t-tests with Bonferroni p-value correction (p = 0.012) showed that participants made more errors in the MF than in the NM condition in sadness recognition, anger recognition, and neutral recognition. Differences in error rates between the MF and NM conditions in happiness recognition were not significant (see Figure 2). Detailed results are displayed in Table 2.
Comparing the EO condition with the NM condition, there were significant differences in sadness recognition, anger recognition, and neutral recognition, and no significant differences in happiness recognition (Table 2). Comparisons between the MO and NM conditions did not reach the significance threshold for any of the emotions shown.
Reaction times analysis showed the main effects of face manipulation condition (F (3,90) = 29.298, p < 0.001, η2 = 0.491), face emotions (F (3,90) = 24.948, p < 0.001, η2 = 0.454), and the interaction between face manipulation conditions and face emotions (F (9,270) = 2.606, p = 0.007, η2 = 0.080). Post hoc t-tests with Bonferroni p-value correction (p = 0.012) showed that reaction times for correct responses were slower for all emotions in the MF condition compared to the NM condition (see Figure 3). Reaction times were also shorter in NM conditions than in EO conditions for all face emotion conditions (Table 3).
No significant differences were found between MF and EO conditions or between NM and MO conditions (all p > 0.012).
We calculated mean reaction times in each face manipulation condition regardless of the emotion. Results are shown in Table 4. Mean reaction times correlated positively with the TAS-20 score in the MO condition (r = 0.484, p = 0.006), in the NM condition (r = 0.477, p = 0.007), and in the EO condition (r = 0.374, p = 0.038). No correlations were found in the MF condition (r = 0.083, p = 0.656).
The TAS-20 subscale “difficulty in describing feelings” was correlated with reaction times in the MO (r = 0.491, p = 0.005) and NM (r = 0.530, p = 0.002) conditions.
No significant correlations were found between the “externally oriented thinking” TAS -20 subscale and the reaction times. TAS-20 and relative subscales mean scores, standard deviation, and range are reported in Table 5.
VAS on perceived difficulty of face manipulation conditions results were entered in a repeated measure ANOVA with face manipulation condition as a within-factor; results show that participants perceived different degrees of difficulty between experimental conditions (F (3,90) = 47.116, p < 0.001, η2 = 0.611). Post hoc comparisons with paired sample t-tests showed that the NM condition was perceived as less difficult than other experimental conditions (NM vs. MF: 2.29 ± 1.465 vs. 5.94 ± 1.750, p < 0.001; NM vs. EO: 2.29 ± 1.465 vs. 5.77 ± 1.820, p < 0.001; NM vs. MO: 2.29 ± 1.465 vs. 3.77 ± 1.802, p < 0.001).

4. Discussion

The present study investigated how humans recognize emotions in faces wearing surgical masks. Typically, observers use information from different areas of the face in order to recognize the emotion expressed by the face they are looking at. In our experiment, participants performed a FERT observing fully visible faces, masked faces, eye regions or mouth regions of angry, happy, sad, or neutral faces.
Error rates analysis showed that participants made more errors in MF than in NM conditions for all emotions shown except happy faces. Our findings demonstrate that sadness and anger are the most misinterpreted emotions when only the eye region remains visible.
This result could be due to the fact that happiness recognition is less dependent on configural information than anger and sadness recognition [13]. Furthermore, the orientation of eyebrows is dramatically different between happiness, anger, or sadness, so it is possible that this unique visible feature enables people to better discriminate between these emotions in the context of the specific proposed task.
Another explanation could be that, in our set of stimuli, happiness was the only positive emotion and therefore could have been easily identified due to the marked difference in emotional valence.
Reaction times analysis showed that emotion recognition was slower in MF condition than in NM condition for all emotions expressed by the faces. This result confirms that wearing a face mask can impair emotion recognition due to the resulting perceptual occlusion of a relevant part of the face.
Furthermore, we found that for all emotions displayed, reaction times were shorter in NM condition than in EO condition; conversely, differences between NM and MO conditions were not significant. Previous studies have reported that despite the importance of the eye region, the mouth region plays a fundamental role in emotion recognition [22]. The importance of the mouth region in emotion recognition could be due to the fact that it is the part of the face with the most prominent facial movements and consequently provides more diagnostic features useful in emotion recognition.
Correlations between reaction times in different face presentation conditions and TAS-20 scores showed that participants with high TAS-20 scores were slower in emotion recognition in NM, EO, and MO conditions. The same correlation was not significant for the MF condition.
Subscales of TAS-20 “difficulty in identifying feeling” and “difficulty in describing feelings” refer to poor emotional awareness. Scores in these subscales correlated with reaction times in MO and NM conditions, while no correlation was observed between these subscales and EO and MF.
Alexithymia is a personality trait that consists in impaired capacity to identify and describe emotions [31]; additionally, individuals with high scores in alexithymia scale show impaired perception of facial expressions [32]. Participants with high scores in TAS-20 show atypical neurophysiological responses in components related to the attentional processing of faces [39] and tend to show atypical fixation patterns during face scanning [33,34].
It should be noted that, in the MF condition, stimuli had the shape of a whole face, so it is possible that in this condition participants scanned the stimuli in search of configural information, leading to an overall worsening of the performance, which may have hidden the specific alexithymia effects on reaction times in the FERT. In NM condition the possibility of scanning the whole face allowed participants with lower alexithymic traits to extract configural information more efficiently and therefore to perform better than participants with high alexithymia scores.
Our study is not free from limitations. In our sample, TAS-20 mean scores were slightly under the mean that can be found in the general population; thus, the variance in our sample could be restricted and therefore could reduce the magnitude of correlation, limiting the generalizability of our results to the general population. Another limitation is the absence of a group of alexithymic subjects; unfortunately, due to the small size of our sample it was not possible to create a control group composed of alexithymic subjects, which would have allowed us to compare alexithymic and non-alexithymic subjects and thus augment the clinical relevance of our findings.
Future studies with a larger sample size characterized by a high range in TAS-20 scores are needed to better estimate the strength of the correlation between alexithymic traits and emotion recognition.

5. Conclusions

The aim of the present study was to assess the effect of surgical masks on configural and featural processing of faces and subsequent effects on emotion perception. We found that surgical masks have a huge impact on emotion recognition and that the effect is dependent on the emotion showed by a seen face. More precisely, we found that surgical masks have a major impact in the recognition of angry and sad faces, while happiness recognition is less affected by the loss of visual information due to the perceptual occlusion produced by the surgical mask. Furthermore, we found correlations between alexithymia and reaction times in the recognition of emotion in non-masked faces, while no correlation was found in the masked face condition. This result suggests that in subjects with low levels of alexithymia, who should be more efficient in face scanning, the sight of a masked face can lead to the activation of configural face processing. Such activation is maladaptive in the case of masked faces because of the impossibility of accessing configural information; on the other hand, featural processing of the visible part of the face would be a more efficient strategy in terms of speed and accuracy.
While surgical masks are of fundamental importance to fight the spread of the pandemic, it should be noted that the impairment in emotion recognition can lead to communication problems. As noted by other authors [40], healthcare professionals must consider communication issues due to surgical masks in their everyday practice and find communication strategies to minimize the impact of the loss of perceptual information when interacting with patients [41].
From a clinical point of view, it is interesting to note that the differences we found in reaction time are similar to those found when comparing clinical populations with neurological or psychiatric disorders, characterized by a deficit in emotion recognition, with healthy controls [42,43,44]. These differences may have a strong impact on emotion perception, with huge consequences on the quality of interaction between individuals and related psychological wellbeing. Our results concerning a sample of healthy individuals show that surgical masks have a huge impact on emotion perception. These findings raise a question regarding the consequences that surgical masks can have in clinical populations affected by diseases involving emotion perception impairment such as autism spectrum disorder, major depressive disorder, and alcohol use disorder, where emotion perception deficit has a central role in the severity of symptoms and consequent social adaptation of patients [45].
Ideally, future investigations should also consider bodily aspects together with facial expressions [46]. Moreover, the use of morphed facial expressions could help in clarifying the differential effect of the eye and mouth regions in decoding emotions during real-life experience.
An interesting field of research could be the longitudinal monitoring of emotion recognition ability in participants living during the COVID-19 pandemic, in order to evaluate the possible adaptation of the human ability to identify emotions to a prolonged condition of reduced facial communication.

Author Contributions

Conceptualization, R.F., N.M., M.D., A.P.; methodology, R.F., N.M., M.D., A.P.; software; N.M.; writing—original draft preparation, R.F., N.M., B.P., S.T., G.P.; writing—review and editing, N.M., M.D., B.P., S.T., G.P., M.R.R., A.P., R.F.; supervision, R.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethical review and approval were waived for this study on human participants in accordance with the local legislation and institutional requirements.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study and the study protocol followed the Declaration of Helsinki.

Data Availability Statement

Data not provided in the article must be shared at the request of other investigators for purposes of replicating procedures and results.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chu, D.K.; Akl, E.A.; Duda, S.; Solo, K.; Yaacoub, S.; Schünemann, H.J.; El-harakeh, A.; Bognanni, A.; Lotfi, T.; Loeb, M.; et al. Physical distancing, face masks, and eye protection to prevent person-to-person transmission of SARS-CoV-2 and COVID-19: A systematic review and meta-analysis. Lancet 2020, 395, 1973–1987. [Google Scholar] [CrossRef]
  2. Grundmann, F.; Epstude, K.; Scheibe, S. Face masks reduce emotion-recognition accuracy and perceived closeness. PLoS ONE 2021, 16, e0249792. [Google Scholar] [CrossRef] [PubMed]
  3. Mheidly, N.; Fares, M.Y.; Zalzale, H.; Fares, J. Effect of Face Masks on Interpersonal Communication During the COVID-19 Pandemic. Front. Public Health 2020, 8, 582191. [Google Scholar] [CrossRef] [PubMed]
  4. Boraston, Z.; Blakemore, S.-J.; Chilvers, R.; Skuse, D. Impaired sadness recognition is linked to social interaction deficit in autism. Neuropsychologia 2007, 45, 1501–1510. [Google Scholar] [CrossRef] [PubMed]
  5. Ekman, P. Are there basic emotions? Psychol. Rev. 1992, 99, 550–553. [Google Scholar] [CrossRef]
  6. Ekman, P.; Friesen, W.V. Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1976, 1, 56–75. [Google Scholar] [CrossRef]
  7. Ekman, P.; Freisen, W.V.; Ancoli, S. Facial signs of emotional experience. J. Pers. Soc. Psychol. 1980, 39, 1125–1134. [Google Scholar] [CrossRef] [Green Version]
  8. Wegrzyn, M.; Vogt, M.; Kireclioglu, B.; Schneider, J.; Kissler, J. Mapping the emotional face. How individual face parts contribute to successful emotion recognition. PLoS ONE 2017, 12, e0177239. [Google Scholar] [CrossRef] [Green Version]
  9. Bombari, D.; Mast, F.W.; Lobmaier, J.S. Featural, configural, and holistic face-processing strategies evoke different scan patterns. Perception 2009, 38, 1508–1521. [Google Scholar] [CrossRef]
  10. Rossion, B. The composite face illusion: A whole window into our understanding of holistic face perception. Vis. Cogn. 2013, 21, 139–253. [Google Scholar] [CrossRef]
  11. Murphy, J.; Gray, K.L.H.; Cook, R. The composite face illusion. Psychon. Bull. Rev. 2016, 24, 245–261. [Google Scholar] [CrossRef]
  12. White, M. Parts and Wholes in Expression Recognition. Cogn. Emot. 2000, 14, 39–60. [Google Scholar] [CrossRef]
  13. Bombari, D.; Schmid, P.; Mast, M.S.; Birri, S.; Mast, F.W.; Lobmaier, J.S. Emotion recognition: The role of featural and configural face information. Q. J. Exp. Psychol. 2013, 66, 2426–2442. [Google Scholar] [CrossRef]
  14. Ruggiero, F.; Dini, M.; Cortese, F.; Vergari, M.; Nigro, M.; Poletti, B.; Priori, A.; Ferrucci, R. Anodal Transcranial Direct Current Stimulation over the Cerebellum Enhances Sadness Recognition in Parkinson’s Disease Patients: A Pilot Study. Cerebellum 2021, 1–10. [Google Scholar] [CrossRef] [PubMed]
  15. Pegna, A.; Khateb, A.; Lazeyras, F.; Seghier, M. Discriminating emotional faces without primary visual cortices involves the right amygdala. Nat. Neurosci. 2004, 8, 24–25. [Google Scholar] [CrossRef] [PubMed]
  16. Vassallo, S.; Cooper, S.L.; Douglas, J.M. Visual scanning in the recognition of facial affect: Is there an observer sex difference? J. Vis. 2009, 9, 11. [Google Scholar] [CrossRef]
  17. Clark, U.; Neargarder, S.; Cronin-Golomb, A. Visual exploration of emotional facial expressions in Parkinson’s disease. Neuropsychologia 2010, 48, 1901–1913. [Google Scholar] [CrossRef] [Green Version]
  18. Jack, R.E.; Blais, C.; Scheepers, C.; Schyns, P.G.; Caldara, R. Cultural Confusions Show that Facial Expressions Are Not Universal. Curr. Biol. 2009, 19, 1543–1548. [Google Scholar] [CrossRef] [Green Version]
  19. Kret, M.E.; de Gelder, B. Islamic headdress influences how emotion is recognized from the eyes. Front. Psychol. 2012, 3, 110. [Google Scholar] [CrossRef] [Green Version]
  20. Graham, D.L.; Ritchie, K.L. Making a Spectacle of Yourself: The Effect of Glasses and Sunglasses on Face Perception. Perception 2019, 48, 461–470. [Google Scholar] [CrossRef]
  21. Kramer, R.S.S.; Ritchie, K. Disguising Superman: How Glasses Affect Unfamiliar Face Matching. Appl. Cogn. Psychol. 2016, 30, 841–845. [Google Scholar] [CrossRef] [Green Version]
  22. Blais, C.; Roy, C.; Fiset, D.; Arguin, M.; Gosselin, F. The eyes are not the window to basic emotions. Neuropsychologia 2012, 50, 2830–2838. [Google Scholar] [CrossRef]
  23. Carbon, C.-C. Wearing Face Masks Strongly Confuses Counterparts in Reading Emotions. Front. Psychol. 2020, 11, 2526. [Google Scholar] [CrossRef]
  24. Freud, E.; Stajduhar, A.; Rosenbaum, R.S.; Avidan, G.; Ganel, T. The COVID-19 pandemic masks the way people perceive faces. Sci. Rep. 2020, 10, 22344. [Google Scholar] [CrossRef] [PubMed]
  25. Bani, M.; Russo, S.; Ardenghi, S.; Rampoldi, G.; Wickline, V.; Nowicki, S.; Strepparava, M.G. Behind the Mask: Emotion Recognition in Healthcare Students. Med. Sci. Educ. 2021, 31, 1273–1277. [Google Scholar] [CrossRef] [PubMed]
  26. Tsantani, M.; Podgajecka, V.; Gray, K.L.H.; Cook, R. How does the presence of a surgical face mask impair the perceived intensity of facial emotions? PLoS ONE 2022, 17, e0262344. [Google Scholar] [CrossRef] [PubMed]
  27. Pfattheicher, S.; Nockur, L.; Böhm, R.; Sassenrath, C.; Petersen, M.B. The Emotional Path to Action: Empathy Promotes Physical Distancing and Wearing of Face Masks During the COVID-19 Pandemic. Psychol. Sci. 2020, 31, 1363–1373. [Google Scholar] [CrossRef]
  28. Carragher, D.J.; Hancock, P.J.B. Surgical face masks impair human face matching performance for familiar and unfamiliar faces. Cogn. Res. Princ. Implic. 2020, 5, 59. [Google Scholar] [CrossRef]
  29. Marini, M.; Ansani, A.; Paglieri, F.; Caruana, F.; Viola, M. The impact of facemasks on emotion recognition, trust attribution and re-identification. Sci. Rep. 2021, 11, 5577. [Google Scholar] [CrossRef]
  30. Fitousi, D.; Rotschild, N.; Pnini, C.; Azizi, O. Understanding the Impact of Face Masks on the Processing of Facial Identity, Emotion, Age, and Gender. Front. Psychol. 2021, 12, 4668. [Google Scholar] [CrossRef]
  31. Taylor, G.J. Alexithymia: Concept, measurement, and implications for treatment. Am. J. Psychiatry 1984, 141, 725–732. [Google Scholar] [CrossRef] [PubMed]
  32. Prkachin, G.C.; Casey, C.; Prkachin, K.M. Alexithymia and perception of facial expressions of emotion. Pers. Individ. Differ. 2009, 46, 412–417. [Google Scholar] [CrossRef]
  33. Bird, G.; Press, C.; Richardson, D. The role of alexithymia in reduced eye-fixation in autism spectrum conditions. J. Autism Dev. Disord. 2011, 41, 1556–1564. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Cuve, H.C.; Castiello, S.; Shiferaw, B.; Ichijo, E.; Catmur, C.; Bird, G. Alexithymia explains atypical spatiotemporal dynamics of eye gaze in autism. Cognition 2021, 212, 104710. [Google Scholar] [CrossRef]
  35. Rosenberg, N.; Ihme, K.; Lichev, V.; Sacher, J.; Rufer, M.; Grabe, H.J.; Kugel, H.; Pampel, A.; Lepsien, J.; Kersting, A.; et al. Alexithymia and automatic processing of facial emotions: Behavioral and neural findings. BMC Neurosci. 2020, 21, 23. [Google Scholar] [CrossRef]
  36. Tottenham, N.; Tanaka, J.W.; Leon, A.C.; McCarry, T.; Nurse, M.; Hare, T.; Marcus, D.J.; Westerlund, A.; Casey, B.; Nelson, C. The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Res. 2009, 168, 242–249. [Google Scholar] [CrossRef] [Green Version]
  37. Rief, W.; Heuser, J.; Fichter, M.M. What does the Toronto alexithymia scale TAS-R measure? J. Clin. Psychol. 1996, 52, 423–429. [Google Scholar] [CrossRef]
  38. Froming, K.; Levy, M.; Schaffer, S.; Ekman, P. The Comprehensive Affect Testing System. Psychol Software, Inc., 2006. Available online: https://psycnet.apa.org/doiLanding?doi=10.1037%2Ft06823-000 (accessed on 12 December 2021).
  39. Vermeulen, N.; Luminet, O.; De Sousa, M.C.; Campanella, S. Categorical perception of anger is disrupted in alexithymia: Evidence from a visual ERP study. Cogn. Emot. 2008, 22, 1052–1067. [Google Scholar] [CrossRef]
  40. Marler, H.; Ditton, A. “I’m smiling back at you”: Exploring the impact of mask wearing on communication in healthcare. Int. J. Lang. Commun. Disord. 2021, 56, 205–214. [Google Scholar] [CrossRef]
  41. Ziccardi, S.; Crescenzo, F.; Calabrese, M. “What Is Hidden behind the Mask?” Facial Emotion Recognition at the Time of COVID-19 Pandemic in Cognitively Normal Multiple Sclerosis Patients. Diagnostics 2021, 12, 47. [Google Scholar] [CrossRef]
  42. Ricciardi, L.; Comandini, F.V.; Erro, R.; Morgante, F.; Bologna, M.; Fasano, A.; Ricciardi, D.; Edwards, M.J.; Kilner, J. Facial emotion recognition and expression in Parkinson’s disease: An emotional mirror mechanism? PLoS ONE 2017, 12, e0169110. [Google Scholar] [CrossRef] [Green Version]
  43. Høyland, A.L.; Nærland, T.; Engstrøm, M.; Lydersen, S.; Andreassen, O.A. The relation between face-emotion recognition and social function in adolescents with autism spectrum disorders: A case control study. PLoS ONE 2017, 12, e0186124. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  44. Loth, E.; Garrido, L.; Ahmad, J.; Watson, E.; Duff, A.; Duchaine, B. Facial expression recognition as a candidate marker for autism spectrum disorder: How frequent and severe are deficits? Mol. Autism 2018, 9, 7. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Grahlow, M.; Rupp, C.I.; Derntl, B. The impact of face masks on emotion recognition performance and perception of threat. PLoS ONE 2022, 17, e0262840. [Google Scholar] [CrossRef] [PubMed]
  46. de Gelder, B.; Van den Stock, J. The bodily expressive action stimulus test (BEAST). Construction and validation of a stimulus Basis for measuring perception of whole body expression of emotions. Front. Psychol. 2011, 2, 181. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Experimental manipulation of the stimuli in each condition.
Figure 1. Experimental manipulation of the stimuli in each condition.
Ijerph 19 02420 g001
Figure 2. Error rates in face manipulation conditions for each emotion shown. * p-value ≤ 0.012.
Figure 2. Error rates in face manipulation conditions for each emotion shown. * p-value ≤ 0.012.
Ijerph 19 02420 g002
Figure 3. Mean reaction times for correct responses only in face manipulation conditions for each emotion shown. * p-value ≤ 0.012.
Figure 3. Mean reaction times for correct responses only in face manipulation conditions for each emotion shown. * p-value ≤ 0.012.
Ijerph 19 02420 g003
Table 1. Demographic data for each participant.
Table 1. Demographic data for each participant.
Demographic VariableData
Sample Size31 (16 M)
Age (Years; Mean ± SD)32 ± 11
Education (Years; Mean ± SD)17 ± 4
TAS- Score (Mean ± SD)39.5 ± 9.37
Table 2. Post hoc comparisons of error rates.
Table 2. Post hoc comparisons of error rates.
EmotionsFace Manipulation ConditionsMean DifferenceSDtdfSig.
HappinessNM 2.82 ± 3.94
MO 2.15 ± 3.01
0.675.050.74300.465
HappinessNM 2.82 ± 3.94
M 5.10 ± 4.65
−2.285.14−2.47300.019
HappinessNM 2.82 ± 3.94
EO 7.39 ± 9.36
−4.5610.50−2.42300.022
NeutralNM 3.22 ± 4.39
MO 4.97 ± 5.09
−1.746.86−1.42300.167
NeutralNM 3.22 ± 4.39
M 7.93 ± 7.16
−4.708.17−3.20300.003 *
NeutralNM 3.22 ± 4.39
EO 9.27 ± 9.36
−6.049.18−3.67300.001 *
AngerNM 3.36 ± 4.86
MO 4.30 ± 5.74
−0.946.15−0.85300.401
AngerNM 3.36 ± 4.86
M 18.81 ± 8.32
−15.458.41−10.2330<0.001 *
AngerNM 3.36 ± 4.86
EO 18.41 ± 7.27
−15.057.19−11.6630<0.001 *
SadnessNM 11.15 ± 8.22
MO 14.65 ± 10.08
−3.499.19−2.12300.043
SadnessNM 11.15 ± 8.22
M 26.48 ± 12.57
−15.3215.86−5.3830<0.001 *
SadnessNM 11.15 ± 8.22
EO 23.11 ± 10.90
−11.9614.01−4.7530<0.001 *
Conditions are listed as NM = No Masked, MO = Mouth Only; EO = Eyes Only; M = Masked. * p-value ≤ 0.012 adjusted for Bonferroni correction for multiple comparisons.
Table 3. Post hoc comparisons of reaction times for correct responses only.
Table 3. Post hoc comparisons of reaction times for correct responses only.
EmotionsFace Manipulation Conditions (ms; Mean ± SD)Mean DifferenceSDtdfSig.
HappinessNM 957.33 ± 243.10
MO 916.41 ± 167.18
40.92188.191.21300.235
HappinessNM 957.33 ± 243.10
M 1209.19 ± 281.68
−251.86243.84−5.7530<0.001 *
HappinessNM 957.33 ± 243.10
EO 1233.80 ±367.66
−276.46364.50−4.2330<0.001 *
NeutralNM 1042.88 ± 385.10
MO1016.86 ± 419.07
26.02196.800.74300.467
NeutralNM 1042.88 ± 385.10
M 1258.00 ± 351.75
−215.12298.40−4.0130<0.001 *
NeutralNM 1042.88 ± 385.10
EO 1313.94 ± 444.31
−271.06317.21−4.7630<0.001 *
AngerNM 964.94 ± 223.95
MO 1031.59 ± 390.74
−66.64256.85−1.45300.159
AngerNM 964.94 ± 223.95
M 1151.74 ± 219.49
−186.80178.61−5.8230<0.001 *
AngerNM 964.94 ± 223.95
EO 1118.97 ± 266.11
−154.03193.98−4.4230<0.001 *
SadnessNM 1203.64 ± 304.12
MO 1223.72 ± 419.99
−20.07249.88−0.45300.658
SadnessNM 1203.64 ± 304.12
M 1511.57 ± 400.47
−307.92385.49−4.4530<0.001 *
SadnessNM 1203.64 ± 304.12
EO 1527.12 ± 346.59
−323.47366.80−4.9130<0.001 *
Conditions are listed as NM = No Masked, MO = Mouth Only; EO = Eyes Only; M = Masked. * p-value ≤ 0.012 adjusted for Bonferroni correction for multiple comparisons. Asterisks denote statistically significant results.
Table 4. Correlation between TAS-20 score and mean reaction times for correct responses only in different face manipulation conditions.
Table 4. Correlation between TAS-20 score and mean reaction times for correct responses only in different face manipulation conditions.
Face Manipulation Conditions Difficulty Describing FeelingsDifficulty Identifying FeelingsExternally-Oriented ThinkingTAS-20 Total
MOPearson r0.49 **0.38 *0.160.48 **
p-value0.0050.0340.3830.006
EOPearson r0.240.350.180.37 *
p-value0.1990.0560.3260.038
NMPearson r0.53 **0.36 *0.090.48 **
p-value0.0020.0500.6080.007
MFPearson r0.070.070.080.08
p-value0.7050.7260.6610.656
Conditions are listed as NM = non-masked; MO = mouth only; EO = eyes only; M = masked. * = p-value < 0.05; ** = p-value < 0.01.
Table 5. TAS-20; Externally-Oriented Thinking; Difficulty Identifying Feeling; Externally-Oriented Thinking Mean scores, standard deviations, and range.
Table 5. TAS-20; Externally-Oriented Thinking; Difficulty Identifying Feeling; Externally-Oriented Thinking Mean scores, standard deviations, and range.
MeanS.D.Range
TAS-2039.59.426–67
Difficulty Describing Feelings11.13.85–23
Difficulty Identifying Feeling13.55.07–25
Externally-Oriented Thinking 16.43.013–27
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Maiorana, N.; Dini, M.; Poletti, B.; Tagini, S.; Rita Reitano, M.; Pravettoni, G.; Priori, A.; Ferrucci, R. The Effect of Surgical Masks on the Featural and Configural Processing of Emotions. Int. J. Environ. Res. Public Health 2022, 19, 2420. https://doi.org/10.3390/ijerph19042420

AMA Style

Maiorana N, Dini M, Poletti B, Tagini S, Rita Reitano M, Pravettoni G, Priori A, Ferrucci R. The Effect of Surgical Masks on the Featural and Configural Processing of Emotions. International Journal of Environmental Research and Public Health. 2022; 19(4):2420. https://doi.org/10.3390/ijerph19042420

Chicago/Turabian Style

Maiorana, Natale, Michelangelo Dini, Barbara Poletti, Sofia Tagini, Maria Rita Reitano, Gabriella Pravettoni, Alberto Priori, and Roberta Ferrucci. 2022. "The Effect of Surgical Masks on the Featural and Configural Processing of Emotions" International Journal of Environmental Research and Public Health 19, no. 4: 2420. https://doi.org/10.3390/ijerph19042420

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop