Next Article in Journal
Are We on the Way to Successfully Educating Future Citizens?—A Spotlight on Critical Thinking Skills and Beliefs about the Nature of Science among Pre-Service Biology Teachers in Germany
Next Article in Special Issue
Emotional Dysfunction and Interoceptive Challenges in Adults with Autism Spectrum Disorders
Previous Article in Journal
Research on the Factors Affecting the Adoption of Smart Aged-Care Products by the Aged in China: Extension Based on UTAUT Model
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Emojis Are Comprehended Better than Facial Expressions, by Male Participants

Department of Psychology, University of Milano-Bicocca, Piazza dell’Ateneo Nuovo 1, 20162 Milan, Italy
*
Author to whom correspondence should be addressed.
Behav. Sci. 2023, 13(3), 278; https://doi.org/10.3390/bs13030278
Submission received: 22 January 2023 / Revised: 19 March 2023 / Accepted: 20 March 2023 / Published: 22 March 2023
(This article belongs to the Special Issue Social Processing in People with or without Autism)

Abstract

:
Emojis are colorful ideograms resembling stylized faces commonly used for expressing emotions in instant messaging, on social network sites, and in email communication. Notwithstanding their increasing and pervasive use in electronic communication, they are not much investigated in terms of their psychological properties and communicative efficacy. Here, we presented 112 different human facial expressions and emojis (expressing neutrality, happiness, surprise, sadness, anger, fear, and disgust) to a group of 96 female and male university students engaged in the recognition of their emotional meaning. Analyses of variance showed that male participants were significantly better than female participants at recognizing emojis (especially negative ones) while the latter were better than male participants at recognizing human facial expressions. Quite interestingly, male participants were better at recognizing emojis than human facial expressions per se. These findings are in line with more recent evidence suggesting that male individuals may be more competent and inclined to use emojis to express their emotions in messaging (especially sarcasm, teasing, and love) than previously thought. Finally, the data indicate that emojis are less ambiguous than facial expressions (except for neutral and surprise emotions), possibly because of the limited number of fine-grained details and the lack of morphological features conveying facial identity.

1. Introduction

The main goal of the study was to compare, in the two sexes, the ability to comprehend emotional states conveyed by real faces with those conveyed by emojis, which are symbols used in long-distance communication. Emojis are colored ideographs, somewhat larger than alphabetical characters, which represent smileys or other entities such as hand signs, animals, food, etc., and are typically used in computer- or smartphone-mediated communication, particularly in messaging systems and social networks. They are often complementary to the written text, and aimed at replacing non-verbal communication as much as possible [1]. Therefore, emojis seem to compensate quite effectively for the lack of non-verbal clues (such as body language, facial expressions, and voice prosody) in textual communication. In particular, face pictographs are used to enrich communication emotionally, promoting the expression of emotional states and feelings. Some studies, along with the extreme pervasiveness of this modern communication system in messaging, have demonstrated their effectiveness. For example, Aluja and coauthors [2] measured the acoustic startle reflex modulation during the observation of emoji emotional faces and found higher acoustically evoked startle responses when viewing unpleasant emojis and lower responses for pleasant ones, similarly to those obtained with real human faces as the emotional stimuli. Again, Cherbonnier and Michinov [3] compared emotion recognition mediated by different types of facial stimuli—pictures of real faces, emojis, and face drawings (emoticons, also known as smileys)—and found that, contrary to what one might suppose, the participants were more accurate in detecting emotions from emojis than real faces and emoticons. Liao et al. [4] (2021), comparing ERPs generated by viewing painful and neutral faces and emojis, found that P300 was of greater amplitude in response to faces than emojis, but both types of stimuli elicited late positive potentials (LPP) of a greater amplitude in response to negative than positive emotions. The authors concluded that emojis and faces were processed similarly, especially at the later cognitive stages.
Emojis and emoticons have become a worldwide means of expressing emotions in computer-mediated communication, but the neuroscientific knowledge regarding how the human brain recognizes them is still scarce. Not much is known, especially, about a possible sex difference in the ability to comprehend emojis. Some studies were performed to compare face and emoticon processing. The results seem to suggest a partial independence in the subserving neural circuits. Yuasa and coworkers [5] compared the cerebral activations during the observation of Japanese faces and Japanese emoticons (e.g., (^_^), combinations of vertically organized letters and symbols) through fMRI and found that the face fusiform area (FFA) was active only during the processing of human faces (as expected, e.g., [6]), and not of emoticons. However, the right inferior frontal gyrus, involved in the assessment of the stimulus emotional valence, was active during the processing of both types of stimuli. This was interpreted as an indication that, even if emoticons were not perceived as faces, they activated higher-order social areas devoted to the interpretation of face emotional meaning. Similarly, using emoji stimuli, Chatzichristos and coworkers [7] failed to find an FFA activation during the perception and matching of positive and negative emojis with words triggering autobiographical memories. However, they found the involvement of the emotional and social areas such as the right temporal and inferior frontal areas and the amygdala nuclei. Other neuroimaging investigations reported how FFA and the occipital face area (OFA) were strongly active during the perception of emoticons. This would explain how they are emotionally recognized and identified as facial stimuli [8,9,10].
The purpose of this study was manifold. First, the communicative effectiveness of sets of emoji and face stimuli was assessed through preliminary validation. Then, it was investigated to which extent the observers grasped the emotional meaning of real facial expressions and emojis, as a function of the stimulus type, valence, and complexity. Another important aim of the study was to test the existence of sex differences in the ability to recognize the affective content of faces vs. emojis. Numerous studies have provided evidence that female participants do better than male participants in tasks involving deciphering emotions through facial expressions and non-verbal communication cues [11,12,13,14,15,16,17]. Female individuals would also be more likely than males to express their emotional experiences to others [18]. This sex difference would be more pronounced with subtle emotions (such as sadness or fear); it would also reflect a greater interest of females in social information [12] and a more empathic attitude toward suffering of others [19]. In detail, some evidence suggest that females would be more expressive and prone to externalizing emotions than male individuals, from adolescence through adulthood [20]. In particular, females would more frequently express states of sadness, fear, shame, or guilt than male individuals, while males would exhibit more aggressive behavior than females when feeling angry [21,22].
On the other hand, some authors have speculated that a gender difference might also exist in emoji use. Evidence has been provided of a more frequent female use of emojis than male [23]. Consistently, higher emoji usage and familiarity ratings for female participants than for male participants were reported in a large American survey [24] assessing the valence and the familiarity of 70 smileys (facial emojis). Given that emojis were introduced into text-based communication to more effectively express emotions and describe affective information, it might be hypothesized that females would manifest more intensive use of emojis and emoticons than males, analogously to what occurs in real-life social interactions [15,25]. However, this does not seem to be the case. Chen and coworkers [26] documented that 7.02% of male users used at least one emoji in their typical messages while 7.96% of female were likely to use one or more emojis. Herring and Dainas [27] reported that female and male social media users similarly interpreted and used emojis in Facebook messages. However, the authors reported that male users were significantly more inclined to use sentiment-related emojis (heartlets) than female users were (male: 19.4%; female: 17.6%), which contradicts the psychosocial literature according to which males would be less willing to express love in real social interactions than females [28]. According to the authors, such a finding would imply that, although males are reserved when expressing their love in real life, they are more willing to express love through emojis in textual communication. After all, it is possible that social behaviors change as technological contexts change.
In order to assess the existence of possible sex differences in the way emotions are processed and recognized in real faces and in emojis, two different samples of university students, matched for sex, age, cultural status, and residence area, were presented with a large set of real faces and emojis and asked to rate them for comprehensibility. The degree of recognizability of human facial expressions relative to eight different human identities (half male, half female) was compared with that of emojis printed in eight different graphic styles. Seven different types of neutral, positive, and negative facial expressions were considered. Before experimentation, pictures of the stimuli were validated by different groups of students (of similar age and sex) recruited in the same university, by measuring the rates of correct recognitions.
On the basis of the previous literature, it was expected that female participants would outperform male participants in the recognition of empathy-related, subtle, human facial expressions (e.g., fear) [29]. In addition, it was generally hypothesized that the emoji and face recognizability scores would be roughly comparable, with some occasional advantage for emojis, depending on the specific emotion [30,31]. The artificiality of this symbolic system would, in fact, be counterbalanced by the emojis’ lack of ambiguity, limited number of fine-grained features, and lack of internal noise related to the variety of human identities.
For example, previous studies reported how fearful emojis were recognized better than real fearful expressions [30]. Furthermore, fearful faces tended to be recognized more slowly than other emotions [32,33]. This disadvantage might be due to the fear configural subtleness, or to the presence of details shared with the surprise expression (the wide-open mouth and eyes) in real faces.
On the contrary, emoji expressions would be more markedly distinctive across categories (e.g., fear emojis feature dark open mouths and evident white sclera in the eyes). Therefore, the emojis’ advantage over fearful human faces reported by some studies might be due to their more distinctive featural characteristics.

2. Materials and Methods

2.1. Participants

Ninety-six healthy students of local university (48 males and 48 females) aged 18 to 35 years (mean age = 23.89) participated in the main study (preceded by a validation performed on a different sample). They were randomly assigned to the facial expression study (48 subjects—24 males, aged: 23.37; 24 females, aged: 23.37 years) or the emoji recognition study (48 subjects—24 males aged 23.62; 24 females, aged: 24.91). The current sample size was tested for power analysis by using the program G*Power3.1 for comparing two independent groups with alpha level = 0.05. They were all right-handed, as assessed through the administration of the Edinburgh Handedness Inventory. They all declared to have never suffered from psychiatric or neurological deficits and to have good or corrected-to-good vision. Before taking part in the experiment, participants signed written informed consent forms. The study was carried out in accordance with the relevant guidelines and regulations and was approved by the ethics committee of University of Milano-Bicocca (CRIP, protocol number RM-2021-401). It was conducted online and programmed via Google Forms, https://www.google.com/forms (accessed on 20 May 2021).

2.2. Stimuli

For stimuli, 56 different human facial expressions and 56 different emoji pictures (i.e., 112 different visual stimuli), depicting 8 variants of 7 universal facial expressions were used in this study. Eight different identities (4 female, 4 male) were used for human faces, while eight different emoji styles were used for emoji testing. Stimuli were created and validated as described below.:
Emoji. Emoji pictures were drawn from free web platforms (Apple, Google, Microsoft, WhatsApp, Twitter, Facebook, Joypixels, and www.icons8.it (accessed on 1 April 2021) and represented the six basic Ekman’s emotions [34]: happiness, sadness, surprise, fear, anger, and disgust, plus neutrality, in eight different styles. They were matched for size (4 cm in diameter) and average luminance. The original stimuli were slightly modified to adjust their average luminance, color saturation, and size. Figure 1 illustrates some examples of stimuli. Stimuli were preliminarily validated in a behavioral study by means of a test administered via Google Forms to select the most easily recognizable emojis among a much larger set of 168 emojis comprising 24 different emoji styles for 7 facial expressions.
Validation was previously performed on group of 48 students (24 males, 24 females) aged on average 23 years. Participants in stimulus validation had normal or corrected-to-normal vision, and never suffered from neurological or psychiatric disorders. They were shown randomly mixed sequences of stimuli, one at a time, and at the center of the screen. The task consisted of deciding, as accurately and quickly as possible, which of the emotion words provided was more appropriate to describe the observed expressions, by clicking a check mark. The eight sets (featuring 56 emojis) associated with the highest accuracy in performance were selected as experimental stimuli. Average hit rate for the final set was 79.40%. In more detail, accuracy was 94.53% (SD = 0.51) for happiness, 76.04% (SD = 1.85) for surprise, 84.12% (SD = 1.76) for sadness, 96.09% (SD = 2.1) for anger, 57.29% (5.04) for fear, 91.15% (SD = 0.99) for disgust, and 56.51% (SD = 0.99) for neutrality.
Faces. Ten Caucasian students (5 males, 5 females) aged about 23 years were recruited for photo shooting. They were required to not wear any paraphernalia (e.g., earrings, glasses, make up, bows, clips, necklaces, tattoos, and piercings) while mustaches or beard were not permitted. All were required to wear a black shirt and to gather their hair behind the ears. The dark area above the forehead generated by the hairline was kept to preserve the humanness of the faces (the hair being, moreover, barely visible). For each of the seven emotional expressions, the actors had to portray a specific emotional state, by focusing on a specific autobiographical episode through the Stanislavsky method, and express spontaneously their mood. This procedure induced actors to activate their emotional memory, by recalling specific past experiences, and to react to them by externalizing spontaneous emotions, instead of concentrating on reproducing facial mimics dissociated from the momentarily emotional state, which might look phony. They were recommended to activate a positive scenario for ‘surprise’ (see Figure 1 for examples of stimuli). All participants provided written informed consent and signed the privacy release form.
Stimulus set was validated on a sample of 50 students (25 females, 24 males, and 1 genderfluid), aged about 23.7 years and different from the experimental sample. Participants in stimulus validation had normal or corrected-to-normal vision, and never suffered from neurological or psychiatric disorders. They were shown randomly mixed sequences of stimuli, one at a time, and at the center of the screen. The task consisted of deciding, as accurately and quickly as possible, which of the emotion words provided was more appropriate to describe the observed expression, by clicking a check mark. Stimuli were 56 pictures depicting the seven facial expressions exhibited by the 8 different actors. The results showed a high accuracy in correctly categorizing facial expressions (87.35%, in line with Carbon [35]); namely, hits were 98.5% for happiness, 86.7% for surprise, 80.1% for sadness, 89.3% for anger, 72.7% for fear, 85.97% for disgust, and 98.2% for neutrality. Stimulus set was also rated for facial attractiveness by a further group of 12 students (7 females and 5 males) aged 18–25 years. They were asked to rate the attractiveness of a neutral expression belonging to the 8 identities, by using a 3-point Likert scale (1 = “not attractive”, 2 = “average”, and 3 = “attractive”). The results showed no difference in attractiveness between individuals of the two sexes, with an average score of 1.83 for females and 1.82 for males. Overall actors were rated as “average” for attractiveness, which allows generalizability of results to the normal-looking population. Face stimuli were 3.37 cm × 5 cm (3°22′ × 5°) in size.

2.3. Procedure

The emotion-recognition task consisted of 112 experimental trials, in which participants were first shown a portrait photograph of an adult face (or a facial emoji, according to the experiment) to be inspected for about 2 s. The stimuli were equiluminant as shown by an ANOVA performed on luminance values (F = 0.1, p = 0.99). Photos of faces and emoji were in color, and were displayed at the center of the screen, on a white background. Immediately below the face or the emoji, there was a list of words (neutrality, happiness, surprise, fear, anger, sadness, and disgust), from which they had to select the emotion that they deemed the most appropriate to describe the meaning of the expression. Moreover, participants rated the degree of recognizability of the expression on a 3-point Likert scale. (1 = ‘not much’, 2 = ‘fairly’, and 3 = ‘very much’). The emotion was scored 0 if a different incorrect expression was selected. The time allowed for perceiving and responding to the two queries was 5 s. Participants were instructed to observe one facial stimulus at a time and to respond within 5 s, not missing any answer. Only one choice per face/emoji was allowed. The task lasted about 15 min.

2.4. Data Analysis

The individual scores obtained from each individual, for each of the 7 facial expressions and stimulation condition, underwent a 3-way repeated-measure ANOVA whose factors of variability were: 1 between-group factor named “sex” (with 2 levels: female and male), 1 between-group factor named “stimulus type” (with 2 levels: emoji and face), and “emotion” between-group factor (with 7 levels: happiness, neutrality, surprise, anger, sadness, fear, and disgust). Multiple post-hoc comparisons were performed using Tukey’s test. Greenhouse–Geisser correction was applied in case of epsilon < 1 and epsilon corrected p value were computed. Finally, a two-way ANOVA was also applied to all the raw data to measure the distribution of individual scores in relation to the sex of viewer and type of stimulus (regardless of facial expression).

3. Results

The ANOVA performed on the recognition rates of emojis vs. faces yielded the significance of Emotion (F 6,552 = 31.575, p < 0.00001). Post-hoc comparisons showed that happiness was considered the most recognizable emotion (2.639, SE = 0.028) and fear the least recognizable emotion (1.86, SE = 0.07), differing from all the others. The significant interaction of the Emotion x Stimulus type (F 6,552 = 13.87, p ˂ 0.00001) and relative post-hoc comparisons showed that the recognizability rates were higher for emojis than faces, for all emotions except happiness and the neutral expressions (see Figure 2).
The significant interaction of the Emotion × Sex × Stimulus type (F 6,552 = 2.3, p ˂ 0.03), and post-hoc tests, showed that female participants generally outperformed male participants in the recognition of facial expressions (especially anger, surprise, sadness, and disgust), while male participants outperformed female participants in the recognition of all emojis except fear (see Figure 3 for means and standard deviations).
Overall, while female participants were better at recognizing facial expressions than male participants, especially for surprise (p < 0.04), anger (p < 0.01), and sadness (p < 0.01), male participants were better at recognizing all emojis (especially neutral, surprise, and disgust; p < 0.01) than female participants, except for the emotion of fear, which was better recognized by females.
The two-way ANOVA performed on the recognizability scores as a function of the viewer’s sex and stimulus type showed the significance of the stimulus (F 1,167 = 15.38, p < 0.0002), with higher scores attributed to emojis than faces, and of the sex (F 1,167 = 40, p < 0.0001), with a better performance for female than male participants. However, the significant interaction of the Stimulus type × Sex (F 1,167 = 65, p < 0.0001) and relative post-hoc comparisons showed that, while there was no difference between emoji vs. faces for female participants, males performed much better than females in recognizing emojis (p < 0.0033), and much worse than females in recognizing facial expressions (p < 0.00008). Furthermore, males were much better at recognizing emojis than facial expressions (p < 0.00008), as clearly visible in Figure 4.

4. Discussion

This study compared the recognizability of human facial expressions and emojis, balanced by number of emotions and number of styles/identities tested. Regardless of the sex of the viewers, some emojis were recognized more clearly than facial expressions, especially anger. This result is similar to that found by Fischer and Herbert [30], contrasting facial expressions, emoji, and emoticons, and finding that emojis and faces were quite good at representing the associated emotions and therefore also in reducing ambiguity.
In addition, for some emotions (i.e., anger), emojis were even better than faces. It should be considered that angry and disgusted emojis, for example, have a slightly different coloring (reddish and greenish), which makes them distinctive. Quite similarly, Cherbonnier and Michinov [3], comparing the comprehension of real faces, emojis, and face drawings, found that subjects were more accurate in detecting emotions from emojis than from the other stimuli, including faces, especially for negative emotions such as disgust and fear.
On the whole, in this study, happy emojis were recognized more accurately than the other emojis, while fear expressions were recognized more inaccurately. Overall, in this study, happy emojis were recognized more accurately than others were, while fearful expressions were recognized with greater uncertainty. This pattern of results fits with the findings reported by some authors [31,32], and showing faster RTs to joyful expressions and slowest for fearful expressions. Similarly, Fischer and Herbert [30] found the fastest responses for happiness and anger, followed by surprise, sadness, neutral, and lastly, fear. The same pattern of results was reported for real human faces [36].
These data are also in accordance with previous research literature showing how children were more accurate in recognizing happy and sad emoticons, while doing worse in recognizing fear and disgust emoticons [23]. The primacy of the expression of happiness might be linked to its specific and unique physiognomic characteristics (e.g., the typical U-shaped mouth curvature) not visible on other emotional expressions [36], or to the fact that positive expressions are more commonly exhibited in social settings (e.g., social smiling), as hypothesized by Schindler and Bublatzky [37]. A similar advantage for recognizing happiness has been found with the Emoji stimuli, also characterized by the (even more prominent) U-shaped mouth feature for smiling. The emotion of fear, on the other hand, was often confused with other emotions, particularly with surprise, as both are characterized by the arching of eyebrows and the widening of the eyes and mouth.
In this study, female participants outperformed male participants in the recognition of fearful emojis. This could be partly related to the fact that fearful emojis were more obscure and difficult to be distinguished from surprise, as the hits were only 57.29% in the validation assessment. Similarly, in a study by Hoffmann et al. [38], it was found that female participants were significantly better at recognizing anger, disgust, and fear (subtle emotions) than male participants. According to the authors, this finding was related to the greater ability of females to perceive emotions in a gestalt fashion, making quick and automatic judgements (see also [39] in this regard). Again, it was found that females were faster than males at recognizing fear, sadness, and anger: they were also more accurate than males in recognizing fear [40]. In this study, statistical analyses showed that male participants were significantly better than female participants at recognizing emojis while females were better than males at recognizing human facial expressions. Quite interestingly, males were better at recognizing emojis than human facial expressions. In general, the female ability to recognize human facial expressions more accurately (or faster) than males is a well-corroborated notion in psychology and cognitive neuroscience [15,41]. Further evidence has demonstrated that female individuals, compared to male individuals, react more strongly when viewing affective stimuli (such as International Affective Picture System images) involving human beings, thus showing a higher empathic response [42,43,44]. Neuroimaging evidence has also been provided on the fact that face processing would engage FFA bilaterally in females, and unilaterally (i.e., only over the right hemisphere, rFFA) in males [45], thus possibly supporting a deeper/more efficient analysis of subtle facial mimicry in females. However, the presumed female superiority in the ability to comprehend emotional non-verbal cues does not seem to include the processing of emojis. Therefore, it is possible that emojis were processed differently, half-way between symbols (such as emoticons or icons) and real faces. Indeed, emojis are colored ideographs, not real body parts. While possessing some facial features able to stimulate social brain areas (e.g., the OFA, the middle temporal gyrus, and the orbitofrontal cortex), emojis seem to stimulate weakly, or not at all, the face fusiform area, sensitive to configural face information [7].
An interesting study by Wolf [46] observing the effect of gender in the use of emoticons in cyberspace found that, to communicate with women in mixed-gender newsgroups, men adopt the female standard of expressing more emotions, being, therefore, not less expressive, as with real facial mimicry. The majority of emoticon use by women would however lie in the meaning category of humor, and include solidarity, support, assertion of positive feelings, and thanks, whereas the bulk of male emoticon use would express teasing and sarcasm. This gender difference in the used emoji valence is compatible with the present pattern of results, showing how male participants outperformed female participants with negative emojis while the latter were better than the former at deciphering positive emojis such as happy or even neutral ones.
Overall, it can be hypothesized that the unexpected male proficiency in emoji recognition might reflect a progressively greater inclination of male participants (namely, students) to express their emotions, particularly negative, but also amorous ones (according to other studies), through messaging in cyberspace. In addition, it may be enlightening to compare the present pattern of results with the hyper-systemizing theory of Baron-Cohen [47] according to which males would be better at processing visuospatial information at the analytic level (e.g., emojis or emoticons), and women better at empathizing with others (e.g., on fearful or sad facial expressions). These changes in social behavior should be further assessed and investigated over time.
It should be considered, however, that no empirical data directly suggest a male superiority in the ability to visually process or recognize objects, symbols, icons, or alphanumeric characters. No sex difference was ever shown in the ability to process object identity [48], visually recognize pictures of detailed objects [49], or visually recognize words [50].
On the other hand, several pieces of evidence suggested an association between the male sex, as well as high testosterone levels, and a better performance in visuo-spatial tasks, including the rotation of 3D shapes [51,52]. Although some authors have found a correlation between ‘systemizing’ male traits and performance in mental rotation tasks, e.g., [53], the visuo/spatial or spatial rotation ability does not seem to be correlated with face processing, or visual processing of colored face-like objects. It can ultimately be hypothesized that males in this study were less attracted to, or attentively oriented toward, social information [12,15], represented in this study by real human faces, and therefore were less able to capture subtle facial mimics (such as that of the surprise, anger, and sadness expressions). Sexual dimorphism in attention to social vs. non-social (object) stimuli has been widely reported [54,55] and might ultimately contribute to explaining why male participants in this study (unlike females) were worse at reading emotions in human faces than in emojis.

5. Study Limits and Further Directions

It should be interesting to explore a wider range of emojis to further investigate this sex effect. It would also be important to gain neuroimaging data to explore if these differences across sexes are paralleled by sex differences in the activation of visual areas devoted to face vs. emoji processing. One limit of the present study is the lack of response time data, due to the methodological paradigm involving qualitative and quantitative assessment but not response speed measurement (because of COVID-19 restrictions). The other limit that should be considered is that, here, facial expressions were shown as static pictures and not videos, which can make the recognition of dynamic facial expressions more difficult.

Author Contributions

A.M.P. conceived and planned the experiment and wrote the paper. L.D.N. and A.C. prepared the stimuli and carried out the data collection. A.M.P. performed statistical analyses and data illustration. A.M.P. and L.D.N. interpreted the data. All authors provided critical feedback and helped shape the research, analysis, and manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of University of Milano-Bicocca (CRIP, protocol number RM-2021-401) for studies involving humans.

Informed Consent Statement

Informed written consent was obtained from all subjects involved in the study.

Data Availability Statement

Anonymized data and details about the preprocessing/analyses are available to colleagues upon request.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ANOVAAnalysis of variance
ERPEvent-related potential
FFAFace fusiform area
fMRIFunctional magnetic resonance imaging
IAPSInternational Affective Picture System
LPPLate positive potential
OFAOccipital face area
SDStandard deviation

References

  1. Bai, Q.; Dan, Q.; Mu, Z.; Yang, M. A Systematic Review of Emoji: Current Research and Future Perspectives. Front. Psychol. 2019, 15, 2221. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Aluja, A.; Balada, F.; Blanco, E.; Lucas, I.; Blanch, A. Startle reflex modulation by affective face “Emoji” pictographs. Psychol. Res. 2020, 84, 15–22. [Google Scholar] [CrossRef] [PubMed]
  3. Cherbonnier, A.; Michinov, N. The recognition of emotions beyond facial expressions: Comparing emoticons specifically designed to convey basic emotions with other modes of expression. Comput. Hum. Behav. 2021, 118, 106689. [Google Scholar] [CrossRef]
  4. Liao, W.; Zhang, Y.; Huang, X.; Xu, X.; Peng, X. “Emoji, I can feel your pain”—Neural responses to facial and emoji expressions of pain. Biol. Psychol. 2021, 163, 108134. [Google Scholar] [CrossRef]
  5. Yuasa, M.; Saito, K.; Mukawa, N. Emoticons convey emotions without cognition of faces: An fMRI study. In Proceedings of the CHI06: CHI 2006 Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 22–27 April 2007; pp. 1565–1570. [Google Scholar]
  6. Kanwisher, N.; Stanley, D.; Harris, A. The fusiform face area is selective for faces not animals. Neuroreport 1999, 10, 83–87. [Google Scholar] [CrossRef]
  7. Chatzichristos, C.; Morante, M.; Andreadis, N.; Kofidis, E.; Kopsinis, Y.; Theodoridis, S. Emojis influence autobiographical memory retrieval from reading words: An fMRI-based study. PLoS ONE 2020, 15, 0234104. [Google Scholar] [CrossRef]
  8. Kim, K.W.; Lee, S.W.; Choi, J.; Kim, T.M.; Jeong, B. Neural correlates of text-based emoticons: A preliminary fMRI study. Brain Behav. 2016, 10, e00500. [Google Scholar] [CrossRef] [Green Version]
  9. Liu, J.; Harris, A.; Kanwisher, N. Perception of face parts and face configurations: An FMRI study. J. Cogn. Neurosci. 2010, 22, 203–211. [Google Scholar] [CrossRef] [Green Version]
  10. Huang, S.C.; Bias, R.G.; Schnyer, D. How are icons processed by the brain? Neuroimaging measures of four types of visual stimuli used in information systems. J. Assoc. Inf. Sci. Technol. 2015, 66, 702–720. [Google Scholar] [CrossRef]
  11. Proverbio, A.M.; Brignone, V.; Matarazzo, S.; Del Zotto, M.; Zani, A. Gender and parental status affect the visual cortical response to infant facial expression. Neuropsychologia 2006, 44, 2987–2999. [Google Scholar] [CrossRef]
  12. Proverbio, A.M.; Zani, A.; Adorni, R. Neural markers of a greater female responsiveness to social stimuli. BMC. Neurosci. 2008, 30, 56. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Hall, J.K.; Hutton, S.B.; Morgan, M.J. Sex differences in scanning faces: Does attention to the eyes explain female superiority in facial expression recognition? Cogn. Emot. 2010, 24, 629–637. [Google Scholar] [CrossRef]
  14. Thompson, A.E.; Voyer, D. Sex differences in the ability to recognize non-verbal displays of emotion: A meta-analysis. Cogn. Emot. 2014, 28, 1164–1195. [Google Scholar] [CrossRef]
  15. Proverbio, A.M. Sex differences in the social brain and in social cognition. J. Neurosci. Res. 2021, 95, 222–231. [Google Scholar] [CrossRef] [PubMed]
  16. Montagne, B.; Kessels, R.P.C.; Frigerio, E.; de Haan, E.H.; Perrett, D.I. Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cogn Process. 2005, 6, 136–141. [Google Scholar] [CrossRef]
  17. Olderbak, S.; Wilhelm, O.; Hildebrandt, A.; Quoidbach, J. Sex differences in facial emotion perception ability across the lifespan. Cogn. Emot. 2019, 33, 579–588. [Google Scholar] [CrossRef] [PubMed]
  18. Dimberg, U.; Lundquist, L.O. Gender differences in facial reactions to facial expressions. Biol. Psychol. 1990, 30, 151–159. [Google Scholar] [CrossRef] [PubMed]
  19. Han, S.; Fan, Y.; Mao, L. Gender difference in empathy for pain: An electrophysiological investigation. Brain Res. 2008, 1196, 85–93. [Google Scholar] [CrossRef]
  20. Chaplin, T.M.; Aldao, A. Gender differences in emotion expression in children: A meta-analytic review. Psychol Bull. 2013, 139, 735–765. [Google Scholar] [CrossRef] [Green Version]
  21. Carré, J.M.; Murphy, K.R.; Hariri, A.R. What lies beneath the face of aggression? Soc. Cogn. Affect. Neurosci. 2013, 8, 224–229. [Google Scholar] [CrossRef] [Green Version]
  22. Wiggert, N.; Wilhelm, F.H.; Derntl, B.; Blechert, J. Gender differences in experiential and facial reactivity to approval and disapproval during emotional social interactions. Front. Psychol. 2015, 6, 1372. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  23. Oleszkiewicz, A.; Karwowski, M.; Pisanski, K.; Sorokowski, P.; Sobrado, B.; Sorokowska, A. Who uses emoticons? Data from 86 702 Facebook users. Personal. Individ. Diff. 2017, 119, 289–295. [Google Scholar] [CrossRef]
  24. Jones, L.L.; Wurm, L.H.; Norville, G.A.; Mullins, K.L. Sex differences in emoji use, familiarity, and valence. Comput. Hum. Behav. 2020, 108, 106305. [Google Scholar] [CrossRef]
  25. Kring, A.M.; Gordon, A.H. Sex differences in emotion: Expression, experience, and physiology. J. Personal. Soc. Psychol. 1998, 74, 686–703. [Google Scholar] [CrossRef] [PubMed]
  26. Chen, Z.; Lu, X.; Ai, W.; Li, H.; Mei, Q.; Liu, X. Through a Gender Lens: Learning Usage Patterns of Emojis from Large-Scale Android Users. In Proceedings of the WWW 2018: The 2018 Web Conference, Lyon, France, 23–27 April 2018. [Google Scholar]
  27. Herring, S.C.; Dainas, A.R. Receiver interpretations of emoji functions: A gender perspective. In Proceedings of the 1st International Workshop on Emoji Understanding and Applications in Social Media (Emoji2018), Stanford, CA, USA, 25 June 2018. [Google Scholar]
  28. Briton, N.J.; Hall, J.A. Beliefs about female and male nonverbal communication. Sex Roles 1995, 32, 79–90. [Google Scholar] [CrossRef]
  29. Baptista, C.; Corrêa Hertzberg, J.; Ely das Neves, F.; Flores Prates, P.; Foletto Silveira, J.; Lemos Vasconcellos, S. Gender and the capacityto identify facial emotional expressions. Estudos Psicol. 2017, 22, 1–9. [Google Scholar] [CrossRef]
  30. Fischer, B.; Herbert, C. Emoji as Affective Symbols: Affective Judgments of Emoji, Emoticons, and Human Faces Varying in Emotional Content. Front. Psychol. 2021, 20, 645173. [Google Scholar] [CrossRef]
  31. Righart, R.; de Gelder, B. Rapid influence of emotional scenes on encoding of facial expressions: An ERP study. Soc. Cogn. Affect Neurosci. 2008, 3, 270–278. [Google Scholar] [CrossRef]
  32. Calvo, M.G.; Lundqvist, D. Facial expressions of emotion (KDEF): Identification under different display-duration conditions. Behav. Res. Methods 2008, 40, 109–115. [Google Scholar] [CrossRef] [Green Version]
  33. Palermo, R.; Coltheart, M. Photographs of facial expression: Accuracy, response times, and ratings of intensity. Behav. Res. Methods Instrum. Comput. 2004, 36, 634–638. [Google Scholar] [CrossRef]
  34. Ekman, P. Emotions inside out. 130 years after Darwin’s “The Expression of the Emotions in Man and Animal”. Ann. N. Y. Acad. Sci. 2003, 1000, 1–6. [Google Scholar] [CrossRef]
  35. Carbon, C.C. Wearing Face Masks Strongly Confuses Counterparts in Reading Emotions. Front. Psychol. 2020, 25, 566886. [Google Scholar] [CrossRef]
  36. Sun, A.; Li, Y.; Huang, Y.-M.; Li, Q.; Lu, G. Facial expression recognition using optimized active regions. Hum.-Cent. Comput. Inform. Sci. 2018, 8, 33. [Google Scholar] [CrossRef] [Green Version]
  37. Schindler, S.; Bublatzky, F. Attention and emotion: An integrative review of emotional face processing as a function of attention. Cortex 2020, 30, 362–386. [Google Scholar] [CrossRef] [PubMed]
  38. Hoffmann, H.; Kessler, H.; Eppel, T.; Rukavina, S.; Traue, H.C. Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men. Acta Psychol. 2010, 135, 278–283. [Google Scholar] [CrossRef] [PubMed]
  39. Hall, J.A.; Matsumoto, D. Gender differences in judgments of multiple emotions from facial expressions. Emotion 2004, 4, 201–206. [Google Scholar] [CrossRef] [Green Version]
  40. Kapitanović, A.; Tokić, A.; Šimić, N. Differences in the recognition of sadness, anger, and fear in facial expressions: The role of the observer and model gender. Arch. Ind. Hyg. Toxicol. 2022, 73, 308–313. [Google Scholar] [CrossRef]
  41. Wingenbach, T.S.H.; Ashwin, C.; Brosnan, M. Sex differences in facial emotion recognition across varying expression intensity levels from videos. PLoS ONE 2018, 13, e0190634. [Google Scholar] [CrossRef] [Green Version]
  42. Klein, S.; Smolka, M.N.; Wrase, J.; Grusser, S.M.; Mann, K.; Braus, D.F.; Heinz, A.; Gruesser, S.M. The Influence of Gender and Emotional Valence of Visual Cues on fMRI Activation in Humans. Pharmacopsychiatry 2003, 36, 191–194. [Google Scholar] [CrossRef]
  43. Gard, M.G.; Kring, A.M. Sex differences in the time course of emotion. Emotion 2007, 7, 429–437. [Google Scholar] [CrossRef] [Green Version]
  44. Proverbio, A.M.; Adorni, R.; Zani, A.; Trestianu, L. Sex differences in the brain response to affective scenes with or without humans. Neuropsychologia 2009, 47, 2374–2388. [Google Scholar] [CrossRef] [PubMed]
  45. Proverbio, A.M. Sexual dimorphism in hemispheric processing of faces in humans: A meta-analysis of 817 cases. Soc. Cogn. Affect. Neurosci. 2021, 16, 1023–1035. [Google Scholar] [CrossRef] [PubMed]
  46. Wolf, A. Emotional Expression Online: Gender Differences in Emoticon Use. Cyberpsychol. Behav. 2000, 3, 827–833. [Google Scholar] [CrossRef] [Green Version]
  47. Baron-Cohen, S. The extreme male brain theory of autism. Trends Cogn. Sci. 2002, 6, 248–254. [Google Scholar] [CrossRef] [PubMed]
  48. Tlauka, M. The processing of object identity information by women and men. PLoS ONE 2015, 10, e0118984. [Google Scholar] [CrossRef] [Green Version]
  49. Proverbio, A.M.; Riva, F.; Martin, E.; Zani, A. Face coding is bilateral in the female brain. PLoS ONE 2010, 21, e11242. [Google Scholar] [CrossRef] [Green Version]
  50. Sato, M. The neurobiology of sex differences during language processing in healthy adults: A systematic review and a meta-analysis. Neuropsychologia 2020, 140, 107404. [Google Scholar] [CrossRef]
  51. Hugdahl, K.; Thomsen, T.; Ersland, L. Sex differences in visuo-spatial processing: An fMRI study of mental rotation. Neuropsychologia 2006, 44, 1575–1583. [Google Scholar] [CrossRef]
  52. Hooven, C.K.; Chabris, C.F.; Ellison, P.T.; Kosslyn, S.M. The relationship of male testosterone to components of mental rotation. Neuropsychologia 2004, 42, 782–790. [Google Scholar] [CrossRef]
  53. Brosnan, M.; Daggar, R.; Collomosse, J. The relationship between systemising and mental rotation and the implications for the extreme male brain theory of autism. J. Autism Dev. Disord. 2010, 40, 1–7. [Google Scholar] [CrossRef]
  54. Connellan, J.; Baron-Cohen, S.; Wheelwright, S.; Batki, A.; Ahluwalia, J. Sex differences in human neonatal social perception. Infant Behav. Dev. 2000, 23, 113–118. [Google Scholar] [CrossRef] [Green Version]
  55. Harrop, C.; Jones, D.R.; Sasson, N.J.; Zheng, S.; Nowell, S.W.; Parish-Morris, J. Social and Object Attention Is Influenced by Biological Sex and Toy Gender-Congruence in Children with and without Autism. Autism Res. 2020, 13, 763–776. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Examples of facial expressions and emojis used to illustrate the 7 affective states, in 4 different “identities”: 1 female, 1 male, and 2 different emoji styles.
Figure 1. Examples of facial expressions and emojis used to illustrate the 7 affective states, in 4 different “identities”: 1 female, 1 male, and 2 different emoji styles.
Behavsci 13 00278 g001
Figure 2. Mean recognizability scores (with standard deviations) relative to the 7 expressions illustrated by human faces and emoji.
Figure 2. Mean recognizability scores (with standard deviations) relative to the 7 expressions illustrated by human faces and emoji.
Behavsci 13 00278 g002
Figure 3. Mean recognizability scores (with standard deviations) measured in male and female participants in response to the seven emotions expressed by human faces and emojis.
Figure 3. Mean recognizability scores (with standard deviations) measured in male and female participants in response to the seven emotions expressed by human faces and emojis.
Behavsci 13 00278 g003
Figure 4. Mean scores and distribution of individual performance as a function of stimulus type and sex of viewers. On the y-axis are visible recognizability scores, while on the x-axis are subjects’ sex and stimulus.
Figure 4. Mean scores and distribution of individual performance as a function of stimulus type and sex of viewers. On the y-axis are visible recognizability scores, while on the x-axis are subjects’ sex and stimulus.
Behavsci 13 00278 g004
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dalle Nogare, L.; Cerri, A.; Proverbio, A.M. Emojis Are Comprehended Better than Facial Expressions, by Male Participants. Behav. Sci. 2023, 13, 278. https://doi.org/10.3390/bs13030278

AMA Style

Dalle Nogare L, Cerri A, Proverbio AM. Emojis Are Comprehended Better than Facial Expressions, by Male Participants. Behavioral Sciences. 2023; 13(3):278. https://doi.org/10.3390/bs13030278

Chicago/Turabian Style

Dalle Nogare, Linda, Alice Cerri, and Alice Mado Proverbio. 2023. "Emojis Are Comprehended Better than Facial Expressions, by Male Participants" Behavioral Sciences 13, no. 3: 278. https://doi.org/10.3390/bs13030278

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop