Next Article in Journal
Eye Movements in Surgery: A Literature Review
Previous Article in Journal
Evidence Supporting Open-Loop Control During Early Vergence
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Examining the Visual Screening Patterns of Emotional Facial Expressions with Gender, Age and Lateralization

by
Banu Cangöz
1,
Arif Altun
1,
Petek Aşkar
2,
Zeynel Baran
1 and
Sacide Güzin Mazman
1
1
Hacettepe University, 06800 Ankara, Turkey
2
TED University, 06420 Ankara, Turkey
J. Eye Mov. Res. 2013, 6(4), 1-15; https://doi.org/10.16910/jemr.6.4.3
Published: 5 November 2013

Abstract

:
The main objective of the study is to investigate the effects of age of model, gender of observer, and lateralization on visual screening patterns while looking at the emotional facial expressions. Data were collected through eye tracking methodology. The areas of interests were set to include eyes, nose and mouth. The selected eye metrics were first fixation duration, fixation duration and fixation count. Those eye tracking metrics were recorded for different emotional expressions (sad, happy, neutral), and conditions (the age of model, part of face and lateralization). The results revealed that participants looked at the older faces shorter in time and fixated their gaze less compared to the younger faces. This study also showed that when participants were asked to passively look at the face expressions, eyes were important areas in determining sadness and happiness, whereas eyes and noise were important in determining neutral expression. The longest fixated face area was on eyes for both young and old models. Lastly, hemispheric lateralization hypothesis regarding emotional face process was supported.

Introduction

Cognitive and neuropsychological studies have shown that human face is a very special sort of stimulus in terms of functions such as perception, recognition, adaptation, social interaction and non-verbal communication. Despite their diversity due to social tags and cultural norms, facial expressions conveying happiness, sadness, anger, scare, disgust, and surprise are considered universal. During face observation, the observer acquires some innate gains concerning the mental and the emotional state of the observed face. In this respect, the clues to be obtained from faces are quite functional in regulating human relationships. The processing of data related to the face is fast and automated and is associated with certain areas of the brain such as the fusiform face area (Nasr & Tootell, 2012).
Many studies with adults have shown that females are more capable of determining accurately emotional facial expressions than males (Hall and Matsumoto, 2004; cf Vassalo, Cooper and Douglas, 2009).
Although emotional facial expressions are recognized, localized, and eye fixated more quickly and accurately compared to neutral ones (Calvo, Avero and Lundqvist, 2006), some studies have indicated that males and females are sensitive to different kinds of emotional expressions. For example, Goos and Silverman (2002) have shown that females as compared with males are more sensitive to angry and sad facial expressions. The fact that females recognize negative emotions more rapidly is perhaps explained better from the evolutionary perspective (rather than from the general learning principle). As females are responsible for rearing, caring, and protection of their children, they have developed those skills which are needed to protect their offspring from danger (Hampson, van Anders and Mullin, 2006). However, no gender differences were reported regarding the speed and accuracy of recognition of emotional facial expressions (Grimshaw, Bulman-Fleming and Ngo, 2004).
Studies using real human faces have shown contradictory results. For example, Fox and Damjanovic (2006) found that eyes provided a key signal of threat and that sad facial expressions mediate the search advantage for threat-related facial expressions. Juth, Lundqvist, Karlsson and Ohman (2005) found that happy faces were recognized quicker than angry and fearful ones. The authors speculated that this quick recognition might be due to the ease of processing happy faces. On the other hand, Calvo, Avero and Lundqvist (2006) asserted that angry faces were more superior to the others. While there are studies showing that an angry facial expression is superior (accuracy, response time, fixation etc.) (i.e., Calvo, Avero and Lundqvist, 2006), in some studies, it has been found that angry and happy faces are more superior than sad or scared expressions (Williams, Moss, Bradshaw and Mattingley, 2005).
In exploring observation behaviors, researchers apply various techniques to solicit information. One of the frequently used ones the Bubbles, which is a technique that examine the categorization and recognition performance to specific visual information in which sparse stimulus is presented to determine the diagnostic visual information (Gossellin and Schyns, 2001). The Bubbles technique is used in face recognition and categorization studies (Smith, Cottrell, Gosselin and Schyns, 2005; Humphreys, Gosselin, Schyns and Johnson, 2006; Vinette, Gosselin and Schyns, 2004). Another approach is to use the eye tracking with eye movement metrics.
Previous studies have investigated the eye tracking patterns under a given task or instruction (emotional rating, identifying emotional valence category, defining the emotion). Those studies indicated a high success rate of categorization for happy faces but not for scared faces (Calvo and Nummenmaa, 2009). On the other hand, there are limited numbers of studies related to the gender differences on eye tracking of the emotional face expressions without a task or instruction. Yet, it is critically important to identify which area(s) of the face have been focused while looking at a face in order to understand the underlying mechanism of non-verbal communication.
It is claimed that some specific facial areas provide more important clues in terms of determining and coding some emotional facial expressions. Eisenbarth and Alpers (2011) have determined that regardless of the emotion indicated by the facial expression, the first eye fixation is oriented on the eyes and the mouth. When the facial expression was sad, the fixation was on the eyes; when it was angry the fixation was on the mouth. It has been found out that in happy facial expressions, the eye fixation is on the mouth, while in scared and neutral facial expressions the eye fixation is equal both on eyes and mouth. As these researchers showed, eyes were important predictors of determining sadness and mouth was important predictor of determining happiness. In another study in which the participants were asked not just to passively look at the expressions, but to actively indicate the valance of the emotion, it was found that participants’ eye movements were directed towards the areas specific to the emotion; that is, “the smiling mouth” or “the sad eyes” (Calvo and Nummenmaa, 2009).
Previous studies indicated no gender differences when participants are to identify the type of emotional expression, to rate the emotional valence of stimuli or to look at the stimuli only, in situations where the stimulus is presented for less than 10 seconds (Kirouac and Dore, 1985).
Considering the emotional face processing and hemispheric lateralization issue, such research findings imply mainly two lateralization hypotheses: (1) the hemispheric lateralization hypothesis (HLH) and (2) the valencespecific lateralization hypothesis (VSLH). According to hemispheric lateralization hypothesis, the right hemisphere is more specialized in to processing emotions with respect to the left one. Studies both supporting (Bourne and Maxwell, 2010) and contradicting (Fusar-Poli et.al, 2009) the HLH exist in the literature. On the other hand, the VSLH posits that the left and right hemispheres specialized each in processing different kind of emotions with the left hemisphere being specialized in processing mainly the positive emotions, and the right hemisphere being specialized in processing negative emotions (Jansari, Rodway and Goncalves, 2011).
Despite the numerous eye tracking studies on the recognition of facial expressions, conflicting results emerge. The aim of this study is to shed some light on this issue by taking into account a set of variables that may be responsible for prior conflicting results. Specifically, the focus was on observer’s gender, age of the model, as these variables were related different lateralization processes.

Method

Participants

The participants were forty volunteer undergraduate students (20 females, Mage = 20.25, SD = 0.64; 20 males, Mage = 21.60, SD = 1.50) with ages ranging from 19 to 27 years old. All participants (92.5% right-handed) had normal or corrected-to-normal vision and were allowed to wear their glasses or contact-lenses if required. All participants gave written informed consent.

Picture Battery

Forty-eight digital colorful, static and real face pictures with happy (16 pictures), neutral (16 pictures) and sad (16 pictures) emotional expressions were selected from the Vital Longevity Face Database (VLFD) (Minear and Park, 2004) were used in the study. The stimulus battery consisted of 24 females, 24 males, 24 young (with ages ranged from 19 to 27) and 24 elderly (with ages ranged from 65 to 84) person pictures with three different emotions. To identify whether the emotional expressions reflect similar results in Turkish culture, a pilot study was conducted with 49 randomly selected photos from VLFD.
For this purpose, a 9 point-Likert scale (sad: 1, neutral: 5, happy: 9) was administered to N = 80 volunteer students (50 females, Mage = 21.30, SD = 1.34; 30 males, Mage = 21.87, SD = 1.11) who were not the participants of the study to rate the emotional valences of faces. The photos were presented to the students in a sequence within one session by a projector and responses were recorded on a recording sheet.
The pilot study revealed that the face pictures representing three emotion conditions (happy, sad, neutral) in American culture were rated accordingly and similarly classified into the same emotion categories with 80 Turkish students, (happy, 7.61±0.52; sad, 2.74±0.50; neutral, 4.48±0.45) (Table 1). In addition, average picture ratings and the percentages on 9 point-Likert scale were calculated according to happy, sad and neutral categories (happy: 9, 8, 7; sad 3, 2, 1; neutral: 6, 5, 4). Selected photos were rated 83.29 % as happy, 73.92 % as sad, and 85.33 % as neutral. These findings showed that the emotional values of the photos used in the study were valid for the Turkish culture as well.

Procedure

Forty-eight photos with emotional facial expressions were presented to the participants in random order on 17 inch TFT monitor as represented in Figure 1. Each photo was presented on the screen for 5 seconds, followed each time by a one-second photo, showing a black plus sign in the center of 14,5x17 cm gray box.
The study started with an instruction and a dummy picture (for warming to the experiment). Each emotional facial expression (happy, sad, and neutral) was shown 16 times at random order. Participants wearing glasses or contact lenses were permitted to wear them during testing hence this did not interfere with eye tracking data collection procedure. Participants were seated approximately 60–65 cm away from the computer screen and calibration was set for each participant separately. The base for calibration level was set to 70%, so that participants who had lower calibration scores were omitted from the study. The visual angle of the display was 30° x 27° and visual angle of the photos were approximately 14° x 16°. Eye movements were recorded by using 120 Hz remote infrared eye tracking system (T120 Tobii Eyetracker) with a 0.5° error rate. Participants were instructed to passively look at the photos. Two areas of interest (AOIs) were determined for eye metrics: (a) the face (parts of eye, nose, and mouth-Figure 2a) and (b) the lateralization (hemifield of face: left and right sides- Figure 2 Figure 2a).
Dependent measurements were fixation duration, first fixation location-duration, and fixation count under participants’ gender, emotional expressions, the age of the model, and parts of face. Those eye metrics were selected based on the relevant literature on the face recognition studies with eye movements (Eisenbarth and Alpers, 2011; Van Belle, de Graef, Verfaillie, Rossion, Lefèvre, 2010; Wong, Cronin-Golomb and Neargarder, 2005). All selected AOI’s (part of faces and/or hemifields) are made equal numerically. At the end of the test, all participants were debriefed.

Results

Prior to running ANOVA analysis, the data were checked for the presence of outliers and violations of parametric analyses such as normality and homogeneity of variances (sphericity). The sphericity assumption was tested by Mauchly’s test, and if it was violated the Greenhouse–Geisser statistics (Greenhouse and Geisser, 1959) was used to adjust the degrees of freedom while reporting F values. In that case, degrees of freedom values were reported in a rational number format (i.e. with decimal numbers); otherwise, they were reported in a natural number format as usual. A 2 (observer gender: female vs. male) x 3 (emotional face: happy vs. sad vs. neutral) x 2 (age of model: young vs. old) x 3 (part of face: eyes vs. nose vs. mouth) mixed ANOVA and a 2(observer gender: female vs. male) x 3(emotional face: happy vs. sad vs. neutral) x 2(age of model: young vs. old) x 2 (hemifield of face: right vs. left) mixed ANOVA’s were employed. Post-hoc comparisons were run with Bonferroni corrections for significant main and interaction effects. SPSS 18.0 was used for statistical analyses.

Findings for First Fixation Location and Duration on Part of Face

First fixation area was the nose area, yet the duration of first fixations showed differences across regions (See Figure 3).
ANOVA analyses revealed that, the main effect of the emotional face expression (F(2, 76) = 13.66, p < .001, η² = .26) and interaction effects of age of model and observer’s gender (F(2, 76)= 6.87, p = .002, η² = .15); age of model and part of face (F(2, 76) = 6.20, p = .003, η² =.14); emotional face expression and part of face (F(3.02, 114.69) = 16.56, p < .001, η² = .30, GG ε = .76); emotional face expression, part of face and observer gender (F(3.02, 114.69) = 2.72, p = .047, η² = .07, GG ε = .75) were significant. Means and standard deviations are shown at Table 2.

Findings for Fixation Duration on the Part of Face

As a result of mixed ANOVA, the main effect of both of the emotional face expression (F(2, 76)=19.62, p < .001, η²=.34) and part of face (F(1.468, 55.779) = 9.63, p < .001, η²=.20, GG ε = .73) were found to be significant for fixation duration, while none of the other main effects reached statistical significance. Interaction effect of model of age and part of face (F(2, 76) = 8.20, p < .001, η²=.18), emotional face expression and part of face (F(4, 152) = 22.85, p < .001, η²=.38), and three-way interaction among age of the model, emotional face expression, and part of face (F(4, 152) = 5.93, p < .001, η²=.14) were found to be significant. However, none of the other interaction effects were found to be statistically significant. Means and standard deviations are shown at Table 2 and results of post-hoc comparison related with fixation duration are summarized in Appendix A.

Findings for Fixation Count on the Part of Face

In mixed ANOVA results for fixation count measurements, the main effect of age of model (F(1,38) = 5.27, p < .027, η²p = .122), emotional face expression (F(2,76) = 7.57, p < .001, η² = .166) and part of face (F(1.25, 47.64) = 13.05, p < .001, η²=.256, GG ε = .63) were found to be significant on the fixation count, while the main effect of observer gender was not statistically significant. Interaction effect of emotional expression and part of face (F(4, 152)=11.99, p < .001, η² = .24) and three-way interaction among age of the model, emotional face expression, and part of face (F(4,152) = 6.61, p < .001, η²= .15) were found to be significant. However, none of the other interaction effects were found to be significant. Means and standard deviations are shown Table 3 and results of post-hoc comparison related with fixation duration are summarized in Appendix A.

Findings of Fixation Duration on the Laterality

As a result of mixed ANOVA for fixation duration, the main effect of age of the model (F(1, 38) = 12.90, p < .001, η² = .25) and hemifield of face (F(1, 38) = 17.73, p < .001, η² = .32) were found to be significant for eye fixation duration. However, none of the other main effects were found to be significant. According to results, people fixated their gaze on the young faces for longer time than on the old faces. The left side of the face was fixated for longer time than the right side. Means and standard deviations are shown at Table 4 and results of post-hoc comparison related with fixation duration are summarized in Appendix B.

Findings of Fixation Count on the Laterality

As a result of mixed ANOVA for fixation count, the main effect of age of the model (F(1, 38) = 6.86, p < .013, η² = .153), emotional face expression (F(2,78) = 3.22, p < .046, η² = .08) and hemifield of face (F(1, 38) = 26.12, p < .001, η² = .41) were found to be significant for fixation count, while the main effect of observer gender was not significant. Interaction effect of observer’s gender and hemifield of face (F(1, 38) = 4.89, p < .033, η² = .11) and the interaction effect of age of the model and hemifield of face (F(1, 38)= 8.98, p < .005, η² = .19) were found to be significant (see Figure 4 for interaction effect graph). Lastly, the three-way interaction among age of the model, emotional face expression, and hemifield of face (F(2, 76) = 4.98, p < .009, η² = .12) were found to be significant. The interaction effect graph is presented in Figure 5.
The means and standard deviations related to laterality analysis are given at Table 4 and results of post-hoc comparison related with fixation counts are summarized in Appendix B.

Discussion

In this study, it was found that, emotional face expressions affected visual screening patterns in all conditions. This finding is consistent with the literature (see, Bradly et.al. 2003; Carstensen and Mikels, 2005; Kesinger and Corkin, 2004). Young participants looked on the sad faces longer and more frequently, followed by the happy and neutral faces. These findings were in parallel with Fox and Damjanovic (2006) and Hortsman and Bauland’s (2006) research findings as well as with the emotional memory enhancement (EME) effect (Bradly et.al. 2003; Carstensen and Mikels, 2005). As a well-known phenomenon, EME effect posits that emotional stimulus are better recalled and recognized than the neutral ones as the emotional stimulus leads to arousal, which in turn results in an enhancement of attention and memory (Kesinger and Corkin, 2004). According this effect, elder people generally tend to remember the positive, but the young people tend to remember the negative stimulus. This encoding selectivity causes emotional regulations so that young adults remember more negative aspects of the events (Thapar and Rouder, 2009). The findings of the study indicated that gender of the observer did not have a significant main effect on the fixation duration and fixation count. This finding is partly consistent with previous findings (Calvo and Lundqvist, 2008, Grimshaw et. al. 2004, Rahman, Wilson and Abrahams., 2004). Perhaps other variables such as the treatment manner are more important than the gender of observer to explain the inconsistencies reported in the literature.
In addition, the parts of face variable had an effect on the fixation duration and fixation count measurements. Participants looked at eyes for a longer time and more frequently than at the other areas. As this study shows when the participants were asked to passively look at the face expressions, they focused on the eyes to determine sadness and happiness; on the other hand, they looked at the eyes and noise in neutral expressions. The longest fixated part of face was eyes both on the young and old models. This finding is consistent with the Eisenbarth and Alpers’ (2011) finding showing that eyes were important areas in determining sadness and the mouth in determining happiness. The studies which used Bubble technique for face recognition -such as those by Vinette, Gosselin and Schyns (2004), Humphreys, Gosselin, Schyns and Johnson (2006)- also reported similar findings that eyes are the most diagnostic regions for recognition of faces. Vinette, Gosselin and Schyns (2004) went further to suggest that eyes are a rich source of information about a person’s identity, state of mind, and intentions. Another finding of this study was that in neutral faces the participants looked at the models’ eyes and nose for longer periods of time than the mouth. For all of the emotional expressions (sad, happy and neutral) eyes received the most fixations.
Both in the young and old models, eye tracking metrics (first fixation duration and fixation duration) indicated a varying distribution across the parts of face. Participants fixated their gaze for a longer time on the eyes of the model than on its mouth and nose. They did so, in both the young and old faces. These findings partly support Calvo and Nummenmaa’s (2009) findings. Additionally, when the young participants were asked simply to passively look at the expressions for young neutral faces; it was revealed that their eye movement patterns were directed towards the eyes (that is, “the neutral eyes”). Yet, in other emotional expressions, participants’ eye movement patterns indicated a tendency towards the mouth and nose (that is, “the sad and smiling mouth”). When the young participants were asked to passively look at the expressions for sad, happy and neutral old faces, their eye movement patterns were directed towards the eyes (that is, “the sad, smiling and neutral eyes”). Consequently the eyes were the main predictors of passive visual screening for emotional facial expressions.
In this study, the position of first fixation was found to be on the nose area. However, duration of the first fixation to the facial parts varies according to the observer's gender and age of the model. Arizpe, Kravitz, Yovel and Baker (2012) suggested that starting position of eye movements strongly influence the visual fixation patterns during face processing. Given that we used a stimulus (black sign) for a second at the centre of the screen to fix participants after each photo, it is possible that participants might have looked at the centre of the upcoming face photo which corresponds to nose. Therefore, we cannot speculate that the nose is a start position when looking at the face for emotional face recognition. Although we had asked the observers to look at the photos with no guiding instructions, other effects such as the plus sign presented before each photo, or the contrast and the brightness of the photos, might be responsible for this effect and therefore could be explored more in depth in future studies.
According to the first fixation duration findings, for both young and old models with happy and sad emotional expressions, the first fixation was longer on the mouth while, it was on the nose for the neutral expressions. Mouth was an important part of face as the first and the basic phase of emotional processes whereas the eyes were important in predicting the higher level emotional processes. The importance of mouth, on the other hand, might be explained with evolutionary approach.
Age of the model has been found to be effective on the fixation duration and fixation count. According to these results, participants looked at the young models more and longer in time than the older ones. We annotated this finding to the physiological basis of “ageism”.
These findings suggest that some parts of face, especially the eyes, give us more important clues for determining and analyzing the emotional faces. The visual information on the left visual field (the left side of the face) is mainly processed by the right hemisphere and the visual information on the right visual field (the right side of the face) is mainly processed by the left hemisphere of the brain. Considering the effects of laterality on the fixation duration and fixation count, the left side of the face was fixated more and longer than the right side. Participants looked on the left side of all emotional (happy, sad, neutral) young and old faces more and longer compared to the right side. Those findings are in line with the hemispheric lateralization hypothesis that deals with the emotional face processing. Parallel with our results, Vinette et al., (2004) also used the Bubbles technique and found that the eye on the left half of the stimuli was used more effectively and more rapidly for face recognition than the right half. As they had suggested, “the right hemisphere of the brain processes faces more efficiently than the left”. Yet, it should be noted that this shouldn’t be taken as evidence supporting hemispheric difference in emotion/facial expression processing. More research is needed to explain this tendency as well as to explore whether this could be taken as evidence to support emotion/facial expression processing.
Males looked at the right side of the face longer than females. However, gender difference was not significant on the left side of the face. Furthermore, gender differences were found with regard to fixation counts on both the right and the left side of the face. In addition, females fixated more frequently their gaze than males. Fixation counts occurred more frequently on the left side than the right side, especially for faces expressing happy and sad emotions. The findings of this study are consistent with past research showing that females, as compared to males, are more sensitive to emotional faces than the neutral ones and they are superior at recognition of emotional expressions (Calvo and Lundqvist, 2008; Calvo and Nummenmaa, 2009; Palermo and Colthearth, 2004). In addition, the current finding showing that female looked longer and more frequently to the emotional faces, indicates that females attend to informative cues more than males. Thus, it can be speculated that during passive looking females are more sensitive than the males at the emotional faces.
To sum up, in support of the hemispheric lateralization hypothesis, the present study confirms that the eyes and mouth are particularly important parts of the face when reading emotional expressions. It further extends our knowledge in showing that scan paths of young observers differ across different emotional facial expressions.

Appendix A

Summary Table of Post Hoc Comparisons for Fixation Duration and Fixation Count (according to Part of Face)
Jemr 06 00014 i0a1a
Jemr 06 00014 i0a1b

Appendix B

Summary Table of Post Hoc Comparisons for Fixation Duration and Fixation Count (according to Laterality)
Jemr 06 00014 i0a2

References

  1. Arizpe, J., D.J. Kravitz, G. Yovel, and C.I. Baker. 2012. Start Position Strongly Influences Fixation Patterns during Face Processing: Difficulties with Eye Movements as a Measure of Information Use. PLoS ONE 7, 2: e31106. [Google Scholar] [CrossRef]
  2. Bradley, M. M., D. Sabatinelli, P. J. Lang, J. R. Fitzsimmons, W. A. King, and P. Desai. 2003. Activation of the visual cortex in motivated attention. Behavioral Neuroscience 117, 2: 369–380. [Google Scholar] [CrossRef] [PubMed]
  3. Bourne, V.J., and A.M. Maxwell. 2010. Examining the sex difference in lateralization for processing facial emotion: Does biological sex or psychological gender identity matter? Neuropsychologia 48: 1289–1294. [Google Scholar] [CrossRef]
  4. Calvo, M. G., P. Avero, and D. Lundqvist. 2006. Facilitated detection of angry faces: Initial orienting and processing efficiency. Cognition and Emotion 20: 785–811. [Google Scholar] [CrossRef]
  5. Calvo, M. G., and D. Lundqvist. 2008. Facial expressions of emotion (KDEF): Identification under different display-duration conditions. Behavior Research Methods 40: 109–115. [Google Scholar] [CrossRef]
  6. Calvo, M. G., and L. Nummenmaa. 2009. Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications. Cognitive, Affective, & Behavioral Neuroscience 9, 4: 398–411. [Google Scholar] [CrossRef]
  7. Carstensen, L. L., and J. A. Mikels. 2005. At the intersection of emotion and cognition: Aging and the positivity effect. Current Directions in Psychological Science 14, 3: 117–121. [Google Scholar] [CrossRef]
  8. Eisenbarth, H., and G.W. Alpers. 2011. Happy Mouth and Sad Eyes: Scanning Emotional Facial Expressions. Emotion 11, 4: 860–865. [Google Scholar] [CrossRef] [PubMed]
  9. Fox, E., and L. Damjanovic. 2006. The eyes are sufficient to produce a threat superiority effect. Emotion 6: 534–539. [Google Scholar] [CrossRef]
  10. Fusar-Poli, P., A. Placentino, F. Carletti, P. Allen, P. Landi, M. Abbamonte, F. Barale, J. Perez, P. McGuire, and P.L. Politi. 2009. Laterality effect on emotional faces processing: ALE meta-analysis of evidence. Neuroscience Letters 452: 262–267. [Google Scholar] [CrossRef]
  11. Goos, L. M., and I. Silverman. 2002. Sex related factors in the perception of threatening facial expressions. Journal of Nonverbal Behavior 26: 27–41. [Google Scholar] [CrossRef]
  12. Gosselin, F., and P. G. Schyns. 2001. Bubbles: A technique to reveal the use of information in recognition tasks. Vision Research 41, 17: 2261–2271. [Google Scholar] [CrossRef]
  13. Greenhouse, S.W., and S. Geisser. 1959. On methods in the analysis of profile data. Psychometrika 24: 95–112. [Google Scholar] [CrossRef]
  14. Grimshaw, G. M., M. B. Bulman-Fleming, and C. Ngo. 2004. A signal-detection analysis of sex differences in the perception of emotional faces. Brain and Cognition 54: 248–250. [Google Scholar] [CrossRef]
  15. Hall, J. A., and D. Matsumoto. 2004. Gender differences in judgments of multiple emotions from facial expressions. Emotion 4: 201–206. [Google Scholar] [CrossRef]
  16. Hampson, E., S.M. van Anders, and L.I. Mullin. 2006. A female advantage in the recognition of emotional facial expressions: test of an evolutionary hypothesis. Evolution and Human Behavior 27: 401–416. [Google Scholar] [CrossRef]
  17. Horstmann, G., and A. Bauland. 2006. Search asymmetries with real faces: Testing the anger-superiority effect. Emotion 6, 2: 193–207. [Google Scholar] [CrossRef]
  18. Humphreys, K., F. Gosselin, P. G. Schyns, and M. H. Johnson. 2006. Using “Bubbles” with babies: A new technique for investigating the informational basis of infant perception. Infant Behavior and Development 29, 3: 471–475. [Google Scholar] [CrossRef]
  19. Jansari, A., P. Rodway, and S. Goncalves. 2011. Identifying facial emotions: Valence specific effects and an exploration of the effects of viewer gender. Brain and Cognition 76: 415–423. [Google Scholar] [CrossRef]
  20. Juth, P., D. Lundqvist, A. Karlsson, and A. Ohman. 2005. Looking for foes and friends: Perceptual and emotional factors when finding a face in the crowd. Emotion 5: 379–395. [Google Scholar] [CrossRef]
  21. Kensinger, E. A., and S. Corkin. 2004. Two routes to emotional memory: Distinct neural processes for valence and arousal. Proceedings of the National Academy of Sciences of the United States of America 101, 9: 3310–3315. [Google Scholar] [CrossRef]
  22. Kirouac, G., and F. Y. Doré. 1985. Accuracy of the judgement of facial expression of emotions as a function of sex and level of education. Journal of Nonverbal Behavior 91: 3–7. [Google Scholar] [CrossRef]
  23. Minear, M., and D.C. Park. 2004. A lifespan database of adult facial stimuli. Behavior Research Methods, Instruments, & Computers 36: 630–633. [Google Scholar]
  24. Nasr, S., and R.B. Tootell. 2012. Role of fusiform and anterior temporal cortical areas in facial recognition. Neuroimage 63, 3: 1743–53. [Google Scholar] [CrossRef]
  25. Palermo, R., and M. Coltheart. 2004. Photographs of facial expression: Accuracy, response times, and ratings of intensity. Behavior Research Methods, Instruments, & Computers 36: 634–638. [Google Scholar]
  26. Rahman, Q., G. D. Wilson, and S. Abrahams. 2004. Sex, sexual orientation, and identification of positive and negative facial affect. Brain and Cognition 54: 179–185. [Google Scholar] [CrossRef]
  27. Smith, M. L., G. W. Cottrell, F. Gosselin, and P. G. Schyns. 2005. Transmitting and decoding facial expressions. Psychological Science 16, 3: 184–189. [Google Scholar] [CrossRef]
  28. Thapar, A., and J. N. Rouder. 2009. Aging and recognition memory for emotional words: A bias account. Psychonomic Bulletin & Review 16, 4: 699–704. [Google Scholar]
  29. Thayer, J. F., and B. H. Johnsen. 2000. Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology 41: 243–246. [Google Scholar] [CrossRef]
  30. Van Belle, G., P. de Graef, K. Verfaillie, B. Rossion, and P. Lefèvre. 2010. Face inversion impairs holistic perception: Evidence from gaze-contingent stimulation. Journal of Vision May 1. 10. pii: 10.5.10. [Google Scholar] [CrossRef]
  31. Vassallo, S., S.L. Cooper, and J.M. Douglas. 2009. Visual scanning in the recognition of facial affect: Is there an observer sex difference? Journal of Vision 9, 3: 11, 1–10. [Google Scholar] [CrossRef] [PubMed]
  32. Vinette, C., F. Gosselin, and P. G. Schyns. 2004. Spatiotemporal dynamics of face recognition in a flash: It’s in the eyes. Cognitive Science 28, 2: 289–301. [Google Scholar]
  33. Williams, M. A., S. A. Moss, J. L. Bradshaw, and J. B. Mattingley. 2005. Look at me, I’m smiling: Visual search for threatening and nonthreatening facial expressions. Visual Cognition 12: 29–50. [Google Scholar] [CrossRef]
  34. Wong, B., A. Cronin-Golomb, and S. Neargarder. 2005. Patterns of visual scanning as predictors of emotion identification in normal aging. Neuropsychology 19, 6: 739. [Google Scholar] [CrossRef]
Figure 1. Schematic representation of the test procedure. 
Figure 1. Schematic representation of the test procedure. 
Jemr 06 00014 g001
Figure 2. Area of Interest (AOI). 
Figure 2. Area of Interest (AOI). 
Jemr 06 00014 g002
Figure 3. The position of the first fixation. 
Figure 3. The position of the first fixation. 
Jemr 06 00014 g003
Figure 4. Age of Model x Hemifield interaction effect for fixation count measurement (Bars represent Standard Errors). 
Figure 4. Age of Model x Hemifield interaction effect for fixation count measurement (Bars represent Standard Errors). 
Jemr 06 00014 g004
Figure 5. Age of Model x Valence x Part of Face interaction effect for fixation count measurement (Bars represent Standard Errors). 
Figure 5. Age of Model x Valence x Part of Face interaction effect for fixation count measurement (Bars represent Standard Errors). 
Jemr 06 00014 g005
Table 1. Means, Standard Deviations, and Percent Rates for Pictures. 
Table 1. Means, Standard Deviations, and Percent Rates for Pictures. 
Jemr 06 00014 i001aJemr 06 00014 i001b
Table 2. Mean and Standard Deviations for First Fixation Duration. 
Table 2. Mean and Standard Deviations for First Fixation Duration. 
Jemr 06 00014 i002
Table 3. Mean and Standard Deviations for Fixation Duration and Fixation Count. 
Table 3. Mean and Standard Deviations for Fixation Duration and Fixation Count. 
Jemr 06 00014 i003
Table 4. Means and Standard Deviations for Laterality. 
Table 4. Means and Standard Deviations for Laterality. 
Jemr 06 00014 i004

Share and Cite

MDPI and ACS Style

Cangöz, B.; Altun, A.; Aşkar, P.; Baran, Z.; Mazman, S.G. Examining the Visual Screening Patterns of Emotional Facial Expressions with Gender, Age and Lateralization. J. Eye Mov. Res. 2013, 6, 1-15. https://doi.org/10.16910/jemr.6.4.3

AMA Style

Cangöz B, Altun A, Aşkar P, Baran Z, Mazman SG. Examining the Visual Screening Patterns of Emotional Facial Expressions with Gender, Age and Lateralization. Journal of Eye Movement Research. 2013; 6(4):1-15. https://doi.org/10.16910/jemr.6.4.3

Chicago/Turabian Style

Cangöz, Banu, Arif Altun, Petek Aşkar, Zeynel Baran, and Sacide Güzin Mazman. 2013. "Examining the Visual Screening Patterns of Emotional Facial Expressions with Gender, Age and Lateralization" Journal of Eye Movement Research 6, no. 4: 1-15. https://doi.org/10.16910/jemr.6.4.3

APA Style

Cangöz, B., Altun, A., Aşkar, P., Baran, Z., & Mazman, S. G. (2013). Examining the Visual Screening Patterns of Emotional Facial Expressions with Gender, Age and Lateralization. Journal of Eye Movement Research, 6(4), 1-15. https://doi.org/10.16910/jemr.6.4.3

Article Metrics

Back to TopTop