Next Issue
Volume 17, October
Previous Issue
Volume 17, July
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 17, Issue 3 (February 2024) – 6 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
22 pages, 1992 KiB  
Article
Effect of Action Video Games in Eye Movement Behavior: A Systematic Review
by Anna Montolio-Vila, Marc Argilés, Bernat Sunyer-Grau, Lluïsa Quevedo and Graham Erickson
J. Eye Mov. Res. 2024, 17(3), 1-22; https://doi.org/10.16910/jemr.17.3.6 - 25 Sep 2024
Cited by 2 | Viewed by 142
Abstract
Previous research shows that playing action video games seems to modify the behavior of eye movements such as eye fixations and saccades. The aim of the current work was to determine the effect of playing action video games on eye movements behavior such [...] Read more.
Previous research shows that playing action video games seems to modify the behavior of eye movements such as eye fixations and saccades. The aim of the current work was to determine the effect of playing action video games on eye movements behavior such as fixations, saccades and pursuits. A systematic research review in PubMed and Scopus databases was conducted to identify articles published between 2010 and 2022 which referred to action video games and eye movements, including fixations, saccades and pursuits. We included those that were experimental and quasi-experimental, comparing at least two groups between action vs. non-action video games players. All the studies included used an eye tracker to study eye movements. A total of 97 scientific articles were found in the databases. After inclusion criteria, thirteen articles (N = 13) were analyzed for the present work, of which ten (n = 10) had a cross-sectional design, and three (n =3 ) were randomized intervention studies. Playing regularly or training with action video games is not likely to produce changes in eye movements, based on the literature research analyzed. For future research, more interventional studies, with less gender bias, more sample participants and general consensus on the distinction between the action and non-action video games is needed. Full article
Show Figures

Figure 1

10 pages, 1134 KiB  
Article
The Level of Skills Involved in an Observation-Based Gait Analysis
by Shuzo Bonkohara
J. Eye Mov. Res. 2024, 17(3), 1-10; https://doi.org/10.16910/jemr.17.3.1 - 25 Sep 2024
Cited by 1 | Viewed by 106
Abstract
This study aimed to determine the visual assessment skills during an observation-based gait analysis. Participants (N = 40) included 20 physiotherapists (PTs) with >10 years of clinical experience (physiotherapists) and 20 physiotherapy students. Both groups watched a video of the gait of a [...] Read more.
This study aimed to determine the visual assessment skills during an observation-based gait analysis. Participants (N = 40) included 20 physiotherapists (PTs) with >10 years of clinical experience (physiotherapists) and 20 physiotherapy students. Both groups watched a video of the gait of a subject with Guillain–Barré syndrome before and after being provided with information regarding other movements. Further, visual lines were measured using an EMR-8 eye mark recorder, and the results were compared between both groups. The average gaze duration was longer for students than for PTs (F1, 79 = 53.3; p < 0.01), whereas PTs gazed more often than the students (F1, 79 = 87.6; p < 0.01). Furthermore, the PTs moved their eyes vertically more often than the students (F1, 151 = 9.1; p < 0.01). We found that being able to discriminate the relative physical relationship of body locations by frequent and rapid vertical gazes could be an indication of the level of skills as an index to express the visual assessment skill in an observation-based gait analysis. Full article
Show Figures

Figure 1

28 pages, 4349 KiB  
Article
The Observer’s Lens: The Impact of Personality Traits and Gaze on Facial Impression Inferences
by Kuangzhe Xu and Toshihiko Matsuka
J. Eye Mov. Res. 2024, 17(3), 1-28; https://doi.org/10.16910/jemr.17.3.5 - 19 Aug 2024
Cited by 3 | Viewed by 133
Abstract
Previous studies on facial impression inference have focused on the physical features of faces, with only a few considering the effects of the observer. This study explored how participants’ personality traits directly and indirectly affect the impression inference of human faces. Specifically, we [...] Read more.
Previous studies on facial impression inference have focused on the physical features of faces, with only a few considering the effects of the observer. This study explored how participants’ personality traits directly and indirectly affect the impression inference of human faces. Specifically, we examined how observers’ personality traits impact their eye movements, which in turn influence impression inferences. Experiment 1 found relationships between participants’ personality traits and eye movements, but these did not significantly impact impression inferences. In Experiment 2, we manipulated observers’ observational behavior to control for the potential interactive effect between facial features and participants’ eye movements during impression inference. This manipulation suggested that focusing on different areas of faces leads to different impression inferences. It also suggests that the same person might have different impressions of the exact same face by changing their observational behavior. These results deepen our understanding of the impact of facial features and participants’ personality traits on impression inferences, indicating that observers’ personality traits and observational behavior play a significant role in impression formation. Full article
Show Figures

Figure 1

21 pages, 7200 KiB  
Article
Classification Framework to Identify Similar Visual Scan Paths Using Multiple Similarity Metrics
by Ricardo Palma Fraga, Ziho Kang and Jerry M. Crutchfield
J. Eye Mov. Res. 2024, 17(3), 1-21; https://doi.org/10.16910/jemr.17.3.4 - 9 Aug 2024
Cited by 2 | Viewed by 107
Abstract
Analyzing visual scan paths, the time-ordered sequence of eye fixations and saccades, can help us understand how operators visually search the environment before making a decision. To analyze and compare visual scan paths, prior studies have used metrics such as string edit similarity, [...] Read more.
Analyzing visual scan paths, the time-ordered sequence of eye fixations and saccades, can help us understand how operators visually search the environment before making a decision. To analyze and compare visual scan paths, prior studies have used metrics such as string edit similarity, which considers the order used to inspect areas of interest (AOIs), as well as metrics that consider the AOIs shared between visual scan paths. However, to identify similar visual scan paths, particularly in tasks and environments in which operators may apply variations of a common underlying visual scanning behavior, using solely one similarity metric might not be sufficient. In this study, we introduce a classification framework using a combination of the string edit algorithm and the Jaccard coefficient similarity. We applied our framework to the visual scan paths of nine tower controllers in a highfidelity simulator when a “clear-to-take-off” clearance was issued. The classification framework was able to provide richer and more meaningful classifications of the visual scan paths compared to the results when using either the string edit algorithm or Jaccard coefficient similarity. Full article
Show Figures

Figure 1

22 pages, 44857 KiB  
Article
Quantifying Dwell Time With Location-based Augmented Reality: Dynamic AOI Analysis on Mobile Eye Tracking Data With Vision Transformer
by Julien Mercier, Olivier Ertz and Erwan Bocher
J. Eye Mov. Res. 2024, 17(3), 1-22; https://doi.org/10.16910/jemr.17.3.3 - 29 Apr 2024
Cited by 3 | Viewed by 128
Abstract
Mobile eye tracking captures egocentric vision and is well-suited for naturalistic studies. However, its data is noisy, especially when acquired outdoor with multiple participants over several sessions. Area of interest analysis on moving targets is difficult because (A) camera and objects move nonlinearly [...] Read more.
Mobile eye tracking captures egocentric vision and is well-suited for naturalistic studies. However, its data is noisy, especially when acquired outdoor with multiple participants over several sessions. Area of interest analysis on moving targets is difficult because (A) camera and objects move nonlinearly and may disappear/reappear from the scene; and (B) off-the-shelf analysis tools are limited to linearly moving objects. As a result, researchers resort to time-consuming manual annotation, which limits the use of mobile eye tracking in naturalistic studies. We introduce a method based on a fine-tuned Vision Transformer (ViT) model for classifying frames with overlaying gaze markers. After fine-tuning a model on a manually labelled training set made of 1.98% (=7845 frames) of our entire data for three epochs, our model reached 99.34% accuracy as evaluated on hold-out data. We used the method to quantify participants’ dwell time on a tablet during the outdoor user test of a mobile augmented reality application for biodiversity education. We discuss the benefits and limitations of our approach and its potential to be applied to other contexts. Full article
Show Figures

Figure 1

11 pages, 2396 KiB  
Article
Dynamics of Eye Dominance Behavior in Virtual Reality
by Franziska Prummer, Ludwig Sidenmark and Hans Gellersen
J. Eye Mov. Res. 2024, 17(3), 1-11; https://doi.org/10.16910/jemr.17.3.2 - 28 Feb 2024
Cited by 3 | Viewed by 81
Abstract
Prior research has shown that sighting eye dominance is a dynamic behavior and dependent on horizontal viewing angle. Virtual reality (VR) offers high flexibility and control for studying eye movement and human behavior, yet eye dominance has not been given significant attention within [...] Read more.
Prior research has shown that sighting eye dominance is a dynamic behavior and dependent on horizontal viewing angle. Virtual reality (VR) offers high flexibility and control for studying eye movement and human behavior, yet eye dominance has not been given significant attention within this domain. In this work, we replicate Khan and Crawford’s (2001) original study in VR to confirm their findings within this specific context. Additionally, this study extends its scope to study alignment with objects presented at greater depth in the visual field. Our results align with previous results, remaining consistent when targets are presented at greater distances in the virtual scene. Using greater target distances presents opportunities to investigate alignment with objects at varying depths, providing greater flexibility for the design of methods that infer eye dominance from interaction in VR. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop