Next Issue
Volume 12, December
Previous Issue
Volume 12, October
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 12, Issue 7 (November 2019) – 12 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
1 pages, 93 KiB  
Article
Eyes Wide Shut: Gaze Dynamics Without Vision, Symposium 6 at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 21.8.2019
by Martinez-Conde Martinez-Conde, Bradley Buchsbaum, Fatema Ghasia, Freek van Ede and Stephen L. Macknik
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.12 - 25 Nov 2019
Viewed by 61
Abstract
The human ability for visualization extends far beyond the physical items that surround us. We are able to dismiss the constant influx of photons hitting our retinas, and instead picture the layout of our kindergarten classroom, envision the gently swaying palm trees of [...] Read more.
The human ability for visualization extends far beyond the physical items that surround us. We are able to dismiss the constant influx of photons hitting our retinas, and instead picture the layout of our kindergarten classroom, envision the gently swaying palm trees of our dream vacation, or foresee the face of a yet-to-be-born child. As we inspect imaginary objects and people with our mind’s eye, our corporeal eyeballs latch onto the fantasy. Research has found that our eyes can move as if seeing, even when there is nothing to look at. Thus, gaze explorations in the absence of actual vision have been reported in many contexts, including in visualization and memory tasks, and perhaps even during REM sleep. This symposium will present the manifold aspects of gaze dynamics in conditions when the visual input is impoverished or altogether absent. Presentations will address the characteristics of large and small eye movements during imagined and remembered scenes, the impact of visual field deficits on oculomotor control, and the role of eye movements in the future development of neural prosthetics for the blind. Full article
2 pages, 95 KiB  
Article
Eye and Head Movements While Looking at Rotated Scenes in VR. Session “Beyond the Screen’s Edge” at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 19.8.2019
by Nicola C. Anderson and Walter F. Bischof
J. Eye Mov. Res. 2019, 12(7), 1-2; https://doi.org/10.16910/jemr.12.7.11 - 25 Nov 2019
Cited by 3 | Viewed by 43
Abstract
We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) [...] Read more.
We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previous work, we found a horizontal bias in saccade directions, but this was affected by both the image shape and its content. Interestingly, when viewing landscapes (but not fractals), observers rotated their head in line with the image rotation, presumably to make saccades in cardinal, rather than oblique, directions. We discuss our findings in relation to current theories on eye movement control, and how insights from VR might inform traditional eyetracking studies. - Part 2: Observers looked at panoramic, 360 degree scenes using VR goggles while eye and head movements were tracked. Fixations were determined using IDT (Salvucci & Goldberg, 2000) adapted to a spherical coordinate system. We then analyzed (a) the spatial distribution of fixations and the distribution of saccade directions, (b) the spatial distribution of head positions and the distribution of head movements, and (c) the relation between gaze and head movements. We found that, for landscape scenes, gaze and head best fit the allocentric frame defined by the scene horizon, especially when taking head tilt (i.e., head rotation around the view axis) into account. For fractal scenes, which are isotropic on average, the bias toward a body-centric frame gaze is weak for gaze and strong for the head. Furthermore, our data show that eye and head movements are closely linked in space and time in stereotypical ways, with volitional eye movements predominantly leading the head. We discuss our results in terms of models of visual exploratory behavior in panoramic scenes, both in virtual and real environments. Full article
2 pages, 94 KiB  
Article
Eye Movements in Developing Readers: From Basic Research to Classroom Application. Parts of Symposium 7 at the 20th European Conference on Eye Movements in Alicante, September 21, 2019
by Alexandra Spichtig, Christian Vorstius, Ronan Reilly and Jochen Laubrock
J. Eye Mov. Res. 2019, 12(7), 1-2; https://doi.org/10.16910/jemr.12.7.10 - 25 Nov 2019
Cited by 1 | Viewed by 52
Abstract
Eye-movement recording has made it possible to achieve a detailed understanding of oculomotor and cognitive behavior during reading and of changes in this behavior across the stages of reading development. Given that many students struggle to attain even basic reading skills, a logical [...] Read more.
Eye-movement recording has made it possible to achieve a detailed understanding of oculomotor and cognitive behavior during reading and of changes in this behavior across the stages of reading development. Given that many students struggle to attain even basic reading skills, a logical extension of eye-movement research involves its applications in both the diagnostic and instructional areas of reading education. The focus of this symposium is on eye-movement research with potential implications for reading education. Christian Vorstius will review results from a large-scale longitudinal study that examined the development of spatial parameters in fixation patterns within three cohorts, ranging from elementary to early middle school, discussing an early development window and its potential influences on reading ability and orthography. Ronan Reilly and Xi Fan will present longitudinal data related to developmental changes in reading-related eye movements in Chinese. Their findings are indicative of increasing sensitivity to lexical predictability and sentence coherence. The authors suggest that delays in the emergence of these reading behaviors may signal early an increased risk of reading difficulty. Jochen Laubrock’s presentation will focus on perceptual span development and explore dimensions of this phenomenon with potential educational implications, such as the modulation of perceptual span in relation to cognitive load, as well as preview effects during oral and silent reading—and while reading comic books. Full article
2 pages, 96 KiB  
Article
Eye Movements During the Reading of Narrative and Poetic Texts. Symposium 6 at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 21.8.2019
by Arthur Jacobs, Jana Lüdtke, Johanna Kaakinen and Lynn S. Eekhof
J. Eye Mov. Res. 2019, 12(7), 1-2; https://doi.org/10.16910/jemr.12.7.9 - 25 Nov 2019
Viewed by 40
Abstract
Despite a wealth of studies using eye tracking to investigate mental processes during vision or reading, the investigation of oculomotor activity during natural reading of longer texts –be it newspaper articles, narratives or poetry– is still an exception in this field (as evidenced [...] Read more.
Despite a wealth of studies using eye tracking to investigate mental processes during vision or reading, the investigation of oculomotor activity during natural reading of longer texts –be it newspaper articles, narratives or poetry– is still an exception in this field (as evidenced by the program of ECEM 2017 in Wuppertal). Following up on our symposium at ECEM 2017, here we bring together eye movement research on natural text reading to report recent progress in a coordinated way sharing data, experiences and software skills in this highly complex subfield. More specifically, in this symposium we will address several challenges faced by an eye tracking perspective on the reading of longer texts which involve a surplus of intervening variables and novel methods to analyze the data. In particular, the following issues will be addressed: - Which text-analytical and statistical methods are best to deal with the myriad of surface and affective semantic features potentially influencing eye movements during reading of ‘natural’ texts? - What are the pros and cons of using machine learning assisted predictive modeling as an alternative to the standard GLM/LMM frameworks? - Which kind of theoretical models can deal with the level of complexity offered by reading longer natural texts? Full article
2 pages, 39 KiB  
Article
From Lab-Based Studies to Eye-Tracking in Virtual and Real Worlds: Conceptual and Methodological Problems and Solutions. Symposium 4 at the 20th European Conference on Eye Movement Research (Ecem) in Alicante, 20.8.2019
by Ignace T. C. Hooge, Roy S. Hessels, Diederick C. Niehorster, Gabriel J. Diaz, Andrew T. Duchowski and Jeff B. Pelz
J. Eye Mov. Res. 2019, 12(7), 1-2; https://doi.org/10.16910/jemr.12.7.8 - 25 Nov 2019
Cited by 5 | Viewed by 71
Abstract
Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers [...] Read more.
Wearable mobile eye trackers have great potential as they allow the measurement of eye movements during daily activities such as driving, navigating the world and doing groceries. Although mobile eye trackers have been around for some time, developing and operating these eye trackers was generally a highly technical affair. As such, mobile eye-tracking research was not feasible for most labs. Nowadays, many mobile eye trackers are available from eye-tracking manufacturers (e.g., Tobii, Pupil labs, SMI, Ergoneers) and various implementations in virtual/augmented reality have recently been released.The wide availability has caused the number of publications using a mobile eye tracker to increase quickly. Mobile eye tracking is now applied in vision science, educational science, developmental psychology, marketing research (using virtual and real supermarkets), clinical psychology, usability, architecture, medicine, and more. Yet, transitioning from lab-based studies where eye trackers are fixed to the world to studies where eye trackers are fixed to the head presents researchers with a number of problems. These problems range from the conceptual frameworks used in world-fixed and head-fixed eye tracking and how they relate to each other, to the lack of data quality comparisons and field tests of the different mobile eye trackers and how the gaze signal can be classified or mapped to the visual stimulus. Such problems need to be addressed in order to understand how world-fixed and head-fixed eye-tracking research can be compared and to understand the full potential and limits of what mobile eye-tracking can deliver. In this symposium, we bring together presenting researchers from five different institutions (Lund University, Utrecht University, Clemson University, Birkbeck University of London and Rochester Institute of Technology) addressing problems and innovative solutions across the entire breadth of mobile eye-tracking research. Hooge, presenting Hessels et al. paper, focus on the definitions of fixations and saccades held by researchers in the eyemovement field and argue how they need to be clarified in order to allow comparisons between world-fixed and head-fixed eye-tracking research.—Diaz et al. introduce machine-learning techniques for classifying the gaze signal in mobile eye-tracking contexts where head and body are unrestrained. Niehorster et al. compare data quality of mobile eye trackers during natural behavior and discuss the application range of these eye trackers. Duchowski et al. introduce a method for automatically mapping gaze to faces using computer vision techniques. Pelz et al. employ state-of-the-art techniques to map fixations to objects of interest in the scene video and align grasp and eye-movement data in the same reference frame to investigate the guidance of eye movements during manual interaction. Full article
1 pages, 104 KiB  
Article
The Effects of Fixational Eye Movements on Population Responses in V1. Keynote at the 20th European Conference on Eye Movements in Alicante, September 22, 2019
by Hamutal Slovin
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.7 - 25 Nov 2019
Viewed by 48
Abstract
During visual fixation, the eyes make small and fast movements known as microsaccades (MSs). The effects of MSs on neural activity in the visual cortex are not well understood. Utilizing voltage-sensitive dye imaging, we imaged the spatiotemporal patterns of neuronal responses induced by [...] Read more.
During visual fixation, the eyes make small and fast movements known as microsaccades (MSs). The effects of MSs on neural activity in the visual cortex are not well understood. Utilizing voltage-sensitive dye imaging, we imaged the spatiotemporal patterns of neuronal responses induced by MSs in early visual cortices of behaving monkeys. Our results reveal a continuous “visual instability” during fixation: while the visual stimulus moves over the retina with each MS, the neuronal activity in V1 ‘hops’ within the retinotopic map, as dictated by the MS parameters. Neuronal modulations induced by MSs are characterized by neural suppression followed by neural enhancement and increased synchronization. The suppressed activity may underlie the suppressed perception during MSs whereas the late enhancement may facilitate the processing of new incoming image information. Moreover, the instability induced by MSs applies also to neural correlates of visual perception processes such as figure-ground (FG) segregation, which appear to develop faster after fixational saccades. Full article
1 pages, 98 KiB  
Article
On the 'Where' and 'When' of Eye Guidance in Real-World Scenes
by Antje Nuthmann
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.6 - 25 Nov 2019
Viewed by 39
Abstract
Keynote at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 22.8.2019 [...] Full article
1 pages, 46 KiB  
Article
Unraveling Concussion Impact by Impact
by Janet C. Rucker
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.5 - 25 Nov 2019
Viewed by 45
Abstract
Keynote at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 20.8.2019 [...] Full article
1 pages, 46 KiB  
Article
The Fate and Function of Vision During Saccades
by Martin Rolfs
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.4 - 25 Nov 2019
Viewed by 59
Abstract
Keynote at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 20.8.2019 [...] Full article
1 pages, 48 KiB  
Article
Hearing in a World of Light: Why, Where, and How Visual and Auditory Information Are Connected by the Brain
by Jennifer M. Groh
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.3 - 25 Nov 2019
Cited by 1 | Viewed by 63
Abstract
Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote presentation describes recent findings concerning how eye movements affect early auditory processing via motor processes in the ear (eye movement-related eardrum oscillations, or EMREOs). [...] Read more.
Information about eye movements with respect to the head is required for reconciling visual and auditory space. This keynote presentation describes recent findings concerning how eye movements affect early auditory processing via motor processes in the ear (eye movement-related eardrum oscillations, or EMREOs). Computational efforts to understand how eye movements are factored in to auditory processing to produce a reference frame aligned with visual space uncovered a second critical issue: sound location is not mapped but is instead rate (meter) coded in the primate brain, unlike visual space. Meter coding would appear to limit the representation of multiple simultaneous sounds. The second part of this presentation concerns how such a meter code could use fluctuating activity patterns to circumvent this limitation. Full article
1 pages, 47 KiB  
Article
Beyond the Lab? Eye Tracking in Dynamic Real-World Environments
by Enkelejda Kasneci
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.2 - 25 Nov 2019
Cited by 1 | Viewed by 53
Abstract
Keynote by Enkelejda Kasneci at the 20th European Conference on Eye Movement Research (ECEM) in Alicante, 18.8.2019, Video stream of presentation [...] Full article
1 pages, 34 KiB  
Proceeding Paper
Abstracts of the 20th European Conference on Eye Movements, 18–22 August 2019, in Alicante (Spain)
by Susana Martinez-Conde, Luis Martinez-Otero, Albert Compte and Rudolf Groner
J. Eye Mov. Res. 2019, 12(7), 1; https://doi.org/10.16910/jemr.12.7.1 - 25 Nov 2019
Cited by 2 | Viewed by 64
Abstract
This document contains all abstracts of the 20th European Conference on Eye Movements, August 18–22, 2019, in Alicante, Spain Full article
Previous Issue
Next Issue
Back to TopTop