Next Issue
Volume 10, October
Previous Issue
Volume 10, March
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 10, Issue 3 (June 2017) – 6 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
10 pages, 737 KiB  
Article
Interference Between Smooth Pursuit and Color Working Memory
by Shulin Yue, Zhenlan Jin, Chenggui Fan, Qian Zhang and Ling Li
J. Eye Mov. Res. 2017, 10(3), 1-10; https://doi.org/10.16910/jemr.10.3.6 - 10 Jul 2017
Cited by 6 | Viewed by 49
Abstract
Spatial working memory (WM) and spatial attention are closely related, but the relationship between non-spatial WM and spatial attention still remains unclear. The present study aimed to investigate the interaction between color WM and smooth pursuit eye movements. A modified delayed-match-to-sample paradigm (DMS) [...] Read more.
Spatial working memory (WM) and spatial attention are closely related, but the relationship between non-spatial WM and spatial attention still remains unclear. The present study aimed to investigate the interaction between color WM and smooth pursuit eye movements. A modified delayed-match-to-sample paradigm (DMS) was applied with 2 or 4 items presented in each visual field. Subjects memorized the colors of items in the cued visual field and smoothly moved eyes towards or away from memorized items during retention interval despite that the colored items were no longer visible. The WM performance decreased with higher load in general. More importantly, the WM performance was better when subjects pursued towards rather than away from the cued visual field. Meanwhile, the pursuit gain decreased with higher load and demonstrated a higher result when pursuing away from the cued visual field. These results indicated that spatial attention, guiding attention to the memorized items, benefits color WM. Therefore, we propose that a competition for attention resources exists between color WM and smooth pursuit eye movements. Full article
Show Figures

Figure 1

11 pages, 904 KiB  
Article
Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task
by Alexander Leube, Katharina Rifai and Siegfried Wahl
J. Eye Mov. Res. 2017, 10(3), 1-11; https://doi.org/10.16910/jemr.10.3.3 - 7 Jun 2017
Cited by 52 | Viewed by 97
Abstract
The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and [...] Read more.
The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and compared directly to the saccades detected by a 1000 HZ stationary tracker while a reading task was performed. Saccades and fixations were detected using a velocity based algorithm and their properties analyzed. Results showed that there was no significant difference in the number of detected fixations but mean fixation durations differed between the 60 Hz mobile and the stationary eye tracker. The 120 Hz mobile eye tracker showed a significant increase in the detection rate of saccades and an improved estimation of the mean saccade duration, compared to the 60 Hz eye tracker. To conclude, for the detection and analysis of fast eye movements, such as saccades, it is better to use a 120 Hz mobile eye tracker. Full article
Show Figures

Figure 1

13 pages, 723 KiB  
Article
Developing Clinically Practical Transcranial Direct Current Stimulation Protocols to Improve Saccadic Eye Movement Control
by Po Ling Chen and Liana Machado
J. Eye Mov. Res. 2017, 10(3), 1-13; https://doi.org/10.16910/jemr.10.3.5 - 5 Jun 2017
Cited by 6 | Viewed by 65
Abstract
Recent research indicates that anodal transcranial direct current stimulation (tDCS) applied over the frontal eye field (FEF) can improve saccadic eye movement control in healthy young adults. The current research set out to determine whether similar results can be produced using a clinically [...] Read more.
Recent research indicates that anodal transcranial direct current stimulation (tDCS) applied over the frontal eye field (FEF) can improve saccadic eye movement control in healthy young adults. The current research set out to determine whether similar results can be produced using a clinically practical protocol, whether tDCS applied over the dorsolateral prefrontal cortex (DLPFC) might also afford benefits, and whether benefits extend to older adults. Twenty young and 10 older adults completed two active (FEF and DLPFC) and one sham stimulation session. To aid clinical translation, the method of positioning the electrodes entailed simple measurements only. Saccadic performance following anodal tDCS applied over the FEF or DLPFC did not differ from the sham condition in either age group. Additionally, saccadic performance contralateral to the active electrodes showed no evidence of benefits over ipsilateral performance. These results call into question whether the protocol utilized can be applied effectively using only simple measurements to localize the relevant frontal subregion. Future efforts to develop a clinically practical tDCS protocol to improve saccadic eye movement control should include a sham control condition and consider adjusting the tDCS electrode montage and current strength to optimize the chances of conferring benefits in the population under study. Full article
Show Figures

Figure 1

15 pages, 1085 KiB  
Article
Eye-Tracking Analysis of Interactive 3D Geovisualization
by Lukas Herman, Stanislav Popelka and Vendula Hejlova
J. Eye Mov. Res. 2017, 10(3), 1-15; https://doi.org/10.16910/jemr.10.3.2 - 31 May 2017
Cited by 25 | Viewed by 71
Abstract
This paper describes a new tool for eye-tracking data and their analysis with the use of interactive 3D models. This tool helps to analyse interactive 3D models easier than by time-consuming, frame-by-frame investigation of captured screen recordings with superimposed scanpaths. The main function [...] Read more.
This paper describes a new tool for eye-tracking data and their analysis with the use of interactive 3D models. This tool helps to analyse interactive 3D models easier than by time-consuming, frame-by-frame investigation of captured screen recordings with superimposed scanpaths. The main function of this tool, called 3DgazeR, is to calculate 3D coordinates (X, Y, Z coordinates of the 3D scene) for individual points of view. These 3D coordinates can be calculated from the values of the position and orientation of a virtual camera and the 2D coordinates of the gaze upon the screen. The functionality of 3DgazeR is introduced in a case study example using Digital Elevation Models as stimuli. The purpose of the case study was to verify the functionality of the tool and discover the most suitable visualization methods for geographic 3D models. Five selected methods are presented in the results section of the paper. Most of the output was created in a Geographic Information System. 3DgazeR works with the SMI eye-tracker and the low-cost EyeTribe tracker connected with open source application OGAMA, and can compute 3D coordinates from raw data and fixations. Full article
Show Figures

Figure 1

19 pages, 13511 KiB  
Article
An Inverse-Linear Logistic Model of the Main Sequence
by Andrew T. Duchowski, Krzysztof Krejtz, Cezary Biele, Anna Niedzielska, Peter Kiefer, Ioannis Giannopoulos, Nina Gehrer and Michael Schönenberg
J. Eye Mov. Res. 2017, 10(3), 1-19; https://doi.org/10.16910/jemr.10.3.4 - 29 May 2017
Cited by 3 | Viewed by 52
Abstract
A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simultaneously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its [...] Read more.
A model of the main sequence is proposed based on the logistic function. The model’s fit to the peak velocity-amplitude relation resembles an S curve, simultaneously allowing control of the curve’s asymptotes at very small and very large amplitudes, as well as its slope over the mid-amplitude range. The proposed inverse-linear logistic model is also able to express the linear relation of duration and amplitude. We demonstrate the utility and robustness of the model when fit to aggregate data at the smalland mid-amplitude ranges, namely when fitting microsaccades, saccades, and superposition of both. We are confident the model will suitably extend to the largeamplitude range of eye movements. Full article
Show Figures

Figure 1

9 pages, 1938 KiB  
Article
Ways of Improving the Precision of Eye Tracking Data: Controlling the Influence of Dirt and Dust on Pupil Detection
by Wolfgang Fuhl, Thomas C. Kübler, Dennis Hospach, Oliver Bringmann, Wolfgang Rosenstiel and Enkelejda Kasneci
J. Eye Mov. Res. 2017, 10(3), 1-9; https://doi.org/10.16910/jemr.10.3.1 - 12 May 2017
Cited by 7 | Viewed by 43
Abstract
Eye-tracking technology has to date been primarily employed in research. With recent advances in affordable video-based devices, the implementation of gaze-aware smartphones, and marketable driver monitoring systems, a considerable step towards pervasive eye-tracking has been made. However, several new challenges arise with the [...] Read more.
Eye-tracking technology has to date been primarily employed in research. With recent advances in affordable video-based devices, the implementation of gaze-aware smartphones, and marketable driver monitoring systems, a considerable step towards pervasive eye-tracking has been made. However, several new challenges arise with the usage of eye-tracking in the wild and will need to be tackled to increase the acceptance of this technology. The main challenge is still related to the usage of eye-tracking together with eyeglasses, which in combination with reflections for changing illumination conditions will make a subject "untrackable". If we really want to bring the technology to the consumer, we cannot simply exclude 30% of the population as potential users only because they wear eyeglasses, nor can we make them clean their glasses and the device regularly. Instead, the pupil detection algorithms need to be made robust to potential sources of noise. We hypothesize that the amount of dust and dirt on the eyeglasses and the eye-tracker camera has a significant influence on the performance of currently available pupil detection algorithms. Therefore, in this work, we present a systematic study of the effect of dust and dirt on the pupil detection by simulating various quantities of dirt and dust on eyeglasses. Our results show (1) an overall high robustness to dust in an offfocus layer. (2) the vulnerability of edge-based methods to even small in-focus dust particles. (3) a trade-off between tolerated particle size and particle amount, where a small number of rather large particles showed only a minor performance impact. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop