Next Issue
Volume 9, May
Previous Issue
Volume 9, February
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 9, Issue 4 (May 2016) – 6 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
11 pages, 1120 KiB  
Article
Performance of a Simple Remote Video-Based Eye Tracker with GPU Acceleration
by Jean-Pierre Du Plessis and Pieter Blignaut
J. Eye Mov. Res. 2016, 9(4), 1-11; https://doi.org/10.16910/jemr.9.4.6 - 15 Aug 2016
Cited by 2 | Viewed by 49
Abstract
Eye tracking is a well-established tool that is often utilised in research. There are currently many different types of eye trackers available, but they are either expensive, or provide a relatively low sampling frequency. The focus of this paper was to validate the [...] Read more.
Eye tracking is a well-established tool that is often utilised in research. There are currently many different types of eye trackers available, but they are either expensive, or provide a relatively low sampling frequency. The focus of this paper was to validate the performance and data quality of a simple remote video-based eye tracker that is capable of attaining higher framerates than is normally possible with low-cost eye trackers. It utilises the Graph- ical Processing Unit (GPU) in an attempt to parallelise aspects of the process to localize feature points in eye images. Moreover, the proposed implementation allows for the system to be used on a variety of different GPUs. The developed solution is capable of sampling at frequencies of 200 Hz and higher, while allowing for head movements within an area of 10 × 6 × 10 cm and an average accuracy of one degree of visual angle. Full article
13 pages, 8218 KiB  
Article
ScanGraph: A Novel Scanpath Comparison Method Using Visualisation of Graph Cliques
by Jitka Dolezalova and Stanislav Popelka
J. Eye Mov. Res. 2016, 9(4), 1-13; https://doi.org/10.16910/jemr.9.4.5 - 5 Aug 2016
Cited by 42 | Viewed by 77
Abstract
The article describes a new tool for analysing eye-movement data. Many different approaches to scanpath comparison exist. One of the most frequently used approaches is String Edit Distance, where gaze trajectories are replaced by sequences of visited Areas of Interest. In cartographic literature, [...] Read more.
The article describes a new tool for analysing eye-movement data. Many different approaches to scanpath comparison exist. One of the most frequently used approaches is String Edit Distance, where gaze trajectories are replaced by sequences of visited Areas of Interest. In cartographic literature, the most commonly used software for scanpath comparison is eyePatterns. During an analysis of eyePatterns functionality, we found that the tree-graph visualisation of its results is not reliable. Thus, we decided to develop a new tool called ScanGraph. Its computational algorithms are modified to work better with sequences of different length. The output is visualised as a simple graph, and similar groups of sequences are displayed as cliques of this graph. This article describes ScanGraph’s functionality on a simple cartographic eye-tracking study example. Differences in the reading strategy of a simple map between cartographic experts and novices were investigated. The paper should aid researchers who would like to analyse the differences between groups of participants, and who would like to use our tool, available at www.eyetracking.upol.cz/scangraph. Full article
Show Figures

Figure 1

15 pages, 283 KiB  
Article
Do Graph Readers Prefer the Graph Type Most Suited to a Given Task? Insights from Eye Tracking
by Benjamin Strobel, Steffani Saß, Marlit Annalena Lindner and Olaf Köller
J. Eye Mov. Res. 2016, 9(4), 1-15; https://doi.org/10.16910/jemr.9.4.4 - 14 Jul 2016
Cited by 20 | Viewed by 62
Abstract
Research on graph comprehension suggests that point differences are easier to read in bar graphs, while trends are easier to read in line graphs. But are graph readers able to detect and use the most suited graph type for a given task? In [...] Read more.
Research on graph comprehension suggests that point differences are easier to read in bar graphs, while trends are easier to read in line graphs. But are graph readers able to detect and use the most suited graph type for a given task? In this study, we applied a dual repre-sentation paradigm and eye tracking methodology to determine graph readers’ preferential processing of bar and line graphs while solving both point difference and trend tasks. Data were analyzed using linear mixed-effects models. Results show that participants shifted their graph preference depending on the task type and refined their preference over the course of the graph task. Implications for future research are discussed. Full article
Show Figures

Figure 1

14 pages, 886 KiB  
Article
Pupil Size Affects Measures of Eye Position in Video Eye Tracking: Implications for Recording Vergence Accuracy
by Wolfgang Jaschinski
J. Eye Mov. Res. 2016, 9(4), 1-14; https://doi.org/10.16910/jemr.9.4.2 - 20 Jun 2016
Cited by 20 | Viewed by 65
Abstract
Video eye trackers rely on the position of the pupil centre. However, the pupil centre can shift when the pupil size changes. This pupillary artefact is investigated for binocular vergence accuracy (i.e., fixation disparity) in near vision where the pupil is smaller in [...] Read more.
Video eye trackers rely on the position of the pupil centre. However, the pupil centre can shift when the pupil size changes. This pupillary artefact is investigated for binocular vergence accuracy (i.e., fixation disparity) in near vision where the pupil is smaller in the binocular test phase than in the monocular calibration. A regression between recordings of pupil size and fixation disparity allows correcting the pupillary artefact. This corrected fixation disparity appeared to be favourable with respect to reliability and validity, i.e., the correlation of fixation disparity versus heterophoria. The findings provide a quantitative estimation of the pupillary artefact on measured eye position as function of viewing dis-tance and luminance, both for measures of monocular and binocular eye position. Full article
Show Figures

Figure 1

6 pages, 517 KiB  
Article
A Simple(r) Tool for Examining Fixations
by Francesco Di Nocera, Claudio Capobianco and Simon Mastrangelo
J. Eye Mov. Res. 2016, 9(4), 1-6; https://doi.org/10.16910/jemr.9.4.3 - 17 Jun 2016
Cited by 5 | Viewed by 52
Abstract
This short paper describes an update of A Simple Tool For Examining Fixations (ASTEF) developed for facilitating the examination of eye-tracking data and for computing a spatial statistics algorithm that has been validated as a measure of mental workload (namely, the Nearest Neighbor [...] Read more.
This short paper describes an update of A Simple Tool For Examining Fixations (ASTEF) developed for facilitating the examination of eye-tracking data and for computing a spatial statistics algorithm that has been validated as a measure of mental workload (namely, the Nearest Neighbor Index: NNI). The code is based on Matlab® 2013a and is currently distributed on the web as an open-source project. This implementation of ASTEF got rid of many functionalities included in the previous version that are not needed anymore considering the large availability of commercial and open-source software solutions for eye-tracking. That makes it very easy to compute the NNI on eye-tracking data without the hassle of learning complicated tools. The software also features an export function for creating the time series of the NNI values computed on each minute of the recording. This feature is crucial given that the spatial distribution of fixations must be used to test hy-potheses about the time course of mental load. Full article
Show Figures

Figure 1

11 pages, 1213 KiB  
Article
Recurrence Metrics for Assessing Eye Movements in Perceptual Experiments
by Susan P. Farnand, Preethi Vaidyanathan and Jeff B. Pelz
J. Eye Mov. Res. 2016, 9(4), 1-11; https://doi.org/10.16910/jemr.9.4.1 - 16 May 2016
Cited by 7 | Viewed by 78
Abstract
In a recent study evaluating the impact of image content on the consistency of eye move-ments in perceptual experiments (Farnand, 2013) image complexity was inversely related to scanpath consistency. This work involved a qualitative analysis of eye movements along with analysis of the [...] Read more.
In a recent study evaluating the impact of image content on the consistency of eye move-ments in perceptual experiments (Farnand, 2013) image complexity was inversely related to scanpath consistency. This work involved a qualitative analysis of eye movements along with analysis of the number and duration of fixations. No quantitative analysis of scan path consistency was performed. Recently, Anderson et al. developed a quantitative tool - Recurrence Quantification Analysis - for analyzing eye movement patterns (Ander-son et al., 2013). In the present study, the RQA method was applied to the fixation data of Farnand (2013). The results suggest that RQA analysis provides relevant quantitative information regarding the attentional focus of observers viewing pictorial images and, as such, a potentially powerfully tool for eye-tracking studies. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop