Next Issue
Volume 14, August
Previous Issue
Volume 14, March
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).

J. Eye Mov. Res., Volume 14, Issue 3 (April 2021) – 5 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
19 pages, 15554 KiB  
Article
Determining Which Sine Wave Frequencies Correspond to Signal and Which Correspond to Noise in Eye-Tracking Time-Series
by Mehedi H. Raju, Lee Friedman, Troy M. Bouman and Oleg V. Komogortsev
J. Eye Mov. Res. 2021, 14(3), 1-19; https://doi.org/10.16910/jemr.14.3.5 - 31 Dec 2023
Cited by 4 | Viewed by 138
Abstract
The Fourier theorem states that any time-series can be decomposed into a set of sinusoidal frequencies, each with its own phase and amplitude. The literature suggests that some frequencies are important to reproduce key qualities of eye-movements (“signal”) and some of frequencies are [...] Read more.
The Fourier theorem states that any time-series can be decomposed into a set of sinusoidal frequencies, each with its own phase and amplitude. The literature suggests that some frequencies are important to reproduce key qualities of eye-movements (“signal”) and some of frequencies are not important (“noise”). To investigate what is signal and what is noise, we analyzed our dataset in three ways: (1) visual inspection of plots of saccade, microsaccade and smooth pursuit exemplars; (2) analysis of the percentage of variance accounted for (PVAF) in 1,033 unfiltered saccade trajectories by each frequency band; (3) analyzing the main sequence relationship between saccade peak velocity and amplitude, based on a power law fit. Visual inspection suggested that frequencies up to 75 Hz are required to represent microsaccades. Our PVAF analysis indicated that signals in the 0-25 Hz band account for nearly 100% of the variance in saccade trajectories. Power law coefficients (a, b) return to unfiltered levels for signals low-pass filtered at 75 Hz or higher. We conclude that to maintain eyemovement signal and reduce noise, a cutoff frequency of 75 Hz is appropriate. We explain why, given this finding, a minimum sampling rate of 750 Hz is suggested. Full article
Show Figures

Figure 1

16 pages, 6403 KiB  
Article
Filtering Eye-Tracking Data From an EyeLink 1000: Comparing Heuristic, Savitzky-Golay, IIR and FIR Digital Filters
by Mehedi H. Raju, Lee Friedman, Troy M. Bouman and Oleg V. Komogortsev
J. Eye Mov. Res. 2021, 14(3), 1-16; https://doi.org/10.16910/jemr.14.3.6 - 19 Oct 2023
Cited by 5 | Viewed by 208
Abstract
In a prior report (Raju et al., 2023) we concluded that, if the goal was to preserve events such as saccades, microsaccades, and smooth pursuit in eye-tracking recordings, data with sine wave frequencies less than 75 Hz were the signal and data above [...] Read more.
In a prior report (Raju et al., 2023) we concluded that, if the goal was to preserve events such as saccades, microsaccades, and smooth pursuit in eye-tracking recordings, data with sine wave frequencies less than 75 Hz were the signal and data above 75 Hz were noise. Here, we compare five filters in their ability to preserve signal and remove noise. We compared the proprietary STD and EXTRA heuristic filters provided by our EyeLink 1000 (SR-Research, Ottawa, Canada), a Savitzky- Golay (SG) filter, an infinite impulse response (IIR) filter (low-pass Butterworth), and a finite impulse filter (FIR). For each of the non-heuristic filters, we systematically searched for optimal parameters. Both the IIR and the FIR filters were zero-phase filters. All filters were evaluated on 216 fixation segments (256 samples), from nine subjects. Mean frequency response profiles and amplitude spectra for all five filters are provided. Also, we examined the effect of our filters on a noisy recording. Our FIR filter had the sharpest roll-off of any filter. Therefore, it maintained the signal and removed noise more effectively than any other filter. On this basis, we recommend the use of our FIR filter. We also report on the effect of these filters on temporal autocorrelation. Full article
Show Figures

Figure 1

9 pages, 3893 KiB  
Article
Multimodality During Fixation—Part II: Evidence for Multimodality in Spatial Precision-Related Distributions and Impact on Precision Estimates
by Lee Friedman, Timothy Hanson and Oleg V. Komogortsev
J. Eye Mov. Res. 2021, 14(3), 1-9; https://doi.org/10.16910/jemr.14.3.4 - 28 Oct 2021
Cited by 1 | Viewed by 79
Abstract
This paper is a follow-on to our earlier paper (7), which focused on the multimodality of angular offsets. This paper applies the same analysis to the measurement of spatial precision. Following the literature, we refer these measurements as estimates of device [...] Read more.
This paper is a follow-on to our earlier paper (7), which focused on the multimodality of angular offsets. This paper applies the same analysis to the measurement of spatial precision. Following the literature, we refer these measurements as estimates of device precision, but, in fact, subject characteristics clearly affect the measurements. One typical measure of the spatial precision of an eye-tracking device is the standard deviation (SD) of the position signals (horizontal and vertical) during a fixation. The SD is a highly interpretable measure of spread if the underlying error distribution is unimodal and normal. However, in the context of an underlying multimodal distribution, the SD is less interpretable. We will present evidence that the majority of such distributions are multimodal (68-70% strongly multimodal). Only 21-23% of position distributions were unimodal. We present an alternative method for measuring precision that is appropriate for both unimodal and multimodal distributions. This alternative method produces precision estimates that are substantially smaller than classic measures. We present illustrations of both unimodality and multimodality with either drift or a microsaccade present during fixation. At present, these observations apply only to the EyeLink 1000, and the subjects evaluated herein. Full article
Show Figures

Figure 1

19 pages, 9095 KiB  
Article
Angular Offset Distributions During Fixation Are, More Often Than Not, Multimodal
by Lee Friedman, Dillon Lohr, Timothy Hanson and Oleg V. Komogortsev
J. Eye Mov. Res. 2021, 14(3), 1-19; https://doi.org/10.16910/jemr.14.3.2 - 3 Jun 2021
Cited by 4 | Viewed by 72
Abstract
Typically, the position error of an eye-tracking device is measured as the distance of the eye-position from the target position in two-dimensional space (angular offset). Accuracy is the mean angular offset. The mean is a highly interpretable measure of central tendency if the [...] Read more.
Typically, the position error of an eye-tracking device is measured as the distance of the eye-position from the target position in two-dimensional space (angular offset). Accuracy is the mean angular offset. The mean is a highly interpretable measure of central tendency if the underlying error distribution is unimodal and normal. However, in the context of an underlying multimodal distribution, the mean is less interpretable. We will present evidence that the majority of such distributions are multimodal. Only 14.7% of fixation angular offset distributions were unimodal, and of these, only 11.5% were normally distributed. (Of the entire dataset, 1.7% were unimodal and normal.) This multimodality is true even if there is only a single, continuous tracking fixation segment per trial. We present several approaches to measure accuracy in the face of multimodality. We also address the role of fixation drift in partially explaining multimodality. Full article
Show Figures

Figure 1

21 pages, 5894 KiB  
Article
A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research
by Daria Ivanchenko, Katharina Rifai, Ziad M. Hafed and Frank Schaeffel
J. Eye Mov. Res. 2021, 14(3), 1-21; https://doi.org/10.16910/jemr.14.3.3 - 5 May 2021
Cited by 16 | Viewed by 168
Abstract
We describe a high-performance, pupil-based binocular eye tracker that approaches the performance of a well-established commercial system, but at a fraction of the cost. The eye tracker is built from standard hardware components, and its software (written in Visual C++) can be easily [...] Read more.
We describe a high-performance, pupil-based binocular eye tracker that approaches the performance of a well-established commercial system, but at a fraction of the cost. The eye tracker is built from standard hardware components, and its software (written in Visual C++) can be easily implemented. Because of its fast and simple linear calibration scheme, the eye tracker performs best in the central 10 degrees of the visual field. The eye tracker possesses a number of useful features: (1) automated calibration simultaneously in both eyes while subjects fixate four fixation points sequentially on a computer screen, (2) automated realtime continuous analysis of measurement noise, (3) automated blink detection, (4) and realtime analysis of pupil centration artifacts. This last feature is critical because it is known that pupil diameter changes can be erroneously registered by pupil-based trackers as a change in eye position. We evaluated the performance of our system against that of a wellestablished commercial system using simultaneous measurements in 10 participants. We propose our low-cost eye tracker as a promising resource for studies of binocular eye movements. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop