Next Article in Journal
An Inverse-Linear Logistic Model of the Main Sequence
Previous Article in Journal
Eye-Tracking Analysis of Interactive 3D Geovisualization
 
 
Journal of Eye Movement Research is published by MDPI from Volume 18 Issue 1 (2025). Previous articles were published by another publisher in Open Access under a CC-BY (or CC-BY-NC-ND) licence, and they are hosted by MDPI on mdpi.com as a courtesy and upon agreement with Bern Open Publishing (BOP).
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task

by
Alexander Leube
,
Katharina Rifai
and
Siegfried Wahl
Institute for Ophthalmic Research, University of Tuebingen, Tuebingen, Germany
J. Eye Mov. Res. 2017, 10(3), 1-11; https://doi.org/10.16910/jemr.10.3.3
Submission received: 17 January 2017 / Published: 7 June 2017

Abstract

:
The purpose of this study was to compare saccade detection characteristics in two mobile eye trackers with different sampling rates in a natural task. Gaze data of 11 participants were recorded in one 60 Hz and one 120 Hz mobile eye tracker and compared directly to the saccades detected by a 1000 HZ stationary tracker while a reading task was performed. Saccades and fixations were detected using a velocity based algorithm and their properties analyzed. Results showed that there was no significant difference in the number of detect-ed fixations but mean fixation durations differed between the 60 Hz mobile and the sta-tionary eye tracker. The 120 Hz mobile eye tracker showed a significant increase in the detection rate of saccades and an improved estimation of the mean saccade duration, com-pared to the 60 Hz eye tracker. To conclude, for the detection and analysis of fast eye movements, such as saccades, it is better to use a 120 Hz mobile eye tracker.

Introduction

The investigation of eye movements using eye track-ing technology provides a powerful tool for different disciplines. Besides its role in scientific and clinical tasks, eye tracking applications are widely used for examining visual attention in marketing studies (Lahey & Oxley, 2016; Oliveira et al., 2016; Wedel, 2013), adapting learn-ing behavior in real time situations (Rosch & Vogel-Walcutt, 2013) or to enhance the control modalities in computer games (Isokoski et al., 2009; Isokoski & Martin, 2006; Vickers et al., 2013). Especially saccadic eye movements and their statistics are of interest.
They are for instance used to investigate eye move-ments during reading, scene perception and visual search task, see Rayner (2009) for review. Eye movement ab-normalities like corrective saccades in a smooth pursuit task were shown to supplement clinical diagnosis of schizophrenia (Benson et al., 2012; Sereno & Holzman, 1995; Shulgovskiy et al., 2015) and can be linked to cognitive deficits in word processing of schizophrenia patients (Fernández et al., 2016). Furthermore, saccadic eye movements can be used as objective indicators in screening mental health (Vidal et al., 2012) for instance of dyslexia (Biscaldi et al., 1998; Eden et al., 1994; Nilsson Benfatto et al., 2016) or autism (Kemner et al., 1998; Rosenhall et al., 2007). This clearly shows the scientifically and clinically importance to detect the char-acteristics of saccades accurately. Such eye movement tests are often conducted on controlled conditions with high accuracy eye trackers that require head stabilization and presentation of stimuli on a fixed display. This, how-ever, is unlike normal visual perception, and it is there-fore important to move to more day-to-day tasks. Eye movements in such tasks can only be measured with mobile eye trackers, and it is unclear how well these can measure and detect saccadic eye movements.
In the analysis of eye tracking data, the algorithm used to detect events, such as fixations, blinks and saccades is the crucial factor. Algorithms can be classified in three main categories, based on their threshold criteria: dispersion, velocity or acceleration-based (A. Duchowski, 2007; Nyström & Holmqvist, 2010; Salvucci & Goldberg, 2000), or the combination of these criteria. The velocity-threshold identification is the fastest algorithm (no back-tracking required as in dispersion algorithms) that differ-entiate fixations and saccades by their point-by-point velocities and requires only one parameter to specify, the velocity threshold (Salvucci & Goldberg, 2000). When using velocity-based algorithms to analyze eye tracking data, the sampling rate of the eye tracking signal becomes the limiting factor (Juhola & Pyykko, 1987). During saccades, the eye movements are very fast, and at low sampling rates, insufficient samples of these fast move-ments may be available for correct detection. Because of their velocity characteristics saccades can be classified as “outliers” in the velocity profile (Engbert & Kliegl, 2003; Inhoff et al., 2010; Liversedge & Findlay, 2000) and serve as a robust criteria in analyzing eye tracking data.
According to the Nyquist theorem, a higher sampled eye tracker detects saccades of shorter duration in com-parison to a lower sampled eye tracker which saccade detection shows a minimum duration threshold of twice the Nyquist frequency. Thus, specifically an increase of sampling frequency from 60 Hz to 120 Hz is expected to increase the detection rate of saccades, whose durations are in the range between approximately 16 and 33 ms. The main sequence of saccades shows a linear relation-ship between saccade amplitude and duration (Bahill et al., 1975; Engbert & Kliegl, 2003). In reading, which is a common activity and highly important in modern day-to-day life, saccade distributions show a high number of small saccades (Rayner 1998) which would not be detect-ed if they fall into the interval between 16 and 33 ms. Saccadic behavior in reading tasks is a well-studied and explained characteristic of human eye movements . The task-specific saccade distributions directly impact the detection rate of eye trackers with a limited sampling rate. Thus, specifically in this task, an accurate choice of sampling frequency is crucial. Therefore, we assume that an eye tracker with higher sampling might detect more saccades in a reading task, as it will also detect short duration saccades. We furthermore hypothesize, that the estimation of mean saccade duration is more reliable when estimated from higher sampled gaze, because more samples will be available to reliably detect saccade start and end. Studies examining eye movements typically rely on high-sampling static eye trackers. But, novel mobile eye trackers allow recordings in more natural scenarios, especially paradigms in which the subject is freely behav-ing. With increasing use of mobile eye trackers, it be-comes inevitable to evaluate how well mobile eye track-ers can detect and measure saccadic eye movements. Thus, this study evaluates the impact of an increase in sampling rate of a head worn eye tracker designed for field studies from 60 Hz to 120 Hz in a real-world task with a topic of high research interest: reading (K. Rayner et al., 2012).

Methods

Participants

11 eye-healthy participants with a mean age of 34.9 ± 9.9 years were included in the study. The participants had normal or corrected-to-normal vision. All participants were naïve to the purpose of the study. All procedures followed the tenets of the Declaration of Helsinki. In-formed consent was obtained from all participants after explanation of the nature and possible consequences of the study.

Equipment andExperimentalProcedure

Participants were wearing one of two mobile eye trackers (SMI ETG w, 60 Hz sampling; SMI ETG 2w, 120 Hz sampling, SensoMotoric Instruments GmbH, Teltow, Germany). In order to evaluate saccade detection in these eye trackers, participants placed their head in a chin rest and a stationary eye tracker (EyeLink 1000, SR Research Ltd., Mississauga, Canada) was used as a refer-ence. This eye tracker was placed below the screen at a distance of 60cm. For stimulus presentation, a visual display (VIEWPixx /3D, VPixx Technologies, Canada) at distance of 70 cm was used. Both mobile eye trackers and the stationary eye tracker were calibrated and validated using a 3-point calibration pattern composed of three black rings on a gray background. The stationary and mobile eye trackers recorded the eye positions simultane-ously. The stationary eye tracker was set to record at a sampling rate of 1000 Hz, binocularly, and the two mo-bile eye trackers to either 60 Hz or 120 Hz binocular tracking. To minimize an influence of the IR signal from the stationary eye tracker on the tracking ability of the mobile eye tracking glasses, the power of the IR LED was reduced to a minimum of 50% intensity.
The mobile eye tracking glasses use infrared (IR) LED´s arranged in a ring pattern within the glasses frame while the IR array from the stationary eye tracker is a single dot pattern. Because of the differences in shape and intensity of the reflection pattern (see Figure 1) and the intensity of the corneal reflex is much higher in the stationary eye tracker, both reflections were distinguisha-ble from each other and a simultaneous measurement was possible. Moreover, both 60 Hz and 120 Hz eye trackers are expected to be equally affected by any potential mutal interference.
Prior to the experiment start, participants were in-formed that they will read a text, about which they will have to answer questions to ensure attentive reading. To enable an offline temporal synchronization between both eye trackers for the data analysis, a peripheral fixation point of 25° dislocation from the left side of the text was displayed for three seconds on the screen. Subsequently the sample text was presented. The letter size was set to 23 pixels, which corresponded to an angular size of 0.5 ° for capital letters using the font type Helvetica. A rela-tively large letter size ensured that every normally sighted participant was able to read the text. Two sample texts were created: Text 1 contained information about the human visual systems and covered 237 word, text 2 was about Tuebingen and the Eberhard Karls University Tuebingen with 276 words. An example of the experi-ment procedure is given in Figure 2. The participants were instructed to read silently and in their normal read-ing speed. Subsequently, the participants confirmed or rejected five statements regarding the content of the text, by pressing a button on the keyboard. All stimuli were programmed and displayed using the psychophysics toolbox (Psychtoolbox 3, Kleiner M, et al. 2007) in the Matlab programming language (Matlab, MathWorks Inc., Natick, Massachusetts).

Analysis

From the eye tracking data the average number of fix-ations, their average duration and the average number of saccades and their durations were calculated. Blinks were excluded from the dataset prior to analysis. Blinks were identified on the basis of the individual eye tracker crite-ria (the pupil size is very small or either zero). Fixations and saccades were identified using an algorithm based on the velocity profile, see Equation (1), of the gaze data calculated as the difference in horizontal eye position between successive positions and divided by the inter-sample time interval (Salvucci & Goldberg, 2000; van der Geest & Frens, 2002) without application of a run-ning-average filter prior to the analysis. According to Equation (2), a fixation is classified as gaze points where the eye-velocity signal V n remains below a threshold of d = 60 °/sec for a minimum time duration of ΔtFix = 100 ms (average fixation durations are around 200 – 300 ms (K. Rayner, 1998; Starr & Rayner, 2001)). A single fixation and its associated duration was defined as the time inter-val where Equation (2) resulted in a ‘false’ outcome. In addition to the fixation duration, the absolute number of fixations was analyzed.
Jemr 10 00016 i001
Jemr 10 00016 i002
Furthermore, the velocity profile of the gaze data was used for saccade detection. A saccade event was identi-fied as the time interval where the condition of Equation (2) was true (i.e., the intervals not assigned to a fixation). The local maximum in this saccade time interval was localized using Matlab and marked a saccade event. The saccade duration was calculated as the time interval dur-ing which the velocity of the eye remained above the velocity threshold d and Equation (2) was true.
To analyze the difference in performance between the 60 Hz and the 120 Hz mobile tracking glasses, the rela-tive differences in the number and duration of detected fixations and saccades between the mobile and the sta-tionary eye tracker were calculated and evaluated. All calculations consider the gaze data of the right eye. Nor-mality of data was investigated using the Shapiro-Wilk test. In case of normal distributed data a t-test (power 1-β = 0.80) to test for difference in the detection ability was performed. Consequently, a Wilcoxon rank test in case of not normally distributed data was performed. The critical p-value (α error) was set to 0.05 and the statistical anal-yses was performed (IBM SPSS Statistics 22, IBM, Ar-monk, USA).

Results

Figure 2 compares the eye movement data from the static eye tracker (a) and the data from the 120Hz mobile eye tracker (which is similar in the case of the 60 Hz mobile eye tracker) in (b), when superimposed on the text that was read. In order to plot the mobile eye tracking data (Figure 2b), the data were manually scaled using an empirically defined scaling factor. Figure 2 shows that the reduced sampling rate in the mobile eye tracker leads to a sparse representation of saccade midflight eye posi-tions.
A larger number of saccades were detected for the 120 Hz than for the 60 Hz eye tracker (p=0.011, two-sited t-test), see Table 1 and Figure 3. The 120 Hz mobile eye tracker also led to a more reliable estimation of mean saccade duration (Δ = 5.91 ms, p = 0.033, two-sited t-test), see Figure 3. Despite these differences, the number of saccades undetected by the stationary eye tracker but detected by the mobile eye trackers was very low and ranged below 1% of the total number of correctly detect-ed saccades. The data therefore show that saccade detec-tion was generally adequate in mobile eye trackers, the 120Hz eye tracker was better in measuring the duration of the saccade than the 60Hz eye tracker.
In contrast to the saccade, no significant difference in the number of fixations were found between the 60Hz and 120Hz eye tracker (p = 0.110, Wilcoxon-test). Statis-tical analysis showed no significant difference between the 60 Hz and the 120 Hz devices in fixation durations (p = 0.088, paired t-test). Nevertheless, there is a trend to-wards more accurate fixation detection in the 120 Hz device when compared to the stationary eye tracker. Mean and standard deviation of the number and the dura-tion of saccades and fixations are shown in Figure 3.
Figure 4 illustrate the frequency distribution of fixation durations in the 60 Hz (Figure 4a) and the 120 Hz devices (Figure 4b), respectively, compared to the stationary eye tracker. Fixation durations within the silent reading task ranged from 100 ms to 600 ms. The distribution of rec-orded fixations of the 60 Hz mobile eye tracker showed a shift of the maximum towards smaller fixation duration (p = 0.01, Wilcoxon rank test) while the distribution of the 120 Hz mobile eye tracker reveals a trend towards a better assessment of fixation durations (p = 0.59, Wilcox-on rank test), in comparison to the stationary reference eye tracker.

Discussion

Previous studies have revealed that the use of station-ary eye trackers with lower sampling rates results in sig-nificantly impoverished detection and measurement of saccadic eye movements, especially at the border of the stimuli screen (Ooms et al., 2015). Generally, high-frequency stationary eye trackers should be preferred in investigations of saccades and the use of eye tracker with lower sampling rates should be restricted to the examina-tion of fixation behavior and pupil size (Dalmaijer, 2014). While the effects of sampling rate for stationary eye trackers is known, no such information is available for the impact of mobile eye trackers, which place the cam-eras often closer to participants' eyes, use a different pattern of IR lighting, and a different calibration method. In the current study we compared the impact of sampling rate of mobile eye trackers on extraction rates of saccades and fixations in a reading task, as a common task of daily life.

Mobile Eye Tracking and Reading

The development of mobile eye trackers in the last years (Babcock & Pelz, 2004; Li et al., 2006; Pfeiffer & Renner, 2014) has enabled researcher to examine eye movements during reading in a natural context (Rayner, 1998). Mobile eye tracking of reading may enhance clini-cal diagnosis, for example, by differentiating progressive supranuclear palsy from Parkinson's disease (Marx et al., 2012) or for mental or linguistic disorders (Fernández et al., 2016; Vidal et al., 2012). The measurement of eye movements in such tasks has led to the further under-standing of learning (Rosch & Vogel-Walcutt, 2013) e.g. in medical and health professions (Kok & Jarodzka, 2017) and can further be extended to e-learning applica-tions (Molina et al., 2014). However, research on the reliability of mobile eye tracker in the detection of sac-cades and fixations especially in reading is sparse.
Current analysis of saccadic eye movements demon-strated the benefit of a higher sampling rate of the 120 Hz mobile eye tracker in the detection of saccades. During reading, the amplitude and number of both progressive and return (regression) saccades depend on various in-trinsic and extrinsic factors (Rayner, 1998). To evaluate these properties of saccades, it is therefore important to measure parameter of saccadic eye movements accurate-ly. External aspects like visual information factors, e.g. the spaces or type of characters between words (Pollatsek & Rayner, 1982; Yang & McConkie, 2001) or the length and orthographic information of the words (Joseph et al., 2009; Rayner & McConkie, 1976; Vitu et al., 1995) im-pact saccadic amplitudes. Secondly, higher level factors, such as spatial coding (Liversedge & Findlay, 2000) or the location of attention (Rayner, 2009; Schneider & Deubel, 1995) influence saccadic behavior. The main sequence saccadic eye movements (Bahill et al., 1975; Harris & Wolpert, 2006) demonstrates a linear correlation between saccadic amplitude and duration. In reading, people often make small saccadic eye movements (e.g. refixations of the same word), and therefore it is im-portant to accurately detect small saccade amplitudes, it is crucial to use high-frequency equipment facilitate re-cording of small saccade durations. Our results revealed that saccades are better detected with a 120 Hz sampling rate and that the distribution of saccade amplitudes is better measured with this higher sampling rate.

Event Detection Algorithms for Eye Move-Ment Data

In the analysis of saccadic eye movements we used the standard approach based on the velocity profile of the gaze traces (Salvucci & Goldberg, 2000). Engbert and Kliegl (2003) developed a velocity-based algorithm for the detection of microsaccades involving a noise depend-ent detection threshold and a temporal overlap criterion for the binocular occurrence of saccades. The advantage of using a noise dependent algorithm is that it can be adapted easily to the different eye tracking technologies and inter-individual differences (Engbert & Kliegl, 2003). In future work, such noise dependent algorithms could therefore improve the detection performance of low sam-pled eye tracking data if the internal noise distribution is different between the eye trackers. A further approach for future work is to use data from both eyes in the analysis (the method by Engbert and Kliegl requires saccades to overlap in both eyes). However, in a real-world applica-tion a saccade detection algorithm could account for the binocularity and could increase accuracy of detecting saccades. One further extension is to use acceleration in addition to velocity to detect saccades (Behrens & Weiss, 1992) and combined this with noise-dependent saccade thresholds (Behrens et al., 2010). Future work can also examine the use of more complex algorithms, including continuous wavelet and principal component analysis (PCA) or using that saccades can be identified as local singularities (Bettenbühl et al., 2010). Although the focus of the present study was not the comparison of algorithms for event detection in mobile eye tracking, advanced computations might improve the performance in event detection.

Head-Worn vs. Head Fixed Eye Tracking

The accuracy of eye tracker strongly depends on the conditions of the planned experiment (Niehorster et al., 2017) and furthermore on restrictions of head movements (Hessels et al., 2015), hence it is important to consider application-oriented parameters as well. Generally, head-worn eye tracker are not restricted by a certain head posi-tion but the eye movements show a more complex pattern compared to a head-fixed situation, as for instance the vestibulo-ocular reflex (Fetter, 2007) or the optokinetic nystagmus (Crawford et al., 2003) occur. In inter-device comparisons between mobile eye trackers, further studies will have to clarify whether the event detection in mobile eye tracking depends on the type of tracking (e.g. pu-pil/glint tracking vs. 3d- eye model) or number of tracked eyes and the use of advanced calibration methods or algorithms, as discussed above. The current study reports on results of laboratory work including a head-fixed measurement setup. Mobile eye tracker enable a head-free acquisition of eye movement data and it was shown that head movements strongly contribute in the pro-cessing of the visual input, and thus to the oculomotor behavior (Rifai & Wahl, 2016; t Hart et al., 2009). Fur-thermore, in head-free scenarios position or orientation dislocation of the eye tracker on the head was shown to have significant influence on the accuracy of eye tracker (Niehorster et al., 2017). Given that, upcoming studies will have to investigate the sampling dependence of mo-bile eye tracker in head-free scenarios and real-world tasks.

Fixation Statistics in Reading

Analysis of the fixation statistics during a common reading task showed no difference in the number of de-tected fixations or the mean fixation duration between the two mobile eye trackers. The observed mean fixation duration of 220 ms to 240 ms is comparable to other studies, which performed silent reading tasks (Kliegl et al., 2006; Rayner, 1998; Vitu et al., 2001). Yang et al. (2001) reported shorter fixation duration for a reading task of 211 ms. The higher sampling rate of the 120 Hz mobile eye tracker led to a closer mapping of the fre-quency distribution of fixation durations, which is also represented in the significant smaller mean fixation dura-tion, compared to the 60 Hz eye tracker. The frequency distribution of fixation durations showed in all cases the typical right tailed function (Inhoff et al., 2010; Yang & McConkie, 2001) with a maximum around 200 ms. The choice of a typical and realistic reading task, which repre-sents a common daily visual duty, suggests that this esti-mation is also correct for the saccade amplitude distribu-tion in reading and possibly in other tasks.

Future Implications in Virtual Reality Applications

Mobile eye tracking is becoming progressively more important with the introduction of head-mounted-displays (HMD) and virtual reality (VR) glasses to enable a more realistic interaction mediated by human-computer-interfaces (Boukhalfi et al., 2015; Duchowski et al., 2000; Krejtz et al., 2014; Pfeiffer & Memili, 2016; Quinlivan et al., 2016; Tanriverdi & Jacob, 2000). Thus, there is an increasing need in eye trackers that combine usability for field studies with high accuracy and fast eye tracker in a miniaturized version and their incorporation into HMD or VR systems (Bulling & Gellersen, 2010; Kassner et al., 2014). Real time gaze estimation including precise and fast eye tracking enables accurate and thus comfortable stereo image presentation in virtual reality simulations or highly interactive virtual reality scenarios due to a higher sampling rate of eye tracker. Specifically, the detection of fast eye movements, like saccades, can enable gaze point-ing to virtual objects. Juhola et al. (Juhola & Pyykko, 1987) showed that a velocity based algorithm for saccade analysis requires a minimum of 70 Hz sampled data. DiScenna et al. (DiScenna et al., 1995) stated that for a reliable measurement of all kinds of eye movement video cameras with frame rates above 120 Hz are necessary. The results of the current study suggests that a 120 Hz mobile eye tracker leads to more reliable measurements also in a task specific evaluation of saccade and fixation statistics on reading.

Conclusions

The study reports on a relative performance compari-son between two mobile video-based eye trackers during reading. Low sampled eye tracking (60 Hz) lead to an under estimation of the detection of saccades while 120 Hz sampling results in a higher accuracy in the detection of fast eye movements and fixation durations. A certain detection of small saccade durations, as they occur in reading, requires higher sampling rates of the used eye trackers. Reliable and robust detection of saccades by fast and accurate mobile eye trackers will lead to novel de-velopments in gaze-contingent protocols, e.g. for virtual reality simulations. Furthermore, increased sampling rates in eye tracking technology might enable advance-ments in new fields, such as in clinical applications for eye movement training scenarios in visual impaired pa-tients or clinical eye movement marker analysis in diag-nosis of diseases.

Ethics and Conflict of Interest

The author(s) declare(s) that the contents of the article are in agreement with the ethics described in http://biblio.unibe.ch/portale/elibrary/BOP/jemr/ethics.html and that there is no conflict of interest regarding the publication of this paper.

Acknowledgments

This work was done in an industry-on-campus-cooperation between the University Tuebingen and Carl Zeiss Vision International GmbH. The work was support-ed by third-party-funding (ZUK 63).The beta version of the 120 Hz mobile eye tracker was developed and friend-ly provided by the SensoMotoric Instruments GmbH, D-14513 Teltow, Germany.

References

  1. Babcock, J. S., and J. B. Pelz. 2004. Building a light-weight eyetracking headgear. Proceedings of the 2004 symposium on Eye tracking research \& applications; pp. 109–114. [Google Scholar] [CrossRef]
  2. Bahill, A. T., M. R. Clark, and L. Stark. 1975. The main sequence, a tool for studying human eye movements. Mathematical Biosciences 24, 3-4: 191–204. [Google Scholar]
  3. Behrens, F., M. Mackeben, and W. Schroder-Preikschat. 2010. An improved algorithm for automatic detec-tion of saccades in eye movement data and for calcu-lating saccade parameters. Behav Res Methods 42, 3: 701–708. [Google Scholar] [CrossRef] [PubMed]
  4. Behrens, F., and L. R. Weiss. 1992. An algorithm sepa-rating saccadic from nonsaccadic eye movements au-tomatically by use of the acceleration signal. Vision Res 32, 5: 889–893. [Google Scholar] [PubMed]
  5. Benson, P. J., S. A. Beedie, E. Shephard, I. Giegling, D. Rujescu, and D. St. Clair. 2012. Simple Viewing Tests Can Detect Eye Movement Abnormalities That Distinguish Schizophrenia Cases from Controls with Exceptional Accuracy. Biological Psychiatry 72, 9: 716–724. [Google Scholar] [CrossRef]
  6. Bettenbühl, M., C. Paladini, K. Mergenthaler, R. Kliegl, R. Engbert, and M. Holschneider. 2010. Mi-crosaccade characterization using the continuous wavelet transform and principal component analysis. 3, 5. [Google Scholar] [CrossRef]
  7. Biscaldi, M., S. Gezeck, and V. Stuhr. 1998. Poor sac-cadic control correlates with dyslexia. Neuropsycho-logia 36, 11: 1189–1202. [Google Scholar] [CrossRef]
  8. Boukhalfi, T., C. Joyal, S. Bouchard, S. M. Neveu, and P. Renaud. 2015. Tools and Techniques for Real-time Data Acquisition and Analysis in Brain Comput-er Interface studies using qEEG and Eye Tracking in Virtual Reality Environment. IFAC-PapersOnLine 48, 3: 46–51. [Google Scholar] [CrossRef]
  9. Bulling, A., and H. Gellersen. 2010. Toward mobile eye-based human-computer interaction. IEEE Pervasive Computing 9, 4: 8–12. [Google Scholar]
  10. Crawford, J. D., J. C. Martinez-Trujillo, and E. M. Klier. 2003. Neural control of three-dimensional eye and head movements. Current Opinion in Neurobiology 13, 6: 655–662. [Google Scholar] [CrossRef]
  11. Dalmaijer, E. 2014. Is the low-cost EyeTribe eye track-er any good for research? PeerJ PrePrints 2: e585v1. [Google Scholar] [CrossRef]
  12. DiScenna, A. O., V. Das, A. Z. Zivotofsky, S. H. Seidman, and R. J. Leigh. 1995. Evaluation of a video track-ing device for measurement of horizontal and vertical eye rotations during locomotion. Journal of Neurosci-ence Methods 58, 1–2: 89–94. [Google Scholar] [CrossRef]
  13. Duchowski, Shivashankaraiah V., T. Rawls, A. K. Gramo-padhye, B. J. Melloy, and B. Kanki. 2000. Binocular eye tracking in virtual reality for inspection training. Paper presented at the Proceedings of the 2000 symposium on Eye tracking research & applica-tions, Palm Beach Gardens, Florida, USA. [Google Scholar]
  14. Duchowski, A. 2007. Eye Tracking Methodology: The-ory and Practice. Springer. [Google Scholar]
  15. Eden, G. F., J. F. Stein, H. M. Wood, and F. B. Wood. 1994. Differences in eye movements and reading problems in dyslexic and normal children. Vision Re-search 34, 10: 1345–1358. [Google Scholar] [CrossRef]
  16. Engbert, R., and R. Kliegl. 2003. Microsaccades uncover the orientation of covert attention. Vision Research 43, 9: 1035–1045. [Google Scholar] [CrossRef]
  17. Fernández, G., M. Sapognikoff, S. Guinjoan, D. Orozco, and O. Agamennoni. 2016. Word processing dur-ing reading sentences in patients with schizophrenia: evidences from the eyetracking technique. Compre-hensive Psychiatry 68: 193–200. [Google Scholar] [CrossRef]
  18. Fetter, M. 2007. Vestibulo-ocular reflex Neuro-ophthalmology. Karger Publish-ers: Vol. 40, pp. 35–51. [Google Scholar]
  19. Harris, C. M., and D. M. Wolpert. 2006. The main se-quence of saccades optimizes speed-accuracy trade-off. Biological Cybernetics 95, 1: 21–29. [Google Scholar] [CrossRef]
  20. Hessels, R. S., T. H. W. Cornelissen, C. Kemner, and I. T. C. Hooge. 2015. Qualitative tests of remote eyetracker recovery and performance during head ro-tation. Behavior Research Methods 47, 3: 848–859. [Google Scholar] [CrossRef] [PubMed]
  21. Inhoff, A. W., B. A. Seymour, D. Schad, and S. Greenberg. 2010. The size and direction of saccadic curva-tures during reading. Vision Res 50, 12: 1117–1130. [Google Scholar] [CrossRef]
  22. Isokoski, P., M. Joos, O. Spakov, and B. Martin. 2009. Gaze controlled games. Universal Access in the In-formation Society 8, 4: 323–337. [Google Scholar]
  23. Isokoski, P., and B. Martin. 2006. Eye tracker input in first person shooter games. Paper presented at the Proceedings of the 2nd Conference on Communica-tion by Gaze Interaction: Communication by Gaze In-teraction-COGAIN 2006: Gazing into the Future. [Google Scholar]
  24. Joseph, H. S., S. P. Liversedge, H. I. Blythe, S. J. White, and K. Rayner. 2009. Word length and landing po-sition effects during reading in children and adults. Vision Res 49, 16: 2078–2086. [Google Scholar] [CrossRef]
  25. Juhola, M., and I. Pyykko. 1987. Effect of sampling fre-quencies on the velocity of slow and fast phases of nystagmus. Int J Biomed Comput 20, 4: 253–263. [Google Scholar]
  26. Kassner, M., W. Patera, and A. Bulling. 2014. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. Paper presented at the Proceedings of the 2014 ACM international joint con-ference on pervasive and ubiquitous computing: Ad-junct publication. [Google Scholar]
  27. Kemner, C., M. N. Verbaten, J. M. Cuperus, G. Camffer-man, and H. van Engeland. 1998. Abnormal Sac-cadic Eye Movements in Autistic Children. Journal of Autism and Developmental Disorders 28, 1: 61–67. [Google Scholar] [CrossRef] [PubMed]
  28. Kliegl, R., A. Nuthmann, and R. Engbert. 2006. Track-ing the mind during reading: the influence of past, present, and future words on fixation durations. Jour-nal of experimental psychology: General 135, 1: 12. [Google Scholar]
  29. Kok, E. M., and H. Jarodzka. 2017. Before your very eyes: the value and limitations of eye tracking in med-ical education. Medical Education 51, 1: 114–122. [Google Scholar] [CrossRef]
  30. Krejtz, K., C. Biele, D. Chrzastowski, A. Kopacz, A. Niedzielska, P. Toczyski, and A. Duchowski. 2014. Gaze-controlled gaming: immersive and diffi-cult but not cognitively overloading. Paper presented at the Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Com-puting: Adjunct Publication. [Google Scholar]
  31. Lahey, J. N., and D. Oxley. 2016. The Power of Eye Tracking in Economics Experiments. American Eco-nomic Review 106, 5: 309–313. [Google Scholar] [CrossRef]
  32. Li, D., J. Babcock, and D. J. Parkhurst. 2006. openEyes: a low-cost head-mounted eye-tracking solution. Paper presented at the Proceedings of the 2006 symposium on Eye tracking research & applications, San Di-ego, California. [Google Scholar]
  33. Liversedge, S. P., and J. M. Findlay. 2000. Saccadic eye movements and cognition. Trends in Cognitive Sci-ences 4, 1: 6–14. [Google Scholar] [CrossRef]
  34. Marx, S., G. Respondek, M. Stamelou, S. Dowiasch, J. Stoll, F. Bremmer, and W. Einhauser. 2012. Valida-tion of mobile eye-tracking as novel and efficient means for differentiating progressive supranuclear palsy from Parkinson's disease. Frontiers in Behav-ioral Neuroscience 6, 88. [Google Scholar] [CrossRef]
  35. Molina, A. I., M. A. Redondo, C. Lacave, and M. Ortega. 2014. Assessing the effectiveness of new devices for accessing learning materials: An empirical analysis based on eye tracking and learner subjective percep-tion. Computers in Human Behavior 31: 475–490. [Google Scholar] [CrossRef]
  36. Niehorster, D. C., T. H. W. Cornelissen, K. Holmqvist, I. T. C. Hooge, and R. S. Hessels. 2017. What to ex-pect from your remote eye-tracker when participants are unrestrained. Behavior Research Methods, 1–15. [Google Scholar] [CrossRef]
  37. Nilsson Benfatto, M., G. Öqvist Seimyr, J. Ygge, T. Pan-sell, A. Rydberg, and C. Jacobson. 2016. Screen-ing for Dyslexia Using Eye Tracking during Reading. PLoS One 11, 12: e0165508. [Google Scholar] [CrossRef]
  38. Nyström, M., and K. Holmqvist. 2010. An adaptive algo-rithm for fixation, saccade, and glissade detection in eyetracking data. Behavior Research Methods 42, 1: 188–204. [Google Scholar] [CrossRef]
  39. Oliveira, D., L. Machín, R. Deliza, A. Rosenthal, E. H. Wal-ter, A. Giménez, and G. Ares. 2016. Consum-ers' attention to functional food labels: Insights from eye-tracking and change detection in a case study with probiotic milk. LWT-Food Science and Tech-nology 68: 160–167. [Google Scholar] [CrossRef]
  40. Ooms, K., L. Dupont, L. Lapon, and S. Popelka. 2015. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different exper-imental setups. Journal of eye movement research 8, 1. [Google Scholar]
  41. Pfeiffer, T., and C. Memili. 2016. Model-based real-time visualization of realistic three-dimensional heat maps for mobile eye tracking and eye tracking in virtual re-ality. Paper presented at the Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, South Carolina. [Google Scholar]
  42. Pfeiffer, T., and P. Renner. 2014. EyeSee3D: a low-cost approach for analyzing mobile 3D eye tracking data using computer vision and augmented reality technol-ogy. Paper presented at the Proceedings of the Sym-posium on Eye Tracking Research and Applications, Safety Harbor, Florida. [Google Scholar]
  43. Pollatsek, A., and K. Rayner. 1982. Eye movement con-trol in reading: The role of word boundaries. Journal of Experimental Psychology: Human Perception and Performance 8, 6: 817–833. [Google Scholar] [CrossRef]
  44. Quinlivan, B., J. S. Butler, I. Beiser, L. Williams, E. McGovern, S. O'Riordan, and R. B. Reilly. 2016. Application of virtual reality head mounted display for investigation of movement: a novel effect of ori-entation of attention. J Neural Eng 13, 5: 056006. [Google Scholar] [CrossRef] [PubMed]
  45. Rayner. 1998. Eye movements in reading and infor-mation processing: 20 years of research. Psychol Bull 124, 3: 372–422. [Google Scholar] [PubMed]
  46. Rayner. 2009. Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology 62, 8: 1457–1506. [Google Scholar] [CrossRef]
  47. Rayner, and G. W. McConkie. 1976. What guides a read-er's eye movements? Vision Research 16, 8: 829–837. [Google Scholar] [CrossRef]
  48. Rayner, K., A. Pollatsek, J. Ashby, and C. Clifton, Jr. 2012. Psychology of reading. Psychology Press. [Google Scholar]
  49. Rifai, K., and S. Wahl. 2016. Specific eye–head coordi-nation enhances vision in progressive lens wearers. Journal of Vision 16, 11: 5–5. [Google Scholar] [CrossRef]
  50. Rosch, J. L., and J. J. Vogel-Walcutt. 2013. A review of eye-tracking applications as tools for training. Cogni-tion, Technology & Work 15, 3: 313–327. [Google Scholar] [CrossRef]
  51. Rosenhall, U., E. Johansson, and C. Gillberg. 2007. Oculomotor findings in autistic children. The Journal of Laryngology & Otology 102, 5: 435–439. [Google Scholar] [CrossRef]
  52. Salvucci, D. D., and J. H. Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. Pa-per presented at the Proceedings of the 2000 sympo-sium on Eye tracking research & applications, Palm Beach Gardens, Florida, USA. [Google Scholar]
  53. Schneider, W. X., and H. Deubel. 1995. Edited by R. W. John M. Findlay and W. K. Robert. Visual Attention and Saccadic Eye Movements: Evidence for Obligato-ry and Selective Spatial Coupling. In Studies in Visual In-formation Processing. North-Holland: Vol. Volume 6, pp. 317–324. [Google Scholar]
  54. Sereno, A. B., and P. S. Holzman. 1995. Antisaccades and smooth pursuit eye movements in schizophrenia. Biological Psychiatry 37, 6: 394–401. [Google Scholar] [CrossRef] [PubMed]
  55. Shulgovskiy, V. V., M. V. Slavutskaya, I. S. Lebedeva, S. A. Karelin, V. V. Moiseeva, A. P. Kulaichev, and V. G. Kaleda. 2015. Saccadic responses to consecu-tive visual stimuli in healthy people and patients with schizophrenia. Human Physiology 41, 4: 372–377. [Google Scholar] [CrossRef]
  56. t Hart, B. M., J. Vockeroth, F. Schumann, K. Bartl, E. Schneider, P. König, and W. Einhäuser. 2009. Gaze allocation in natural stimuli: Comparing free exploration to head-fixed viewing conditions. Visual Cognition 17, 6-7: 1132–1158. [Google Scholar] [CrossRef]
  57. Tanriverdi, V., and R. J. K. Jacob. 2000. Interacting with eye movements in virtual environments. Paper pre-sented at the Proceedings of the SIGCHI conference on Human Factors in Computing Systems, The Hague, The Netherlands. [Google Scholar]
  58. van der Geest, J. N., and M. A. Frens. 2002. Recording eye movements with video-oculography and scleral search coils: a direct comparison of two methods. Journal of Neuroscience Methods, 114, 2, 185–195. [Google Scholar] [CrossRef]
  59. Vickers, S., H. Istance, and A. Hyrskykari. 2013. Per-forming locomotion tasks in immersive computer games with an adapted eye-tracking interface. ACM Transactions on Accessible Computing (TACCESS), vol. 5, p. 2. [Google Scholar]
  60. Vidal, M., J. Turner, A. Bulling, and H. Gellersen. 2012. Wearable eye tracking for mental health monitoring. Computer Communications 35, 11: 1306–1311. [Google Scholar] [CrossRef]
  61. Vitu, F., G. W. McConkie, P. Kerr, and J. K. O'Regan. 2001. Fixation location effects on fixation durations during reading: an inverted optimal viewing position effect. Vision Research 41, 25–26: 3513–3533. [Google Scholar] [CrossRef]
  62. Vitu, F., J. K. O'Regan, A. W. Inhoff, and R. Topolski. 1995. Mindless reading: eye-movement characteris-tics are similar in scanning letter strings and reading texts. Percept Psychophys 57, 3: 352–364. [Google Scholar]
  63. Wedel, M. 2013. Attention research in marketing: A review of eye tracking studies. Robert H. Smith School Research Paper No. RHS, 2460289. [Google Scholar]
  64. Yang, S. N., and G. W. McConkie. 2001. Eye movements during reading: a theory of saccade initiation times. Vision Research 41, 25–26: 3567–3585. [Google Scholar] [CrossRef]
Figure 1. Comparison of the corneal infrared reflections from the stationary (red arrow) and the mobile (blue arrows) eye tracker. (a) represents the image from the stationary and (b) from the mobile eye tracker in simultaneous use.
Figure 1. Comparison of the corneal infrared reflections from the stationary (red arrow) and the mobile (blue arrows) eye tracker. (a) represents the image from the stationary and (b) from the mobile eye tracker in simultaneous use.
Jemr 10 00016 g001
Figure 2. Raw data from gaze traces superimposed to the stimulus text. (a) illustrates the recorded eye position with a 1000 Hz eye tracker (EyeLink 1000) and in (b) at a sampling rate of 120 Hz (SMI mobile glasses). Gaze data and text were aligned manually by the author (horizontal and vertical stretching).
Figure 2. Raw data from gaze traces superimposed to the stimulus text. (a) illustrates the recorded eye position with a 1000 Hz eye tracker (EyeLink 1000) and in (b) at a sampling rate of 120 Hz (SMI mobile glasses). Gaze data and text were aligned manually by the author (horizontal and vertical stretching).
Jemr 10 00016 g002
Figure 3. Mean number of fixations (a) and saccades (b), +/- standard deviation (SD). (c) and (d) present the mean fixation and saccade durations, respectively. Asterisks indicate the significance level: * α < 0.05, *** α < 0.001.
Figure 3. Mean number of fixations (a) and saccades (b), +/- standard deviation (SD). (c) and (d) present the mean fixation and saccade durations, respectively. Asterisks indicate the significance level: * α < 0.05, *** α < 0.001.
Jemr 10 00016 g003
Figure 4. Mean frequency distribution (n = 11) of fixation durations for the stationary and the (a) 60 Hz and (b) 120 Hz mobile eye tracker in milliseconds (ms).
Figure 4. Mean frequency distribution (n = 11) of fixation durations for the stationary and the (a) 60 Hz and (b) 120 Hz mobile eye tracker in milliseconds (ms).
Jemr 10 00016 g004
Table 1. Relative comparison of two mobile eye trackers to a stationary eye tracker. Mean and standard deviation (SD) for the number and the duration of saccades and fixations. Asterisks indicate the significance level: * α < 0.05; n = 11.
Table 1. Relative comparison of two mobile eye trackers to a stationary eye tracker. Mean and standard deviation (SD) for the number and the duration of saccades and fixations. Asterisks indicate the significance level: * α < 0.05; n = 11.
60 Hz mobile eye tracker 120 Hz mobile eyetracker Relative difference between mobile eye trackers
Mean ± SD
Number of saccades56.11 ± 12.44 %68.37 ± 13.97 %12.25 % *
Duration of saccades (ms)-10.81 ± 7.51 ms-4.89 ± 2.76 ms5.91 ms *
Number of fixations76.72 ± 18.67 %86.41 ± 15.43 %9.69 %
Duration of fixations (ms)10.55 ± 10.13 ms4.30 ± 14.33 ms6.25 ms

Share and Cite

MDPI and ACS Style

Leube, A.; Rifai, K.; Wahl, S. Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task. J. Eye Mov. Res. 2017, 10, 1-11. https://doi.org/10.16910/jemr.10.3.3

AMA Style

Leube A, Rifai K, Wahl S. Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task. Journal of Eye Movement Research. 2017; 10(3):1-11. https://doi.org/10.16910/jemr.10.3.3

Chicago/Turabian Style

Leube, Alexander, Katharina Rifai, and Siegfried Wahl. 2017. "Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task" Journal of Eye Movement Research 10, no. 3: 1-11. https://doi.org/10.16910/jemr.10.3.3

APA Style

Leube, A., Rifai, K., & Wahl, S. (2017). Sampling Rate Influences Saccade Detection in Mobile Eye tracking of a Reading Task. Journal of Eye Movement Research, 10(3), 1-11. https://doi.org/10.16910/jemr.10.3.3

Article Metrics

Back to TopTop