A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research
Abstract
:Introduction
Methods
Set-up and hardware
Software and estimated achievable spatial resolution
Real-time noise analysis
Calibration procedure
Effects of pupil size on pupil center positions
Automated tests of gaze accuracy and gaze precision, and comparisons to the commercial EyeLink system
Data recording
Participants
Measurements using artificial eyes
Binocular vergence eye movement measurements
Data analysis
Results
Precision and accuracy using artificial eyes
Raw data plots (human participants)
Precision and accuracy with participants
Saccade and microsaccade metrics
Binocular measurements
Discussion
Ethics and Conflict of Interest
Acknowledgments
Appendix
- -
- screen resolution (here 1920x1080 Pix),
- -
- video magnification (pix/mm, here 35.5),
- -
- distance of subject to screen (550 mm),
- -
- horizontal distance between cameras (here 80 mm),
- -
- distance from the camera to LEDs (here 80 mm),
- -
- distance from the camera to the eye (here 250 mm).
- -
- all calibration parameters,
- -
- frame number,
- -
- time determined from frame rate,
- -
- pupil diameters,
- -
- eye positions in screen coordinates (in floating point pixel coordinates) in x and y direction,
- -
- vergence determined from eye positions and after automated correction for pupil centration artifacts (in arcmin),
- -
- the timing of a trigger signal that is linked to the appearance of a new fixation target and was used in the current study to synchronize our eye tracker to the EyeLink 1000 Plus for comparison.
- a video showing the procedures can be downloadedhere: https://www.dropbox.com/s/7k6c6h37nljzl3i/DEMO%20eye% 20tracker%20Feb%202021.wmv?dl=0
- the software of the eye tracker, with libraries, camera drivers and IC Imaging Control 3.1: https://www.dropbox.com/sh/kpejv5p8ud6bxwl/AABRs6950UOxUU2FmA8Er0ya?dl=0
- instructions for the eye tracker set-up: https://www.dropbox.com/s/e8dck6ld6hg91v6/Instructions%20binocular%20eye%20tracker.pdf?dl=0 .
References
- Baden, T., M. B. Maina, A. Maia Chagas, Y. G. Mohammed, T. O. Auer, A. Silbering, L. von Tobel, M. Pertin, R. Hartig, J. Aleksic, I. Akinrinade, M. A. Awadelkareem, A. Koumoundourou, A. Jones, F. Arieti, A. Beale, D. Münch, S. C. Salek, S. Yusuf, and L. L. Prieto-Godino. 2020. TReND in Africa: Toward a Truly Global (Neuro)science Community. Neuron, S0896-6273(20)30480-3, Advance online publication. [Google Scholar] [CrossRef]
- Barsingerhorn, A. D., F. N. Boonstra, and H. H. Goossens. 2017. Optics of the human cornea influence the accuracy of stereo eye-tracking methods: a simulation study. Biomedical optics express 8, 2: 712–725. [Google Scholar] [CrossRef] [PubMed]
- Bartl, K., C. Siebold, S. Glasauer, C. Helmchen, and U. Büttner. 1996. A simplified calibration method for three-dimensional eye movement recordings using search-coils. Vision research 36, 7: 997–1006. [Google Scholar] [CrossRef] [PubMed]
- Brand, J., S. G. Diamond, N. Thomas, and D. Gilbert-Diamond. 2020. Evaluating the data quality of the Gazepoint GP3 low-cost eye tracker when used independently by study participants. Behavior Research Methods, 1–13. [Google Scholar] [CrossRef]
- Brodie, S. E. 1987. Photographic calibration of the Hirschberg test. Invest. Ophtbalmol. Vis. Sci. 28: 736–742. [Google Scholar]
- Bellet, M. E., J. Bellet, H. Nienborg, Z. M. Hafed, and P. Berens. 2019. Human-level saccade detection performance using deep neural networks. Journal of neurophysiology 121, 2: 646–661. [Google Scholar] [CrossRef] [PubMed]
- Carr, D. B., and P. Grover. 2020. The Role of Eye Tracking Technology in Assessing Older Driver Safety. Geriatrics (Basel, Switzerland) 5, 2: 36. [Google Scholar] [CrossRef]
- Chen, C. Y., and Z. M. Hafed. 2013. Postmicrosaccadic enhancement of slow eye movements. The Journal of neuroscience: the official journal of the Society for Neuroscience 33, 12: 5375–5386. [Google Scholar] [CrossRef]
- Cornsweet, T. N., and H. D. Crane. 1973. Accurate two-dimensional eye tracker using first and fourth Purkinje images. JOSA 63, 8: 921–928. [Google Scholar] [CrossRef]
- Crane, H. D., and C. M. Steele. 1985. Generation-V dual-Purkinje-image eyetracker. Applied optics 24, 4: 527–537. [Google Scholar] [CrossRef]
- Dalmaijer, E. 2014. Is the low-cost EyeTribe eye tracker any good for research? (No. e585v1). PeerJ PrePrints. [Google Scholar] [CrossRef]
- Frutos-Pascual, M., and B. Garcia-Zapirain. 2015. Assessing visual attention using eye tracking sensors in intelligent cognitive therapies based on serious games. Sensors (Basel, Switzerland) 15, 5: 11092. [Google Scholar] [PubMed]
- . [CrossRef]
- Hafed, Z. M. 2013. Alteration of visual perception prior to microsaccades. Neuron 77, 4: 775–786. [Google Scholar] [CrossRef] [PubMed]
- Hafed, Z. M., C. Y. Chen, and X. Tian. 2015. Vision, perception, and attention through the lens of microsaccades: mechanisms and implications. Frontiers in systems neuroscience 9: 167. [Google Scholar] [CrossRef]
- Holmqvist, K., M. Nyström, and Fiona Mulvey. 2012. Eye tracker data quality: What it is and how to measure it. Eye Tracking Research and Applications Symposium (ETRA). [Google Scholar] [CrossRef]
- Hosp, B., S. Eivazi, M. Maurer, W. Fuhl, D. Geisler, and E. Kasneci. 2020. RemoteEye: An opensource high-speed remote eye tracker. Behavior research methods, 1–15. [Google Scholar] [CrossRef]
- Houben, M. M., J. Goumans, and J. van der Steen. 2006. Recording three-dimensional eye movements: scleral search coils versus video oculography. Investigative ophthalmology & visual science 47, 1: 179–187. [Google Scholar] [CrossRef]
- Hutchinson, T. E., K. P. White, W. N. Martin, K. C. Reichert, and L. A. Frey. 1989. Human-computer interaction using eye-gaze input. IEEE Transactions on systems, man, and cybernetics, vol. 19, pp. 1527–1534. [Google Scholar]
- Imai, T., K. Sekine, K. Hattori, N. Takeda, I. Koizuka, K. Nakamae, and T. Kubo. 2005. Comparing the accuracy of video-oculography and the scleral search coil system in human eye movement analysis. Auris Nasus Larynx 32, 1: 3–9. [Google Scholar] [CrossRef]
- Janthanasub, V., and P. Meesad. 2015. Evaluation of a low-cost eye tracking system for computer in-put. Applied Science and Engineering Progress 8, 3: 185–196. [Google Scholar] [CrossRef]
- Karikari, T. K., A. E. Cobham, and I. S. Ndams. 2016. Building sustainable neuroscience capacity in Africa: the role of non-profit organisations. Metabolic brain disease 31, 1: 3–9. [Google Scholar] [CrossRef]
- Ko, H. K., M. Poletti, and M. Rucci. 2010. Microsaccades precisely relocate gaze in a high visual acuity task. Nature neuroscience 13, 12: 1549–1553. [Google Scholar] [CrossRef]
- Letaief, M., N. Rezzoug, and P. Gorce. 2019. Comparison between joystickand gaze-controlled electric wheelchair during narrow doorway crossing: Feasibility study and movement analysis. Assistive technology: the official journal of RESNA, 1–12, Advance online publication. [Google Scholar] [CrossRef]
- Lu, Z., X. Coster, and J. de Winter. 2017. How much time do drivers need to obtain situation awareness? A laboratory-based study of automated driving. Applied ergonomics 60: 293–304. [Google Scholar] [CrossRef]
- Malevich, T., A. Buonocore, and Z. M. Hafed. 2020. Rapid stimulus-driven modulation of slow ocular position drifts. eLife 9: e57595. [Google Scholar] [CrossRef] [PubMed]
- Martinez-Conde, S., S. L. Macknik, and D. H. Hubel. 2004. The role of fixational eye movements in visual perception. Nature reviews neuroscience 5, 3: 229–240. [Google Scholar] [CrossRef]
- Meyer, A. F., J. Poort, J. O’Keefe, M. Sahani, and J. F. Linden. 2018. A head-mounted camera system integrates detailed behavioral monitoring with multichannel electrophysiology in freely moving mice. Neuron 100, 1: 46–60. [Google Scholar] [CrossRef]
- Morgante, J. D., R. Zolfaghari, and S. P. Johnson. 2012. A critical test of temporal and spatial accuracy of the Tobii T60XL eye tracker. Infancy 17, 1: 9–32. [Google Scholar] [CrossRef] [PubMed]
- Niehorster, D. C., R. Zemblys, and K. Holmqvist. 2020. Is apparent fixational drift in eye-tracking data due to filters or eyeball rotation? Behavior Research Methods, 1–14. [Google Scholar] [CrossRef]
- Ooms, K., L. Dupont, L. Lapon, and S. Popelka. 2015. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental setups. Journal of eye movement research 8, 1. [Google Scholar] [CrossRef]
- Orlov, P. A., and N. Apraksin. 2015. The Effectiveness of Gaze-Contingent Control in Computer Games. Perception 44, 8-9: 1136–1145. [Google Scholar] [CrossRef]
- Payne, H. L., and J. L. Raymond. 2017. Magnetic eyetracking in mice. eLife 6: e29222. [Google Scholar] [CrossRef]
- Raynowska, J., J. R. Rizzo, J. C. Rucker, W. Dai, J. Birkemeier, J. Hershowitz, and T. Hudson. 2018. Validity of low-resolution eye-tracking to assess eye movements during a rapid number naming task: performance of the eyetribe eye tracker. Brain injury 32, 2: 200–208. [Google Scholar] [CrossRef]
- Robinson, D. A. 1963. A method of measuring eye movemnent using a scleral search coil in a magnetic field. IEEE Transactions on bio-medical electronics 10, 4: 137–145. [Google Scholar] [CrossRef]
- Rucci, M. 2008. Fixational eye movements, natural image statistics, and fine spatial vision. Network: Computation in Neural Systems 19, 4: 253–285. [Google Scholar] [CrossRef]
- Schaeffel, F. 2002. Kappa and Hirschberg ratio measured with an automated video gaze tracker. Optometry and vision science: official publication of the American Academy of Optometry 79, 5: 329–334. [Google Scholar] [CrossRef] [PubMed]
- Strobl, M., F. Lipsmeier, L. R. Demenescu, C. Gossens, M. Lindemann, and M. De Vos. 2019. Look me in the eye: evaluating the accuracy of smartphonebased eye tracking for potential application in autism spectrum disorder research. Biomedical engineering online 18, 1: 51. [Google Scholar] [CrossRef]
- Tabernero, J., and P. Artal. 2014. Lens Oscillations in the Human Eye. Implications for Post-Saccadic Suppression of Vision. PLoS ONE 9. [Google Scholar] [CrossRef]
- Tian, X., M. Yoshida, and Z. M. Hafed. 2016. A microsaccadic account of attentional capture and inhibition of return in Posner cueing. Frontiers in systems neuroscience 10: 23. [Google Scholar] [CrossRef]
- van der Geest, J. N., and M. A. Frens. 2002. Recording eye movements with video-oculography and scleral search coils: a direct comparison of two methods. Journal of neuroscience methods 114, 2: 185–195. [Google Scholar] [CrossRef]
- Wang, D., F. B. Mulvey, J. B. Pelz, and K. Holmqvist. 2017. A study of artificial eyes for the measurement of precision in eye-trackers. Behavior Research Methods 49, 3: 947–959. [Google Scholar] [CrossRef]
- Wästlund, E., K. Sponseller, O. Pettersson, and A. Bared. 2015. Evaluating gaze-driven power wheelchair with navigation support for persons with disabilities. Journal of rehabilitation research and development 52, 7: 815–826. [Google Scholar] [CrossRef]
- Wildenmann, U., and F. Schaeffel. 2013. Variations of pupil centration and their effects on video eye tracking. Ophthalmic and Physiological Optics 33, 6: 634–641. [Google Scholar] [CrossRef]
- Willeke, K. F., X. Tian, A. Buonocore, J. Bellet, A. Ramirez-Cardenas, and Z. M. Hafed. 2019. Memory-guided microsaccades. Nature communications 10, 1: 1–14. [Google Scholar] [CrossRef]
- Zoccolan, D., B. J. Graham, and D. D. Cox. 2010. A self-calibrating, camera-based eye tracker for the recording of rodent eye movements. Frontiers in neuroscience 4: 193. [Google Scholar] [CrossRef]
Custom-built eye tracker | EyeLink 1000 Plus | |
Precision (RMS) | ||
horizontal | 0.0353 (0.0028) | 0.0406 (0.0091) |
vertical | 0.003 (1.6092e-04) | 0.0032 (1.3606e-04) |
Precision (standarddeviation) | ||
horizontal | 0.0252 (0.0018) | 0.0361 (0.0062) |
vertical | 0.0061 (3.2828e-04) | 0.0074 (0.0022) |
Custom-built eye tracker | EyeLink 1000 Plus | |
Precision (RMS) | ||
horizontal | 0.0457 (0.0301) | 0.0202 (0.0297) |
vertical | 0.0467 (0.0310) | 0.0271 (0.0403) |
Precision (standard deviation) horizontal | 0.1953 (0.1861) | 0.1746 (0.1972) |
vertical | 0.1984 (0.1812) | 0.2160 (0.1944) |
Accuracy | ||
horizontal | 0.3858 (0.2488) | 0.5504 (0.3051) |
vertical | 0.4750 (0.4718) | 1.0192 (0.7170) |
Characteristics | Custom-built eye tracker | EyeLink 1000 Plus |
Spatial precision (artificial eyes) | 0.0191 | 0.0219 |
Spatial precision (participants) | 0.0462 | 0.0236 |
Spatial accuracy | 0.4304 | 0.7848 |
Sampling rate | 395 Hz | 1 kHz (binocular), |
2 kHz (monocular) | ||
Real-time automated noise analysis | yes no | Real-time pupil |
artifact correction | yes no | Gaze-contingent |
experiments | yes | no |
Copyright © 2021. This article is licensed under a Creative Commons Attribution 4.0 International License.
Share and Cite
Ivanchenko, D.; Rifai, K.; Hafed, Z.M.; Schaeffel, F. A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research. J. Eye Mov. Res. 2021, 14, 1-21. https://doi.org/10.16910/jemr.14.3.3
Ivanchenko D, Rifai K, Hafed ZM, Schaeffel F. A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research. Journal of Eye Movement Research. 2021; 14(3):1-21. https://doi.org/10.16910/jemr.14.3.3
Chicago/Turabian StyleIvanchenko, Daria, Katharina Rifai, Ziad M. Hafed, and Frank Schaeffel. 2021. "A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research" Journal of Eye Movement Research 14, no. 3: 1-21. https://doi.org/10.16910/jemr.14.3.3
APA StyleIvanchenko, D., Rifai, K., Hafed, Z. M., & Schaeffel, F. (2021). A Low-Cost, High-Performance Video-Based Binocular Eye Tracker for Psychophysical Research. Journal of Eye Movement Research, 14(3), 1-21. https://doi.org/10.16910/jemr.14.3.3