Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection
Abstract
:1. Introduction
2. Related Work
2.1. The Eye Tribe and its Low-Cost Eye Tracker
2.2. Research with Webcams or DIY Eye Trackers
2.3. Eye Tracking Data Quality
2.4. Fixation Identification Algorithms and Thresholds
3. Materials and Methods
3.1. Participants
3.2. Apparatus
3.3. Stimuli and Tasks
3.4. Procedure
- The recordings during the first second of a trial is removed as the participant has to redirect its gaze towards the target point.
- The recordings indicated by a state “8” are removed. This corresponds to the situation where the eye tracker was unable to determine the position of the right and left eye.
4. Results
4.1. Artificial Participant
4.2. Real Participants
5. Discussion
6. Conclusions & Future Work
Author Contributions
Funding
Conflicts of Interest
References
- Mele, M.L.; Federici, S. Gaze and eye-tracking solutions for psychological research. Cognit. Process. 2012, 13, 261–265. [Google Scholar] [CrossRef] [PubMed]
- Sharma, A.; Abrol, P. Eye Gaze Techniques for Human Computer Interaction: A Research Survey. Int. J. Comput. Appl. 2013, 71. [Google Scholar] [CrossRef]
- Poole, A.; Ball, L.J. Eye tracking in HCI and usability research: Current Status and Future Prospects. In Encyclopedia of Human Computer Interaction; Ghaoudi, C., Ed.; Idea Group: Pennsylvania, PA, USA, 2005; pp. 211–219. [Google Scholar]
- Rosch, J.L.; Vogel-Walcutt, J.J. A review of eye-tracking applications as tools for training. Cognit. Technol. Work 2013, 15, 313–327. [Google Scholar] [CrossRef]
- Lai, M.L.; Tsai, M.J.; Yang, F.Y.; Hsu, C.Y.; Liu, T.C.; Lee, S.W.Y.; Lee, M.H.; Chiou, G.L.; Liang, J.C.; Tsai, C.C. A review of using eye-tracking technology in exploring learning from 2000 to 2012. Educ. Res. Rev. 2013, 10, 90–115. [Google Scholar] [CrossRef]
- Wedel, M.; Pieters, R. A review of eye-tracking research in marketing. Rev. Market. Res. 2008, 4, 123–147. [Google Scholar]
- Clifton, C.; Ferreira, F.; Henderson, J.M.; Inhoff, A.W.; Liversedge, S.P.; Reichle, E.D.; Schotter, E.R. Eye movements in reading and information processing: Keith Rayner’s 40year legacy. J. Mem. Lang. 2016, 86, 1–19. [Google Scholar] [CrossRef]
- Buswell, G.T. How People Look at Pictures; University of Chicago Press: Chicago, IL, USA, 1935. [Google Scholar]
- Yarbus, A.L. Eye Movements and Vision; Plenum Press: New York, NY, USA, 1967. [Google Scholar]
- Nyström, M.; Holmqvist, K. An adaptive algorithm for fixation, saccade, and glissade detection in eyetracking data. Behav. Res. Methods 2010, 42, 188–204. [Google Scholar] [CrossRef] [PubMed]
- Mould, M.S.; Foster, D.H.; Amano, K.; Oakley, J.P. A simple nonparametric method for classifying eye fixations. Vis. Res. 2012, 57, 18–25. [Google Scholar] [CrossRef] [PubMed]
- Krassanakis, V.; Filippakopoulou, V.; Nakos, B. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. J. Eye Mov. Res. 2014, 7, 1–10. [Google Scholar]
- Larsson, L.; Nyström, M.; Andersson, R.; Stridh, M. Detection of fixations and smooth pursuit movements in high-speed eye-tracking data. Biomed. Signal Process. Control 2015, 18, 145–152. [Google Scholar] [CrossRef]
- Gitelman, D.R. ILAB: A program for postexperimental eye movement analysis. Behav. Res. Methods Instrum. Comput. 2002, 34, 605–612. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Voßkühler, A.; Nordmeier, V.; Kuchinke, L.; Jacobs, A.M. OGAMA (Open Gaze and Mouse Analyzer): Open-source software designed to analyze eye and mouse movements in slideshow study designs. Behav. Res. Methods 2008, 40, 1150–1162. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Berger, C.; Winkels, M.; Lischke, A.; Höppner, J. GazeAlyze: A MATLAB toolbox for the analysis of eye movement data. Behav. Res. Methods 2012, 44, 404–419. [Google Scholar] [CrossRef] [PubMed]
- Zhegallo, A.V.; Marmalyuk, P.A. ETRAN—R Extension Package for Eye Tracking Results Analysis. Perception 2015, 44, 1129–1135. [Google Scholar] [CrossRef] [PubMed]
- Camilli, M.; Nacchia, R.; Terenzi, M.; Di Nocera, F. ASTEF: A simple tool for examining fixations. Behav. Res. Methods 2008, 40, 373–382. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Di Nocera, F.; Capobianco, C.; Mastrangelo, S. A Simple (r) Tool for Examining Fixations. J. Eye Mov. Res. 2016, 9, 1–6. [Google Scholar]
- Xu, P.; Ehinger, K.A.; Zhang, Y.; Finkelstein, A.; Kulkarni, S.R.; Xiao, J. TurkerGaze: Crowdsourcing saliency with webcam based eye tracking. arXiv, 2015; arXiv:1504.06755. [Google Scholar]
- Gómez-Poveda, J.; Gaudioso, E. Evaluation of temporal stability of eye tracking algorithms using webcams. Expert Syst. Appl. 2016, 64, 69–83. [Google Scholar] [CrossRef]
- Papoutsaki, A.; Daskalova, N.; Sangkloy, P.; Huang, J.; Laskey, J.; Hays, J. WebGazer: Scalable Webcam Eye Tracking Using User Interactions. In Proceedings of the 25th International Joint Conference on Artificial Intelligence (IJCAI), New York, NY, USA, 9–15 July 2016; pp. 3839–3845. [Google Scholar]
- Ferhat, O.; Vilariño, F.; Sanchez, F.J. A cheap portable eye-tracker solution for common setups. J. Eye Mov. Res. 2014, 7, 1–10. [Google Scholar]
- Skodras, E.; Kanas, V.G.; Fakotakis, N. On visual gaze tracking based on a single low cost camera. Signal Process. Image Commun. 2015, 36, 29–42. [Google Scholar] [CrossRef]
- Parada, F.J.; Wyatte, D.; Yu, C.; Akavipat, R.; Emerick, B.; Busey, T. ExpertEyes: Open-source, high-definition eyetracking. Behav. Res. Methods 2015, 47, 73–84. [Google Scholar] [CrossRef] [PubMed]
- Ooms, K.; Dupont, L.; Lapon, L.; Popelka, S. Accuracy and precision of fixation locations recorded with the low-cost Eye Tribe tracker in different experimental set-ups. J. Eye Mov. Res. 2015, 8, 17–26. [Google Scholar]
- Ferhat, O.; Vilariño, F. Low Cost Eye Tracking: The Current Panorama. Comput. Intell. Neurosci. 2016, 2016, 8680541. [Google Scholar] [CrossRef] [PubMed]
- Kasprowski, P.; Harezlak, K. Using non-calibrated eye movement data to enhance human computer interfaces. In Intelligent Decision Technologies; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 347–356. [Google Scholar]
- Rodrigue, M.; Son, J.; Giesbrecht, B.; Turk, M.; Höllerer, T. Spatio-Temporal Detection of Divided Attention in Reading Applications Using EEG and Eye Tracking. In Proceedings of the 20th International Conference on Intelligent User Interfaces, Atlanta, GA, USA, 29 March–1 April 2015; pp. 121–125. [Google Scholar]
- Kim, J.; Thomas, P.; Sankaranarayana, R.; Gedeon, T.; Yoon, H.J. Understanding eye movements on mobile devices for better presentation of search results. J. Assoc. Inf. Sci. Technol. 2016, 66, 526–544. [Google Scholar] [CrossRef]
- Rajanna, V.; Hammond, T. GAWSCHI: Gaze-augmented, wearable-supplemented computer-human interaction. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 233–236. [Google Scholar]
- Dalmaijer, E. Is the low-cost EyeTribe eye tracker any good for research? (No. e585v1). PeerJ PrePrints 2014, 2, e585v1. [Google Scholar]
- Popelka, S.; Stachoň, Z.; Šašinka, Č.; Doležalová, J. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes. Comput. Intell. Neurosci. 2016, 2016, 9172506. [Google Scholar] [CrossRef] [PubMed]
- Bækgaard, P.; Petersen, M.K.; Larsen, J.E. Thinking outside of the box or enjoying your 2 seconds of frame? In Proceedings of the International Conference on Universal Access in Human-Computer Interaction, Los Angeles, CA, USA, 2–7 August 2015.
- Brennan, C.; McCullagh, P.; Lightbody, G.; Galway, L.; Feuser, D.; González, J.L.; Martin, S. Accessing Tele-Services using a Hybrid BCI Approach. In Proceedings of the International Work-Conference on Artificial Neural Networks, Palma de Mallorca, Spain, 10–12 June 2015. [Google Scholar]
- Li, D.; Babcock, J.; Parkhurst, D.J. OpenEyes: A Low-cost head-mounted eye-tracking solution. In Proceedings of the 2006 Symposium on Eye Tracking Research & Applications, San Diego, CA, USA, 27–29 March 2006. [Google Scholar]
- San Agustin, J.; Skovsgaard, H.; Mollenbach, E.; Barret, M.; Tall, M.; Hansen, D.W.; Hansen, J.P. Evaluation of a low-cost open-source gaze tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications, Santa Barbara, CA, USA, 28–30 March 2010. [Google Scholar]
- Semmelmann, K.; Weigelt, S. Online webcam-based eye tracking in cognitive science: A first look. Behav. Res. Methods 2018, 50, 451–465. [Google Scholar] [CrossRef] [PubMed]
- Cheng, S.; Sun, Z.; Ma, X.; Forlizzi, J.L.; Hudson, S.E.; Dey, A. Social Eye Tracking: Gaze Recall with Online Crowds. In Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, Vancouver, BC, Canada, 14–18 March 2015. [Google Scholar]
- Holmqvist, K.; Nyström, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; Van de Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures; Oxford University Press: Oxford, UK, 2011. [Google Scholar]
- Holmqvist, K.; Nyström, M.; Mulvey, F. Eye tracker data quality: What it is and how to measure it. In Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012; pp. 45–52. [Google Scholar]
- Nyström, M.; Andersson, R.; Holmqvist, K.; Van De Weijer, J. The influence of calibration method and eye physiology on eyetracking data quality. Behav. Res. Methods 2013, 45, 272–288. [Google Scholar] [CrossRef] [PubMed]
- Blignaut, P.; Wium, D. Eye-tracking data quality as affected by ethnicity and experimental design. Behav. Res. Methods 2014, 46, 67–80. [Google Scholar] [CrossRef] [PubMed]
- Hessels, R.S.; Andersson, R.; Hooge, I.T.; Nyström, M.; Kemner, C. Consequences of Eye Color, Positioning, and Head Movement for Eye-Tracking Data Quality in Infant Research. Infancy 2015, 20, 601–633. [Google Scholar] [CrossRef]
- Clemotte, A.; Velasco, M.A.; Torricelli, D.; Raya, R.; Ceres, R. Accuracy and precision of the Tobii X2-30 eye-tracking under non ideal conditions. In Proceedings of the International Congress on Neurotechnology, Electronics and Informatics (NEUROTECHNIX 2014), Roma, Italy, 25–26 October 2014. [Google Scholar]
- Mantiuk, R. Accuracy of High-End and Self-build Eye-Tracking Systems. In International Multi-Conference on Advanced Computer Systems; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 216–227. [Google Scholar]
- Gibaldi, A.; Vanegas, M.; Bex, P.J.; Maiello, G. Evaluation of the Tobii EyeX Eye Tracking controller and Matlab toolkit for research. Behav. Res. Methods 2017, 49, 923–946. [Google Scholar] [CrossRef] [PubMed]
- Reingold, E.M. Eye tracking research and technology: Towards objective measurement of data quality. Vis. Cognit. 2014, 22, 635–652. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Krassanakis, V.; Filippakopoulou, V.; Nakos, B. Detection of moving point symbols on cartographic backgrounds. J. Eye Mov. Res. 2016, 9, 1–16. [Google Scholar]
- Krassanakis, V.; Filippakopoulou, V.; Nakos, B. An Application of Eye Tracking Methodology in Cartographic Research. In Proceedings of the Eye-TrackBehavior 2011(Tobii), Frankfurt, Germany, 4–5 October 2011. [Google Scholar]
- Krassanakis, V.; Lelli, A.; Lokka, I.E.; Filippakopoulou, V.; Nakos, B. Investigating dynamic variables with eye movement analysis. In Proceedings of the 26th International Cartographic Association Conference, Dresden, Germany, 25–30 August 2013. [Google Scholar]
- Krassanakis, V. Development of a Methodology of Eye Movement Analysis for the Study of Visual Perception in Animated Maps. Ph.D. Thesis, School of Rural and Surveying Engineering, National Technical University of Athens, Athens, Greece, 2014. (In Greek). [Google Scholar]
- Krassanakis, V. Recording the Trace of Visual Search: A Research Method of the Selectivity of Hole as Basic Shape Characteristic. Diploma Thesis, School of Rural and Surveying Engineering, National Technical University of Athens, Athens, Greece, 2009. (In Greek). [Google Scholar]
- Hermens, F. Dummy eye measurements of microsaccades: Testing the influence of system noise and head movements on microsaccade detection in a popular video-based eye tracker. J. Eye Mov. Res. 2015, 8, 1–17. [Google Scholar]
- Wang, D.; Mulvey, F.B.; Pelz, J.B.; Holmqvist, K. A study of artificial eyes for the measurement of precision in eye-trackers. Behav. Res. Methods 2017, 49, 947–959. [Google Scholar] [CrossRef] [PubMed]
- De Urabain, I.R.S.; Johnson, M.H.; Smith, T.J. GraFIX: A semiautomatic approach for parsing low-and high-quality eye-tracking data. Behav. Res. Methods 2015, 47, 53–72. [Google Scholar] [CrossRef] [PubMed]
- Świrski, L.; Dodgson, N. Rendering synthetic ground truth images for eye tracker evaluation. In Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, FL, USA, 26–28 March 2014; pp. 219–222. [Google Scholar]
- Salvucci, D.D.; Goldberg, J.H. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, FL, USA, 6–8 November 2000; pp. 71–78. [Google Scholar]
- Punde, P.A.; Manza, R.R. Review of algorithms for detection of fixations from eye tracker database. Int. J. Latest Trends Eng. Technol. 2016, 7, 247–253. [Google Scholar]
- Harezlak, K.; Kasprowski, P. Evaluating quality of dispersion based fixation detection algorithm. In Proceedings of the 29th International Symposium on Computer and In-formation Sciences, Information Sciences and Systems, October, Krakow, Poland, 27–28 October 2014; pp. 97–104. [Google Scholar]
- Karagiorgou, S.; Krassanakis, V.; Vescoukis, V.; Nakos, B. Experimenting with polylines on the visualization of eye tracking data from observation of cartographic lines. In Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research (Co-Located with the 8th International Conference on Geographic Information Science (GIScience 2014)), Vienna, Austria, 23–26 September 2014; Kiefer, P., Giannopoulos, I., Raubal, M., Krüger, A., Eds.; pp. 22–26. [Google Scholar]
- Li, B.; Wang, Q.; Barney, E.; Hart, L.; Wall, C.; Chawarska, K.; De Urabain, I.R.S.; Smith, T.J.; Shic, F. Modified DBSCAN algorithm on oculomotor fixation identification. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 337–338. [Google Scholar]
- Shic, F.; Scassellati, B.; Chawarska, K. The incomplete fixation measure. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications, Savannah, GA, USA, 26–28 March 2008; pp. 111–114. [Google Scholar]
- Tangnimitchok, S.; Nonnarit, O.; Barreto, A.; Ortega, F.R.; Rishe, N.D. Finding an Efficient Threshold for Fixation Detection in Eye Gaze Tracking. In International Conference on Human-Computer Interaction; Springer International Publishing: Berlin/Heidelberg, Germany, 2016; pp. 93–103. [Google Scholar]
- Jacob, R.J.K.; Karn, K.S. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind’s Eyes: Cognitive and Applied Aspects of Eye Movements; Elsevier Science: Oxford, UK, 2003; pp. 573–605. [Google Scholar]
- Blignaut, P. Fixation identification: The optimum threshold for a dispersion algorithm. Atten. Percept. Psychophys. 2009, 71, 881–895. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Blignaut, P.; Beelders, T. The effect of fixational eye movements on fixation identification with dispersion-based fixation detection algorithm. J. Eye Mov. Res. 2009, 2, 1–14. [Google Scholar]
- Goldberg, J.H.; Kotval, X.P. Computer interface evaluation using eye movements: Methods and constructs. Int. J. Ind. Ergon. 1999, 24, 631–645. [Google Scholar] [CrossRef]
- Duchowski, A.T. Eye Tracking Methodology: Theory & Practice, 2nd ed.; Springer: London, UK, 2007. [Google Scholar]
- Wass, S.V.; Smith, T.J.; Johnson, M.H. Parsing eye-tracking data of variable quality to provide accurate fixation duration estimates in infants and adults. Behav. Res. Methods 2013, 45, 229–250. [Google Scholar] [CrossRef] [PubMed]
- Dalmaijer, E.; Mathôt, S.; Van der Stigchel, S. PyGaze: An open-source, cross-platform toolbox for minimal-effort programming of eyetracking experiments. Behav. Res. Methods 2014, 46, 913–921. [Google Scholar] [CrossRef] [PubMed]
- Matlin, M.W. Cognition; Wiley: Hoboken, NJ, USA, 2005. [Google Scholar]
- Tafaj, E.; Kasneci, G.; Rosenstil, W.; Bogdan, M. Bayesian online clustering of eye movement data. In Proceedings of the Symposium on Eye Tracking Research and Applications, Santa Barbara, CA, USA, 28–30 March 2012; pp. 285–288. [Google Scholar]
- Santini, T.; Fuhl, T.; Kübler, T.; Fasneci, E. Bayesian identification of fixations, saccades, and smooth pursuits. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 163–170. [Google Scholar]
- Braunagel, C.; Geisler, D.; Stolzmann, W.; Rosenstil, W.; Kasneci, E. On the necessity of adaptive eye movement classification in conditionally automated driving scenarios. In Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, Charleston, SC, USA, 14–17 March 2016; pp. 19–26. [Google Scholar]
Eye Tracker | Eye Tribe Tracker |
---|---|
Eye tracking principle | Non-invasive, image-based eye tracking—pupil with corneal reflection |
Sampling rate | 30 Hz or 60 Hz |
Accuracy | 0.5°–1° |
Spatial resolution | 0.1° (RMS) |
Latency | <20 ms at 60 Hz |
Calibration | 9, 12 or 16 points |
Operating range | 45 cm–75 cm |
Tracking area | 40 cm × 30 cm at 65 cm distance (30 Hz) |
Gaze tracking range | Up to 24″ |
Dimensions | 20 cm × 1.9 cm × 1.9 cm |
Weight | 70 g |
nr 13 Targets | nr 5 Targets | X (px) | Y (px) |
---|---|---|---|
1 | 1 | 750 | 75 |
2 | 8400 | 75 | |
3 | 2 | 16,050 | 75 |
4 | 4575 | 300 | |
5 | 12,225 | 300 | |
6 | 750 | 525 | |
7 | 3 | 8400 | 525 |
8 | 16,050 | 525 | |
9 | 4575 | 750 | |
10 | 12,225 | 750 | |
11 | 4 | 750 | 975 |
12 | 8400 | 975 | |
13 | 5 | 16,050 | 975 |
Unfiltered | Filtered | |||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Org | Fix 0.7 | Fix 0.8 | Fix 0.9 | Fix 1.0 | Fix 1.1 | Fix 1.2 | Fix 1.3 | Org | Fix 0.7 | Fix 0.8 | Fix 0.9 | Fix 1.0 | Fix 1.1 | Fix 1.2 | Fix 1.3 | |
raw | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x |
avg | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x | x |
Univariate ANOVA | F | p |
---|---|---|
uf_f | 9.769 | 0.003 * |
avg_raw | 0.000 | 0.984 |
13pt_5pt | 2.570 | 0.114 |
uf_f * avg_raw | 0.001 | 0.974 |
uf_f * 13pt_5pt | 8.684 | 0.004 * |
avg_raw * 13pt_5pt | 0.000 | 0.996 |
uf_f * avg_raw * 13pt_5pt | 0.001 | 0.975 |
? | 13 pt | 5 pt | ||||||
---|---|---|---|---|---|---|---|---|
avg | raw | avg | raw | |||||
df = 7 | df = 6 | df = 7 | df = 6 | df = 7 | df = 6 | df = 7 | df = 6 | |
Unfiltered | p = 0.006 * F = 3.246 | p = 0.618 F = 0.742 | p = 0.001 * F = 4.252 | p = 0.069 F = 2.096 | p = 0.236 F = 1.482 | p = 0.459 F = 0.990 | p = 0.154 F = 1.778 | p = 0.171 F = 1.717 |
Filtered | p = 0.330 F = 1.178 | p = 0.396 F = 1.053 | p = 0.041 * F = 2.275 | p = 0.256 F = 1.336 | p = 0.118 F = 1.665 | p = 0.114 F = 2.033 | p = 0.112 F = 2.044 | p = 0.107 F = 2.081 |
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ooms, K.; Krassanakis, V. Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection. J. Imaging 2018, 4, 96. https://doi.org/10.3390/jimaging4080096
Ooms K, Krassanakis V. Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection. Journal of Imaging. 2018; 4(8):96. https://doi.org/10.3390/jimaging4080096
Chicago/Turabian StyleOoms, Kristien, and Vassilios Krassanakis. 2018. "Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection" Journal of Imaging 4, no. 8: 96. https://doi.org/10.3390/jimaging4080096
APA StyleOoms, K., & Krassanakis, V. (2018). Measuring the Spatial Noise of a Low-Cost Eye Tracker to Enhance Fixation Detection. Journal of Imaging, 4(8), 96. https://doi.org/10.3390/jimaging4080096