Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability
Abstract
:Introduction
Methods
Participants
Apparatus
Procedure
Data Processing
Analysis
Results
Gaze Sample Validity
Raw Fixation Data and Outlier Correction
Overall Accuracy and Precision
Data Quality for Participants and Sessions
Vision Correction and HMD Hardware
Inter-Pupillary Distance
Discussion
Ethics and Conflict of Interest
Acknowledgements
References
- Adhanom, I., S. C. Lee, E. Folmer, and P. MacNeilage. 2020. GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-Based Eye Trackers. In Symposium on Eye Tracking Research and Applications. pp. 1–5. [Google Scholar] [CrossRef]
- Albert, R., A. Patney, D. Luebke, and J. Kim. 2017. Latency Requirements for Foveated Rendering in Virtual Reality. ACM Transactions on Applied Perception (TAP) 14, 4: 1–13. [Google Scholar] [CrossRef]
- Andersson, R., L. Larsson, K. Holmqvist, M. Stridh, and M. Nyström. 2017. One Algorithm to Rule Them All? An Evaluation and Discussion of Ten Eye Movement Event-Detection Algorithms. Behavior Research Methods 49, 2: 616–37. [Google Scholar] [CrossRef] [PubMed]
- Becker, W., and A. F. Fuchs. 1969. Further Properties of the Human Saccadic System: Eye Movements and Correction Saccades with and Without Visual Fixation Points. Vision Research 9, 10: 1247–1258. [Google Scholar] [CrossRef]
- Blignaut, P. 2017. Using smooth pursuit calibration for difficult-to-calibrate participants. Journal of Eye Movement Research 10, 4. [Google Scholar] [CrossRef] [PubMed]
- Blignaut, P., K. Holmqvist, M. Nyström, and R. Dewhurst. 2014. Improving the Accuracy of Video-Based Eye Tracking in Real Time Through Post-Calibration Regression. In Current Trends in Eye Tracking Research. Edited by M. Horsley, M. Eliot, B. A. Knight and R. Reilly. Cham: Springer International Publishing: pp. 77–100. [Google Scholar] [CrossRef]
- Bulling, A., and H. Gellersen. 2010. Toward Mobile EyeBased Human-Computer Interaction. IEEE Pervasive Computing 9, 4: 8–12. [Google Scholar] [CrossRef]
- Clay, V., P. König, and S. U. König. 2019. Eye Tracking in Virtual Reality. Journal of Eye Movement Research 12, 1. [Google Scholar] [CrossRef]
- Dodgson, N. A. 2004. Variation and extrema of human interpupillary distance. In Stereoscopic Displays and Virtual Reality Systems XI. Edited by M. T. Bolas, A. J. Woods, J. O. Merritt and S. A. Benton. vol. 5291, pp. 36–46, International Society for Optics; Photonics; SPIE. [Google Scholar] [CrossRef]
- Drewes, H., K. Pfeuffer, and F. Alt. 2019. Time-and space-efficient eye tracker calibration. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. pp. 1–8. [Google Scholar] [CrossRef]
- Drewes, J., S. Feder, and W. Einhäuser. 2021. Gaze During Locomotion in Virtual Reality and the Real World. Frontiers in Neuroscience 15: 596. [Google Scholar] [CrossRef] [PubMed]
- Drewes, J., W. Zhu, Y. Hu, and X. Hu. 2014. Smaller Is Better: Drift in Gaze Measurements Due to Pupil Dynamics. PLOS ONE 9, 10: 1–6. [Google Scholar] [CrossRef]
- Duchowski, A. T. 2002. A Breadth-First Survey of EyeTracking Applications. Behavior Research Methods, Instruments, & Computers 34, 4: 455–70. [Google Scholar] [CrossRef]
- Duchowski, A. T. 2017. Eye Tracking Methodology: Theory and Practice. Cham, Springer. [Google Scholar] [CrossRef]
- Duchowski, A. T. 2018. Gaze-Based Interaction: A 30 Year Retrospective. Computers & Graphics 73: 59–69. [Google Scholar] [CrossRef]
- Ehinger, B. V., K. Groß, I. Ibs, and P. König. 2019. A New Comprehensive Eye-Tracking Test Battery Concurrently Evaluating the Pupil Labs Glasses and the EyeLink 1000. PeerJ 7: e7086. [Google Scholar] [CrossRef] [PubMed]
- Feit, A. M., S. Williams, A. Toledo, A. Paradiso, H. Kulkarni, S. Kane, and M. R. Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. In Proceedings of the 2017 Chi Conference on Human Factors in Computing Systems. pp. 1118–30. [Google Scholar] [CrossRef]
- Gegenfurtner, K. R. 2016. The Interaction Between Vision and Eye Movements. Perception 45, 12: 1333–1357. [Google Scholar] [CrossRef]
- Hansen, D. W., and Q. Ji. 2009. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 3: 478–500. [Google Scholar] [CrossRef]
- Hayhoe, M. M., and C. A. Rothkopf. 2011. Vision in the Natural World. WIREs Cognitive Science 2, 2: 158–66. [Google Scholar] [CrossRef] [PubMed]
- Hibbard, P. B., L. C. J. van Dam, and P. Scarfe. 2020. The Implications of Interpupillary Distance Variability for Virtual Reality. In 2020 International Conference on 3d Immersion (Ic3d). pp. 1–7, IEEE. [Google Scholar] [CrossRef]
- Holmqvist, K., M. Nyström, and F. Mulvey. 2012. Eye Tracker Data Quality: What It Is and How to Measure It. In Proceedings of the Symposium on Eye Tracking Research and Applications. pp. 45–52, ACM. [Google Scholar] [CrossRef]
- Holmqvist, K., S. L. Örbom, I. T. C. Hooge, D. C. Niehorster, R. G. Alexander, R. Andersson, and R. S. Hessels. 2022. Eye Tracking: Empirical Foundations for a Minimal Reporting Guideline. Behavior Research Methods. [Google Scholar] [CrossRef] [PubMed]
- Hornof, A. J., and T. Halverson. 2002. Cleaning up Systematic Error in Eye-Tracking Data by Using Required Fixation Locations. Behavior Research Methods, Instruments, & Computers 34, 4: 592–604. [Google Scholar] [CrossRef]
- HTC Corporation. 2021. VIVE Pro Eye Specs & User Guide. Retrieved from https://developer.vive.com/resources/vivesense/hardware-guide/vive-pro-eye-specs-user-guide/on on July 28, 2021.
- Jacob, R., and S. Stellmach. 2016. What You Look at Is What You Get: Gaze-Based User Interfaces. Interactions 23, 5: 62–65. [Google Scholar] [CrossRef]
- Jamali, A., C. Yousefzadeh, C. McGinty, D. Bryant, and P. Bos. 2018. A continuous variable lens system to address the accommodation problem in VR and 3D displays. In 3D Image Acquisition and Display: Technology, Perception and Applications (pp. 3Tu2G-5). Optical Society of America. [Google Scholar] [CrossRef]
- Komogortsev, O. V., D. V. Gobert, S. Jayarathna, D. H. Koh, and S. M. Gowda. 2010. Standardization of Automated Analyses of Oculomotor Fixation and Saccadic Behaviors. IEEE Transactions on Biomedical Engineering 57, 11: 2635–45. [Google Scholar] [CrossRef]
- König, P., N. Wilming, T. C. Kietzmann, J. P. Ossandón, S. Onat, B. V. Ehinger, R. R. Gameiro, and K. Kaspar. 2016. Eye Movements as a Window to Cognitive Processes. Journal of Eye Movement Research 9, 5: 1–16. [Google Scholar] [CrossRef]
- Kowler, E. 2011. Eye Movements: The Past 25years. Vision Research 51, 13: 1457–83. [Google Scholar] [CrossRef] [PubMed]
- Kowler, E., and E. Blaser. 1995. The Accuracy and Precision of Saccades to Small and Large Targets. Vision Research 35, 12: 1741–1754. [Google Scholar] [CrossRef]
- Langbehn, E., F. Steinicke, M. Lappe, G. F. Welch, and G. Bruder. 2018. In the Blink of an Eye: Leveraging Blink-Induced Suppression for Imperceptible Position and Orientation Redirection in Virtual Reality. ACM Transactions on Graphics (TOG) 37, 4: 1–11. [Google Scholar] [CrossRef]
- Lohr, D. J., L. Friedman, and O. V. Komogortsev. 2019. Evaluating the Data Quality of Eye Tracking Signals from a Virtual Reality System: Case Study Using SMI’s Eye-Tracking HTC Vive. arXiv arXiv:1912.02083. [Google Scholar] [CrossRef]
- Majaranta, P., and A. Bulling. 2014. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing. Edited by S. H. Fairclough and K. Gilleade. London: Springer London: pp. 39–65. [Google Scholar] [CrossRef]
- Majaranta, P., H. Aoki, D. Mick, D. W. Hansen, J. P. Hansen, A. Hyrskykari, and K.-J. Räihä. 2011. Gaze Interaction and Applications of Eye Tracking: Advances in Assistive Technologies. IGI Global. [Google Scholar] [CrossRef]
- Martinez-Conde, S., S. L. Macknik, and D. H. Hubel. 2004. The Role of Fixational Eye Movements in Visual Perception. Nature Reviews Neuroscience 5, 3: 229–40. [Google Scholar] [CrossRef]
- Marwecki, S., A. D. Wilson, E. Ofek, M. G. Franco, and C. Holz. 2019. Mise-Unseen: Using Eye Tracking to Hide Virtual Reality Scene Changes in Plain Sight. In Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology. pp. 777–89. [Google Scholar] [CrossRef]
- McConkie, G. W. 1981. Evaluating and Reporting Data Quality in Eye Movement Research. Behavior Research Methods & Instrumentation 13, 2: 97–106. [Google Scholar] [CrossRef]
- Nyström, M., and K. Holmqvist. 2010. An Adaptive Algorithm for Fixation, Saccade, and Glissade Detection in Eyetracking Data. Behavior Research Methods 42, 1: 188–204. [Google Scholar] [CrossRef] [PubMed]
- Nyström, M., R. Andersson, K. Holmqvist, and J. van de Weijer. 2013. The Influence of Calibration Method and Eye Physiology on Eyetracking Data Quality. Behavior Research Methods 45, March: 272–88. [Google Scholar] [CrossRef]
- Orquin, J. L., and K. Holmqvist. 2018. Threats to the Validity of Eye-Movement Research in Psychology. Behavior Research Methods 50, 4: 1645–56. [Google Scholar] [CrossRef] [PubMed]
- Pastel, S., C.-H. Chen, L. Martin, M. Naujoks, K. Petri, and K. Witte. 2020. Comparison of Gaze Accuracy and Precision in Real-World and Virtual Reality. Virtual Reality, 1–15. [Google Scholar] [CrossRef]
- Patney, A., M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn. 2016. Towards Foveated Rendering for Gaze-Tracked Virtual Reality. ACM Trans. Graph. 35, 6. [Google Scholar] [CrossRef]
- Plopski, A., T. Hirzle, N. Norouzi, L. Qian, G. Bruder, and T. Langlotz. 2022. The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-Worn Extended Reality. ACM Comput. Surv. 55, 3. [Google Scholar] [CrossRef]
- Rothkopf, C. A., D. H. Ballard, and M. M. Hayhoe. 2007. Task and context determine where you look. Journal of Vision 7, 14: 16–16. [Google Scholar] [CrossRef]
- Rucci, M., R. Iovin, M. Poletti, and F. Santini. 2007. Miniature Eye Movements Enhance Fine Spatial Detail. Nature 447, 7146: 852–55. [Google Scholar] [CrossRef] [PubMed]
- Sauer, Y., A. Sipatchin, S. Wahl, and M. Garcı́a Garcı́a. 2022. Assessment of Consumer VR-Headsets’ Objective and Subjective Field of View (FoV) and Its Feasibility for Visual Field Testing. Virtual Reality, 1–13. [Google Scholar] [CrossRef]
- Scarfe, P., and A. Glennerster. 2019. The Science Behind Virtual Reality Displays. Annual Review of Vision Science 5: 529–47. [Google Scholar] [CrossRef] [PubMed]
- Schuetz, I., H. Karimpur, and K. Fiehler. 2022. vexptoolbox: A Software Toolbox for Human Behavior Studies Using the Vizard Virtual Reality Platform. Behavior Research Methods. [Google Scholar] [CrossRef]
- Schuetz, I., T. S. Murdison, and M. Zannoli. 2020. A Psychophysics-Inspired Model of Gaze Selection Performance. In Symposium on Eye Tracking Research and Applications. pp. 1–5. [Google Scholar] [CrossRef]
- Schuetz, I., T. S. Murdison, K. J. MacKenzie, and M. Zannoli. 2019. An Explanation of Fitts’ Law-Like Performance in Gaze-Based Selection Tasks Using a Psychophysics Approach. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. p. 535, ACM. [Google Scholar] [CrossRef]
- Schütz, I., and K. Fiehler. 2022. Data and Code for: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability. [Google Scholar] [CrossRef]
- Sipatchin, A., S. Wahl, and K. Rifai. 2021. Eye-Tracking for Clinical Ophthalmology with Virtual Reality (VR): A Case Study of the HTC Vive Pro Eye’s Usability. In Healthcare, 9:180. 2. Multidisciplinary Digital Publishing Institute. [Google Scholar] [CrossRef]
- Stein, N., D. C. Niehorster, T. Watson, F. Steinicke, K. Rifai, S. Wahl, and M. Lappe. 2021. A Comparison of Eye Tracking Latencies Among Several Commercial Head-Mounted Displays. I-Perception 12, 1: 2041669520983338. [Google Scholar] [CrossRef]
- Tanriverdi, V., and R. J. K. Jacob. 2000. Interacting with Eye Movements in Virtual Environments. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. pp. 265–72, CHI ’00. New York, NY, USA: Association for Computing Machinery. [Google Scholar] [CrossRef]
- Tong, J., R. S. Allison, and L. M. Wilcox. 2020. The Impact of Radial Distortions in Vr Headsets on Perceived Surface Slant. Electronic Imaging 2020, 11: 60409–1. [Google Scholar] [CrossRef]
- Troje, N. F. 2019. Reality Check. Perception 48, 11: 1033–1038. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. This article is licensed under a Creative Commons Attribution 4.0 International License.
Share and Cite
Schuetz, I.; Fiehler, K. Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability. J. Eye Mov. Res. 2022, 15, 1-18. https://doi.org/10.16910/jemr.15.3.3
Schuetz I, Fiehler K. Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability. Journal of Eye Movement Research. 2022; 15(3):1-18. https://doi.org/10.16910/jemr.15.3.3
Chicago/Turabian StyleSchuetz, Immo, and Katja Fiehler. 2022. "Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability" Journal of Eye Movement Research 15, no. 3: 1-18. https://doi.org/10.16910/jemr.15.3.3
APA StyleSchuetz, I., & Fiehler, K. (2022). Eye Tracking in Virtual Reality: Vive Pro Eye Spatial Accuracy, Precision, and Calibration Reliability. Journal of Eye Movement Research, 15(3), 1-18. https://doi.org/10.16910/jemr.15.3.3