Testing Different Function Fitting Methods for Mobile Eye-Tracker Calibration
Abstract
:Introduction
- A simulation of gaze vectors during simulated calibration. Horizontal and vertical angles between the optical axis and the line from the centre of the eyeball to the eye camera are simulated and enriched with controlled patterns of measurement noise. We use these gaze angles because they have proven to be relatively stable in terms of device slippage (Santini et al., 2019).
- A simulation of different calibration patterns. Different target patterns (e.g., circular pursuit, 9-point calibration) are investigated in different conditions. In particular, the difference in performance for inter- and extrapolation is addressed.
- To transfer the results of the simulation to the real world, we created a real-world study where we recorded calibration data of 7 subjects. So, we obtain real calibration patterns, and it gives us the opportunity to confirm that the simulated results can be transferred to a real application. In this case, we compare the performance of the calibration method with the simulated data of the real samples with the data from the experiment.
Related Work
Dataset
Simulated Data
- 5-point calibration (5p) This is a normal 5-point calibration pattern with dots in the corners and one in the centre. It is usually used to fit a homography for gaze mapping.
- 9-point calibration (9p) This is a normal 9-point calibration pattern with three points on three lines. Popular for calibrating desktop eye-tracking devices.
- Centre calibration (Centre) Snake pattern, simulating a smooth pursuit calibration, in the centre of the seen field. Full calibration (Full) Snake pattern across the entire field of view.
- Subject huge (huge) This pattern is extracted from the experiment to compare the simulation with the real data. This pattern covers almost the field.
- Subject small (small) This pattern is extracted from the experiment to compare the simulation with the real data. This pattern covers only the centre field.
- 20 × 20 full field This is a pattern consisting of 20 × 20 dots evenly distributed over the entire field. It is used to evaluate the performance of calibrations.
Data collected in real-world experiment
Methods
- Calculate the error for each simulation:
- Fit the method with the gaze vectors to the target points of the calibration pattern used.
- Estimate the gaze points with the gaze vectors of the full field pattern (Figure 4 (g))
- Calculate the angle between the vector from the camera to the estimated point and the vector from the camera to the true point.
- Calculate the mean of the amount of the angles. This is the mean angle error of the simulation.
- Calculate the mean of the mean error angles of all simulations.
Evaluations
Comparative view on all methods
Polynomial Regression
Ridge Regression
Intra- and Extrapolation
Experimental Results
Limitations
Conclusions
Acknowledgments
Appendix A
Calculation of the error in degrees
Unity
- -
- Create a scene by drag-and-drop of game objects like spheres and cubes.
- -
- Manipulate the scene using C#-scripts.
- -
- Easily switch between the world coordinate system and the camera’s point of view.
- -
- Efficient calculation and extraction of data with unity's integrated functions.
Aruco-marker
Influence of the precision error to the Methods
References
- Atchison, D. A. 2017. Axes and angles of the eye, volume one. In P. Artal, Handbook of visual optics. CRC Press: pp. S. 455–467. [Google Scholar] [CrossRef]
- Blignaut, P. 2014. Mapping the Pupil-Glint Vector to Gaze Coordinates in a Simple Video-Based Eye Tracker. Journal of Eye Movement Research 7. [Google Scholar] [CrossRef]
- Blignaut, P. 2016. Idiosyncratic feature-based gaze mapping. Journal of Eye Movement Research 9. [Google Scholar] [CrossRef]
- Cortes, C., and V. Vapnik. 1995. Support-vector networks. Machine learning 20: 273–297. [Google Scholar] [CrossRef]
- Drewes, H., K. Pfeuffer, and F. Alt. 2019. Time-and Space-Efficient Eye Tracker Calibration. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications. New York, NY, USA: Association for Computing Machinery. [Google Scholar] [CrossRef]
- Drucker, H., C. J. Burges, L. Kaufman, A. Smola, and V. Vapnik. 1997. Support vector regression machines. Advances in neural information processing systems 9: 155–161. [Google Scholar]
- Fischler, M. A., and R. C. Bolles. 1981. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Commun. ACM 24: 381–395. [Google Scholar] [CrossRef]
- Fuhl, W., J. Schneider, and E. Kasneci. 2021. 1000 Pupil Segmentations in a Second using Haar Like Features and Statistical Learning. International Conference on Computer Vision Workshops, ICCVW. [Google Scholar] [CrossRef]
- Hassoumi, A., V. Peysakhovich, and C. Hurter. 2019. Improving eye-tracking calibration accuracy using symbolic regression. Plos one 14: e0213675. [Google Scholar] [CrossRef] [PubMed]
- Hoerl, A. E., and R. W. Kennard. 1970. Ridge regression: Biased estimation for nonorthogonal problems. Technometrics 12: 55–67. [Google Scholar] [CrossRef]
- Kasprowski, P., K. Harezlak, and M. Stasch. 2014. Guidelines for eye tracker calibration using points of regard. June, 284. [Google Scholar] [CrossRef]
- Kim, J., M. Stengel, A. Majercik, S. De Mello, D. Dunn, S. Laine, and D. Luebke. 2019. Nvgaze: An anatomically-informed dataset for lowlatency, near-eye gaze estimation. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, S. 1–12. [Google Scholar] [CrossRef]
- Kübler, T. C. 2021. Look! Blickschulungsbrille: Technical specifications. Tech. rep., Look! ET, December. [Google Scholar]
- Nair, N., R. Kothari, A. K. Chaudhary, Z. Yang, G. J. Diaz, J. B. Pelz, and R. J. Bailey. 2020. RIT-Eyes: Rendering of near-eye images for eyetracking applications. ACM Symposium on Applied Perception, S. 1–9. [Google Scholar] [CrossRef]
- Narcizo, F. B., F. E. dos Santos, and D. W. Hansen. 2021. High-Accuracy Gaze Estimation for Interpolation-Based Eye-Tracking Methods. Vision 5. [Google Scholar] [CrossRef]
- Pedregosa, F., G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, and E. Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12: 2825–2830. [Google Scholar]
- Santini, T., W. Fuhl, D. Geisler, and E. Kasneci. 2017. EyeRecToo: Open-Source Software for Real-Time Pervasive HeadMounted Eye-Tracking. 12th Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2017), February. [Google Scholar] [CrossRef]
- Santini, T., D. C. Niehorster, and E. Kasneci. 2019. Get a Grip: Slippage-Robust and Glint-Free Gaze Estimation for Real-Time Pervasive Head-Mounted Eye Tracking. Proceedings of the 2019 ACM Symposium on Eye Tracking Research & Applications (ETRA), June. [Google Scholar] [CrossRef]
- Tibshirani, R. 1996. Regression Shrinkage and Selection via the Lasso. Journal of the Royal Statistical Society. Series B (Methodological) 58: 267–288. [Google Scholar] [CrossRef]
- Wood, E., T. Baltrušaitis, L.-P. Morency, P. Robinson, and A. Bulling. 2016. Learning an appearance-based gaze estimator from one million synthesised images. Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications, S. 131–138. [Google Scholar] [CrossRef]
Copyright © 2023. This article is licensed under a Creative Commons Attribution 4.0 International License.
Share and Cite
Severitt, B.R.; Kübler, T.C.; Kasneci, E. Testing Different Function Fitting Methods for Mobile Eye-Tracker Calibration. J. Eye Mov. Res. 2023, 16, 1-17. https://doi.org/10.16910/jemr.16.4.2
Severitt BR, Kübler TC, Kasneci E. Testing Different Function Fitting Methods for Mobile Eye-Tracker Calibration. Journal of Eye Movement Research. 2023; 16(4):1-17. https://doi.org/10.16910/jemr.16.4.2
Chicago/Turabian StyleSeveritt, Björn R., Thomas C. Kübler, and Enkelejda Kasneci. 2023. "Testing Different Function Fitting Methods for Mobile Eye-Tracker Calibration" Journal of Eye Movement Research 16, no. 4: 1-17. https://doi.org/10.16910/jemr.16.4.2
APA StyleSeveritt, B. R., Kübler, T. C., & Kasneci, E. (2023). Testing Different Function Fitting Methods for Mobile Eye-Tracker Calibration. Journal of Eye Movement Research, 16(4), 1-17. https://doi.org/10.16910/jemr.16.4.2