Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed
Abstract
1. Introduction
- We introduced a proprioceptive calibration method that enables EOG calibration in the absence of visual input.
- We propose a novel approach for estimating the gaze direction using EOG signals when the eyes are closed.
- We established eye tracking as a benchmark for evaluating EOG-based gaze direction estimation by comparing both modalities under eyes-open conditions.
- We validated the effectiveness of EOG-based gaze estimation under eyes-closed conditions by comparing aligned EOG trajectory data with shuffled (chance-level) control data.
2. Related Works
3. Materials and Methods
3.1. Participants
3.2. Experiment Design
3.3. Apparatus
3.4. Procedure
4. Analysis
4.1. Processing
4.2. Comparison of EOG and Eye-Tracking Similarity Between Eyes-Open Lights-On and Lights-Off Conditions
4.3. Similarity Between Ground Truth, Eye Tracking, and EOG
4.4. Statistical Comparisons
4.4.1. Comparing EOG and Eye-Tracking Signals for the Eyes-Open Condition
4.4.2. Comparing EOG Signals and the Ground Truth Trajectory for the Eyes-Closed Condition
5. Results
5.1. Effect of Lighting Conditions on Similarity Between EOG and Eye-Tracking Signals Under Eyes-Open Conditions
5.2. Grand Average of Instructed Trajectory vs. Sensed Gaze Across Lighting Conditions
5.3. Cross-Correlation Between Instructed Trajectory and Sensed Gaze
5.4. Comparison of EOG and Eye-Tracking Signals Under Eyes-Open Conditions
5.5. Control Analysis of EOG Accuracy Under Eyes-Closed Conditions Using Surrogate Data
6. Discussion
7. Limitations and Future Work
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Chang, W.D.; Cha, H.S.; Kim, D.Y.; Kim, S.H.; Im, C.H. Development of an electrooculogram-based eye-computer interface for communication of individuals with amyotrophic lateral sclerosis. J. Neuroeng. Rehabil. 2017, 14, 89. [Google Scholar] [CrossRef] [PubMed]
- Chieh, T.C.; Mustafa, M.M.; Hussain, A.; Hendi, S.F.; Majlis, B.Y. Development of vehicle driver drowsiness detection system using electrooculogram (EOG). In Proceedings of the 2005 1st International Conference on Computers, Communications, & Signal Processing with Special Track on Biomedical Engineering, Kuala Lumpur, Malaysia, 14–16 November 2005; pp. 165–168. [Google Scholar] [CrossRef]
- Fukuda, K.; Stern, J.A.; Brown, T.B.; Russo, M.B. Cognition, blinks, eye-movements, and pupillary movements during performance of a running memory task. Aviat. Space, Environ. Med. 2005, 76, C75–C85. [Google Scholar] [PubMed]
- Tsai, Y.F.; Viirre, E.; Strychacz, C.; Chase, B.; Jung, T.P. Task performance and eye activity: Predicting behavior relating to cognitive workload. Aviat. Space Environ. Med. 2007, 78, B176–B185. [Google Scholar] [PubMed]
- Lee, K.R.; Chang, W.D.; Kim, S.; Im, C.H. Real-Time “Eye-Writing” Recognition Using Electrooculogram. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 37–48. [Google Scholar] [CrossRef]
- Katz, B.F.G.; Kammoun, S.; Parseihian, G.; Gutierrez, O.; Brilhault, A.; Auvray, M.; Truillet, P.; Denis, M.; Thorpe, S.; Jouffrais, C. NAVIG: Augmented reality guidance system for the visually impaired. Virtual Real. 2012, 16, 253–269. [Google Scholar] [CrossRef]
- Bleau, M.; Kafle, K.; Wang, M.; Kabore, S.S.; Cueva-Vargas, J.L.; Nemargut, J.P. International prevalence of tactile map usage and its impact on navigational independence and well-being of people with visual impairments. Sci. Rep. 2025, 15, 27245. [Google Scholar] [CrossRef]
- Xu, C.; Israr, A.; Poupyrev, I.; Bau, O.; Harrison, C. Tactile display for the visually impaired using TeslaTouch. In Proceedings of the CHI ’11 Extended Abstracts on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 317–322. [Google Scholar] [CrossRef]
- Duchowski, A.T.; House, D.H.; Gestring, J.G. Comparing Estimated Gaze Depth in Virtual and Physical Environments. In Proceedings of the Symposium on Eye Tracking Research & Applications (ETRA), Safety Harbor, FL, USA, 26–28 March 2014; Association for Computing Machinery: New York, NY, USA, 2014. [Google Scholar] [CrossRef]
- Kuo, T.; Shih, K.; Chung, S.; Chen, H.H. Depth from Gaze. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Athens, Greece, 7–10 October 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 2910–2914. [Google Scholar]
- Schirm, J.; Gómez-Vargas, A.R.; Perusquía-Hernández, M.; Skarbez, R.T.; Isoyama, N.; Uchiyama, H.; Kiyokawa, K. Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality. Sensors 2023, 23, 6667. [Google Scholar] [CrossRef]
- Batmaz, A.U.; Turkmen, R.; Sarac, M.; Barrera Machuca, M.D.; Stuerzlinger, W. Re-investigating the Effect of the Vergence-Accommodation Conflict on 3D Pointing. In Proceedings of the 29th ACM Symposium on Virtual Reality Software and Technology, VRST ’23, New York, NY, USA, 9–11 October 2023. [Google Scholar]
- Hansen, D.W.; Ji, Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 478–500. [Google Scholar] [CrossRef]
- Constable, P.A.; Bach, M.; Frishman, L.; Jeffrey, B.G.; Robson, A.G. Request for Input from the ISCEV Membership: ISCEV Standard EOG (2017) Update–October 2016 Draft. Available online: https://www.iscev.org/standards/pdfs/ISCEV-EOG-Standard-2017-Draft-01.pdf (accessed on 24 June 2025).
- Zibandehpoor, M.; Alizadehziri, F.; Larki, A.A.; Teymouri, S.; Delrobaei, M. Electrooculography Dataset for Objective Spatial Navigation Assessment in Healthy Participants. Sci. Data 2025, 12, 553. [Google Scholar] [CrossRef]
- Robert, F.M.; Otheguy, M.; Nourrit, V.; de Bougrenet de la Tocnaye, J.L. Potential of a Laser Pointer Contact Lens to Improve the Reliability of Video-Based Eye-Trackers in Indoor and Outdoor Conditions. J. Eye Mov. Res. 2024, 17, 1–16. [Google Scholar] [CrossRef]
- Kourkoumelis, N.; Tzaphlidou, M. Eye Safety Related to Near Infrared Radiation Exposure to Biometric Devices. Sci. World J. 2011, 11, 902610. [Google Scholar] [CrossRef]
- Gowrisankaran, S.; Sheedy, J.E.; Hayes, J.R. Eyelid Squint Response to Asthenopia-Inducing Conditions. Optom. Vis. Sci. 2007, 84, 611. [Google Scholar] [CrossRef]
- Bacharach, J.; Lee, W.W.; Harrison, A.R.; Freddo, T.F. A review of acquired blepharoptosis: Prevalence, diagnosis, and current treatment options. Eye 2021, 35, 2468–2481. [Google Scholar] [CrossRef]
- Ripa, M.; Cuffaro, G.; Pafundi, P.C.; Valente, P.; Battendieri, R.; Buzzonetti, L.; Mattei, R.; Rizzo, S.; Savino, G. An epidemiologic analysis of the association between eyelid disorders and ocular motility disorders in pediatric age. Sci. Rep. 2022, 12, 8840. [Google Scholar] [CrossRef]
- Yan, C.; Pan, W.; Dai, S.; Xu, B.; Xu, C.; Liu, H.; Li, X. FSKT-GE: Feature maps similarity knowledge transfer for low-resolution gaze estimation. IET Image Process. 2024, 18, 1642–1654. [Google Scholar] [CrossRef]
- Yan, C.; Pan, W.; Xu, C.; Dai, S.; Li, X. Gaze Estimation via Strip Pooling and Multi-Criss-Cross Attention Networks. Appl. Sci. 2023, 13, 5901. [Google Scholar] [CrossRef]
- Robinson, D.A. A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field. IEEE Trans. Bio-Med. Electron. 1963, 10, 137–145. [Google Scholar]
- Sprenger, A.; Neppert, B.; Köster, S.; Gais, S.; Kömpf, D.; Helmchen, C.; Kimmig, H. Long-term eye movement recordings with a scleral search coil-eyelid protection device allows new applications. J. Neurosci. Methods 2008, 170, 305–309. [Google Scholar] [CrossRef] [PubMed]
- Creel, D.J. The electrooculogram. In Handbook of Clinical Neurology; Clinical Neurophysiology: Basis and Technical, Aspects; Levin, K.H., Chauvel, P., Eds.; Elsevier: Amsterdam, The Netherlands, 2019; Volume 160, Chapter 33; pp. 495–499. [Google Scholar] [CrossRef]
- Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing; Fairclough, S.H., Gilleade, K., Eds.; Human–Computer Interaction Series; Springer: London, UK, 2014; pp. 39–65. [Google Scholar] [CrossRef]
- Mack, D.J.; Schönle, P.; Fateh, S.; Burger, T.; Huang, Q.; Schwarz, U. An EOG-based, head-mounted eye tracker with 1 kHz sampling rate. In Proceedings of the 2015 IEEE Biomedical Circuits and Systems Conference (BioCAS), Atlanta, GA, USA, 22–24 October 2015; pp. 1–4. [Google Scholar] [CrossRef]
- Le Meur, O.; Le Callet, P.; Barba, D. Predicting visual fixations on video based on low-level visual features. Vis. Res. 2007, 47, 2483–2498. [Google Scholar] [CrossRef]
- Andersson, R.; Nyström, M.; Holmqvist, K. Sampling Frequency and Eye-Tracking Measures: How Speed Affects Durations, Latencies, and More. J. Eye Mov. Res. 2009, 3, 1–12. [Google Scholar] [CrossRef]
- Darling, W.G.; Wall, B.M.; Coffman, C.R.; Capaday, C. Pointing to One’s Moving Hand: Putative Internal Models Do Not Contribute to Proprioceptive Acuity. Front. Hum. Neurosci. 2018, 12, 177. [Google Scholar] [CrossRef] [PubMed]
- Dollack, F.; Perusquía-Hernández, M.; Kadone, H.; Suzuki, K. Head Anticipation During Locomotion with Auditory Instruction in the Presence and Absence of Visual Input. Front. Hum. Neurosci. 2019, 13, 293. [Google Scholar] [CrossRef] [PubMed]
- van Gorp, H.; van Gilst, M.M.; Overeem, S.; Dujardin, S.; Pijpers, A.; van Wetten, B.; Fonseca, P.; van Sloun, R.J.G. Single-channel EOG sleep staging on a heterogeneous cohort of subjects with sleep disorders. Physiol. Meas. 2024, 45, 055007. [Google Scholar] [CrossRef] [PubMed]
- Herman, J.H.; Erman, M.; Boys, R.; Peiser, L.; Taylor, M.E.; Roffwarg, H.P. Evidence for a Directional Correspondence Between Eye Movements and Dream Imagery in REM Sleep. Sleep 1984, 7, 52–63. [Google Scholar] [CrossRef]
- LaBerge, S.; Baird, B.; Zimbardo, P.G. Smooth tracking of visual targets distinguishes lucid REM sleep dreaming and waking perception from imagination. Nat. Commun. 2018, 9, 3298. [Google Scholar] [CrossRef]
- Senzai, Y.; Scanziani, M. A cognitive process occurring during sleep is revealed by rapid eye movements. Science 2022, 377, 999–1004. [Google Scholar] [CrossRef]
- Bulling, A.; Roggen, D.; Tröster, G. It’s in your eyes: Towards context-awareness and mobile HCI using wearable EOG goggles. In Proceedings of the 10th International Conference on Ubiquitous Computing, UbiComp ’08, Seoul, Republic of Korea, 21–24 September 2008; Association for Computing Machinery: New York, NY, USA, 2008; pp. 84–93. [Google Scholar]
- Bulling, A.; Roggen, D.; Tröster, G. Wearable EOG goggles: Eye-based interaction in everyday environments. In Proceedings of the CHI ’09 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’09, Boston, MA, USA, 4–9 April 2009; Association for Computing Machinery: New York, NY, USA, 2009; pp. 3259–3264. [Google Scholar]
- Chang, W.D. Electrooculograms for Human–Computer Interaction: A Review. Sensors 2019, 19, 2690. [Google Scholar] [CrossRef]
- Bulling, A.; Ward, J.A.; Gellersen, H.; Troster, G. Eye Movement Analysis for Activity Recognition Using Electrooculography. IEEE Trans. Pattern Anal. Mach. Intell. 2011, 33, 741–753. [Google Scholar] [CrossRef]
- Merino, M.; Rivera, O.; Gómez, I.; Molina, A.; Dorronzoro, E. A Method of EOG Signal Processing to Detect the Direction of Eye Movements. In Proceedings of the 2010 First International Conference on Sensor Device Technologies and Applications, Venice, Italy, 18–25 July 2010; pp. 100–105. [Google Scholar]
- Hládek, L.; Porr, B.; Brimijoin, W.O. Real-time estimation of horizontal gaze angle by saccade integration using in-ear electrooculography. PLoS ONE 2018, 13, e0190420. [Google Scholar] [CrossRef]
- Toivanen, M.; Pettersson, K.; Lukander, K. A probabilistic real-time algorithm for detecting blinks, saccades, and fixations from EOG data. J. Eye Mov. Res. 2015, 8, 1–14. [Google Scholar] [CrossRef]
- Chang, W.D.; Cha, H.S.; Im, C.H. Removing the Interdependency between Horizontal and Vertical Eye-Movement Components in Electrooculograms. Sensors 2016, 16, 227. [Google Scholar] [CrossRef]
- Ryu, J.; Lee, M.; Kim, D.H. EOG-based eye tracking protocol using baseline drift removal algorithm for long-term eye movement detection. Expert Syst. Appl. 2019, 131, 275–287. [Google Scholar] [CrossRef]
- Yan, M.; Tamura, H.; Tanno, K. A Study on Gaze Estimation System using Cross-Channels Electrooculogram Signals. In Proceedings of the International MultiConference of Engineers and Computer Scientists 2014, Hong Kong, 12–14 March 2014. [Google Scholar]
- Manabe, H.; Fukumoto, M.; Yagi, T. Direct Gaze Estimation Based on Nonlinearity of EOG. IEEE Trans. Biomed. Eng. 2015, 62, 1553–1562. [Google Scholar] [CrossRef] [PubMed]
- Barbara, N.; Camilleri, T.A.; Camilleri, K.P. EOG-Based Gaze Angle Estimation Using a Battery Model of the Eye. In Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany, 23–27 July 2019; pp. 6918–6921. [Google Scholar]
- Leigh, R.; Zee, D. Eye movements of the blind. Investig. Ophthalmol. Vis. Sci. 1980, 19, 328–331. [Google Scholar]
- Hsieh, C.W.; Chen, H.S.; Jong, T.L. The study of the relationship between electro-oculogram and the features of closed eye motion. In Proceedings of the 2008 International Conference on Information Technology and Applications in Biomedicine, Shenzhen, China, 30–31 May 2008; pp. 420–422. [Google Scholar]
- Findling, R.; Nguyen, L.; Sigg, S. Closed-Eye Gaze Gestures: Detection and Recognition of Closed-Eye Movements with Cameras in Smart Glasses. In Proceedings of the Advances in Computational Intelligence-15th International Work-Conference on Artificial Neural Networks, IWANN 2019, Canaria, Spain, 12–14 June 2019; pp. 322–334. [Google Scholar] [CrossRef]
- Findling, R.D.; Quddus, T.; Sigg, S. Hide my Gaze with EOG! Towards Closed-Eye Gaze Gesture Passwords that Resist Observation-Attacks with Electrooculography in Smart Glasses. In Proceedings of the 17th International Conference on Advances in Mobile Computing & Multimedia, MoMM2019, Munich, Germany, 2–4 December 2019; Association for Computing Machinery: New York, NY, USA, 2020; pp. 107–116. [Google Scholar] [CrossRef]
- Tamaki, D.; Fujimori, H.; Tanaka, H. An Interface using Electrooculography with Closed Eyes. Int. Symp. Affect. Sci. Eng. 2019, ISASE2019, 1–4. [Google Scholar] [CrossRef]
- Ben Barak-Dror, O.; Hadad, B.; Barhum, H.; Haggiag, D.; Tepper, M.; Gannot, I.; Nir, Y. Touchless short-wave infrared imaging for dynamic rapid pupillometry and gaze estimation in closed eyes. Commun. Med. 2024, 4, 1–12. [Google Scholar] [CrossRef]
- MacNeil, R.R.; Gunawardane, P.; Dunkle, J.; Zhao, L.; Chiao, M.; de Silva, C.W.; Enns, J.T. Using electrooculography to track closed-eye movements. J. Vis. 2021, 21, 1898. [Google Scholar] [CrossRef]
- MacNeil, R.R. Tracking the Closed Eye by Calibrating Electrooculography with Pupil-Corneal Reflection. Ph.D. Thesis, University of British Columbia, Vancouver, BC, Canada, 2020. [Google Scholar] [CrossRef]
- Kassner, M.; Patera, W.; Bulling, A. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. arXiv 2014, arXiv:1405.0006. [Google Scholar] [CrossRef]
- openFrameworks, Community. openFrameworks. Available online: https://openframeworks.cc/ (accessed on 31 August 2025).
- De Tommaso, D.; Wykowska, A. TobiiGlassesPySuite: An Open-source Suite for Using the Tobii Pro Glasses 2 in Eye-tracking Studies. In Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications, ETRA ’19, Denver, CO, USA, 25–28 June 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 46:1–46:5. [Google Scholar] [CrossRef]
- Virtanen, P.; Gommers, R.; Oliphant, T.E.; Haberland, M.; Reddy, T.; Cournapeau, D.; Burovski, E.; Peterson, P.; Weckesser, W.; Bright, J.; et al. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat. Methods 2020, 17, 261–272. [Google Scholar] [CrossRef]
- Lakens, D.; Scheel, A.M.; Isager, P.M. Equivalence Testing for Psychological Research: A Tutorial. Adv. Methods Pract. Psychol. Sci. 2018, 1, 259–269. [Google Scholar] [CrossRef]
- Maxwell, S.E.; Lau, M.Y.; Howard, G.S. Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? Am. Psychol. 2015, 70, 487–498. [Google Scholar] [CrossRef]
- Cohn, J.F.; Schmidt, K.L. The timing of facial motion in posed and spontaneous smiles. Int. J. Wavelets, Multiresolution Inf. Process. 2004, 2, 121–132. [Google Scholar] [CrossRef]
- Perusquía-Hernández, M.; Ayabe-Kanamura, S.; Suzuki, K. Human perception and biosignal-based identification of posed and spontaneous smiles. PLoS ONE 2019, 14, e0226328. [Google Scholar] [CrossRef]
- Händel, B.F.; Chen, X.; Inbar, M.; Kusnir, F.; Landau, A.N. A quantitative assessment of EOG eye tracking during free viewing in sighted and in congenitally blind. Brain Res. 2025, 1864, 149794. [Google Scholar] [CrossRef]
- Cooray, N. Proof of concept: Screening for REM sleep behaviour disorder with a minimal set of sensors. Clin. Neurophysiol. 2021, 132, 904–913. [Google Scholar] [CrossRef]
- Antoniades, C.A.; Spering, M. Eye movements in Parkinson’s disease: From neurophysiological mechanisms to diagnostic tools. Trends Neurosci. 2024, 47, 71–83. [Google Scholar] [CrossRef]











| Criteria | EOG | Eye Tracking |
|---|---|---|
| Measurements [26] | Measures the corneoretinal standing potential using electrodes around the eyes | Analyzes infrared light reflected from the eye using video-based cameras |
| Occlusion [13,26] | Can track under eyelids | Cannot track when eyes are closed or occluded |
| Illumination sensitivity [14,15,16] | Ambient lighting might cause small drifts [25] | Performance may degrade in bright outdoor environments or in dark conditions without infrared illumination |
| Optical interface sensitivity [17] | Not affected by reflections | Performance may degrade due to optical interference from ambient infrared light and reflective surfaces |
| Spatial accuracy [27,28] | Lower (approx. 1–2° error) | Higher (up to 0.5° or better) |
| Temporal resolution [27,29] | Higher (up to 1 kHz) | Lower (typically 60–300 Hz; high-end systems can reach up to 1000 Hz) |
| Computational load [26] | Lower (signal processing) | Higher (video acquisition and image analysis) |
| Direction | Condition | Shapiro–Wilk (W, p-Value) | Wilcoxon (W, p-Value) | Cohen’s d |
|---|---|---|---|---|
| Horizontal | Lights on | 0.775, 0.011 | 16, 0.496 | 0.45 |
| Lights off | 0.731, 0.003 | |||
| Vertical | Lights on | 0.699, 0.001 | 19, 0.734 | −0.25 |
| Lights off | 0.889, 0.199 |
| Participant | W-Stat X | p-Val X | Equiv Bound X | 90% CI X | Equiv X | W-Stat Y | p-Val Y | Equiv Bound Y | 90% CI Y | Equiv Y |
|---|---|---|---|---|---|---|---|---|---|---|
| All | 124,940 | 1.000 | Yes | 124,130 | 1.000 | Yes | ||||
| P01 | 125,160 | 1.000 | Yes | 116,194 | 0.359 | Yes | ||||
| P02 | 117,783 | 0.759 | Yes | 121,977 | 1.000 | Yes | ||||
| P03 | 121,785 | 1.000 | Yes | 124,363 | 1.000 | Yes | ||||
| P04 | 118,439 | 1.000 | Yes | 129,296 | 1.000 | Yes | ||||
| P05 | 120,971 | 1.000 | Yes | 128,942 | 1.000 | Yes | ||||
| P06 | 120,104 | 1.000 | Yes | 127,549 | 1.000 | Yes | ||||
| P07 | 117,370 | 0.629 | Yes | 127,024 | 1.000 | Yes | ||||
| P08 | 129,136 | 1.000 | Yes | 121,666 | 1.000 | Yes | ||||
| P09 | 128,758 | 1.000 | Yes | 115,432 | 0.244 | Yes | ||||
| P10 | 123,344 | 1.000 | Yes | 123,664 | 1.000 | Yes | ||||
| P11 | 128,490 | 1.000 | Yes | 124,597 | 1.000 | Yes |
| Axis | Condition | MAE (Mean ± SD) | Shapiro–Wilk (W, p-Value) | Wilcoxon (W, p-Value) | Cohen’s d |
|---|---|---|---|---|---|
| X | Paired | 0.334 ± 0.123 | 0.828, <0.001 | 0, <0.001 | −4.58 |
| Shuffled | 0.565 ± 0.128 | 0.904, <0.001 | |||
| Y | Paired | 0.327 ± 0.135 | 0.536, <0.001 | 107, <0.001 | −1.85 |
| Shuffled | 0.465 ± 0.120 | 0.582, <0.001 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wei, X.; Dollack, F.; Kiyokawa, K.; Perusquía-Hernández, M. Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed. Sensors 2025, 25, 6754. https://doi.org/10.3390/s25216754
Wei X, Dollack F, Kiyokawa K, Perusquía-Hernández M. Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed. Sensors. 2025; 25(21):6754. https://doi.org/10.3390/s25216754
Chicago/Turabian StyleWei, Xin, Felix Dollack, Kiyoshi Kiyokawa, and Monica Perusquía-Hernández. 2025. "Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed" Sensors 25, no. 21: 6754. https://doi.org/10.3390/s25216754
APA StyleWei, X., Dollack, F., Kiyokawa, K., & Perusquía-Hernández, M. (2025). Electro-Oculography and Proprioceptive Calibration Enable Horizontal and Vertical Gaze Estimation, Even with Eyes Closed. Sensors, 25(21), 6754. https://doi.org/10.3390/s25216754

