Next Article in Journal
A UAV Maneuver Decision-Making Algorithm for Autonomous Airdrop Based on Deep Reinforcement Learning
Next Article in Special Issue
Automatic Visual Attention Detection for Mobile Eye Tracking Using Pre-Trained Computer Vision Models and Human Gaze
Previous Article in Journal
Intelligent Transportation Related Complex Systems and Sensors
Previous Article in Special Issue
Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality
Due to scheduled maintenance work on our core network, there may be short service disruptions on this website between 16:00 and 16:30 CEST on September 25th.
Article

ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays

1
Department of Physics, Technische Universität Kaiserslautern, Erwin-Schrödinger-Str. 46, 67663 Kaiserslautern, Germany
2
German Research Center for Artificial Intelligence (DFKI), Interactive Machine Learning Department, Stuhlsatzenhausweg 3, Saarland Informatics Campus D3_2, 66123 Saarbrücken, Germany
3
Applied Artificial Intelligence, Oldenburg University, Marie-Curie Str. 1, 26129 Oldenburg, Germany
*
Author to whom correspondence should be addressed.
Academic Editor: Jamie A Ward
Sensors 2021, 21(6), 2234; https://doi.org/10.3390/s21062234
Received: 25 February 2021 / Revised: 11 March 2021 / Accepted: 17 March 2021 / Published: 23 March 2021
(This article belongs to the Special Issue Wearable Technologies and Applications for Eye Tracking)
Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers. View Full-Text
Keywords: augmented reality; eye tracking; toolkit; accuracy; precision augmented reality; eye tracking; toolkit; accuracy; precision
Show Figures

Figure 1

MDPI and ACS Style

Kapp, S.; Barz, M.; Mukhametov, S.; Sonntag, D.; Kuhn, J. ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays. Sensors 2021, 21, 2234. https://doi.org/10.3390/s21062234

AMA Style

Kapp S, Barz M, Mukhametov S, Sonntag D, Kuhn J. ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays. Sensors. 2021; 21(6):2234. https://doi.org/10.3390/s21062234

Chicago/Turabian Style

Kapp, Sebastian, Michael Barz, Sergey Mukhametov, Daniel Sonntag, and Jochen Kuhn. 2021. "ARETT: Augmented Reality Eye Tracking Toolkit for Head Mounted Displays" Sensors 21, no. 6: 2234. https://doi.org/10.3390/s21062234

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop