Next Article in Journal
An Empirical Study on Transmission Beamforming for Ultrasonic Guided-Wave Based Structural Health Monitoring
Next Article in Special Issue
C-MHAD: Continuous Multimodal Human Action Dataset of Simultaneous Video and Inertial Sensing
Previous Article in Journal
Adaptive Indoor Area Localization for Perpetual Crowdsourced Data Collection
Article

Ambiguity-Free Optical–Inertial Tracking for Augmented Reality Headsets

Information Engineering Department, University of Pisa, Via G. Caruso 16, 56122 Pisa, Italy
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(5), 1444; https://doi.org/10.3390/s20051444
Received: 13 February 2020 / Revised: 4 March 2020 / Accepted: 4 March 2020 / Published: 6 March 2020
(This article belongs to the Special Issue Sensors Fusion for Human-Centric 3D Capturing)
The increasing capability of computing power and mobile graphics has made possible the release of self-contained augmented reality (AR) headsets featuring efficient head-anchored tracking solutions. Ego motion estimation based on well-established infrared tracking of markers ensures sufficient accuracy and robustness. Unfortunately, wearable visible-light stereo cameras with short baseline and operating under uncontrolled lighting conditions suffer from tracking failures and ambiguities in pose estimation. To improve the accuracy of optical self-tracking and its resiliency to marker occlusions, degraded camera calibrations, and inconsistent lighting, in this work we propose a sensor fusion approach based on Kalman filtering that integrates optical tracking data with inertial tracking data when computing motion correlation. In order to measure improvements in AR overlay accuracy, experiments are performed with a custom-made AR headset designed for supporting complex manual tasks performed under direct vision. Experimental results show that the proposed solution improves the head-mounted display (HMD) tracking accuracy by one third and improves the robustness by also capturing the orientation of the target scene when some of the markers are occluded and when the optical tracking yields unstable and/or ambiguous results due to the limitations of using head-anchored stereo tracking cameras under uncontrollable lighting conditions. View Full-Text
Keywords: augmented reality; optical tracking; computer vision; perspective 3-point problem; inertial tracking; hand–eye calibration; sensor fusion; Kalman filter; head-mounted display augmented reality; optical tracking; computer vision; perspective 3-point problem; inertial tracking; hand–eye calibration; sensor fusion; Kalman filter; head-mounted display
Show Figures

Figure 1

MDPI and ACS Style

Cutolo, F.; Mamone, V.; Carbonaro, N.; Ferrari, V.; Tognetti, A. Ambiguity-Free Optical–Inertial Tracking for Augmented Reality Headsets. Sensors 2020, 20, 1444. https://doi.org/10.3390/s20051444

AMA Style

Cutolo F, Mamone V, Carbonaro N, Ferrari V, Tognetti A. Ambiguity-Free Optical–Inertial Tracking for Augmented Reality Headsets. Sensors. 2020; 20(5):1444. https://doi.org/10.3390/s20051444

Chicago/Turabian Style

Cutolo, Fabrizio, Virginia Mamone, Nicola Carbonaro, Vincenzo Ferrari, and Alessandro Tognetti. 2020. "Ambiguity-Free Optical–Inertial Tracking for Augmented Reality Headsets" Sensors 20, no. 5: 1444. https://doi.org/10.3390/s20051444

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop