Next Article in Journal
Design and Realization of an Electromagnetic Guiding System for Blind Running Athletes
Next Article in Special Issue
LiDAR Scan Matching Aided Inertial Navigation System in GNSS-Denied Environments
Previous Article in Journal
Impact of Data Processing and Antenna Frequency on Spatial Structure Modelling of GPR Data
Previous Article in Special Issue
Inertial Sensing Based Assessment Methods to Quantify the Effectiveness of Post-Stroke Rehabilitation
Article Menu

Export Article

Open AccessArticle
Sensors 2015, 15(7), 16448-16465; doi:10.3390/s150716448

An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation

1
Beijing Engineering Research Center of Mixed Reality and Advanced Display, School of Optoelectronics, Beijing Institute of Technology, Beijing 100081, China
2
Department of Computer Science, Johns Hopkins University, Baltimore, MD 21218, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Gert F. Trommer
Received: 14 May 2015 / Revised: 21 June 2015 / Accepted: 24 June 2015 / Published: 8 July 2015
(This article belongs to the Special Issue Inertial Sensors and Systems)
View Full-Text   |   Download PDF [3937 KB, uploaded 8 July 2015]   |  

Abstract

Optical tracking provides relatively high accuracy over a large workspace but requires line-of-sight between the camera and the markers, which may be difficult to maintain in actual applications. In contrast, inertial sensing does not require line-of-sight but is subject to drift, which may cause large cumulative errors, especially during the measurement of position. To handle cases where some or all of the markers are occluded, this paper proposes an inertial and optical sensor fusion approach in which the bias of the inertial sensors is estimated when the optical tracker provides full six degree-of-freedom (6-DOF) pose information. As long as the position of at least one marker can be tracked by the optical system, the 3-DOF position can be combined with the orientation estimated from the inertial measurements to recover the full 6-DOF pose information. When all the markers are occluded, the position tracking relies on the inertial sensors that are bias-corrected by the optical tracking system. Experiments are performed with an augmented reality head-mounted display (ARHMD) that integrates an optical tracking system (OTS) and inertial measurement unit (IMU). Experimental results show that under partial occlusion conditions, the root mean square errors (RMSE) of orientation and position are 0.04° and 0.134 mm, and under total occlusion conditions for 1 s, the orientation and position RMSE are 0.022° and 0.22 mm, respectively. Thus, the proposed sensor fusion approach can provide reliable 6-DOF pose under long-term partial occlusion and short-term total occlusion conditions. View Full-Text
Keywords: optical tracking; inertial tracking; hybrid tracking; Extended Kalman Filter optical tracking; inertial tracking; hybrid tracking; Extended Kalman Filter
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

He, C.; Kazanzides, P.; Sen, H.T.; Kim, S.; Liu, Y. An Inertial and Optical Sensor Fusion Approach for Six Degree-of-Freedom Pose Estimation. Sensors 2015, 15, 16448-16465.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top