Next Article in Journal
New Frontiers for Applications of Thermal Infrared Imaging Devices: Computational Psychopshysiology in the Neurosciences
Next Article in Special Issue
Development of a Flow Injection Based High Frequency Dual Channel Quartz Crystal Microbalance
Previous Article in Journal
Radio Frequency Compatibility Evaluation of S Band Navigation Signals for Future BeiDou
Previous Article in Special Issue
Sub-Pixel Extraction of Laser Stripe Center Using an Improved Gray-Gravity Method
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle
Sensors 2017, 17(5), 1037; doi:10.3390/s17051037

Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion

1
School of Mechanical Engineering and Automation, Beihang University, Xueyuan Road, Haidian District, Beijing 100191, China
2
Beijing Baofengmojing Technologies Co., Ltd., Zhichun Road, Haidian District, Beijing 100191, China
*
Author to whom correspondence should be addressed.
Academic Editors: Ruqiang Yan, Subhas Chandra Mukhopadhyay and Gui Yun Tian
Received: 19 February 2017 / Revised: 28 April 2017 / Accepted: 2 May 2017 / Published: 5 May 2017
View Full-Text   |   Download PDF [5660 KB, uploaded 5 May 2017]   |  

Abstract

In mobile augmented/virtual reality (AR/VR), real-time 6-Degree of Freedom (DoF) motion tracking is essential for the registration between virtual scenes and the real world. However, due to the limited computational capacity of mobile terminals today, the latency between consecutive arriving poses would damage the user experience in mobile AR/VR. Thus, a visual-inertial based real-time motion tracking for mobile AR/VR is proposed in this paper. By means of high frequency and passive outputs from the inertial sensor, the real-time performance of arriving poses for mobile AR/VR is achieved. In addition, to alleviate the jitter phenomenon during the visual-inertial fusion, an adaptive filter framework is established to cope with different motion situations automatically, enabling the real-time 6-DoF motion tracking by balancing the jitter and latency. Besides, the robustness of the traditional visual-only based motion tracking is enhanced, giving rise to a better mobile AR/VR performance when motion blur is encountered. Finally, experiments are carried out to demonstrate the proposed method, and the results show that this work is capable of providing a smooth and robust 6-DoF motion tracking for mobile AR/VR in real-time. View Full-Text
Keywords: real-time motion tracking; adaptive filter; visual-inertial fusion; mobile AR/VR; pose estimation real-time motion tracking; adaptive filter; visual-inertial fusion; mobile AR/VR; pose estimation
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Fang, W.; Zheng, L.; Deng, H.; Zhang, H. Real-Time Motion Tracking for Mobile Augmented/Virtual Reality Using Adaptive Visual-Inertial Fusion. Sensors 2017, 17, 1037.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top