# A Low-Cost, Wearable Opto-Inertial 6-DOF Hand Pose Tracking System for VR

^{*}

## Abstract

**:**

## 1. Introduction

^{TM}with the integration of hand tracking systems like Leap Motion Controller enables us to experience “visually realistic” interaction with Virtual objects. However, most of these commercial devices does not provide touch feedback (haptics). The integration of haptics in VR devices will improve interactivity and immersion [4]. Fully optical devices like Leap Motion have limited applicability for VR haptic devices. The main reason being the haptic setup on the hand can occlude part of the skin which affects the performance of the tracker. This motivates the development of a low cost hand tracking system for an integration with a lightweight, low cost, wearable and wireless exoskeleton setup for force feedback.

## 2. Materials and Methods

#### 2.1. Optical Tracking System

- Find the linear fitting values from the graph of Z vs. X axis and Z vs. Y axis while moving the IR tracker in the Z axis. Slope and intercept is calculated for 10 sample points. To calculate slope and intercepts of data sequence (X, Y)using a least square solution, the LabVIEW linear fit Virtual Instrument (VI) uses the iterative general Least Square method to fit points to a straight line of the form$$\begin{array}{c}\hfill f=mx+b\end{array}$$$$\begin{array}{c}\hfill {y}_{i}=m{x}_{i}+b\end{array}$$The least square method finds the slope and intercept which minimizes the residue expressed by the following equation.$$\begin{array}{c}\hfill {\displaystyle \frac{1}{N}}\sum _{i=0}^{N-1}{\left({f}_{i}-{y}_{i}\right)}^{2}\end{array}$$
- Feedback and update the new slope values as coefficient of the calibration matrix (${m}_{x}$ and ${m}_{y}$)
- Continue to Step 1

#### 2.2. Inertial Tracking System

#### 2.3. Performance Evaluation

## 3. Results

## 4. Conclusions

## Acknowledgments

## Author Contributions

## Conflicts of Interest

## References

- Maereg, A.T.; Secco, E.L.; Agidew, T.F.; Diaz-Nieto, R.; Nagar, A. Wearable haptics for VR stiffness discrimination. In Proceedings of the European Robotics Forum, Edinburgh, UK, 22–24 March 2017. [Google Scholar]
- Andualem, T.M.; David, R.; Atulya, N.; Emanuele, L.S. Integrated wireless and wearable haptics system for virtual interaction. In Proceedings of the EuroHaptics, London, UK, 4–7 July 2016. [Google Scholar]
- Li, M.; Konstantinova, J.; Secco, E.L.; Jiang, A.; Liu, H.; Nanayakkara, T.; Seneviratne, L.D.; Dasgupta, P.; Althoefer, K.; Wurdemann, H.A. Using visual cues to enhance haptic feedback for palpation on virtual model of soft tissue. Med. Biol. Eng. Comput.
**2015**, 53, 1177–1186. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Margolis, T.; DeFanti, T.A.; Dawe, G.; Prudhomme, A.; Schulze, J.P.; Cutchin, S. Low cost heads-up virtual reality (HUVR) with optical tracking and haptic feedback. In Proceedings of the Society of Photo-optical Instrumentation Engineers (SPIE), San Francisco, CA, USA, 23–27 January 2011. [Google Scholar]
- Foxlin, E.; Altshuler, Y.; Naimark, L.; Harrington, M. Flighttracker: A novel optical/inertial tracker for cockpit enhanced vision. In Proceedings of the 3rd IEEE/ACM International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 2–5 November 2004. [Google Scholar]
- Gu, X.; Zhang, Y.; Sun, W.; Bian, Y.; Zhou, D.; Kristensson, P.O. Dexmo: An Inexpensive and Lightweight Mechanical Exoskeleton for Motion Capture and Force Feedback in VR. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Santa Clara, CA, USA, 7–12 May 2016. [Google Scholar]
- Steven, M.L. Virtual Reality; Cambridge University Press: Cambridge, UK, 2016. [Google Scholar]
- Secco, E.L.; Sottile, R.; Davalli, A.; Calori, L.; Cappello, A.; Chiari, L. VR-Wheel: A rehabilitation platform for motor recovery. In Proceedings of the Virtual Rehabilitation, Venice, Italy, 27–29 September 2007. [Google Scholar]
- Zaoui, M.; Wormell, D.; Altshuler, Y.; Foxlin, E.; McIntyre, J. A 6 DOF opto-inertial tracker for virtual reality experiments in microgravity. Acta Astronaut.
**2001**, 49, 451–462. [Google Scholar] [CrossRef] - He, C.; Kazanzides, P.; Sen, H.T.; Kim, S.; Liu, Y. An inertial and optical sensor fusion approach for six degree-of-freedom pose estimation. Sensors
**2015**, 15, 16448–16465. [Google Scholar] [CrossRef] [PubMed] - Cortes, G.; Marchand, É.; Ardouinz, J.; Lécuyer, A. Increasing optical tracking workspace of VR applications using controlled cameras. In Proceedings of the IEEE Symposium on 3D User Interfaces (3DUI), Los Angeles, CA, USA, 18–19 March 2017. [Google Scholar]
- Hogue, A.; Jenkin, M.; Allison, R.S. An optical-inertial tracking system for fully-enclosed VR displays. In Proceedings of the First Canadian Conference on Computer and Robot Vision, London, ON, Canada, 17–19 May 2004. [Google Scholar]
- Patel, K.; Stuerzlinger, W. Simulation of a virtual reality tracking system. In Proceedings of the IEEE International Conference on Virtual Environments Human-Computer Interfaces and Measurement Systems (VECIMS), Ottawa, ON, Canada, 19–21 September 2011. [Google Scholar]
- Calloway, T.; Megherbi, D.B. Using 6 DOF vision-inertial tracking to evaluate and improve low cost depth sensor based SLAM. In Proceedings of the IEEE International Conference on Computational Intelligence and Virtual Environments for Measurement Systems and Applications (CIVEMSA), Budapest, Hungary, 27–28 June 2016. [Google Scholar]
- Pintaric, T.; Kaufmann, H. Affordable infrared-optical pose-tracking for virtual and augmented reality. In Proceedings of the IEEE VR Workshop on Trends and Issues in Tracking for Virtual Environments, Charlotte, NC, USA, 11 March 2007. [Google Scholar]
- Marchand, E.; Uchiyama, H.; Spindler, F. Pose estimation for augmented reality: A hands-on survey. IEEE Trans. Vis. Comput. Graph.
**2016**, 22, 2633–2651. [Google Scholar] [CrossRef] [PubMed] - Satyavolu, S.; Bruder, G.; Willemsen, P.; Steinicke, F. Analysis of IR-based virtual reality tracking using multiple Kinects. In Proceedings of the IEEE Virtual Reality Short Papers and Posters (VRW), Costa Mesa, CA, USA, 4–8 March 2012. [Google Scholar]
- Madgwick, S.O.; Harrison, A.J.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the IEEE International Conference on Rehabilitation Robotics (ICORR), Zurich, Switzerland, 29 June–1 July 2011. [Google Scholar]
- Mahony, R.; Hamel, T.; Pflimlin, J.M. Nonlinear complementary filters on the special orthogonal group. IEEE Trans. Autom. Contr.
**2008**, 53, 1203–1218. [Google Scholar] [CrossRef] [Green Version] - Vasconcelos, J.F.; Elkaim, G.; Silvestre, C.; Oliveira, P.; Cardeira, B. Geometric approach to strapdown magnetometer calibration in sensor frame. IEEE Trans. Aerosp. Electron. Syst.
**2011**, 47, 1293–1306. [Google Scholar] [CrossRef]

Position RMSE (mm) | Orientation RMSE (Degrees) | ||||
---|---|---|---|---|---|

Axis | Original | Filtered | Angle | Original | Filtered |

X | 0.288 | 0.148 | Pitch | 0.199 | 0.113 |

Y | 0.268 | 0.104 | Roll | 0.137 | 0.079 |

Z | 0.653 | 0.373 | Yaw | 0.831 | 0.486 |

© 2017 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Maereg, A.T.; Secco, E.L.; Agidew, T.F.; Reid, D.; Nagar, A.K.
A Low-Cost, Wearable Opto-Inertial 6-DOF Hand Pose Tracking System for VR. *Technologies* **2017**, *5*, 49.
https://doi.org/10.3390/technologies5030049

**AMA Style**

Maereg AT, Secco EL, Agidew TF, Reid D, Nagar AK.
A Low-Cost, Wearable Opto-Inertial 6-DOF Hand Pose Tracking System for VR. *Technologies*. 2017; 5(3):49.
https://doi.org/10.3390/technologies5030049

**Chicago/Turabian Style**

Maereg, Andualem T., Emanuele L. Secco, Tayachew F. Agidew, David Reid, and Atulya K. Nagar.
2017. "A Low-Cost, Wearable Opto-Inertial 6-DOF Hand Pose Tracking System for VR" *Technologies* 5, no. 3: 49.
https://doi.org/10.3390/technologies5030049