- freely available
Robust Arm and Hand Tracking by Unsupervised Context Learning
AbstractHand tracking in video is an increasingly popular research field due to the rise of novel human-computer interaction methods. However, robust and real-time hand tracking in unconstrained environments remains a challenging task due to the high number of degrees of freedom and the non-rigid character of the human hand. In this paper, we propose an unsupervised method to automatically learn the context in which a hand is embedded. This context includes the arm and any other object that coherently moves along with the hand. We introduce two novel methods to incorporate this context information into a probabilistic tracking framework, and introduce a simple yet effective solution to estimate the position of the arm. Finally, we show that our method greatly increases robustness against occlusion and cluttered background, without degrading tracking performance if no contextual information is available. The proposed real-time algorithm is shown to outperform the current state-of-the-art by evaluating it on three publicly available video datasets. Furthermore, a novel dataset is created and made publicly available for the research community.
Share & Cite This Article
Export to BibTeX | EndNote
MDPI and ACS Style
Spruyt, V.; Ledda, A.; Philips, W. Robust Arm and Hand Tracking by Unsupervised Context Learning. Sensors 2014, 14, 12023-12058.View more citation formats
Spruyt V, Ledda A, Philips W. Robust Arm and Hand Tracking by Unsupervised Context Learning. Sensors. 2014; 14(7):12023-12058.Chicago/Turabian Style
Spruyt, Vincent; Ledda, Alessandro; Philips, Wilfried. 2014. "Robust Arm and Hand Tracking by Unsupervised Context Learning." Sensors 14, no. 7: 12023-12058.