Hybrid Orientation Based Human Limbs Motion Tracking Method
AbstractOne of the key technologies that lays behind the human–machine interaction and human motion diagnosis is the limbs motion tracking. To make the limbs tracking efficient, it must be able to estimate a precise and unambiguous position of each tracked human joint and resulting body part pose. In recent years, body pose estimation became very popular and broadly available for home users because of easy access to cheap tracking devices. Their robustness can be improved by different tracking modes data fusion. The paper defines the novel approach—orientation based data fusion—instead of dominating in literature position based approach, for two classes of tracking devices: depth sensors (i.e., Microsoft Kinect) and inertial measurement units (IMU). The detailed analysis of their working characteristics allowed to elaborate a new method that let fuse more precisely limbs orientation data from both devices and compensates their imprecisions. The paper presents the series of performed experiments that verified the method’s accuracy. This novel approach allowed to outperform the precision of position-based joints tracking, the methods dominating in the literature, of up to 18%. View Full-Text
Share & Cite This Article
Glonek, G.; Wojciechowski, A. Hybrid Orientation Based Human Limbs Motion Tracking Method. Sensors 2017, 17, 2857.
Glonek G, Wojciechowski A. Hybrid Orientation Based Human Limbs Motion Tracking Method. Sensors. 2017; 17(12):2857.Chicago/Turabian Style
Glonek, Grzegorz; Wojciechowski, Adam. 2017. "Hybrid Orientation Based Human Limbs Motion Tracking Method." Sensors 17, no. 12: 2857.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.