Next Article in Journal
An Improved STARFM with Help of an Unmixing-Based Method to Generate High Spatial and Temporal Resolution Remote Sensing Data in Complex Heterogeneous Regions
Next Article in Special Issue
Design and Analysis of a Single—Camera Omnistereo Sensor for Quadrotor Micro Aerial Vehicles (MAVs)
Previous Article in Journal
Plastic Deformation of Micromachined Silicon Diaphragms with a Sealed Cavity at High Temperatures
Previous Article in Special Issue
An Online Continuous Human Action Recognition Algorithm Based on the Kinect Sensor
Open AccessArticle

Vision-Based Pose Estimation for Robot-Mediated Hand Telerehabilitation

Department of Control and Computer Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, Turin 10129, Italy
Institute of Electronics, Computer and Telecommunication Engineering, National Research Council, Corso Duca degli Abruzzi 24, Turin 10129, Italy
The BioRobotics Institute, Scuola Superiore Sant’Anna, viale Rinaldo Piaggio 34, Pontedera 56025, Italy
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Academic Editor: Yajing Shen
Sensors 2016, 16(2), 208;
Received: 16 December 2015 / Revised: 27 January 2016 / Accepted: 29 January 2016 / Published: 5 February 2016
(This article belongs to the Special Issue Sensors for Robots)
PDF [5164 KB, uploaded 5 February 2016]


Vision-based Pose Estimation (VPE) represents a non-invasive solution to allow a smooth and natural interaction between a human user and a robotic system, without requiring complex calibration procedures. Moreover, VPE interfaces are gaining momentum as they are highly intuitive, such that they can be used from untrained personnel (e.g., a generic caregiver) even in delicate tasks as rehabilitation exercises. In this paper, we present a novel master–slave setup for hand telerehabilitation with an intuitive and simple interface for remote control of a wearable hand exoskeleton, named HX. While performing rehabilitative exercises, the master unit evaluates the 3D position of a human operator’s hand joints in real-time using only a RGB-D camera, and commands remotely the slave exoskeleton. Within the slave unit, the exoskeleton replicates hand movements and an external grip sensor records interaction forces, that are fed back to the operator-therapist, allowing a direct real-time assessment of the rehabilitative task. Experimental data collected with an operator and six volunteers are provided to show the feasibility of the proposed system and its performances. The results demonstrate that, leveraging on our system, the operator was able to directly control volunteers’ hands movements. View Full-Text
Keywords: hand telerehabilitation; hand exoskeleton; motion tracking; upper limb rehabilitation hand telerehabilitation; hand exoskeleton; motion tracking; upper limb rehabilitation

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Airò Farulla, G.; Pianu, D.; Cempini, M.; Cortese, M.; Russo, L.O.; Indaco, M.; Nerino, R.; Chimienti, A.; Oddo, C.M.; Vitiello, N. Vision-Based Pose Estimation for Robot-Mediated Hand Telerehabilitation. Sensors 2016, 16, 208.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top