Human-Computer Interaction Based on Hand Gestures Using RGB-D Sensors
AbstractIn this paper we present a new method for hand gesture recognition based on an RGB-D sensor. The proposed approach takes advantage of depth information to cope with the most common problems of traditional video-based hand segmentation methods: cluttered backgrounds and occlusions. The algorithm also uses colour and semantic information to accurately identify any number of hands present in the image. Ten different static hand gestures are recognised, including all different combinations of spread fingers. Additionally, movements of an open hand are followed and 6 dynamic gestures are identified. The main advantage of our approach is the freedom of the user’s hands to be at any position of the image without the need of wearing any specific clothing or additional devices. Besides, the whole method can be executed without any initial training or calibration. Experiments carried out with different users and in different environments prove the accuracy and robustness of the method which, additionally, can be run in real-time. View Full-Text
Share & Cite This Article
Palacios, J.M.; Sagüés, C.; Montijano, E.; Llorente, S. Human-Computer Interaction Based on Hand Gestures Using RGB-D Sensors. Sensors 2013, 13, 11842-11860.
Palacios JM, Sagüés C, Montijano E, Llorente S. Human-Computer Interaction Based on Hand Gestures Using RGB-D Sensors. Sensors. 2013; 13(9):11842-11860.Chicago/Turabian Style
Palacios, José M.; Sagüés, Carlos; Montijano, Eduardo; Llorente, Sergio. 2013. "Human-Computer Interaction Based on Hand Gestures Using RGB-D Sensors." Sensors 13, no. 9: 11842-11860.