Next Article in Journal
Programming Real-Time Sound in Python
Next Article in Special Issue
A Multi-Layer Perceptron Network for Perfusion Parameter Estimation in DCE-MRI Studies of the Healthy Kidney
Previous Article in Journal
Quantum Leap from Gold and Silver to Aluminum Nanoplasmonics for Enhanced Biomedical Applications
Previous Article in Special Issue
Automatic Cephalometric Landmark Detection on X-ray Images Using a Deep-Learning Method
Open AccessArticle

Person Independent Recognition of Head Gestures from Parametrised and Raw Signals Recorded from Inertial Measurement Unit

Faculty of Electrical, Electronic, Computer and Control Engineering, Institute of Electronics, Lodz University of Technology, 211/215 Wolczanska Str., 90-924 Lodz, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(12), 4213; https://doi.org/10.3390/app10124213
Received: 10 May 2020 / Revised: 14 June 2020 / Accepted: 17 June 2020 / Published: 19 June 2020
(This article belongs to the Special Issue Machine Learning for Biomedical Application)
Numerous applications of human–machine interfaces, e.g., dedicated to persons with disabilities, require contactless handling of devices or systems. The purpose of this research is to develop a hands-free head-gesture-controlled interface that can support persons with disabilities to communicate with other people and devices, e.g., the paralyzed to signal messages or the visually impaired to handle travel aids. The hardware of the interface consists of a small stereovision rig with a built-in inertial measurement unit (IMU). The device is to be positioned on a user’s forehead. Two approaches to recognize head movements were considered. In the first approach, for various time window sizes of the signals recorded from a three-axis accelerometer and a three-axis gyroscope, statistical parameters were calculated such as: average, minimum and maximum amplitude, standard deviation, kurtosis, correlation coefficient, and signal energy. For the second approach, the focus was put onto direct analysis of signal samples recorded from the IMU. In both approaches, the accuracies of 16 different data classifiers for distinguishing the head movements: pitch, roll, yaw, and immobility were evaluated. The recordings of head gestures were collected from 65 individuals. The best results for the testing data were obtained for the non-parametric approach, i.e., direct classification of unprocessed samples of IMU signals for Support Vector Machine (SVM) classifier (95% correct recognitions). Slightly worse results, in this approach, were obtained for the random forests classifier (93%). The achieved high recognition rates of the head gestures suggest that a person with physical or sensory disability can efficiently communicate with other people or manage applications using simple head gesture sequences. View Full-Text
Keywords: electronic human-machine interface; blindness; gesture recognition; inertial sensors; IMU electronic human-machine interface; blindness; gesture recognition; inertial sensors; IMU
Show Figures

Figure 1

MDPI and ACS Style

Borowska-Terka, A.; Strumillo, P. Person Independent Recognition of Head Gestures from Parametrised and Raw Signals Recorded from Inertial Measurement Unit. Appl. Sci. 2020, 10, 4213.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop