Next Article in Journal
Wearable Sensor-Based Human Activity Recognition Method with Multi-Features Extracted from Hilbert-Huang Transform
Next Article in Special Issue
Accelerometry-Based Activity Recognition and Assessment in Rheumatic and Musculoskeletal Diseases
Previous Article in Journal
Neuron Stimulation Device Integrated with Silicon Nanowire-Based Photodetection Circuit on a Flexible Substrate
Previous Article in Special Issue
Elimination of Drifts in Long-Duration Monitoring for Apnea-Hypopnea of Human Respiration
Open AccessArticle

A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control

1
Industrial Design Institute, Zhejiang University of Technology, Hangzhou 310023, China
2
College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
3
Fashion Art Design Institute, Donghua University, Shanghai 200051, China
*
Author to whom correspondence should be addressed.
Academic Editors: Steffen Leonhardt and Daniel Teichmann
Sensors 2016, 16(12), 2050; https://doi.org/10.3390/s16122050
Received: 29 August 2016 / Revised: 22 October 2016 / Accepted: 8 November 2016 / Published: 2 December 2016
(This article belongs to the Special Issue Wearable Biomedical Sensors)
To recognize the user’s motion intention, brain-machine interfaces (BMI) usually decode movements from cortical activity to control exoskeletons and neuroprostheses for daily activities. The aim of this paper is to investigate whether self-induced variations of the electroencephalogram (EEG) can be useful as control signals for an upper-limb exoskeleton developed by us. A BMI based on event-related desynchronization/synchronization (ERD/ERS) is proposed. In the decoder-training phase, we investigate the offline classification performance of left versus right hand and left hand versus both feet by using motor execution (ME) or motor imagery (MI). The results indicate that the accuracies of ME sessions are higher than those of MI sessions, and left hand versus both feet paradigm achieves a better classification performance, which would be used in the online-control phase. In the online-control phase, the trained decoder is tested in two scenarios (wearing or without wearing the exoskeleton). The MI and ME sessions wearing the exoskeleton achieve mean classification accuracy of 84.29% ± 2.11% and 87.37% ± 3.06%, respectively. The present study demonstrates that the proposed BMI is effective to control the upper-limb exoskeleton, and provides a practical method by non-invasive EEG signal associated with human natural behavior for clinical applications. View Full-Text
Keywords: BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution
Show Figures

Figure 1

MDPI and ACS Style

Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control. Sensors 2016, 16, 2050. https://doi.org/10.3390/s16122050

AMA Style

Tang Z, Sun S, Zhang S, Chen Y, Li C, Chen S. A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control. Sensors. 2016; 16(12):2050. https://doi.org/10.3390/s16122050

Chicago/Turabian Style

Tang, Zhichuan; Sun, Shouqian; Zhang, Sanyuan; Chen, Yumiao; Li, Chao; Chen, Shi. 2016. "A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control" Sensors 16, no. 12: 2050. https://doi.org/10.3390/s16122050

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Search more from Scilit
 
Search
Back to TopTop