Next Article in Journal
Wearable Sensor-Based Human Activity Recognition Method with Multi-Features Extracted from Hilbert-Huang Transform
Next Article in Special Issue
Accelerometry-Based Activity Recognition and Assessment in Rheumatic and Musculoskeletal Diseases
Previous Article in Journal
Neuron Stimulation Device Integrated with Silicon Nanowire-Based Photodetection Circuit on a Flexible Substrate
Previous Article in Special Issue
Elimination of Drifts in Long-Duration Monitoring for Apnea-Hypopnea of Human Respiration
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(12), 2050;

A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control

Industrial Design Institute, Zhejiang University of Technology, Hangzhou 310023, China
College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
Fashion Art Design Institute, Donghua University, Shanghai 200051, China
Author to whom correspondence should be addressed.
Academic Editors: Steffen Leonhardt and Daniel Teichmann
Received: 29 August 2016 / Revised: 22 October 2016 / Accepted: 8 November 2016 / Published: 2 December 2016
(This article belongs to the Special Issue Wearable Biomedical Sensors)
Full-Text   |   PDF [7230 KB, uploaded 6 December 2016]   |  


To recognize the user’s motion intention, brain-machine interfaces (BMI) usually decode movements from cortical activity to control exoskeletons and neuroprostheses for daily activities. The aim of this paper is to investigate whether self-induced variations of the electroencephalogram (EEG) can be useful as control signals for an upper-limb exoskeleton developed by us. A BMI based on event-related desynchronization/synchronization (ERD/ERS) is proposed. In the decoder-training phase, we investigate the offline classification performance of left versus right hand and left hand versus both feet by using motor execution (ME) or motor imagery (MI). The results indicate that the accuracies of ME sessions are higher than those of MI sessions, and left hand versus both feet paradigm achieves a better classification performance, which would be used in the online-control phase. In the online-control phase, the trained decoder is tested in two scenarios (wearing or without wearing the exoskeleton). The MI and ME sessions wearing the exoskeleton achieve mean classification accuracy of 84.29% ± 2.11% and 87.37% ± 3.06%, respectively. The present study demonstrates that the proposed BMI is effective to control the upper-limb exoskeleton, and provides a practical method by non-invasive EEG signal associated with human natural behavior for clinical applications. View Full-Text
Keywords: BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control. Sensors 2016, 16, 2050.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top