Next Article in Journal
Wearable Sensor-Based Human Activity Recognition Method with Multi-Features Extracted from Hilbert-Huang Transform
Next Article in Special Issue
Accelerometry-Based Activity Recognition and Assessment in Rheumatic and Musculoskeletal Diseases
Previous Article in Journal
Neuron Stimulation Device Integrated with Silicon Nanowire-Based Photodetection Circuit on a Flexible Substrate
Previous Article in Special Issue
Elimination of Drifts in Long-Duration Monitoring for Apnea-Hypopnea of Human Respiration
Article Menu

Export Article

Open AccessArticle
Sensors 2016, 16(12), 2050; doi:10.3390/s16122050

A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control

1
Industrial Design Institute, Zhejiang University of Technology, Hangzhou 310023, China
2
College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
3
Fashion Art Design Institute, Donghua University, Shanghai 200051, China
*
Author to whom correspondence should be addressed.
Academic Editors: Steffen Leonhardt and Daniel Teichmann
Received: 29 August 2016 / Revised: 22 October 2016 / Accepted: 8 November 2016 / Published: 2 December 2016
(This article belongs to the Special Issue Wearable Biomedical Sensors)
View Full-Text   |   Download PDF [7230 KB, uploaded 6 December 2016]   |  

Abstract

To recognize the user’s motion intention, brain-machine interfaces (BMI) usually decode movements from cortical activity to control exoskeletons and neuroprostheses for daily activities. The aim of this paper is to investigate whether self-induced variations of the electroencephalogram (EEG) can be useful as control signals for an upper-limb exoskeleton developed by us. A BMI based on event-related desynchronization/synchronization (ERD/ERS) is proposed. In the decoder-training phase, we investigate the offline classification performance of left versus right hand and left hand versus both feet by using motor execution (ME) or motor imagery (MI). The results indicate that the accuracies of ME sessions are higher than those of MI sessions, and left hand versus both feet paradigm achieves a better classification performance, which would be used in the online-control phase. In the online-control phase, the trained decoder is tested in two scenarios (wearing or without wearing the exoskeleton). The MI and ME sessions wearing the exoskeleton achieve mean classification accuracy of 84.29% ± 2.11% and 87.37% ± 3.06%, respectively. The present study demonstrates that the proposed BMI is effective to control the upper-limb exoskeleton, and provides a practical method by non-invasive EEG signal associated with human natural behavior for clinical applications. View Full-Text
Keywords: BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution BMI; ERD; ERS; upper-limb exoskeleton; motor imagery; motor execution
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control. Sensors 2016, 16, 2050.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top