Integration of Virtual Reality-Enhanced Motor Imagery and Brain-Computer Interface for a Lower-Limb Rehabilitation Exoskeleton Robot
Abstract
:1. Introduction
- The proposed framework of integrating VR glasses and BCI system enables the collection of EEG in VR environment, and the quality of EEG channels can be further captured and recorded through image processing. Compared to VR devices on the market, VR goggle integration technology requires a lower development cost and can be developed using a smartphone as the screen of a head-mounted display.
- The effects of different AO contents and subject status on MI classification were preliminarily investigated.
- We also propose an open-loop automatic threshold correction procedure for MI-BCI to facilitate the application of LLRER for closed-loop gait rehabilitation in subsequent studies.
- The integration of multiple devices, the MI-BCI devices, treadmill control system, and LLRER control system were integrated through the local network, and the practicality of this integration framework was verified in the closed-loop experimental session.
2. Methods
2.1. Overall System Architecture
2.2. BCI Integrated with Virtual Reality (VR) and Channel Quality Acquisition
2.3. Virtual Reality (VR) and MI-BCI System Integration
3. Electrode Selection Experiment for High Focused Gait Mental Task
4. Closed-Loop Application of VR-Enhanced Gait MI Classification Model in LLRER
4.1. VR-Enhanced BCI, Exploring the Differences in Subjects’ Perspectives
4.2. MI Threshold Auto-Leveling Based on CDF
4.3. Open-Loop Experiment of MI-BCI System
4.4. MI-BCI-Driven LLRER Closed-Loop Experiments
5. Discussion and Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Masengo, G.; Zhang, X.; Dong, R.; Alhassan, A.B.; Hamza, K.; Mudaheranwa, E. Lower limb exoskeleton robot and its cooperative control: A review, trends, and challenges for future research. Front. Neurorobot. 2023, 16, 913748. [Google Scholar] [CrossRef] [PubMed]
- Maier, M.; Ballester, B.R.; Verschure, P.F.M.J. Principles of neurorehabilitation after stroke based on motor learning and brain plasticity mechanisms. Front. Syst. Neurosci. 2019, 13, 74. [Google Scholar] [CrossRef] [PubMed]
- Shusharina, N.N.; Bogdanov, E.A.; Petrov, V.A.; Silina, E.V.; Patrushev, M.V. Multifunctional neurodevice for recognition of electrophysiological signals and data transmission in an exoskeleton construction. Biol. Med. 2016, 8, 1. [Google Scholar] [CrossRef]
- Bogdanov, E.A.; Petrov, V.A.; Botman, S.A.; Sapunov, V.V.; Stupin, V.A.; Silina, E.V.; Sinelnikova, T.G.; Patrushev, M.V.; Shusharina, N.N. Development of a neurodevice with a biological feedback for compensating for lost motor functions. Bullet. Russ. State Med. Univ. 2016, 4, 29–35. [Google Scholar] [CrossRef]
- Tortora, S.; Tonin, L.; Chisari, C.; Micera, S.; Menegatti, E.; Artoni, F. Hybrid human-machine interface for gait decoding through bayesian fusion of EEG and EMG classifiers. Front. Neurorobot. 2020, 14, 89. [Google Scholar] [CrossRef] [PubMed]
- Lóopez-Larraz, E.; Birbaumer, N.; Ramos-Murguialday, A. A hybrid EEG-EMG BMI improves the detection of movement intention in cortical stroke patients with complete hand paralysis. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2000–2003, Honolulu, HI, USA, 18–21 July 2018; IEEE: Piscataway, NJ, USA, 2018. [Google Scholar] [CrossRef]
- Leeb, R.; Sagha, H.; Chavarriaga, R. Multimodal fusion of muscle and brain signals for a hybrid-BCI. In Proceedings of the 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 31 August–4 September 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 4343–4346. [Google Scholar] [CrossRef]
- Sargood, F.; Djeha, M.; Guiatni, M.; Ababou, N. WPT-ANN and belief theory based EEG/EMG data fusion for movement identification. Trait. Du Signal 2019, 36, 383–391. [Google Scholar] [CrossRef]
- Leeb, R.; Sagha, H.; Chavarriaga, R.R.; Millán, J. A hybrid brain–computer interface based on the fusion of electroencephalographic and electromyographic activities. J. Neural Eng. 2011, 8, e025011. [Google Scholar] [CrossRef] [PubMed]
- Jimenez-Fabian, R.; Verlinden, O. Review of control algorithms for robotic ankle systems in lower-limb orthoses, prostheses, and exoskeletons. Med. Eng. Phys. 2012, 34, 397–408. [Google Scholar] [CrossRef] [PubMed]
- Macuga, K.L.; Frey, S.H. Neural representations involved in observed, imagined, and imitated actions are dissociable and hierarchically organized. Neuroimage 2012, 59, 2798–2807. [Google Scholar] [CrossRef]
- Eaves, D.L.; Riach, M.; Holmes, P.S.; Wright, D.J. Motor imagery during action observation: A brief review of evidence, theory and future research opportunities. Front. Neurosci. 2016, 10, 514. [Google Scholar] [CrossRef]
- Vogt, S.; Di Rienzo, F.; Collet, C.; Collins, A.; Guillot, A. Multiple roles of motor imagery during action observation. Front. Hum. Neurosci. 2013, 7, 807. [Google Scholar] [CrossRef]
- Taube, W.; Mouthon, M.; Leukel, C.; Hoogewoud, H.-M.; Annoni, J.-M.; Keller, M. Brain activity during observation and motor imagery of different balance tasks: An fMRI study. Cortex 2015, 64, 102–114. [Google Scholar] [CrossRef]
- Berends, H.I.; Wolkorte, R.; Ijzerman, M.J.; van Putten, M.J.A.M. Differential cortical activation during observation and observation-and-imagination. Exp. Brain Res. 2013, 229, 337–345. [Google Scholar] [CrossRef] [PubMed]
- Putze, F.; Vourvopoulos, A.; Lécuyer, A.; Krusienski, D.; Bermúdez i Badia, S.; Mullen, T.; Herff, C. Brain-computer interfaces and augmented/virtual reality. Front. Hum. Neurosci. 2020, 14, 144. [Google Scholar] [CrossRef]
- dos Santos, L.F.; Christ, O.; Mate, K.; Schmidt, H.; Krüger, J.; Dohle, C. Movement visualisation in virtual reality rehabilitation of the lower limb: A systematic review. Biomed. Eng. Online 2016, 15 (Suppl. S3), 144. [Google Scholar] [CrossRef] [PubMed]
- Kohli, V.; Tripathi, U.; Chamola, V.; Rout, B.K.; Kanhere, S. SA review on Virtual Reality and Augmented Reality use-cases of Brain Computer Interface based applications for smart cities. Microprocess. Microsyst. 2022, 88, 104392. [Google Scholar] [CrossRef]
- Mirelman, A.; Maidan, I.; Deutsch, J.E. Virtual reality and motor imagery: Promising tools for assessment and therapy in Parkinson’s disease. Mov. Disord. 2013, 28, 1597–1608. [Google Scholar] [CrossRef] [PubMed]
- Ferrero, L.; Ortiz, M.; Quiles, V.; Ianez, E.; Azorin, J.M. Improving Motor Imagery of Gait on a Brain–Computer Interface by Means of Virtual Reality: A Case of Study. IEEE Access 2021, 9, 49121–49130. [Google Scholar] [CrossRef]
- Lee, K.; Liu, D.; Perroud, L.; Chavarriaga, R.; Millán, J.d.R. A brain-controlled exoskeleton with cascaded event-related desynchronization classifiers. Robot. Auton. Syst. 2017, 90, 15–23. [Google Scholar] [CrossRef]
- Rodríguez-Ugarte, M.; Iáñez, E.; Ortiz, M.; Azorín, J.M. Improving real-time lower limb motor imagery detection using tDCS and an exoskeleton. Front. Neurosci. 2018, 12, 757. [Google Scholar] [CrossRef]
- Quiles, V.; Ferrero, L.; Ianez, E.; Ortiz, M.; Megia, A.; Comino, N.; Gil-Agudo, A.M.; Azorin, J.M. Usability and acceptance of using a lower-limb exoskeleton controlled by a BMI in incomplete spinal cord injury patients: A case study. In Proceedings of the 2020 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, QC, Canada, 20–24 July 2020; IEEE: Piscataway, NJ, USA, 2020. [Google Scholar]
- Ortiz, M.; de la Ossa, L.; Juan, J.; Iáñez, E.; Torricelli, D.; Tornero, J.; Azorín, J.M. An EEG database for the cognitive assessment of motor imagery during walking with a lower-limb exoskeleton. Sci. Data 2023, 10, 343. [Google Scholar] [CrossRef] [PubMed]
- Ferrero, L.; Ianez, E.; Quiles, V.; Azorin, J.M.; Ortiz, M. Adapting EEG based MI-BMI depending on alertness level for controlling a lower-limb exoskeleton. In Proceedings of the 2022 IEEE International Conference on Metrology for Extended Reality, Artificial Intelligence and Neural Engineering (MetroXRAINE), Rome, Italy, 26–28 October 2022; pp. 399–403. [Google Scholar] [CrossRef]
- Lin, C.; Sie, T. Lower limb rehabilitation exoskeleton using brain–computer interface based on multiband filtering with classifier fusion. Asian J. Control. 2023; online version of record. [Google Scholar] [CrossRef]
- Lin, C.-J.; Sie, T.-Y. Design and experimental characterization of artificial neural network controller for a lower limb robotic exoskeleton. Actuators 2023, 12, 55. [Google Scholar] [CrossRef]
- Veneman, J.F.; Kruidhof, R.; Hekman, E.E.G.; Ekkelenkamp, R.; Van Asseldonk, E.H.F.; van der Kooij, H. Design and evaluation of the LOPES exoskeleton robot for interactive gait rehabilitation. IEEE Trans. Neural Syst. Rehabil. Eng. 2007, 15, 379–386. [Google Scholar] [CrossRef] [PubMed]
- PaysPlat. A Windows 10 Software to Manage 3D Display in Side-by-Side or Top-Bottom Mode. 2024. Available online: https://github.com/PaysPlat/DesktopSbS#readme (accessed on 1 May 2024).
- Space Desk|Multi-Monitor Application 2024. Available online: https://www.spacedesk.net/zh/ (accessed on 1 May 2024).
- Ortiz, M.; de la Ossa, L.; Juan, J.; Iáñez, E.; Torricelli, D.; Tornero, J.; Azorín, J.M. An EEG database for the cognitive assessment of motor imagery during walking with a lower-limb exoskeleton. DECODED: A EUROBENCH subproject. Figshare 2022. [Google Scholar] [CrossRef]
- Meng, H.-J.; Pi, Y.-L.; Liu, K.; Cao, N.; Wang, Y.-Q.; Wu, Y.; Zhang, J. Differences between motor execution and motor imagery of grasping movements in the motor cortical excitatory circuit. PeerJ 2018, 6, e5588. [Google Scholar] [CrossRef] [PubMed]
- Takeuchi, N.; Izumi, S.-I. Rehabilitation with poststroke motor recovery: A review with a focus on neural plasticity. Stroke Res. Treat. 2013, 2013, 128641. [Google Scholar] [CrossRef]
- Li, M.; Xu, G.; Xie, J.; Chen, C. A review: Motor rehabilitation after stroke with control based on human intent. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2018, 232, 344–360. [Google Scholar] [CrossRef]
Item | Type | Specification |
---|---|---|
NI SBRIO-9631 | Embedded controller | Analog and Digital I/O, 266 MHz CPU, 64 MB DRAM, 128 MB Storage, 1 M Gate FPGA |
NI 9516 | Servo Drive Interface Module | Servo, 1-Axis, Dual Encoder |
MPYE-5-M5-010-b | Proportional directional control valve | Pressure range: 0~10 bar; Input voltage range: 0~10 V |
MAS-20-300N-AA-MC-O-ER-BG | Pneumatic Artificial Muscle | Operating pressure: 0~6 bar; Maximal permissible contraction: 25% of nominal length |
Maxon EC60flat | Flat brushless DC motor | Nominal speed: 3730 rpm; Nominal torque: 269 mNm |
CSG-17-100-2UH-LW | Harmonic Drive; with cross roller bearing | Limit for average torque: 51 Nm; Limit for Momentary torque: 143 Nm |
SPAB-P10R-G18-NB-K1 | Air pressure sensor | Pressure range: 0~10 bar; Electrical output: 1~5 V analog voltage output |
MONTROL SM23165DT-CDS7 | Treadmill Servo Motor | No-load speed: 5200 RPM; Nominal torque: 0.52 Nm |
Omron E6B2-CWZ3E | Incremental Encoder | Resolution 1000 P/R; Input voltage range: 5~12 V |
EAmp-0001 Description | |
---|---|
Analog Input | 8 unipolar |
Sampling Rate | 500 Hz/per channel |
Sampling Method | 8 channels sampled simultaneously |
Frequency Response | From 0.5 to 100 Hz |
A/D Resolution | 24 bits |
Full Scale input range | 300 mV |
Bandwidth | From 0.5 to 50 Hz |
Supported Electrodes | 8 Channels |
Noun Symbols | Description |
---|---|
nVR-SIT | Data Collection: nVR; Subject Status: Seated |
state | Real Mental Task Status Labeling; RT: 1; MI: 2; Prep (Black Cross): 3; None: 4 |
channel quality | 8-channel quality average: 1 (poor) to 5 (great) |
treadmill-state | Current state of LLRER returned via TCP; Gait action in progress: 1.5; Idle: 0 |
%command | Commands for BCI-driven LLRER Gait activation: 1; Idle: 0 |
Participant (Subject ID) | Body Height (m) | Body Mass (kg) | Age (Years) | Leg Length (m) | Thigh Length (m) | Calf Length (m) | Foot Length (m) |
---|---|---|---|---|---|---|---|
1 | 1.77 | 78 | 24 | 0.847 | 0.437 | 0.41 | 0.264 |
2 | 1.66 | 60 | 25 | 0.81 | 0.43 | 0.38 | 0.242 |
3 | 1.64 | 60 | 24 | 0.85 | 0.45 | 0.4 | 0.23 |
4 | 1.73 | 64 | 24 | 0.821 | 0.426 | 0.395 | 0.262 |
5 | 1.75 | 72 | 23 | 0.866 | 0.456 | 0.41 | 0.259 |
Open-Loop Validation | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Channel selection | channel_selection = ([“CZ”, “FT7”, “FT8”, “C3”, “C4”, “FC3”, “FC4”, “FCZ”]); | |||||||||||
Subject’s perspective | Subject’s own legs | |||||||||||
Subject’s Status | Subject standing idle wearing LLRER | |||||||||||
Model | CSP + MBF + TreeBagger + PSO (SPS&nTrees) | |||||||||||
nVR-SIT | nVR-STD | wa-SIT | wa-STD | |||||||||
Participant (Subject ID) | cdf | MI prob | %CA | cdf | MI prob | %CA | cdf | MI prob | %CA | cdf | MI prob | %CA |
1 | 0.48 | 0.080 | 78.81 | 0.52 | 0.316 | 79.66 | 0.47 | 0.368 | 82.20 | 0.66 | 0.391 | 70.34 |
2 | 0.61 | 0.626 | 71.01 | 0.63 | 0.557 | 69.57 | 0.61 | 0.525 | 70.29 | 0.90 | 0.909 | 68.12 |
3 | 0.47 | 0.173 | 71.21 | 0.34 | 0.035 | 63.64 | 0.38 | 0.210 | 65.91 | 0.51 | 0.452 | 69.70 |
4 | 0.53 | 0.496 | 81.62 | 0.48 | 0.413 | 77.94 | 0.68 | 0.570 | 80.15 | 0.49 | 0.310 | 80.15 |
5 | 0.57 | 0.400 | 74.10 | 0.76 | 0.820 | 71.22 | 0.70 | 0.542 | 72.66 | 0.83 | 0.651 | 70.50 |
Avg. | 0.53 | 0.355 | 75.35 | 0.55 | 0.428 | 72.41 | 0.57 | 0.443 | 74.24 | 0.68 | 0.542 | 71.76 |
Std. | 0.05 | 0.202 | 4.21 | 0.14 | 0.260 | 5.82 | 0.12 | 0.136 | 6.10 | 0.17 | 0.215 | 4.28 |
Close Loop Rehabilitation | ||||||
---|---|---|---|---|---|---|
Channel selection | channel_selection = ([“CZ”, “FT7”, “FT8”, “C3”, “C4”, “FC3”, “FC4”, “FCZ”]); | |||||
Subject’s perspective | Subject’s own legs | |||||
Subject’s Status | Subjects were wearing LLRER and driven to walk | |||||
Model | CSP + MBF + TreeBagger + PSO (SPS&nTrees) | |||||
nVR-SIT | ||||||
Participant (Subject ID) | MI_TH | RT Triggered | RT False Triggered | MI Triggered | MI False Triggered | %TA |
1 | 0.370 | 4 | 1 | 2 | 3 | 60 |
2 | 0.626 | 3 | 2 | 4 | 1 | 70 |
3 | 0.498 | 4 | 1 | 3 | 2 | 70 |
4 | 0.232 | 4 | 1 | 5 | 0 | 90 |
5 | 0.400 | 5 | 0 | 3 | 2 | 80 |
Avg. | 0.425 | 4.0 | 1.0 | 3.4 | 1.6 | 74.0 |
Std. | 0.13 | 0.6 | 0.6 | 1.0 | 1.0 | 10.2 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lin, C.-J.; Sie, T.-Y. Integration of Virtual Reality-Enhanced Motor Imagery and Brain-Computer Interface for a Lower-Limb Rehabilitation Exoskeleton Robot. Actuators 2024, 13, 244. https://doi.org/10.3390/act13070244
Lin C-J, Sie T-Y. Integration of Virtual Reality-Enhanced Motor Imagery and Brain-Computer Interface for a Lower-Limb Rehabilitation Exoskeleton Robot. Actuators. 2024; 13(7):244. https://doi.org/10.3390/act13070244
Chicago/Turabian StyleLin, Chih-Jer, and Ting-Yi Sie. 2024. "Integration of Virtual Reality-Enhanced Motor Imagery and Brain-Computer Interface for a Lower-Limb Rehabilitation Exoskeleton Robot" Actuators 13, no. 7: 244. https://doi.org/10.3390/act13070244
APA StyleLin, C. -J., & Sie, T. -Y. (2024). Integration of Virtual Reality-Enhanced Motor Imagery and Brain-Computer Interface for a Lower-Limb Rehabilitation Exoskeleton Robot. Actuators, 13(7), 244. https://doi.org/10.3390/act13070244