Movement Intent Detection for Upper-Limb Rehabilitation Exoskeleton Based on Series Elastic Actuator as Force Sensor
Abstract
:1. Introduction
2. Materials and Methods
2.1. Upper-Limb Rehabilitation Exoskeleton
2.1.1. Mechanical Design
2.1.2. Sensors and Actuators
2.1.3. Dynamical Model of the Exoskeleton
2.2. Control Strategy
2.3. Movement Intent Detection
2.4. Exoskeleton Motion Start Algorithm
2.5. Experimental Protocol
- In the first stage, each of the subjects was asked to wear the glenohumeral joint rehabilitation exoskeleton to perform a series of passive rehabilitation routines so they could become familiar with the exercise. This stage helped determine the amplitude and duration of the movements which were comfortable for each subject. The passive rehabilitation routine consisted of 5 flexion–extension movements followed by 5 shoulder abduction–adduction movements, with a resting time of 5 s between each of them. The final amplitude of the movements ranged from 0.6 to 0.9 rad for flexion–extension and from 0.6 to 0.7 rad for abduction–adduction, while the duration of each movement varied between 8.3 and 15.5 s.
- Later, in the second stage, the mode of the exoskeleton was changed to that of “movement intent detection” (Figure 7). In this mode, the subjects were asked to try to perform the same movements they performed in the previous stage. Although it was suggested to the subjects to rest 5 s between each exercise, the subjects had complete freedom to choose when to start each movement, as long as the rest was longer than 2 s. This series of exercises was repeated twice. For each attempt, the movement began from the natural anatomical position (the relaxed arms at the sides of the torso) which was called the “initial position”; in case of detecting the movement intent, the exoskeleton performs a movement cycle with the parameters captured in the training stage and ends again in the initial position pending another attempt of movement by the subject. Throughout this phase, subjects were asked to indicate the robot’s correct or incorrect detections; that is, if the robot executed a rehabilitation routine, the subject was asked to indicate whether or not he attempted to do so, or to indicate the case in which the robot did not detect a movement that the user did attempt to make. In order to reduce incorrect detections, before carrying out the two series of previously mentioned exercises, a manual adjustment of the parameter was executed for each participant through 3 offline detection tests. The parameter had an adjustment range between 0.3 and 1.
3. Experimental Results
3.1. Movement Intent Detection Experimental Results
- Sensitivity: The proportion of positive cases that are well detected by the experiment.
- Positive predictive value (PPV): The proportion of truly positive cases among positive cases detected by the experiment.
3.2. Control Law Performance
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Calafiore, D.; Negrini, F.; Tottoli, N.; Ferraro, F.; Ozyemisci-Taskiran, O.; de Sire, A. Efficacy of robotic exoskeleton for gait rehabilitation in patients with subacute stroke: A systematic review. Eur. J. Phys. Rehabil. Med. 2022, 58, 1–8. [Google Scholar] [CrossRef] [PubMed]
- Chien, W.T.; Chong, Y.Y.; Tse, M.K.; Chien, C.W.; Cheng, H.Y. Robot-assisted therapy for upper-limb rehabilitation in subacute stroke patients: A systematic review and meta-analysis. Brain Behav. 2020, 10, e01742. [Google Scholar] [CrossRef] [PubMed]
- De Santis, A.; Siciliano, B.; De Luca, A.; Bicchi, A. An atlas of physical human–robot interaction. Mech. Mach. Theory 2008, 43, 253–270. [Google Scholar] [CrossRef]
- Lagoda, C.; Schouten, A.C.; Stienen, A.H.; Hekman, E.E.; van der Kooij, H. Design of an electric series elastic actuated joint for robotic gait rehabilitation training. In Proceedings of the 2010 3rd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics, Tokyo, Japan, 26–29 September 2010; pp. 21–26. [Google Scholar]
- Shakeel, A.; Navid, M.S.; Anwar, M.N.; Mazhar, S.; Jochumsen, M.; Niazi, I.K. A review of techniques for detection of movement intention using movement-related cortical potentials. Comput. Math. Methods Med. 2015, 2015, 346217. [Google Scholar] [CrossRef] [PubMed]
- Losey, D.P.; McDonald, C.G.; Battaglia, E.; O’Malley, M.K. A review of intent detection, arbitration, and communication aspects of shared control for physical human–robot interaction. Appl. Mech. Rev. 2018, 70, 010804. [Google Scholar] [CrossRef]
- Wang, D.; Gu, X.; Yu, H. Sensors and algorithms for locomotion intention detection of lower limb exoskeletons. Med. Eng. Phys. 2023, 113, 103960. [Google Scholar] [CrossRef] [PubMed]
- Pons, J.L.; Moreno, J.C.; Torricelli, D.; Taylor, J. Principles of human locomotion: A review. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; pp. 6941–6944. [Google Scholar]
- Bandara, D.; Kiguchi, K. Brain signal acquisition methods in BCIs to estimate human motion intention—A survey. In Proceedings of the 2018 International Symposium on Micro-NanoMechatronics and Human Science (MHS), Nagoya, Japan, 9–12 December 2018; pp. 1–7. [Google Scholar]
- Lana, E.P.; Adorno, B.V.; Tierra-Criollo, C.J. Detection of movement intention using EEG in a human-robot interaction environment. Res. Biomed. Eng. 2015, 31, 285–294. [Google Scholar] [CrossRef]
- Zheng, Y.; Zheng, G.; Zhang, H.; Zhao, B.; Sun, P. Mapping Method of Human Arm Motion Based on Surface Electromyography Signals. Sensors 2024, 24, 2827. [Google Scholar] [CrossRef]
- Xiang, Q.; Wang, J.; Liu, Y.; Guo, S.; Liu, L. Gait recognition and assistance parameter prediction determination based on kinematic information measured by inertial measurement units. Bioengineering 2024, 11, 275. [Google Scholar] [CrossRef]
- Net’uková, S.; Bejtic, M.; Malá, C.; Horáková, L.; Kutílek, P.; Kauler, J.; Krupička, R. Lower limb exoskeleton sensors: State-of-the-art. Sensors 2022, 22, 9091. [Google Scholar] [CrossRef]
- Sanchez-Villamañan, M.d.C.; Gonzalez-Vargas, J.; Torricelli, D.; Moreno, J.C.; Pons, J.L. Compliant lower limb exoskeletons: A comprehensive review on mechanical design principles. J. Neuroeng. Rehabil. 2019, 16, 55. [Google Scholar] [CrossRef] [PubMed]
- Penna, M.F.; Trigili, E.; Amato, L.; Eken, H.; Dell’Agnello, F.; Lanotte, F.; Gruppioni, E.; Vitiello, N.; Crea, S. Decoding Upper-Limb Movement Intention Through Adaptive Dynamic Movement Primitives: A Proof-of-Concept Study with a Shoulder-Elbow Exoskeleton. In Proceedings of the 2023 International Conference on Rehabilitation Robotics (ICORR), Singapore, 24–28 September 2023; pp. 1–6. [Google Scholar]
- Bi, L.; Xia, S.; Fei, W. Hierarchical decoding model of upper limb movement intention from EEG signals based on attention state estimation. IEEE Trans. Neural Syst. Rehabil. Eng. 2016, 29, 2008–2016. [Google Scholar] [CrossRef] [PubMed]
- Ferrero, L.; Quiles, V.; Ortiz, M.; Iáñez, E.; Azorín, J.M. A BMI based on motor imagery and attention for commanding a lower-limb robotic exoskeleton: A case study. Appl. Sci. 2021, 11, 4106. [Google Scholar] [CrossRef]
- Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A brain-machine interface based on ERD/ERS for an upper-limb exoskeleton control. Sensors 2016, 16, 2050. [Google Scholar] [CrossRef] [PubMed]
- Saga, N.; Tanaka, Y.; Doi, A.; Oda, T.; Kudoh, S.N.; Fujie, H. Prototype of an ankle neurorehabilitation system with heuristic BCI using simplified fuzzy reasoning. Appl. Sci. 2019, 9, 2429. [Google Scholar] [CrossRef]
- Wang, Q.; Chen, C.; Mu, X.; Wang, H.; Wang, Z.; Xu, S.; Guo, W.; Wu, X.; Li, W. A Wearable Upper Limb Exoskeleton System and Intelligent Control Strategy. Biomimetics 2024, 9, 129. [Google Scholar] [CrossRef] [PubMed]
- Kong, D.; Wang, W.; Guo, D.; Shi, Y. RBF sliding mode control method for an upper limb rehabilitation exoskeleton based on intent recognition. Appl. Sci. 2022, 12, 4993. [Google Scholar] [CrossRef]
- Li, S.; Zhang, L.; Meng, Q.; Yu, H. A Real-Time Control Method for Upper Limb Exoskeleton Based on Active Torque Prediction Model. Bioengineering 2023, 10, 1441. [Google Scholar] [CrossRef]
- Wang, F.; Wei, X.; Guo, J.; Zheng, Y.; Li, J.; Du, S. Research progress of rehabilitation exoskeletal robot and evaluation methodologies based on bioelectrical signals. In Proceedings of the 2019 IEEE 9th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Suzhou, China, 29 July–2 August 2019; pp. 826–831. [Google Scholar]
- Gandolla, M.; Luciani, B.; Pirovano, D.; Pedrocchi, A.; Braghin, F. A force-based human machine interface to drive a motorized upper limb exoskeleton. A pilot study. In Proceedings of the 2022 International Conference on Rehabilitation Robotics (ICORR), Rotterdam, The Netherlands, 25–29 July 2022; pp. 1–6. [Google Scholar]
- Sun, L.; An, H.; Ma, H.; Wei, Q.; Gao, J. Adaptive Prosthetic Trajectory Estimation Based on Key Points Constraints. Appl. Sci. 2024, 14, 3063. [Google Scholar] [CrossRef]
- Kuo, C.T.; Lin, J.J.; Jen, K.K.; Hsu, W.L.; Wang, F.C.; Tsao, T.C.; Yen, J.Y. Human posture transition-time detection based upon inertial measurement unit and long short-term memory neural networks. Biomimetics 2023, 8, 471. [Google Scholar] [CrossRef]
- Lin, J.J.; Hsu, C.K.; Hsu, W.L.; Tsao, T.C.; Wang, F.C.; Yen, J.Y. Machine Learning for Human Motion Intention Detection. Sensors 2023, 23, 7203. [Google Scholar] [CrossRef] [PubMed]
- Chen, B.; Grazi, L.; Lanotte, F.; Vitiello, N.; Crea, S. A real-time lift detection strategy for a hip exoskeleton. Front. Neurorobot. 2018, 12, 17. [Google Scholar] [CrossRef] [PubMed]
- Dos Santos, L.F.; Escalante, F.M.; Siqueira, A.A.; Boaventura, T. IMU-based Transparency Control of Exoskeletons Driven by Series Elastic Actuator. In Proceedings of the 2022 IEEE 61st Conference on Decision and Control (CDC), Cancun, Mexico, 6–9 December 2022; pp. 2594–2599. [Google Scholar]
- Hu, B.; Zhang, F.; Lu, H.; Zou, H.; Yang, J.; Yu, H. Design and assist-as-needed control of flexible elbow exoskeleton actuated by nonlinear series elastic cable driven mechanism. Actuators 2021, 10, 290. [Google Scholar] [CrossRef]
- Zhu, L.; Wang, Z.; Ning, Z.; Zhang, Y.; Liu, Y.; Cao, W.; Wu, X.; Chen, C. A novel motion intention recognition approach for soft exoskeleton via IMU. Electronics 2020, 9, 2176. [Google Scholar] [CrossRef]
- Li, L.L.; Cao, G.Z.; Liang, H.J.; Zhang, Y.P.; Cui, F. Human lower limb motion intention recognition for exoskeletons: A review. IEEE Sens. J. 2023, 23, 30007–30036. [Google Scholar] [CrossRef]
- Zhang, L.; Liu, G.; Han, B.; Wang, Z.; Zhang, T. sEMG based human motion intention recognition. J. Robot. 2019, 2019, 3679174. [Google Scholar] [CrossRef]
- Planelles, D.; Hortal, E.; Costa, Á.; Úbeda, A.; Iáñez, E.; Azorín, J.M. Evaluating classifiers to detect arm movement intention from EEG signals. Sensors 2014, 14, 18172–18186. [Google Scholar] [CrossRef]
- Zhang, X.; Zhang, H.; Hu, J.; Zheng, J.; Wang, X.; Deng, J.; Wan, Z.; Wang, H.; Wang, Y. Gait pattern identification and phase estimation in continuous multilocomotion mode based on inertial measurement units. IEEE Sens. J. 2022, 22, 16952–16962. [Google Scholar] [CrossRef]
- Jang, J.; Kim, K.; Lee, J.; Lim, B.; Shim, Y. Online gait task recognition algorithm for hip exoskeleton. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 5327–5332. [Google Scholar]
- Ai, Q.; Zhang, Y.; Qi, W.; Liu, Q.; Chen, K. Research on lower limb motion recognition based on fusion of sEMG and accelerometer signals. Symmetry 2017, 9, 147. [Google Scholar] [CrossRef]
- Rosales-Luengas, Y.; Espinosa-Espejel, K.I.; Lopéz-Gutiérrez, R.; Salazar, S.; Lozano, R. Lower Limb Exoskeleton for Rehabilitation with Flexible Joints and Movement Routines Commanded by Electromyography and Baropodometry Sensors. Sensors 2023, 23, 5252. [Google Scholar] [CrossRef]
- Zhang, Y.; Han, P.; Liu, H.; Chen, J. Human motion intention recognition method based on gasbag human-machine interactive force detection and multi-source information fusion. In Proceedings of the 2022 International Conference on Service Robotics (ICoSR), Chengdu, China, 10–12 June 2022; pp. 198–204. [Google Scholar]
Parameter | Value | Units |
---|---|---|
m | ||
m | ||
m | ||
m | ||
Kg | ||
Kg | ||
Kg m2 | ||
Kg m2 | ||
Kg m2 | ||
Kg m2 | ||
Kg m2 | ||
Kg m2/s | ||
Kg m2/s | ||
g | m/s2 |
Parameter | Value |
---|---|
diag (0.1, 0.1) | |
diag (30, 100) | |
diag (500, 500) | |
diag (25, 25) |
Subject | Amplitude | Period | Threshold Offset | Real Attempts | True Positive (TP) | False Positive (FP) | False Negative (FN) |
---|---|---|---|---|---|---|---|
1 | 0.9 rad | 9.8 s | 0.3 | 10 | 10 | 1 | 0 |
2 | 0.8 rad | 8.3 s | 1 | 10 | 10 | 0 | 0 |
3 | 0.8 rad | 10.5 s | 0.3 | 10 | 10 | 1 | 0 |
4 | 0.6 rad | 10.5 s | 0.3 | 10 | 10 | 0 | 0 |
Total | 40 | 40 | 2 | 0 |
Subject | Amplitude | Period | Threshold Offset | Real Attempts | True Positive (TP) | False Positive (FP) | False Negative (FN) |
---|---|---|---|---|---|---|---|
1 | 0.7 rad | 12.8 s | 0.3 | 10 | 10 | 0 | 0 |
2 | 0.6 rad | 10.3 s | 0.7 | 10 | 10 | 0 | 0 |
3 | 0.7 rad | 15.5 s | 0.3 | 10 | 10 | 0 | 0 |
4 | 0.6 rad | 11.5 s | 0.3 | 10 | 10 | 0 | 0 |
Total | 40 | 40 | 0 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rosales-Luengas, Y.; Centeno-Barreda, D.; Salazar, S.; Flores, J.; Lozano, R. Movement Intent Detection for Upper-Limb Rehabilitation Exoskeleton Based on Series Elastic Actuator as Force Sensor. Actuators 2024, 13, 284. https://doi.org/10.3390/act13080284
Rosales-Luengas Y, Centeno-Barreda D, Salazar S, Flores J, Lozano R. Movement Intent Detection for Upper-Limb Rehabilitation Exoskeleton Based on Series Elastic Actuator as Force Sensor. Actuators. 2024; 13(8):284. https://doi.org/10.3390/act13080284
Chicago/Turabian StyleRosales-Luengas, Yukio, Daniel Centeno-Barreda, Sergio Salazar, Jonathan Flores, and Rogelio Lozano. 2024. "Movement Intent Detection for Upper-Limb Rehabilitation Exoskeleton Based on Series Elastic Actuator as Force Sensor" Actuators 13, no. 8: 284. https://doi.org/10.3390/act13080284
APA StyleRosales-Luengas, Y., Centeno-Barreda, D., Salazar, S., Flores, J., & Lozano, R. (2024). Movement Intent Detection for Upper-Limb Rehabilitation Exoskeleton Based on Series Elastic Actuator as Force Sensor. Actuators, 13(8), 284. https://doi.org/10.3390/act13080284