Next Article in Journal
Transcription Factor-Based Biosensors for Detecting Pathogens
Next Article in Special Issue
Decoding of Turning Intention during Walking Based on EEG Biomarkers
Previous Article in Journal
Recent Developments and Future Perspective on Electrochemical Glucose Sensors Based on 2D Materials
Previous Article in Special Issue
Biomechanical and Physiological Evaluation of a Multi-Joint Exoskeleton with Active-Passive Assistance for Walking
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

ARMIA: A Sensorized Arm Wearable for Motor Rehabilitation

Human Robotics Group, University of Alicante, 03690 San Vicente del Raspeig, Spain
*
Author to whom correspondence should be addressed.
Biosensors 2022, 12(7), 469; https://doi.org/10.3390/bios12070469
Submission received: 19 May 2022 / Revised: 22 June 2022 / Accepted: 27 June 2022 / Published: 29 June 2022
(This article belongs to the Special Issue Biosensors in Rehabilitation and Assistance Robotics)

Abstract

:
In this paper, we present ARMIA: a sensorized arm wearable that includes a combination of inertial and sEMG sensors to interact with serious games in telerehabilitation setups. This device reduces the cost of robotic assistance technologies to be affordable for end-users at home and at rehabilitation centers. Hardware and acquisition software specifications are described together with potential applications of ARMIA in real-life rehabilitation scenarios. A detailed comparison with similar medical technologies is provided, with a specific focus on wearable devices and virtual and augmented reality approaches. The potential advantages of the proposed device are also described showing that ARMIA could provide similar, if not better, the effectivity of physical therapy as well as giving the possibility of home-based rehabilitation.

1. Introduction

The number of people with motor disabilities has importantly increased in the last few years due to global aging and the general improvement of clinical care and health technology. For this reason, public and private health care systems are investing more in rehabilitation technologies. Recent events, such as the COVID-19 pandemic have stressed the importance of telerehabilitation both at home and at clinical facilities [1,2]. The use of telerehabilitation systems could avoid unnecessary physical contact between patients and therapists as well as enhance the rehabilitation process by increasing repeatability and intensity.
Upper limb motor impairment is one of the most limiting conditions for activities of daily living (ADLs), so an efficient rehabilitation is critical to recovering quality of life. This impairment can be caused by a variety of neuromuscular conditions such as stroke, spinal cord injury, neurodegenerative diseases, mistakes during surgery or aging. In the last few decades, a variety of robot-assisted technologies have been developed to outperform the efficacy of conventional manual therapies [3,4]. End-effector systems are the most typical devices to provide the patient with repetitive and continuous tasks. Examples of those are MIT-Manus [5], GENTLE [6] or, more recently, reachMAN2 [7]. Anthropomorphic devices such as upper limb exoskeletons are also common even though they are biomechanically more complex [8]. However, all these technologies are expensive and not affordable for particulars and for most rehabilitation centers.
Another key factor in upper limb motor rehabilitation is patient motivation. Studies on this behalf have been explored for many years to conclude that any motivational input increases the involvement of the patient in the requested motor tasks. For this purpose, a very common approach is the use of serious games, which are therapeutic leisure virtual activities to promote engagement. There are many applications of serious games to upper limb rehabilitation [9]. These applications can complement the use of assistive technologies. For instance, ArmAssist is a robotic device to improve arm mobility by using serious games [10], and VirtualRehab provides different interfaces for motor disabled individuals [11].
In addition to robot assistance, a good method to interact with these virtual environments is the sensorization of the patient. In the first stage, a common and simple way of performing this was using accelerometers to measure arm kinematics at the sagittal plane [12], but this did not allow inferring tridimensional information. With the development of modern inertial sensors (inertial measurement units, IMUs), it is now possible to characterize the whole arm kinematics with high precision. Moreover, kinematic metrics have a good correlation with conventional clinical scales, which provides an additional way of enhancing motor therapy [13]. Recent studies also show how ADLs can benefit from the application of this technology in rehabilitation [14,15]. Upper-limb motor rehabilitation can also benefit from the assessment of neuromuscular behavior. Surface electromyography (sEMG) allows for measuring electrical activity from the muscle contractions. This technique has been widely used to evaluate factors such as muscle fatigue [16] or movement coordination [17]. sEMG can also be used as input control for virtual environments by processing the activity of muscle contractions [18]. sEMG provides several advantages compared to other electrophysiological measurements, including non-invasiveness, real-time monitoring and on-site application with relatively affordable equipment.
In this paper, we present ARMIA: a sensorized arm wearable that includes a combination of inertial and sEMG sensors to interact with serious games in telerehabilitation setups. This device reduces the cost of robotic assistance technologies to be affordable for end-users at home and at rehabilitation centers. Hardware and acquisition software specifications are described together with potential applications of ARMIA in real-life rehabilitation scenarios. Similar wearable devices have already been applied in the telerehabilitation of specific body parts such as fingers [19] or the elbow [20]. In this case, ARMIA will provide rehabilitation to the whole arm by tracking upper-limb kinematics through inertial sensors. It also includes the real-time assessment of neuromuscular function by extracting meaningful biomarkers of motor control such as muscular fatigue. ARMIA is meant to be combined with virtual activities defined to track the whole rehabilitation process and optimize it based on the recorded physiological data.

2. Hardware

ARMIA is a sensorized sleeve that allows measuring kinematics and muscular activity during upper limb movements (Figure 1). In order to determine arm kinematics, three inertial sensors are placed on the kinematic chain: one on the thorax (reference), one on the arm and one on the forearm. These last two sensors provide arm link orientation and elbow and shoulder joint angles. ARMIA does not provide wrist orientation, but it is possible to obtain cartesian coordinates of the arm end effector (hand).
Muscular activity is measured from three of the main muscles in charge of upper limb movements: biceps, triceps and pronator teres. These sensors are used to determine the signal amplitude (contraction force) and other factors such as muscular fatigue. The instrumentation of these sensors is critical, as the bipolar electrodes must be placed parallel to the muscle fibers close to the middle of the muscle belly, and the reference electrode must be firmly in contact with the skin.
By considering these previous aspects, all hardware elements were selected according to a series of implementation requirements for the sleeve. The sleeve needs to be easily adapted to a variable morphology, and sensors, electronics and connections should be protected from external harm. All these components were selected from low-cost and lightweight options to make the device affordable and ergonomic. The different parts of ARMIA are described next.

2.1. Microcontroller

In order to acquire all analogical signals from the device, an ESP32 board (Espressif Systems) was chosen. ESP32 is an SoC (System on Chip) that can perform as a complete standalone system to connect with multiple peripheral devices, including a variety of sensors, and has very low energy consumption. This element communicates wirelessly with the computer in order to acquire the data from the inertial and EMG sensors.

2.2. Sensors

Muscular information (sEMG signals) is acquired via surface bipolar electrodes using MyoWare sensors (Sparkfun). These sensors provide a compact approach to the analysis of muscular activity and include the possibility of recording both raw signals and preprocessed signals (rectification and linear envelope). Kinematic information is acquired via Inertial Measurement Units (IMUs) (model LSM6DS33). Each sensor provides accelerometer and gyroscope data and allows measuring both orientation and relative position of arm joints in space.

2.3. Battery

The overall power consumption of each element was analyzed to determine the proper power supply to all the circuitry and sensors. The power supply needs to guarantee wireless communication for a period of several hours. A two-cell Li-Po battery was finally selected due to its small size and weight. This type of battery also provides a higher capacity compared to other options and allows the system to work for 3 h at full charge. Additionally, a DC–DC reductor was included to adapt the battery voltage output (7.4 V to 5 V) to feed the microcontroller and all the connected components.

2.4. Structural Elements

Three different elements were designed to support the electrical components of the sleeve (Figure 2). CAD models were designed using Autodesk Inventor and 3D printed in PLA. The first element is the base structure for the main circuit, which includes the cover. The two other elements are supporting carcasses for the EMG and inertial sensors. The base structure can be closed with no need for screws and houses the ESP32 microcontroller, the battery and the DC–DC reductor. Every element is visible and easy to access. For the EMG sensors, the carcass improves the contact of electrodes with the skin and has a free space on the bottom to easily place the sensor over the skin. Finally, the IMUs carcass was designed to completely encapsulate the electronics and only give access to connection pins.

2.5. Garment

In order to design an adequate garment to hold the sensors, morphological restrictions were applied in the selection of the textile and size of the garment. The garment consists of two main elements: the sleeve that covers de arm up to the shoulder and the chest band to fix the wearable and hold the main electronics (see Figure 1). On the shoulder, there is a connecting textile. The primary material is neoprene with additional textile elements.
For the initial prototype, we assumed an average sleeve length of 32 to 33 inches (around 81–84 cm) for men and between 30 and 31 inches (around 76–79 cm) for women. The first prototype was built under this assumption. However, a wider range of the garment size will be developed to cover different populations, including children.

3. Software

ARMIA software is structured into three different levels. The low-level layer establishes the physical communication protocols between the different sensors and the microcontroller and the wireless communication between the microcontroller and the computer. The medium-level layer manages the data acquisition and storage from the sensors. Finally, the high-level layer develops processing and visualization modules.

3.1. Communication Layer

This layer manages communication between the sensors and the ESP32 microcontroller and between ESP32 and the computer. IMUs use an I2C communication protocol. The original library was modified to allow working with multiple addresses without the need for a multiplexor. EMG sensors already offer an analog output, so they are directly connected to the microcontroller’s analog inputs. Communication with the computer is performed wirelessly using a WiFi UDP protocol due to its simplicity. This protocol allows a high-frequency information exchange necessary for real-time analysis of kinematic and muscular data.

3.2. Acquisition Layer

Once all sensor information is captured by the ESP32 and sent to the computer, a database manages all the data through a visual interface. This interface offers three options: “Capture”, “Query” and “Exit”. “Capture” starts data acquisition through the UDP socket and stores all the data in the database. It also implements user management options, including parameters such as user name or capture time. Additionally, the received data are sent to another port to allow an external application to, for instance, visualize them. “Query” is used to search for previously stored information. It is possible to search data by user name or date. Once selected, CSV files are generated with all the requested data. Finally, “Exit” ends the execution of the acquisition interface.

3.3. Processing and Visualization Layer

Measurements from the different sensors are sent to the computer and then processed and visualized in real time. In the case of inertial sensors, the processing deals with information from both the gyroscope and the accelerometer included in the IMU. Accelerometers are not sensitive to external accelerations not originating from gravity, while gyroscopes have a significant drift in time. In order to reduce these errors, a complementary filter was designed following Equation (1):
θ a n g l e = α ( θ a n g l e + ω g y r o d t ) + ( 1 α ) a a c c
where θ a n g l e is the estimated orientation angle, α is the filter’s coefficient, ω g y r o is the angular velocity from the gyroscope, dt is the differential of time (period) and a a c c is the angle obtained from the accelerometer measurements. The complementary filters have a short-term dependence on the gyroscope, which is precise but uses accelerometer data to avoid drift in the long term. However, the IMUs need to be calibrated before each use to avoid undesirable measurements. Figure 3 (top) shows an example of an IMU in different orientations. For each orientation, a real 3D visualization was computed. On the other hand, EMG signals only need to be scaled for visualization (Figure 3, right). As mentioned in the previous section, Myoware sensors provide both raw and processed data. Figure 3 also shows the prototype of the ARMIA device, including the three recorded muscles (biceps, triceps and pronator ceres) and the three IMUs (thorax, arm and forearm).

4. System Validation

In order to validate the first prototype of ARMIA, several experiments were performed. The main goal was to evaluate how accurate are the recorded signals from the prototype sensors and how they can be translated into meaningful inputs for the virtual activities. A description of some of the current virtual activities proposed to test ARMIA is presented in Section 5.3. These activities have simple inputs, such as flexo-extension angle or muscular amplitude, which are the two main parameters evaluated in the experiments.

4.1. Experiments

Five subjects participated in the study (all male; age 24.6 ± 6.4 years old; height 177 ± 7 cm). None of them had neuromuscular disorders that could affect the recordings, and all of them signed the corresponding informed consent according to the declaration of Helsinki. Data protection documents were also signed. Two experiments were performed:
  • Experiment 1 consisted of tracking a specific angle profile by performing arm flexion movements. The duration of the run was 60 s. Four different angle levels were evaluated: 0° (full extension), 30°, 60° and 90° (arm flexed). Transitions between levels were interpolated to maintain the continuity of the movement;
  • Experiment 2 consisted of performing five short contractions of the biceps muscle at full capacity during a recording of 40 s. Participants were given timing feedback and asked to perform the contractions after the first 5 s. For evaluation purposes, the first and last 5 s were removed.

4.2. Results

Figure 4 shows the results of the system validation. The inertial sensors are very reliable, and participants were capable of tracking with a very small error the angle profile provided visually (Figure 4, left). The tracking error was always below 2 deg and, in most of the cases, close to 1 deg or even below. Participants had some difficulty slowing down and adapting to constant angles, but the readjustment was very fast. EMG recordings are, in general, less reliable due to the intrinsic problematics of biosignal measurements. EMG signal is noisy, and electrodes need to be perfectly adjusted on the participant’s skin. However, our system provides enough quality to detect single contractions of measured muscles, in this case, the biceps. As observed in Figure 4 (right), most of the participants had a very clean muscle activation in each contraction. Only P2 and P5 had a higher base error. This was probably due to movements of the electrodes on the skin, bad fixation or the need for additional cleaning to increase electrode–skin conductivity.

5. Discussion

In this section, we addressed current issues in similar rehabilitation technologies and described the target population of our approach. We further analyzed other tools to enhance physical therapy that can be combined with our device. Particularly, we focused on the use of Virtual Reality (VR) and Augmented Reality (AR) to complement the rehabilitation activities with ARMIA. Finally, we described the in-progress and future work proposed to improve the whole rehabilitation system.

5.1. Comparison with Current and Previous Technology

Previous technologies are commonly relying on robotic support to enhance rehabilitation intensity and dose. Rehabilitation robots have optimal repeatability, multiple programs, high reliability and are ideal for large hospitals, but their cost is unbearable for small clinical centers or patient associations, not to mention their use by particulars. These devices are also not portable, and rehabilitation setups are fixed to a particular setting with specialized personnel, not allowing their use for telerehabilitation. In order to solve these limitations, we proposed the use of wearable technology such as ARMIA, which has proven to be a less cumbersome and economical way to approach physical rehabilitation [21,22].
Many wearable alternatives have been developed in recent years to provide ways of rehabilitating upper limb function from home. An interesting approach is GameRehab, a telerehabilitation system that combines serious games and wearable kinematic sensors [23]. Other systems also explore the use of accelerometers to detect arm movements [24,25]. An alternative for kinematic recognition is the use of strain sensors as in [26]. In another study, muscular information from sEMG sensors was used to detect upper-limb fatigue levels [27,28]. The use of haptic feedback has also been a matter of interest in some studies to show quantitative improvements in children with neuromotor disorders [29]. What seems to be clear is that there is a consensus that this kind of device offers improvements in physical rehabilitation [21,22,30,31].
The proposed device ARMIA combines the advantages of previously cited works as it includes both kinematic information and muscular information. The initial aim of this device is to use EMG and kinematic inputs in a simultaneous but independent way. However, future developments will explore the possibility of sensorial fusion, for instance, by inferring correlations between EMG activity and kinematics. This will allow obtaining a better understanding of how motor control behaves during a rehabilitation task and allow readapting the physical therapy in a more efficient way and improve functional outcomes. ARMIA also offers compatibility with VR and AR applications, as well as virtual on-screen applications. The advantage of this combination with external activities (serious games) was further analyzed in Section 5.3. One of the benefits of the device is ergonomics, as all the sensors are intended to be inserted in a textile to fit comfortably on the user’s arm.

5.2. Clinical Scope

ARMIA is suitable for a wide range of people with motor limitations, although it could be complemented with cognitive tasks if combined with other tools (see Section 5.3). The main groups that benefited from this technology are:
  • People who have suffered recent brain damage due to stroke, traumatism or any other condition;
  • People with neurodegenerative or neuromuscular diseases of any kind who need regular physical therapy;
  • Elderly people with mobility problems in the upper limb and in need of maintaining physical activity;
  • People affected by post-traumatic or post-surgical complications in arms or hands that require rehabilitation.

5.3. Current and Future Developments

From the sensor data, ARMIA software extracts kinematic and muscular parameters to be used with serious games compatible with Virtual Reality (VR) and Augmented Reality (AR) technology. VR is a simulated experience that incorporates different sources of sensory feedback, including auditory, visual or, in some cases, haptic stimuli. In the case of AR, this simulation directly overlays the real world, providing computer-generated perceptual information that complements our view of the surrounding environment. Examples of applications of these technologies to a wide range of motor pathologies can be found in a variety of studies which include approaches with virtual environments [32,33,34,35,36,37,38,39], immersive environments using head-mounted devices [40,41,42,43,44] or the more recent use of AR [45,46,47,48,49,50].
Figure 5 describes how the extracted kinematic and muscular parameters could be translated into a meaningful control input for the games. To manage movement in the virtual environment, users can provide control in different levels of complexity, from single joint angle to bidimensional or tridimensional Cartesian position of the hand. The first proposed input is arm flexion and extension angle by comparing arm and forearm orientation. Three-dimensional Cartesian coordinates of the hand are obtained with respect to the reference sensor (thorax) by resolving the direct kinematics of the arm. The bidimensional position is achieved by extracting projections of the hand 3D coordinates on the transverse plane. This is a relatively short set of kinematic inputs but very suitable for common motor rehabilitation activities in daily therapy. Muscular inputs can provide onset for special actions or determine force level. In-game actions could be related to a sustained force, for instance, to achieve a particular goal, or related to short muscle activations, to control additional actions while performing the correct arm movements. Moreover, parameters such as muscle fatigue could be considered to decrease difficulty at some points.
By considering these possible control inputs, two different games are being developed for the ARMIA device. These games are particularly focused on planar movements and are a good starting point for the device validation:
  • Whac-a-Mole: this game focuses on the rehabilitation of reaching tasks on the transverse plane. The players grab a virtual ball to smash the moles that appear on the screen by reaching their position and hitting them. The score increases on each correct hit. Players have different possibilities for hitting depending on the level of impairment and pathology. One option is to use an external input such as a button to hit the moles or use arm co-contraction to hit the moles at the precise moment. If players have big movement limitations on the hand, the hit can be performed by using the so-called dwell click, which sets the ball to take action automatically when the ball stops moving for a certain amount of time in a certain range to the target. The game will have different difficulty levels that dynamically change depending on players’ performance, or that could be customized by therapists. This difficulty can be based on increasing the number of moles that show up per minute, decreasing the time they are visible, increasing the score required to advance to the next level or including cognitive challenges such as hitting a mole with a particular appearance to obtain extra points.
  • Harvest Truck: this game only takes two control inputs from the wearable: the flexion and extension angle and an action input corresponding to the contraction force level measured on one of the recorded muscles. The players are on the back of a harvest truck traveling on a rural road. On the side of the road, different vegetable boxes of different weights are waiting for pickup. Players must extend the arm to grab the boxes and apply a different amount of contraction force depending on the weight or size of the boxes to be capable of moving them. In this case, in-game difficulty levels change by adding or removing total boxes, increasing box weight or changing the distance to the harvest truck. The score will be computed depending on the number and weight of the collected boxes on each level.
Future developments for the application of ARMIA will be focused on the evaluation of how motor rehabilitation can be benefited from the use of the proposed gamified activities and how our device can provide the necessary control inputs to allow both clinical rehabilitation and home-based telerehabilitation in an entertaining but effective manner.

Author Contributions

Conceptualization, G.J.G., C.A.J. and A.U.; design and implementation, J.M., S.S., E.V., C.A.J., J.P. and J.L.R.; software, G.J.G., A.A., L.B. and V.M.; biomechanical analysis, G.B. and A.U.; validation, J.M., S.S., E.V., A.A., L.B. and G.B.; writing—original draft preparation, G.J.G., J.M., S.S., E.V., A.A., L.B. and A.U.; writing—review and editing, G.J.G., C.A.J., V.M., J.P., J.L.R. and A.U.; supervision, G.J.G. and A.U. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by University of Alicante, project "Exoforge: Laboratorio docente interdisciplinar para desarrollo de proyectos en tecnologías asistivas", grant number 5487.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Ethics Committee of University of Alicante (protocol code UA-2021-06-21_01, approved on 29 June 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data is available upon request. Please contact: [email protected].

Acknowledgments

Authors want to thank Luis José García Burgos and Ángela Sánchez Pérez for their participation in the project.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Johansson, T.; Wild, C. Telerehabilitation in stroke care-a systematic review. J. Telemed. Telecare 2010, 17, 1–6. [Google Scholar] [CrossRef]
  2. Velayati, F.; Ayatollahi, H.; Henmat, M. A systematic review of the effectiveness of telerehabilitation interventions for therapeutic purposes in the elderly. Methods Inf. Med. 2020, 59, 104–109. [Google Scholar] [CrossRef]
  3. Kwakkel, G.; Kollen, B.J.; Krebs, H.I. Effects of robot-assisted therapy on upper limb recovery after stroke: A systematic review. Neurorehabil. Neural Repair 2008, 22, 111–121. [Google Scholar] [CrossRef]
  4. Bertani, R.; Melegari, C.; De Cola, M.C.; Bramanti, A.; Bramanti, P.; Calabrò, R.S. Effects of robot-assisted upper limb rehabilitation in stroke patients: A systematic review with meta-analysis. Neurol Sci. 2017, 38, 1561–1569. [Google Scholar] [CrossRef]
  5. Krebs, H.I. Robot-aided neurorehabilitation. IEEE Trans. Rehabil. Eng. 1998, 6, 75–87. [Google Scholar] [CrossRef] [Green Version]
  6. Richardson, R.; Brown, M.; Bhakta, B.; Levesley, M.C. Design and control of a three degree of freedom pneumatic physiotherapy robot. Robotica 2003, 21, 589–604. [Google Scholar] [CrossRef]
  7. Zhu, T.L.; Klein, J.; Dual, S.A. ReachMAN2: A compact rehabilitation robot to train reaching and manipulation. In Proceedings of the 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014. [Google Scholar] [CrossRef]
  8. Gull, M.A.; Bai, S.; Bak, T. A Review on Design of Upper Limb Exoskeletons. Robotics 2020, 9, 16. [Google Scholar] [CrossRef] [Green Version]
  9. Levin, M.F.; Weiss, P.L.; Keshner, E.A. Emergence of virtual reality as a tool for upper limb rehabilitation: Incorporation of motor control and motor learning principles. Phys. Ther. 2015, 95, 415–425. [Google Scholar] [CrossRef] [Green Version]
  10. Tomic, T.J.D.; Savic, A.M.; Vidakovic, A.S.; Rodic, A.Z.; Isakovic, M.S.; Rodriguez-de-Pablo, C.; Keller, T.; Konstantinovic, L.M. ArmAssist robotic system versus matched conventional therapy for poststroke upper limb rehabilitation: A randomized clinical trial. Biomed. Res. Int. 2017, 7659893. [Google Scholar] [CrossRef]
  11. O’Neil, O.; Fernandez, M.M.; Herzog, J.; Beorchia, M.; Gower, V.; Gramatica, F.; Starrost, K.; Kiwull, L. Virtual reality for neurorehabilitation: Insights from 3 European clinics. PMR 2018, 10, 198–206. [Google Scholar] [CrossRef]
  12. Leuenberger, K.; Gonzanbach, R.; Wachter, S.; Luft, A.; Gassert, R. A method to qualitatively assess arm use in stroke survivors in the home environment. Med. Biol. Eng. Comput. 2016, 55, 141–150. [Google Scholar] [CrossRef] [Green Version]
  13. Colombo, R. Performance measures in robot assisted assessment of sensorimotor functions. In Rehabilitation Robotics; Colombo, R., Sanguineti, V., Eds.; Academic Press: London, UK, 2018; pp. 101–115. [Google Scholar] [CrossRef]
  14. Lee, S.I.; Adans-Dester, C.P.; Obrien, A.T.; Vergara-Diaz, G.P.; Black-Schaffer, R.; Zafonte, R.; Dy, J.G.; Bonato, P. Predicting and monitoring upper-limb rehabilitation outcomes using clinical and wearable sensor data in brain injury survivors. IEEE Trans. Biomed. Eng. 2021, 68, 1871–1881. [Google Scholar] [CrossRef]
  15. Held, J.P.O.; Klaassen, B.; Eenhoorn, A.; van Beijnum, B.-J.F.; Buurke, J.H.; Veltnik, P.H.; Luft, A.R. Inertial sensor measurements of upper-limb kinematics in stroke patients in clinic and home environment. Front. Bioeng. Biotechnol. 2018, 6, 27. [Google Scholar] [CrossRef] [Green Version]
  16. Cifrek, M.; Medved, V.; Tonkovic, S.; Ostojic, S. Surface EMG based muscle fatigue evaluation in biomechanics. Clin. Biomech. 2009, 24, 327–340. [Google Scholar] [CrossRef]
  17. Pan, B.; Sun, Y.; Huang, Z.; Wu, J.; Hou, J.; Liu, Y.; Huang, Z.; Zhang, Z. Alterations of muscle synergies during voluntary arm reaching movement in subacute stroke survivors at different levels of impairment. Front. Comput. Neurosci. 2018, 12, 69. [Google Scholar] [CrossRef]
  18. Kwon, Y.; Dwivedi, A.; Mcdaid, A.; Liarokapis, M. Electromyography-based decoding of dexterous, in-hand manipulation of objects: Comparing task execution in real world and virtual reality. IEEE Access 2021, 9, 37297–37310. [Google Scholar] [CrossRef]
  19. Lockery, D.; Peters, J.F.; Ramanna, S.; Shay, B.L.; Szturm, T. Store-and- feedforward adaptive gaming system for hand-finger motion tracking in telerehabilitation. IEEE Trans. Inf. Technol. Biomed. 2011, 15, 467–473. [Google Scholar] [CrossRef]
  20. Park, H.S.; Peng, Q.; Zhang, L.Q. A portable telerehabilitation system for remote evaluations of impaired elbows in neurological disorders. IEEE Trans. Neural Syst. Rehabil. Eng. 2008, 16, 245–254. [Google Scholar] [CrossRef]
  21. Wang, Q.; Markopoulos, P.; Yu, B.; Chen, W.; Timmermans, A. Interactive wearable systems for upper body rehabilitation: A systematic review. J. Neuroeng. Rehabil. 2017, 14, 20. [Google Scholar] [CrossRef] [Green Version]
  22. Maceira-Elvira, P.; Popa, T.; Schmid, A.C.; Hummel, F.C. Wearable technology in stroke rehabilitation: Towards improved diagnosis and treatment of upper limb impairment. J. Neuroeng. Rehabil. 2019, 16, 142. [Google Scholar] [CrossRef]
  23. Tannous, H.; Istrate, D.; Perrochon, A.; Daviet, J.C.; Benlarbi-Delai, A.; Sarrazin, J.; Tho, M.C.H.B.; Dao, T.T. GAMEREHAB@HOME: A New Engineering System Using Serious Game and Multisensor Fusion for Functional Rehabilitation at Home. IEEE Trans Games 2021, 13, 89–98. [Google Scholar] [CrossRef]
  24. Lim, C.K.; Chen, I.M.; Luo, Z.; Yeo, S.H. A low cost wearable wireless sensing system for upper limb home rehabilitation. In Proceedings of the 2010 IEEE Conference on Robotics, Automation and Mechatronics, Singapore, 28–30 June 2010. [Google Scholar] [CrossRef]
  25. Zhou, H.; Stone, T.; Hu, H.; Harris, N. Use of multiple wearable inertial sensors in upper limb motion tracking. Med. Eng. Phys. 2008, 30, 123–133. [Google Scholar] [CrossRef] [PubMed]
  26. Tognetti, A.; Lorussi, F.; Bartalesi, R.; Quaglini, S.; Tesconi, M.; Zupone, G.; De Rossi, D. Wearable kinesthetic system for capturing and classifying upper limb gesture in post-stroke rehabilitation. J. Neuroeng. Rehabil. 2005, 2, 8. [Google Scholar] [CrossRef] [PubMed]
  27. Montoya, M.F.; Munoz, J.; Henao, O.A. Fatigue-aware videogame using biocybernetic adaptation: A pilot study for upper-limb rehabilitation with sEMG. Virtual Real. 2021, 1–14. [Google Scholar] [CrossRef]
  28. Zhao, S.; Liu, J.; Gong, Z.; Lei, Y.; Yang, X.O.; Chan, C.C.; Ruan, S. Wearable Physiological Monitoring System Based on Electrocardiography and Electromyography for Upper Limb Rehabilitation Training. Sensors 2020, 10, 4861. [Google Scholar] [CrossRef]
  29. Bortone, I.; Barsotti, M.; Leonardis, D.; Crecchi, A.; Tozzini, A.; Bonfiglio, L.; Frisoli, A. Immersive Virtual Environments and Wearable Haptic Devices in rehabilitation of children with neuromotor impairments: A single-blind randomized controlled crossover pilot study. J. Neuroeng. Rehabil. 2020, 17, 144. [Google Scholar] [CrossRef]
  30. Simpson, L.A.; Menon, C.; Hodgson, A.J.; Ben Mortenson, W.; Eng, J.J. Clinicians’ perceptions of a potential wearable device for capturing upper limb activity post-stroke: A qualitative focus group study. J. Neuroeng. Rehabil. 2021, 18, 135. [Google Scholar] [CrossRef]
  31. Hayward, K.S.; Eng, J.J.; Boyd, L.A.; Lakhani, B.; Bernhardt, J.; Lang, C.E. Exploring the role of accelerometers in the measurement of real world upper-Limb use after stroke. Brain Impair. 2016, 17, 16–33. [Google Scholar] [CrossRef] [Green Version]
  32. Saposnik, G.; Teasell, R.; Mamdani, M.; Hall, J.; McIlroy, W.; Cheung, D.; Thorpe, K.E.; Cohen, L.G.; Bayley, M. Effectiveness of Virtual Reality Using Wii Gaming Technology in Stroke Rehabilitation A Pilot Randomized Clinical Trial and Proof of Principle. Stroke 2010, 41, 1477–1484. [Google Scholar] [CrossRef] [Green Version]
  33. Saposnik, G.; Cohen, L.G.; Mamdani, M.; Pooyania, S.; Ploughman, M.; Cheung, D.; Shaw, J.; Hall, J.; Nord, P.; Dukelow, S.; et al. Efficacy and safety of non-immersive virtual reality exercising in stroke rehabilitation (EVREST): A randomised, multicentre, single-blind, controlled trial. Lancet Neurol. 2016, 15, 1019–1027. [Google Scholar] [CrossRef] [Green Version]
  34. Adie, K.; Schofield, C.; Berrow, M.; Wingham, J.; Humfryes, J.; Pritchard, C.; James, M.; Allison, R. Does the use of Nintendo Wii Sports TM improve arm function? Trial of Wii TM in Stroke: A randomized controlled trial and economics analysis. Clin. Rehabil. 2017, 31, 173–185. [Google Scholar] [CrossRef] [PubMed]
  35. Jonsdottir, J.; Bertoni, R.; Lawo, M.; Montesano, A.; Bowman, T.; Gabrielli, S. Serious games for arm rehabilitation of persons with multiple sclerosis. A randomized controlled pilot study. Mult. Scler. Relat. Disord. 2017, 19, 25–29. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  36. Türkbey, T.A.; Kutlay, S.; Gök, H. Clinical feasibility of Xbox KinectTM training for stroke rehabilitation: A single-blind randomized controlled pilot study. J. Rehabil. Med. 2017, 49, 22–29. [Google Scholar] [CrossRef] [Green Version]
  37. Fernandez-Gonzalez, P.; Carratala-Tejada, M.; Monge-Pereira, E.; Collado-Vazquez, S.; Sanchez-Herrera Baeza, P.; Cuesta-Gomez, A.; Ona-Simbana, E.D.; Jardon-Huete, A.; Molina-Rueda, F.; Balaguer-Bernaldo de Quiros, C.; et al. Leap motion controlled video game-based therapy for upper limb rehabilitation in patients with Parkinson’s disease: A feasibility study. J. Neuroeng. Rehabil. 2019, 16, 133. [Google Scholar] [CrossRef]
  38. Aguilera-Rubio, A.; Alguacil-Diego, I.M.; Mallo-Lopez, A.; Cuesta-Gomez, A. Use of the Leap Motion Controller (R) System in the Rehabilitation of the Upper Limb in Stroke. A Systematic Review. J. Stroke Cerebrovasc. Dis. 2022, 31, 106174. [Google Scholar] [CrossRef]
  39. Sucar, L.E.; Orihuela-Espina, F.; Velazquez, R.L.; Reinkensmeyer, D.J.; Leder, R.; Hernández-Franco, J. Gesture therapy: An upper limb virtual reality-based motor rehabilitation platform. IEEE Trans. Neural. Syst. Rehabil. Eng. 2014, 22, 634–643. [Google Scholar] [CrossRef] [Green Version]
  40. Burke, J.W.; McNeill, M.; Charles, D.; Morrow, P.; Crosbie, J.; McDonough, S. Serious Games for Upper Limb Rehabilitation Following Stroke. In Proceedings of the 2009 Conference in Games and Virtual Worlds for Serious Applications, Coventry, UK, 23–24 March 2009. [Google Scholar] [CrossRef]
  41. Ma, M.; Bechkoum, K. Serious Games for Movement Therapy after Stroke. In Proceedings of the 2008 IEEE International Conference on Systems, Man and Cybernetics (SMC), Singapore, 12–15 October 2008. [Google Scholar] [CrossRef] [Green Version]
  42. Deutsch, J.E.; McCoy, S.W. Virtual Reality and Serious Games in Neurorehabilitation of Children and Adults: Prevention, Plasticity, and Participation. Pediatr. Phys. Ther. 2017, 29, S23–S36. [Google Scholar] [CrossRef]
  43. Lee, S.H.; Jung, H.Y.; Yun, S.J.; Oh, B.M.; Seo, H.G. Upper Extremity Rehabilitation Using Fully Immersive Virtual Reality Games with a Head Mount Display: A Feasibility Study. PMR 2020, 12, 257–262. [Google Scholar] [CrossRef]
  44. Woodward, R.B.; Hardgrove, L.J. Adapting myoelectric control in real-time using a virtual environment. J. Neuroeng. Rehabil. 2019, 16, 11. [Google Scholar] [CrossRef]
  45. Trojan, J.; Diers, M.; Fuchs, X.; Bach, F.; Bekarter-Bodmann, R.; Foell, J.; Kamping, S.; Rance, M.; Maab, H.; Flor, H. An augmented reality home-training system based on the mirror training and imagery approach. Behav. Res. Methods 2014, 46, 634–640. [Google Scholar] [CrossRef] [Green Version]
  46. Pinches, J.; Hoermann, S. Automated instructions and real time feedback for upper limb computerized mirror therapy with augmented reflection technology. J. Altern. Med. Res. 2018, 10, 37–46. [Google Scholar]
  47. Mousavi, H.; Khademi, M.; Dodakian, L.; McKenzie, A.; Lopes, C.V.; Cramer, S.C. Choice of human-computer interaction mode in stroke rehabilitation. Neurorehabil. Neural Repair 2016, 30, 258–265. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  48. Colomer, C.; Llorens, R.; Noé, E.; Alcañiz, M. Effect of a mixed reality-based intervention on arm, hand, and finger function on chronic stroke. J. Neuroeng. Rehabil. 2016, 13, 45. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Condino, S.; Turini, G.; Viglialoro, R.; Gesi, M.; Ferrari, V. Wearable Augmented Reality Application for Shoulder Rehabilitation. Electronics 2019, 8, 1178. [Google Scholar] [CrossRef] [Green Version]
  50. Prahm, C.; Bressler, M.; Eckstein, K.; Kuzuoka, H.; Daigeler, A.; Kolbenschlag, J. Developing a wearable Augmented Reality for treating phantom limb pain using the Microsoft Hololens 2. Augment. Hum. 2022, 2022, 309–312. [Google Scholar] [CrossRef]
Figure 1. General appearance of the ARMIA wearable technology. A textile arm sleeve holds the inertial and sEMG sensors and is fastened to the chest for better fixation. The back of the wearable holds the battery and the microcontroller that communicates with the computer or device where serious games are running.
Figure 1. General appearance of the ARMIA wearable technology. A textile arm sleeve holds the inertial and sEMG sensors and is fastened to the chest for better fixation. The back of the wearable holds the battery and the microcontroller that communicates with the computer or device where serious games are running.
Biosensors 12 00469 g001
Figure 2. CAD models of the structural elements, including base structure and sensor holders.
Figure 2. CAD models of the structural elements, including base structure and sensor holders.
Biosensors 12 00469 g002
Figure 3. Current ARMIA prototype showing actual location of sensors (bottom-left). EMG signals obtained from the muscles of interest (bottom-right). Inertial sensor measurements (top).
Figure 3. Current ARMIA prototype showing actual location of sensors (bottom-left). EMG signals obtained from the muscles of interest (bottom-right). Inertial sensor measurements (top).
Biosensors 12 00469 g003
Figure 4. Results obtained for participants P1–P5. Experiment 1: Tracked angle vs. recorded angle, including mean tracking error. Experiment 2: Performance of 5 biceps contractions and proposed threshold range.
Figure 4. Results obtained for participants P1–P5. Experiment 1: Tracked angle vs. recorded angle, including mean tracking error. Experiment 2: Performance of 5 biceps contractions and proposed threshold range.
Biosensors 12 00469 g004
Figure 5. Control inputs for the gamification activities. Kinematic data (3D hand position and flexo-extension angle) and muscular data (fatigue and contraction) are used to command serious games and evaluate motor performance.
Figure 5. Control inputs for the gamification activities. Kinematic data (3D hand position and flexo-extension angle) and muscular data (fatigue and contraction) are used to command serious games and evaluate motor performance.
Biosensors 12 00469 g005
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Garcia, G.J.; Alepuz, A.; Balastegui, G.; Bernat, L.; Mortes, J.; Sanchez, S.; Vera, E.; Jara, C.A.; Morell, V.; Pomares, J.; et al. ARMIA: A Sensorized Arm Wearable for Motor Rehabilitation. Biosensors 2022, 12, 469. https://doi.org/10.3390/bios12070469

AMA Style

Garcia GJ, Alepuz A, Balastegui G, Bernat L, Mortes J, Sanchez S, Vera E, Jara CA, Morell V, Pomares J, et al. ARMIA: A Sensorized Arm Wearable for Motor Rehabilitation. Biosensors. 2022; 12(7):469. https://doi.org/10.3390/bios12070469

Chicago/Turabian Style

Garcia, Gabriel J., Angel Alepuz, Guillermo Balastegui, Lluis Bernat, Jonathan Mortes, Sheila Sanchez, Esther Vera, Carlos A. Jara, Vicente Morell, Jorge Pomares, and et al. 2022. "ARMIA: A Sensorized Arm Wearable for Motor Rehabilitation" Biosensors 12, no. 7: 469. https://doi.org/10.3390/bios12070469

APA Style

Garcia, G. J., Alepuz, A., Balastegui, G., Bernat, L., Mortes, J., Sanchez, S., Vera, E., Jara, C. A., Morell, V., Pomares, J., Ramon, J. L., & Ubeda, A. (2022). ARMIA: A Sensorized Arm Wearable for Motor Rehabilitation. Biosensors, 12(7), 469. https://doi.org/10.3390/bios12070469

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop