Next Article in Journal
Spatiotemporal Analysis of the Population Risk of Congenital Microcephaly in Pernambuco State, Brazil
Previous Article in Journal
Association Between Mycotoxin Exposure and Dietary Habits in Colorectal Cancer Development Among a Polish Population: A Study Protocol
Previous Article in Special Issue
Fuzzy Logic-Based Risk Assessment of a Parallel Robot for Elbow and Wrist Rehabilitation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Low-Cost Robotic Guide Based on a Motor Imagery Brain–Computer Interface for Arm Assisted Rehabilitation

1
Instituto de Automática e Informática Industrial, Universitat Politècnica de València, 46022 València, Spain
2
Departament de Psicobiologia, Facultat de Psicologia, Universitat de València, 46010 València, Spain
3
Facultad de Ingeniería, Ingeniería Mecatrónica, Universidad Autónoma de Bucaramanga, Bucaramanga 680003, Colombia
4
Facultad de Educación, Universidad Internacional de la Rioja, 26006 Logroño, Spain
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2020, 17(3), 699; https://doi.org/10.3390/ijerph17030699
Submission received: 23 December 2019 / Revised: 10 January 2020 / Accepted: 17 January 2020 / Published: 21 January 2020
(This article belongs to the Special Issue Application of Robotic Devices for Neurologic Rehabilitation)

Abstract

:
Motor imagery has been suggested as an efficient alternative to improve the rehabilitation process of affected limbs. In this study, a low-cost robotic guide is implemented so that linear position can be controlled via the user’s motor imagination of movement intention. The patient can use this device to move the arm attached to the guide according to their own intentions. The first objective of this study was to check the feasibility and safety of the designed robotic guide controlled via a motor imagery (MI)-based brain–computer interface (MI-BCI) in healthy individuals, with the ultimate aim to apply it to rehabilitation patients. The second objective was to determine which are the most convenient MI strategies to control the different assisted rehabilitation arm movements. The results of this study show a better performance when the BCI task is controlled with an action–action MI strategy versus an action–relaxation one. No statistically significant difference was found between the two action–action MI strategies.

1. Introduction

Motor deficiencies are a great handicap for many disabled people. Such disabilities are caused by various medical problems (stroke, trauma, neurodegenerative diseases, etc.) and affect millions of people worldwide. After suffering a motor disability, the rehabilitation of motor function becomes a necessity. Rehabilitation of hands and arms is fundamental to improve the independence of affected patients when performing everyday tasks [1,2]. Many of the rehabilitation strategies applied, such as active motor training (AMT) [3], force the use of paralyzed limbs. AMT can produce an improvement in motor performance and a cortical reorganization. However, AMT usefulness is limited since it depends on a patient’s residual motor activity. This excludes many patients from the potential benefits of AMT.
Motor imagery (MI) may be an alternative rehabilitation strategy. MI can rely on the imagination of body movements and does not depend on parasitic or residual motor abilities [4,5,6]. Various studies point to the positive effect that motor imagination training can produce on motor recovery [7,8].
Brain–computer interfaces (BCIs) are communication/control systems that can be employed to transform the user’s intention into different actions by means of brain activity [9,10,11,12]. BCIs include sensors that record brain activity and software that processes this information in order to interact with the environment by means of actuators. Patients with limited communication and movement capabilities can benefit from this technology, which includes communication protocols such as spellers [13], control of robot arms [14] and neuro-prosthesis [15], control of motorized wheelchairs [16], and home automation systems [17]. Although BCI technology has attracted extensive research interest for several decades, it has not yet become a common technology in medical protocols [18]. Barriers that must be removed before BCI technology is ready for commercial purposes include individual customization of BCI applications (i.e., the need for individual and recurrent calibration, standardization of protocols and procedures, as well as patient convenience and comfort in using BCI devices such as electroencephalogram (EEG) electrodes and caps) [19]. Some research is currently intended to solve problems that prevent the generalization of medical and recreational BCI use [11,19,20]. The success of BCI technology will depend on improving its reliability and accuracy, i.e., to increase the number of times in which the system actually performs the intended action.
BCI technology can also be used to assist in rehabilitation, since BCIs aim to translate the patient’s MI into external commands to control a helping device. The thought or realization of a motor action is generated in the sensory–motor areas of the cerebral cortex, which causes a variation in the EEG signal [21]. Specifically, imagining motor actions usually modifies the amplitude of the mu/beta rhythms in the sensory–motor cortex [22]. These variations in the spectral content of EEG signals can be used to control a BCI system [23,24].
Non-invasive EEG-based BCI systems have been used to control robotic systems that can improve the daily life of affected individuals, as well as for the rehabilitation and recovery of motor skills [25]. A quadriplegic individual who only had some left bicep mobility was connected to a robotic hand prosthesis by means of MI [26]. After five months of training, the subject acquired some control over the prosthesis. Interestingly enough, shorter training periods have also been reported to be successful [27]. Moreover, several BCI paradigms have been combined simultaneously, for example to control a robot arm that simulates an upper human joint with two degrees [28]. This application used MI to control the gripping function and steady-state visual evoked potentials (SSVEP) for the joint control.
In [29,30], extensive reviews of state of the art exoskeletons developed for the rehabilitation of the upper limb are presented. Some exoskeletons for upper limb rehabilitation are already commercially available, i.e., Aupa, JACE S600, JACE S603 and Armeo® Spring. The main disadvantages of these devices are that the anthropometric measures, on which their design are based, may not correspond to the population in which they are to be applied, and also that their high cost of acquisition and maintenance make them inaccessible to most institutions. These limitations have motivated the research carried out in this study in order to produce a customized, low-cost rehabilitation device.
Motor imagery based BCI (MI-BCI) has been used to carry out robotics-assisted rehabilitation in several studies [31,32]. In one of them [32], a group of patients who underwent standard robotic rehabilitation was compared with another group using MI-BCI, the latter of which achieved superior performance. A combination of MI-BCI and conventional physical therapy [26] to rehabilitate patients who had suffered a stroke has also shown that patients treated with MI-BCI improve more than those treated with random BCI.
MI-BCI has proven its potential to help in the rehabilitation process, but its learning curve is steep and weeks of training are required to gain a satisfactory level of control [33]. Many people only achieve a deficient level of BCI control, even when extended training is provided, a phenomenon that has been labelled ‘BCI illiteracy’ [34].
Several MI strategies have been studied, such as hand and foot movements, mental mathematical operations, and mental rotation of objects [35,36,37,38]. MI strategies based on mental imagination of limb movements seem to be more appropriate for limb rehabilitation [39]. Pfurtscheller et al. [26] used brain oscillations to control an electrically driven hand orthosis for restoring hand grasp function. The subjects imagined left versus right hand movements, left and right hands versus no specific imagination, and both feet versus right hand, and achieved an average classification accuracy of approximately 65%, 75% and 95%, respectively. Which particular motor imaginations allow for a better control, though, remains an open issue. In this paper, different motor imagination strategies are compared.
This work implements a low-cost robotic rehabilitation assistance system. The subjects’ will to move their arms is interpreted by analyzing their motor imagery by means of processing the EEG signal from the motor cortex. The rehearsed movements of the users are decoded from their MI and then translated into the real movement of the rehabilitation device. BCI performance in the control of the rehabilitation device with different motor imagery tasks is assessed. For this purpose, two experiments were planned. In the first experiment, action instructions (imagined movement of hands or feet) are compared with non-action instructions (imagined motionless hands or feet); while in the second experiment, two different action strategies (right/left hand movements versus hand/feet movements) are compared.
In the second section of this paper, the design and construction of the rehabilitation guide is presented. The guide control system based on a motor imagery BCI is explained. The experiment procedure is described as well as the different hypotheses tested. In Section 3 and Section 4, results of the experiments are shown and discussed. Finally, some conclusions are drawn.

2. Materials and Methods

2.1. Rehabilitation Guide

The first objective of this work was to design a low-cost robotic device useful for the rehabilitation of arm movements in affected patients. The system was designed to enable various types of rehabilitation depending on the placement of the guide. Thus, it should be possible to rehabilitate vertical, horizontal, and longitudinal movements as shown in Figure 1. The position, speed and direction of a carriage attached to the guide are controlled. The patient places his or her hand on the mobile carriage to rehabilitate the arm’s motor function.
The structure of the guide is made up of a metallic aluminum assembly with a square section of 4.5 cm and a longitude of 110 cm. On this, a carriage made by three-dimensional (3D) printing is mounted, which can slide longitudinally along the assembly. On this car, wrist attachments are fixed that adapt to the subject’s arm in order to perform the rehabilitation.
The transmission system of the mobile carriage consists of a drive gear (double helical driver with 21 teeth) that rotates with the motor shaft. A double helical drive gear with 10 teeth rotates around the driving pulley and drags the carriage on a belt. The drive pulley has a radius of 25.21 mm and 0.254 mm teeth and the pulley on the right end of the guide is the same as the drive.
The relationship of movement between the carriage and the motor revolutions is obtained from Equation (1).
21 10 · 2 π · 25.464 = 336.46 mm / rev  
The Direct Current (DC) motor that drives the gear system has a nominal supply voltage of 12 V and in nominal operation can vertically raise 3 kg (as measured on the guide carriage). This motor has a coupled reducer and an incremental encoder. The encoder’s output determines the carriage’s direction of movement and number of movement steps. A limit switch was placed on the extreme left end of the guide to enable the system to home at the beginning of the program execution and so to have a carriage position reference.
To enable the carriage to move in both directions, the motor is controlled from a driver or H-bridge controlled by digital signals from the PC. Specifically, the integrated BD6231F-E2 is a H-bridge that also enables control of the output voltage through pulse width modulation (PWM) so that with the same component, it is possible to control movement speed and direction. A LabVIEW program controls the driver moving the carriage via the NI-6210 acquisition card. The carriage movement is limited at the beginning of every trial according to the desired span for every user.
The rehabilitation guide is powered from batteries to ensure a stable voltage, to facilitate the portability, and to reduce the risk of patient electrocution. One of the design requirements of this work was the construction of a low-cost rehabilitation device, and for that reason, many of the components of the guide have been 3D-printed (Figure 2).
To begin with the rehabilitation, the user places his or her hand on a wristband, which has been specifically molded for the patient, and is made of a thermoplastic insole. The user then must try to move the carriage towards the right or left, backwards or forwards, or up or down, depending on the rehabilitation exercise and, hence, on the position of the guide. These different configuration options allow for the same guide to perform different rehabilitation exercises so that it can be adapted to the requirements of each patient.
Once the user begins to try (although their mobility in this joint could be very limited or even fully absent), by processing the EEG signals, it is possible to detect the direction of the user’s intended movement and this information is used to start the guide motor and to move its articulation accordingly. The user therefore obtains a movement feedback that reflects his or her intention. The guide’s motor and transmission system prevent any actual movement of the carriage if no user intention is detected.

2.2. EEG-Based Control

EEG signals from the motor areas of the patient are registered to control the guide movements. These signals are processed and a control action (which can be movement in one direction, in the other, or stop) is sent to the guide. The speed of movement is regulated through PWM. This speed control is external to the patient and performed by the person responsible for the rehabilitation.
Motor imagery is used to control the rehabilitation guide movements. The subject being monitored is asked to think of a type of movement to detect variations in the EEG related to this task with the electrodes placed in the sensorimotor areas of the brain. The concept of building a BCI with motor imagery lies in creating a computer algorithm that detects the changes in brain waves associated with the patient’s movement intention and translates these changes into computer signals [40,41]. The guide then moves according to the user’s intention, providing visual and somatosensory feedback (Figure 3).
Several factors have been taken into account when selecting hardware and software components for the EEG-based BCI. Given the interest in a portable and compact solution for BCI control, the Enobio digital amplifier [42] from Neuroelectrics was selected to acquire the EEG signals. The Enobio amplifier was developed for BCI research. It was chosen because of its wireless technology and dry electrodes that facilitate the experimental setup. The EEG signal was acquired through channels F3, F4, C3, Cz, C4, T7, T8 and Pz around the sensorimotor areas according to the standard 10–20 electrode location system (Figure 4). Ground and earth electrodes were placed in the subject’s earlobe. The EEG signal was recorded using a sampling rate of 500 Hz and band-pass filtered between 2 and 100 Hz with an activated notch filter at 50 Hz. The sampled and amplified EEG signal is then sent to the computer via Bluetooth.
To implement rehabilitation guide control, the open-source BCI software BCI2000 (Wadsworth Center, New York State Department of Health in Albany, New York, USA) was used [43]. The carriage movement was controlled after the cursor movement on the computer screen in the cursor task of BCI2000 [44]. The cursor task is based on the modulation of mu and beta rhythms and allows the participants to control the position of a cursor on the computer screen. The participants’ intentions affect the cursor position by means of imagining motor actions. In this study, the participants had to follow the instructions received to direct the cursor towards a bar that could appear in different parts of the screen. Being able to direct the cursor and to reach the bar in a pre-specified time interval was considered as a successful attempt. The participants had to control the direction in which the cursor was moving in order to reach the bar. EEG signal processing, feature extraction and classification followed the procedure described below [22,45,46,47].
The EEG channels are spatially filtered to improve the signal-to-noise ratio. A large Laplacian filter is chosen as shown in Equation (2) [48]:
C o u t =   k = 1 n C k W k
where Cout is the electrode to be analyzed, Ck is the electrode k signal, n is the total number of electrodes and Wk is the weight of electrode k. In this study, the filtered channels C3, CZ and C4 are chosen as output channels according to the weights shown in Table 1. These channels are located on the motor somatosensory cortex areas corresponding to the right hand, feet and left hand respectively.
BCI2000 uses an AR model to estimate the sensorimotor rhythm amplitudes for control according to Equation (3) [49]:
Y t = i = 1 i = p a t 1 Y t 1 + e t
where Yt is the predicted signal at time t, a is a vector of p coefficients, and e is the prediction error. a is a vector of estimated filter coefficients for an all-pole model of order p, for which the power spectrum is found as shown in Equation (4).
P ^ ( e j w ) = 1 | 1   k = 1 r a p ( k ) e j k w | 2  
In order to evaluate the model coefficients, BCI2000 employs the Maximum Entropy (or Burg) method [50]. BCI2000 is configured to calculate the power in adjacent bins of 3 Hz width. Within each bin, the power is estimated at evenly-spaced intervals and averaged. In this case a 3 Hz bin from 9–12 Hz with 11 evaluations finds the power in 0.2 Hz intervals (Equation (5)).
P ^ 9 12 =   w I = 9 ,   0.2 12 [ | 1 k = 1 r a p ( k ) e j k w i | 2 ]
Each of the three output channels then have a number of binned power spectrum amplitudes. The linear classifier subsequently translates these features into output control signals. Its output is normalized with respect to mean and variance and used to determine a virtual cursor movement in the computer screen [43].
Since the cursor task is implemented in BCI2000 and the rehabilitation guide control program was written in LabVIEW [51], a gateway between these software tools was developed. The connection between LabVIEW and BCI2000 is based on the UDP protocol where BCI2000 acts as the server and LabVIEW acts as the client.

2.3. Participants

Although the ultimate goal is the therapeutic use of the proposed rehabilitation device, healthy non-disabled participants have been selected for this study in order to validate the security and feasibility of the rehabilitation guide and to determine the most convenient MI strategies.
All procedures performed involving human participants were in accordance with the ethical standards of the Universitat de València research committee and comparable ethical standards. Informed consent was obtained from all individual participants included in the study.
The participants were students from the Universitat de València. They each fulfilled several questionnaires and tests and performed two different BCI tasks. None of them had previous experience with BCIs. A medical history of epilepsy or the intake of psychoactive drugs were exclusion criteria for this experiment and none of the participants were rejected for these causes.
Two different experiments concerning BCI task MI strategies have been carried out. Fifty participants (10 males, 40 females) with a mean age of 20.18 years (standard deviation (SD) = 3.04) took part in the first experiment. For the second experiment, 127 participants (11 males, 116 females) with a mean age of 18.01 years (SD = 0.59) were selected. Data from participants who did not complete the whole protocol have been excluded. The initial number of participants in the first experiment was 90 and in the second experiment 183.
In order to investigate correlations between BCI performance and user traits, the participants went through a demographic survey and several psychological tests. An initial questionnaire designed by us explored some participants’ traits, psychological variables and habitually performed daily activities (physical exercise, video games, music training etc.). These activities have been hypothetically related to the ability to manage a BCI device [35]. Results regarding these psychological traits and their correlation with BCI performance were shown in [52].

2.4. Experimental Procedure

The session in each experiment lasted for approximately 60 min and was organized as shown in Table 2.
Preparation: Participants were informed about the experiment procedure and signed the informed consent. The Enobio helmet was properly placed on their heads [53]. The task instructions were provided during the habituation period.
Relaxation: Immediately before starting the tests, participants performed a Jacobson’s progressive facial relaxation procedure guided by recorded verbal instructions for 180 s. The role of this relaxation procedure was to induce a relaxed state in the participants. It was conducted because tension has been shown to correlate negatively with motor imagery BCI performance [40].
BCI tasks: Each participant performed two control tasks that differed in the MI strategy. The task order was randomized between subjects.
In the first experiment, a vertical arm rehabilitation was rehearsed (Figure 5). The guide moved the subject’s arm up and down according to the MI performed. In the first strategy, the users followed an action–relaxation instruction and in the second strategy, they followed an action–action instruction. For the action–relaxation instruction (hand/relax task (HRT)), subjects had to purport moving their hand to move the guide up and to relax if they wanted to move it downwards. In the action–action task (hand/feet task (HFT)), they had to purport moving their hand to move the guide up. If they wanted to move the guide down, they were instructed to imagine that they were stretching their feet.
To provide a task reference and feedback to the subjects, a cursor task was shown on the computer screen [43]. Targets appear on the screen and participants were asked to imagine the instructed movements to move the cursor towards the targets.
Six 150 s tests were performed by each participant (three per type of task), divided into 20 s trials. In each trial, the cursor was visible for a maximum of 20 s, during which they could succeed (the cursor reached the target) or fail (the cursor did not reach the target). In both cases, a new trial was subsequently initiated. If the 20 s period finished without the cursor reaching either side, a new trial was started. Henceforth, the number of trials for each participant depended on the time they spent in each trial. The carriage in the rehabilitation guide moved after the cursor position between two prefixed limit positions. They were asked to imagine repetitive movements as long as they wanted the cursor, and the guide, to keep on moving to reach the target.
In the second experiment, a horizontal arm rehabilitation was rehearsed. The guide moved the subject’s arm to the right and to the left according to the MI performed. Two different action–action tasks were compared. The previous HFT was compared with a right hand/left hand task (RLT) for moving the guide in a horizontal position. In this RLT, targets appear on the right and left sides of the screen and participants were asked to imagine right-hand movements to direct the cursor to the right, and the opposite for the left side (Table 3).
After performing the BCI tasks, the participants completed a test to evaluate their subjective experience of the experimental procedure.
To analyze the results, IBM SPSS Statistics software v. 16.0 was used. t-tests were performed for related samples and paired samples, and univariate variance analyses have been performed.

3. Results

In the first experiment, an action–action strategy (HFT) was compared with an action–relaxation strategy (HRT) in the MI-BCI task. Figure 6 shows the individual task success comparing the HRT and HFT control strategies. In Figure 6, results for all subjects are ordered according to their performance at the HRT. This ordered structure facilitates the comparison between ‘good’ and ‘bad’ performers for each condition and it shows the variability of the data. Figure 7 shows the group performance averages in each strategy. These percentages are averaged over the three trials in each condition.
Figure 6 and Figure 7 show that the HFT strategy produced a better performance: 76% of the participants (38 out of 50) achieved better results than with HRT. A statistically significant difference between both strategies was found: the control on HRT trials was significantly lower than on HFT: t (49) = −4.667, p < 0.001.
As expected, average task performance was low for subjects without previous training in MI-BCI. For the HRT, no learning was observed among the participants (operationalized as an increase in the percentage of successful attempts between the first and the last trial); the difference between both attempts was not significant. For the HFT, though, an improvement was observed between the first and the third trial (t = −2.425; p = 0.010).
As shown in the results of this first experiment, for most subjects, it was easier to use an action–action motor imagery strategy than an action–relaxation one.
To compare different action–action motor imagery strategies, a second experiment was designed with a different group of participants (n = 127). In this experiment, the HFT and RLT strategies to control the horizontal movement of the rehabilitation guide were compared. Figure 8 shows the individual task success comparing both strategies. Individual results for all subjects are ordered according to their performance in the RLT. Figure 9 shows the group performance averages for each strategy.
Figure 8 and Figure 9 suggest that the general performances with both action–action strategies (RLT and HFT) are similar. This was confirmed with a mean difference hypothesis test that showed no statistically significant differences between both strategies.
A unifactorial inter-group ANOVA was performed using the group variable as the independent variable (first group, n = 50 and second group, n = 127) and the HFT performance index as dependent variable. Levene’s test indicated compliance with homocedasticity (F(1,175) = 3.597; p = 0.060). Regarding the results of the ANOVA, they showed the absence of statistically significant differences between both groups in the HFT performance variable (F(1,175) = 2.074; p = 0.152; η P 2 = 0.012; 1-β = 0.299). Consequently, the averages in both groups did not show the existence of greater efficacy in one group (n = 50; mean = 57.91; SD = 11.30) with respect to the other (n = 127; mean = 54.39; SD = 15.80).
Table 4 shows the results of the initial questionnaire completed by the participants.
Feedback was obtained from the participants about their experience in wearing the EEG cap in the BCI tasks in relation to experienced pain and comfort (Table 5 and Table 6).

4. Discussion

This study presents a low-cost rehabilitation device controlled via an MI-BCI system. The construction of the guide with rapid prototyping techniques such as 3D printing makes it compact, lightweight and economical. The device for holding the patient’s hand to the guide was printed on a rigid material and a thermoforming process was applied that allows a customized fit. The use of open-source BCI software, such as BCI2000, also contributes to an easy replicability of the prototype.
The proposed robotic guide permits different rehabilitation exercises (shoulder and elbow in vertical, horizontal and longitudinal movements), depending on its positioning in relation to the subject. The range and speed of the rehabilitation movements of the guide are adaptable to each patient.
This research has been subject to several limitations. Although a higher number of sessions would have been convenient, our participants were students enrolled in a course taught by one of the authors and their participation was a part of the course’s program. The end date of the course, as well as the fact that the BCI-related practice was a part of the syllabus, precluded us from the possibility of performing further experimental sessions. An additional limitation relates to the present prototype, which only allows for unidimensional movements. We are currently working on a robotic device with similar characteristics that will hopefully allow for the rehabilitation of two-dimensional and three-dimensional movements.
Mean contrasts were performed for independent groups using Student’s t-test for all the dichotomous variables of the initial questionnaire versus the performance of each subject in the BCI tasks. No statistically significant differences were detected in any of the variables except for Q4 (t (127) = 2.8981; p = 0.005; SD = 7.62).
From the data of the opinion questionnaire about the experiment, it can be concluded that the experience of the subjects has been good, considering that almost 95% of the respondents reported that wearing the EEG helmet with dry electrodes produced little or no discomfort (Table 5). Similarly, approximately two-thirds of the participants estimated that they could wear the EEG helmet for up to two hours continuously (Table 6). It can be concluded that the use of dry electrodes and wireless EEG signal transmission made the experimental setup tolerable and even comfortable.
In this work, several MI strategies to control the device in different rehabilitation exercises have been compared. The fact that an action–action MI strategy provides better results than an action–relaxation MI strategy can be related to the nature of the EEG signals and their distribution over the scalp. Switching between two different MI tasks that are associated to opposite sides of the body (i.e., left hand vs. right hand) triggers the activation of different areas of the motor cortex. This fact makes it easier for the classification algorithms to detect the changes in the EEG data in an action vs. action condition than when it is an action vs. rest task, in which there are just variations of EEG power in a single area of the brain.
In this study, an active BCI paradigm was used where the user performed mental tasks voluntarily (thinking about the movement of the right hand, left hand, feet, or relax). Motor imagery was well accepted by the users because it provided a sense of agency compared to other reactive or passive paradigms [54]. Reactive BCI paradigms such as SSVEP occur when the user’s brain reacts to external stimuli (visual, auditory or tactile). In passive BCI paradigms, the user’s mental state is analyzed in real time such as in workload estimations.
The MI rehabilitation paradigm applied in this study is not limited to a specific type of patient condition. According to similar studies [8,55,56], it could be used for post-stroke treatment, spinal cord injury (SCI) patients, trauma, etc. Moreover, as [57] have shown, by simultaneously combining motor imagery and action observation when compared to simply observing the action, we see an enhanced corticomotor excitability that might result from the activation of mirror neurons. This promising conclusion highlights that the results obtained in this study might be applied to the rehabilitation of medullary-injured patients.

5. Conclusions

An MI-BCI controlled low-cost robotic system for the rehabilitation of arms has been designed. The feasibility and safety of the system has been tested with an extended healthy population. All subjects could perform the intended arm movements. The rehabilitation guide has been controlled by the users according to his or her intentions with MI-EEG. Validation of the BCI control of the rehabilitation guide by a healthy population, as well as the selection of the most effective MI strategy, were necessary steps before applying it to patients.
These results show a better performance with an action–action MI strategy than with an action–relaxation MI strategy. A statistically significant difference between the two action–action motor imagery strategies was not found.
Once the feasibility and safety of the robotic guide for arm rehabilitation has been checked with healthy subjects, the next stage would be to apply the system to patient populations. Since MI-BCI has been used in previous studies with post-ictus patients and the rehabilitation guide does not require active muscle exercise, the authors are confident that this system may become a useful tool in the rehabilitation of arm movements.

Author Contributions

Conceptualization and methodology, E.Q.; experiments design and validation, E.Q., F.S. and G.C.; formal analysis, G.C. and N.C.; data curation G.C.; writing—original draft preparation, E.Q. and N.C.; writing—review and editing, E.Q. and F.S.; supervision, E.Q., F.S., M.J. and L.Á.-K.; funding acquisition E.Q., F.S., M.J. and L.Á.-K. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been partially funded by Instituto de Automática e Informática Industrial AI2, Universitat Politècnica de València. The APC was funded by Facultat de Educación, Universidad Internacional de La Rioja.

Acknowledgments

The authors want to acknowledge the contribution of Álvaro Uriel and Carlos Catalán in the construction of the prototype. This work has been partially funded by the Instituto de Automática e Informática Industrial AI2, Universitat Politècnica de València.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this paper:
AMTActive motor training
BCIBrain–computer interface
EEGElectroencephalography
HFTHand/feet task
HRTHand/relax task
MIMotor imagery
PWMPulse width modulation
RLTRight hand/left hand task
SCISpinal cord injury
SPSSStatistical Package for the Social Sciences
SSVEPSteady-state visual evoked potentials
UDPUser datagram protocol

References

  1. Ang, K.K.; Guan, C. Brain-computer interface in stroke rehabilitation. J. Comput. Sci. Eng. 2013, 7, 139–146. [Google Scholar] [CrossRef] [Green Version]
  2. Langhorne, P.; Bernhardt, J.; Kwakkel, G. Stroke rehabilitation. Lancet 2015, 377, 1693–1702. [Google Scholar] [CrossRef]
  3. Noser, A.; Ro, T.; Boake, C.; Levin, H.; Aronowski, J.; Schallert, T. Constraint-induced movement therapy. Stroke 2004, 32, 2699–2701. [Google Scholar]
  4. Pfurtscheller, G.; Neuper, C. Motor imagery activates primary sensorimotor area in humans. Neurosci. Lett. 1997, 239, 65–68. [Google Scholar] [CrossRef]
  5. McFarland, D.; Miner, L.; Vaughan, T.; Wolpaw, J. Mu and Beta Rhythm Topographies During Motor Imagery and Actual Movements. Brain Topogr. 2000, 12, 177–186. [Google Scholar] [CrossRef]
  6. Kilteni, K.; Andersson, B.; Houborg, C.; Ehrsson, H. Motor imagery involves predicting the sensory consequences of the imagined movement. Nat. Commun. 2018, 1617, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Sharma, N.; Pomeroy, V.M.; Baron, J.C. Motor imagery a backdoor to the motor system after stroke? Stroke 2006, 37, 1941–1952. [Google Scholar] [CrossRef] [Green Version]
  8. Tong, Y.; Pendy, J., Jr.; Li, W.; Du, H.; Zhang, T.; Geng, X.; Ding, Y. Motor Imagery-Based Rehabilitation: Potential Neural Correlates and Clinical Apllication for Functional Recovery of Motor Deficits after Stroke. Aging Dis. 2017, 8, 364–371. [Google Scholar] [CrossRef] [Green Version]
  9. Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain computer interfaces, a review. Sensors 2012, 12, 1211–1279. [Google Scholar] [CrossRef]
  10. Clerc, M.; Bougrain, L.; Lotte, F. Brain-Computer Interfaces 1: Foundations and Methods; ISTE-Wiley: London, UK; Hoboken, NJ, USA, 2016. [Google Scholar]
  11. Clerc, M.; Bougrain, L.; Lotte, F. Brain-Computer Interfaces 2: Technology and Applications; ISTE-Wiley: London, UK; Hoboken, NJ, USA, 2016. [Google Scholar]
  12. Wolpaw, J.R.; Birbaumer, N.; McFarland, D.J.; Pfurtscheller, G.; Vaughan, T.M. Vaughan, Brain-computer interfaces for communication and control. Clin. Neurophysiol. 2002, 113, 767–791. [Google Scholar] [CrossRef]
  13. Placidi, G.; Petracca, A.; Spezialetti, M.; Iacoviello, D. A Modular Framework for EEG Web Based Binary Brain Computer Interfaces to Recover Communication Abilities in Impaired People. J. Med. Syst. 2016, 40, 34. [Google Scholar] [CrossRef] [PubMed]
  14. Hortal, E.; Planelles, D.; Costa, A.; Iáñez, E.; Úbeda, A.; Azorín, J.M.; Fernández, E. SVM-Based Brain-Machine interface for controlling a Robot Arm through four Mental Tasks. Neurocomputing 2015, 151, 116–121. [Google Scholar] [CrossRef]
  15. Collinger, J.L.; Wodlinger, B.; Downey, J.; Wang, W.; Tyler-Kabara, E.C.; Weber, D.J.; Schwartz, A.B. High-performance neuroprosthetic control by an individual with tetraplegia. Lancet 2013, 381, 557–564. [Google Scholar] [CrossRef] [Green Version]
  16. Iturrate, I.; Antelis, J.M.; Kubler, A.; Minguez, J.A. A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation. IEEE Trans. Robot. 2009, 25, 614–627. [Google Scholar] [CrossRef] [Green Version]
  17. Edlinger, G.; Holzner, C.; Guger, C.; Groenegress, C.; Slater, M. Brain-computer interfaces for goal orientated control of a virtual smart home environment. In Proceedings of the 4th International IEEE/EMBS Conference on Neural Engineering, Antalya, Turkey, 29 April–2 May 2009. [Google Scholar]
  18. Mihajlović, V.; Grundlehner, B.; Vullers, R.; Penders, J. Wearable, wireless EEG solutions in daily life applications: What are we missing? IEEE J. Biomed. Health Inf. 2015, 19, 6–21. [Google Scholar] [CrossRef]
  19. Lightbody, G.; Galway, L.; McCullagh, P. The brain computer interface: Barriers to becoming pervasive. In Pervasive Health; Holzinger, A., Ziefle, M., Röcker, C., Eds.; Springer: London/England, UK, 2014; pp. 101–129. [Google Scholar]
  20. Tangermann, M.; Lotte, F.; Van Erp, J. Brain-Computer Interfaces: Beyond Medical Applications. IEEE Comput. Soc. 2012, 45, 26–34. [Google Scholar]
  21. Úbeda, A.; Azorín, J.M.; Chavarriaga, R.; Millán, J.d.R. Classification of upper limb center-out reaching tasks by means of EEG-based continuous decoding techniques. J. Neuroeng. Rehabil. 2017, 14, 9. [Google Scholar] [CrossRef] [Green Version]
  22. Wolpaw, J.R.; McFarland, D.J. Control of a two-dimensional movement signal by a noinvasive brain-computer interface in humans. Proc. Natl. Acad. Sci. USA 2004, 101, 17849–17854. [Google Scholar] [CrossRef] [Green Version]
  23. Lotte, F.; Congedo, M.; Lécuyer, A.; Lamarche, F.; Arnaldi, B. A review of classification algorithms for eeg-based brain-computer interfaces. J. Neural Eng. 2007, 4, R1–R14. [Google Scholar] [CrossRef]
  24. Li, M.; Luo, X.; Yang, J.; Sun, Y. Applying a locally linear embedding algorithm for feature extraction and visualization of MI-EEG. J. Sens. 2016, 2016, 7481946. [Google Scholar] [CrossRef]
  25. Soekadar, S.R.; Birbaumer, N.; Slutzky, M.W.; Cohen, L.G. Brain-machine interfaces in neurorehabilitation of stroke. Neurobiol. Dis. 2015, 83, 172–179. [Google Scholar] [CrossRef] [Green Version]
  26. Pfurtscheller, G.; Guger, C.; Müller, G.; Krausz, G.; Neuper, C. Brain oscillations control hand orthosis in a tetraplegic. Neurosci. Lett. 2000, 292, 211–214. [Google Scholar] [CrossRef]
  27. Müller-Putz, G.R.; Scherer, R.; Pfurtscheller, G.; Rupp, R. EEG-based neuroprosthesis control: A step towards clinical practice. Neurosci. Lett. 2005, 382, 169–174. [Google Scholar] [CrossRef] [PubMed]
  28. Horki, P.; Solis-Escalante, T.; Neuper, C.; Müller-Putz, G. Combined motor imagery and SSVEP based BCI control of a 2 DoF artificial upper limb. Med. Biol. Eng. Comput. 2011, 49, 567–577. [Google Scholar] [CrossRef]
  29. Lo, H.; Xie, S. Exoskeleton robots for upper limb rehabilitation: State of the art and future prospects. Med. Eng. Phys. 2012, 34, 261–268. [Google Scholar] [CrossRef] [PubMed]
  30. Sheng, B.; Zhang, Y.; Meng, W.; Xie, C.D.S. Bilateral robots for upper limb stroke rehabilitation: State of the art and future prospects. Med. Eng. Phys. 2012, 38, 587–606. [Google Scholar] [CrossRef]
  31. Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A Brain-Machine Interface Based on ERD/ERS for an Upper-Limb Exoskeleton Control. Sensors 2016, 16, 2050. [Google Scholar] [CrossRef] [Green Version]
  32. Ang, K.K.; Guan, C.; Chua, K.; Ang, B.T.; Kuah, C.; Wang, C.; Phua, K.S.; Chin, Z.Y.; Zhang, H. A clinical study of motor imagery-based brain-computer interface for upper limb robotic rehabilitation. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009. [Google Scholar]
  33. Blankertz, B.; Sannelli, C.; Halder, S.; Hammer, E.M.; Kübler, A.; Müller, K.R.; Curio, G.; Dickhaus, T. Neurophysiological predictor of SMR-based BCI performance. Neuroimage 2010, 51, 1303–1309. [Google Scholar] [CrossRef] [Green Version]
  34. Allison, B.Z.; Neuper, C. Could Anyone Use a BCI. In Brain-Computer Interfaces; Applying our minds to Human Computer Interaction; Springer: London, UK, 2010; pp. 33–54. [Google Scholar]
  35. Jeunet, C.; N’Kaoua, B.; Subramanian, S.; Hachet, M.; Lotte, F. Predicting Mental Imagery-Based BCI Perfomance from Personality, Cognitive Profile and Neurophysiological Patterns. PLoS ONE 2015, 10, e0143962. [Google Scholar] [CrossRef]
  36. Friedrich, E.V.; Neuper, C.; Scherer, R. Whatever works: A systematic user-centered training protocol to optimize brain-computer interfacing individually. PLoS ONE 2013, 8, e76214. [Google Scholar] [CrossRef] [Green Version]
  37. Pfurtscheller, G.; Neuper, C. Motor Imagery and Direct Brain-Computer Communication. Proc. IEEE 2001, 89, 1123–1134. [Google Scholar] [CrossRef]
  38. Vuckovic, A.; Osuagwu, B.A. Using a motor imagery questionnaire to estimate the perfomance of a Brain-Computer Interface based on object oriented motor imagery. Clin. Neurophysiol. 2013, 124, 1586–1595. [Google Scholar] [CrossRef] [Green Version]
  39. Chaudhary, U.; Birbaumer, N.; Ramos-Murguialday, A. Brain computer interfaces for communication and rehabilitation. Nat. Rev. Neurol. 2016, 12, 513–525. [Google Scholar] [CrossRef] [Green Version]
  40. Jeunet, C.; N’Kaoua, B.; Lotte, F. Advances in User-Training for Mental-Imagery Based BCI Control: Psychological and Cognitive Factors and their Neural Correlates. Progress Brain Res. 2016, 228, 3–35. [Google Scholar]
  41. Schalk, G.; Brunner, P.; Gerhardt, L.A.; Bischof, H.; Wolpaw, J.R. Brain-computer interfaces (BCIs): detection instead of classification. J. Neurosci. Methods 2008, 167, 51–62. [Google Scholar] [CrossRef]
  42. Cester, I.; Dunne, S.; Riera, A.; Ruffini, G. ENOBIO: Wearable, Wireless, 4-channel electrophysiology recording system optimized for dry electrodes. In Proceedings of the International Workshop on Wearable Micro and Nanosystems for Personalised Health, Valencia, España, 21–23 May 2008. [Google Scholar]
  43. Schalk, G.; Mellinger, J. A Practical Guide to Brain-Computer Interfacing with Bci2000; Springer: London, UK, 2010. [Google Scholar]
  44. Schalk, G.; McFarland, D.J.; Hinterberger, T.; Birbaumer, N.; Wolpaw, J.R. BCI2000: A general-purpose brain-computer interface (BCI) system. IEEE Trans. Biomed. Eng. 2004, 51, 1034–1043. [Google Scholar] [CrossRef]
  45. Wolpaw, J.R.; McFarland, D.J.; Neat, G.W.; Forneris, C.A. An EEG-based brain-computer interface for cursor control. Electroencephalogr. Clin. Neurophysiol. 1991, 78, 252–259. [Google Scholar] [CrossRef]
  46. McFarland, D.J.; Neat, G.W.; Read, R.F.; Wolpaw, J.R. An EEG-based method for graded cursor control. Psychobiology 1993, 21, 77–81. [Google Scholar]
  47. Fabiani, G.E.; McFarland, D.J.; Wolpaw, J.R.; Pfurtscheller, G. Conversion of EEG Activity into Cursor Movement by a Brain-Computer Interface (BCI). IEEE Trans. Neural Syst. Rehabil. Eng. 2004, 12, 331–338. [Google Scholar] [CrossRef]
  48. McFarland, D.J.; McCane, L.M.; David, S.V.; Wolpaw, J.R. Spatial filter selection for EEG-based communication. Electroencephalogr. Clin. Neurophysiol. 1997, 103, 386–394. [Google Scholar] [CrossRef]
  49. Zhang, Y.; Ji, X.; Zhang, Y. Classification of EEG Signals Based on AR Model and Approximate Entropy. In Proceedings of the International Joint Conference on Neural Networks, Killarney, Ireland, 12–17 July 2015. [Google Scholar]
  50. Wilson, J.; Williams, J. Massively parallel signal processing using the graphics processing unit for real-time brain-computer interface feature extraction. Front. Neuroeng. 2009, 2, 11. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  51. Bell, I.; Falcon, J.; Limroth, J.; Robinson, K. Integration of hardware into the Labview environment for rapid prototyping and the development of control design applications. In Proceedings of the Mini Symposia UKACC Control, Bath, England, 8 September 2004. [Google Scholar]
  52. Candela, G.; Quiles, E.; Chio, N.; Suay, F. Attentional Variables and BCI Performance: Comparing Two Strategies. In Psychology Applications & Developments IV; Science Press: Lisboa, Portugal, 2018; pp. 130–139. [Google Scholar]
  53. Wilson, J.A.; Schalk, G.; Walton, L.M.; Williams, J.C. Using an EEG-Based Brain-Computer Interface for Virtual Cursor Movement with BCI2000. J. Vis. Exp. JoVE 2009, 29, 1319. [Google Scholar] [CrossRef] [PubMed]
  54. Clerc, M.; Bougrain, L.; Lotte, F. Physiological Markers for Controlling Active and Reactive BCIs. In Brain-Computer Interfaces 1: Foundations and Methods; ISTE-Wiley: London, UK; Hoboken, NJ, USA, 2016. [Google Scholar]
  55. Alonso-Valerdi, L.; Salido-Ruiz, R.; Ramirez-Mendoza, R. Motor imagery based brain-computer interfaces: An emerging technology to rehabilitate motor deficits. Neuropsychologia 2015, 79, 354–363. [Google Scholar] [CrossRef] [PubMed]
  56. Monge-Pereira, E.; Ibañez-Pereda, J.; Alguacil-Diego, I.; Serrano, J.; Spottorno-Rubio, M.; Molina-Rueda, F. Use of Electroencephalography Brain-Computer Interface Systems as a Rehabilitative Approach for Upper Limb Function After a Stroke: A System Review. PmR 2017, 9, 918–932. [Google Scholar] [CrossRef]
  57. Cengiz, B.; Vuralli, D.; Zinnuroğlu, M.; Bayer, G.; Golmohammadzadeh, H.; Günendi, Z.; Turgut, A.; İrfanoğlu, B.; Arikan, K. Analysis of mirror neuron system activation during action observation alone and action observation with motor imagery tasks. Exp. Brain Res. 2018, 236, 497–503. [Google Scholar] [CrossRef]
Figure 1. Rehabilitation guide operation modes. EEG = electroencephalogram. (a) Transversal movement; (b) longitudinal movement; (c) vertical movement.
Figure 1. Rehabilitation guide operation modes. EEG = electroencephalogram. (a) Transversal movement; (b) longitudinal movement; (c) vertical movement.
Ijerph 17 00699 g001
Figure 2. Rehabilitation guide.
Figure 2. Rehabilitation guide.
Ijerph 17 00699 g002
Figure 3. Motor imagery brain–computer interface (MI-BCI) for the rehabilitation process.
Figure 3. Motor imagery brain–computer interface (MI-BCI) for the rehabilitation process.
Ijerph 17 00699 g003
Figure 4. Registered EEG channels.
Figure 4. Registered EEG channels.
Ijerph 17 00699 g004
Figure 5. Vertical movement rehabilitation.
Figure 5. Vertical movement rehabilitation.
Ijerph 17 00699 g005
Figure 6. Individual task performance in HRT vs. HFT.
Figure 6. Individual task performance in HRT vs. HFT.
Ijerph 17 00699 g006
Figure 7. Task performance average in HRT vs. HFT.
Figure 7. Task performance average in HRT vs. HFT.
Ijerph 17 00699 g007
Figure 8. Individual task performance for RLT vs. HFT.
Figure 8. Individual task performance for RLT vs. HFT.
Ijerph 17 00699 g008
Figure 9. Task performance average for RLT vs. HFT.
Figure 9. Task performance average for RLT vs. HFT.
Ijerph 17 00699 g009
Table 1. Laplacian filter coefficients. Ck = electrode k signal.
Table 1. Laplacian filter coefficients. Ck = electrode k signal.
Ck
CoutF3F4T7C3CZC4T8PZ
C3−0.250−0.251−0.2500−0.25
CZ−0.2−0.20−0.21−0.20−0.2
C40−0.2500−0.251−0.25−0.25
Table 2. Experimental procedure.
Table 2. Experimental procedure.
StepTime (Minutes)Activities
Preparation and information15
  • Filling out consent form
  • General information (posture, stop the experiment)
  • Initial questionnaire
  • Electrode placement
Relaxation5
  • Jacobson’s progressive facial relaxation procedure
MI-BCI tasks30
  • BCI tasks
Opinion questionnaire5
  • Experiment evaluation test
Experimental end5
Table 3. BCI tasks.
Table 3. BCI tasks.
ExperimentParadigmTaskVisual Cue/Description
1 Ijerph 17 00699 i001Hand movement versus relax (HRT)“↑” imagine opening and closing hand
“↓” no movement and relax
Ijerph 17 00699 i002
1 Ijerph 17 00699 i003Hand versus feet movement (HFT)“↑” imagine opening and closing hand
“↓” imagine both feet flexion
Ijerph 17 00699 i004
2 Ijerph 17 00699 i005Right hand versus left hand movement (RLT)“←” imagine opening and closing left hand
“→” imagine opening and closing right hand
Ijerph 17 00699 i006
2 Ijerph 17 00699 i007Hand versus feet movement (HFT) “←” imagine both feet flexion
“→” imagine opening and closing hand
Ijerph 17 00699 i008
Table 4. Initial questionnaire.
Table 4. Initial questionnaire.
QuestionsAnswersPercentage
Q1: DexterityRight92.6
Left7.4
Q2: Do you play any musical instrument?Yes17.1
No82.9
Q3: Do you consider yourself a bilingual person?Yes70.9
No29.1
Q4: Did you sleep well last night?Yes64.2
No35.8
Table 5. Pain perception.
Table 5. Pain perception.
DiscomfortNoneLittleModerateA LotToo Much
Participants (%)88 (49.7%)79 (44.6%)10 (5.7%)0 (0%)0 (0%)
Table 6. Comfort perception.
Table 6. Comfort perception.
Tolerance Time<1 h1–2 h2–4 hAlmost All DayAll Day
Participants (%)30 (17.0%)79 (44.6%)29 (16.4%)29 (16.4%)10 (5.6%)

Share and Cite

MDPI and ACS Style

Quiles, E.; Suay, F.; Candela, G.; Chio, N.; Jiménez, M.; Álvarez-Kurogi, L. Low-Cost Robotic Guide Based on a Motor Imagery Brain–Computer Interface for Arm Assisted Rehabilitation. Int. J. Environ. Res. Public Health 2020, 17, 699. https://doi.org/10.3390/ijerph17030699

AMA Style

Quiles E, Suay F, Candela G, Chio N, Jiménez M, Álvarez-Kurogi L. Low-Cost Robotic Guide Based on a Motor Imagery Brain–Computer Interface for Arm Assisted Rehabilitation. International Journal of Environmental Research and Public Health. 2020; 17(3):699. https://doi.org/10.3390/ijerph17030699

Chicago/Turabian Style

Quiles, Eduardo, Ferran Suay, Gemma Candela, Nayibe Chio, Manuel Jiménez, and Leandro Álvarez-Kurogi. 2020. "Low-Cost Robotic Guide Based on a Motor Imagery Brain–Computer Interface for Arm Assisted Rehabilitation" International Journal of Environmental Research and Public Health 17, no. 3: 699. https://doi.org/10.3390/ijerph17030699

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop