Next Article in Journal
Automatic Localization of Seizure Onset Zone Based on Multi-Epileptogenic Biomarkers Analysis of Single-Contact from Interictal SEEG
Next Article in Special Issue
Effectiveness of Mechanical Horse-Riding Simulator-Based Interventions in Patients with Cerebral Palsy—A Systematic Review and Meta-Analysis
Previous Article in Journal
Application of Precision-Cut Lung Slices as an In Vitro Model for Research of Inflammatory Respiratory Diseases
Previous Article in Special Issue
Bilateral Sensorimotor Cortical Communication Modulated by Multiple Hand Training in Stroke Participants: A Single Training Session Pilot Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Brain Activity and EEG-Based Brain–Computer Interfaces for Rehabilitation Application

1
School of Biomedical Engineering (Suzhou), Division of Life Sciences and Medicine, University of Science and Technology of China, Hefei 230026, China
2
Suzhou Institute of Biomedical Engineering and Technology, Chinese Academy of Sciences, Suzhou 215163, China
3
Mechanical Engineering Department, Faculty of Engineering at Shoubra, Banha University, Cairo 13518, Egypt
4
Mechatronics and Robotics Department, School of Innovative Engineering Design, Egypt-Japan University of Science and Technology (E-JUST), Alexandria 21934, Egypt
*
Authors to whom correspondence should be addressed.
Bioengineering 2022, 9(12), 768; https://doi.org/10.3390/bioengineering9120768
Submission received: 26 October 2022 / Revised: 29 November 2022 / Accepted: 30 November 2022 / Published: 5 December 2022
(This article belongs to the Special Issue Bioengineering for Physical Rehabilitation)

Abstract

:
Patients with severe CNS injuries struggle primarily with their sensorimotor function and communication with the outside world. There is an urgent need for advanced neural rehabilitation and intelligent interaction technology to provide help for patients with nerve injuries. Recent studies have established the brain-computer interface (BCI) in order to provide patients with appropriate interaction methods or more intelligent rehabilitation training. This paper reviews the most recent research on brain-computer-interface-based non-invasive rehabilitation systems. Various endogenous and exogenous methods, advantages, limitations, and challenges are discussed and proposed. In addition, the paper discusses the communication between the various brain-computer interface modes used between severely paralyzed and locked patients and the surrounding environment, particularly the brain-computer interaction system utilizing exogenous (induced) EEG signals (such as P300 and SSVEP). This discussion reveals with an examination of the interface for collecting EEG signals, EEG components, and signal postprocessing. Furthermore, the paper describes the development of natural interaction strategies, with a focus on signal acquisition, data processing, pattern recognition algorithms, and control techniques.

1. Introduction

Stroke has become one of the main reasons for abnormal human death. According to global disease research records, more than 10 million patients worldwide suffer from stroke and up to 116 million people are left with disabilities. This disability affects the patient and the patient’s family’s daily life [1]. Stroke causes damage to the central nervous system. One of the highly predicted injuries is the loss of limb motion. Rehabilitation training is critical for stroke patients. In recent years, there has been a noticeable increase in the survival rate of stroke cases. However, there is still a high demand for advanced rehabilitation methods to speed up the recovery period and improve motor recovery in post-stroke patients, which gives more availability to the concept of brain computer interface BCI rehabilitation systems. Brain-computer interfaces have been given priority usage over the conventional neuromuscular pathways because they enable stroke patients to communicate with the surrounding environment using their brain signals, overcoming the movement disability of the limbs [2]. This advantage caused a growing attraction in the field of rehabilitation. Additionally, the ability to decode the desires of patients diagnosed with motor disability has governed the usage of an external rehabilitative or assistive device. It has proved the ability of BCI systems to apply the neural plasticity concept using neurofeedback [3,4,5].
Furthermore, the BCI rehabilitation system has a great advantage over traditional rehabilitation (e.g., physiotherapist restricted induced therapy movement); the BCI rehabilitation systems are closed-loop, patient-oriented, and stimulate motion. There is no need for the remaining paralyzed limbs to move [6]. Many clinical studies stated a remarkable enhancement in motor recovery by using BCI rehabilitation systems. Furthermore, recent review articles stated that for stroke patients’ rehabilitation methods, clinical research using the BCI rehabilitation system recorded higher clinical scores under controlled conditions [7,8,9]. Although all of these are inspiring advantages for the BCI rehabilitation systems, there are some remaining obstacles, such as the accuracy of the patient detected, motor intention, system stability cross subjects, and the different rehabilitation sessions in the accuracy of real time and real-time brain data processing techniques [10,11,12,13]. Therefore, there is an urgent need to devolve innovative BCI paradigms to further improve the practicability and effectiveness of BCI rehabilitation systems.
There are two main techniques, invasive and non-invasive BCI systems, that measure the subject’s intention by collecting the brain signals. Electrocorticography (ECoG) and electroencephalography (EEG) have risen to prominence as the most often used invasive and non-invasive methods, respectively [14,15]. Electrocorticography (ECoG) of the brain employs single neuron action potentials (single units), multiunit activity (MUA), and local field potentials (LFP) [16,17]. These approaches successfully use these signals’ high-quality spatial and temporal properties to decode biomechanical parameters [18]. However, invasive electrodes have substantial disadvantages due to the danger of undergoing surgery and the progressive deterioration of recorded data [19]. Therefore, non-invasive techniques became more common in human subjects. The non-invasive techniques are magnetoencephalography (MEG), near-infrared spectroscopy (NIRS), functional magnetic resonance imaging (fMRI), (MRI) and electroencephalography (EEG).
Whatever the method used, invasive or non-invasive, after the rehabilitation process it is very important to monitor the effects of rehabilitation. Diffusion tensor imaging (DTI) is a powerful method that allows the investigation of the microstructure of tissues both in vivo and noninvasively [20]. Also, it is a complementary way that can help verify the quality and effects of rehabilitation treatment with the use of a technique that eliminates the impact of systematic errors (BSD-DTI) in the context of supporting the verification of the rehabilitation process. A good DTI can show whether neural fiber tracts are being restored or not.
Furthermore, new technology advances such as wireless recording, real-time temporal resolution, and machine learning analysis have sparked a surge in interest in non-invasive technologies, particularly EEG-based BCI approaches. Furthermore, EEG is considered the most non-invasive, realistic, and practical brain machine interface technique because other techniques are not portable, expensive and technically challenging. EEG has proven to be the most preferred approach because of its direct measurements of cerebral activity, low cost, mobility for clinical usage, ease to use, and adaptability to multiple experiment paradigms [14]. EEG signals could act as a connection between the brain and various external devices, leading to brain-controlled assistive and rehabilitation devices for disabled people and patients with strokes and other neurological deficits [21]. The BCI-EEG based solution mainly depends on the EEG signal properties and the EEG signal and the processing of the EEG signal, as shown in Figure 1, since the strength of the signal and its subsequent processing affect the accuracy of the controlled rehabilitation device.
Illiteracy in EEG-BCI may impede the wide spread of the EEG-BCI application. An illiteracy of EEG-BCI means that (about 20% of patients) cannot use the EEG signals to control a BCI system, which may impede the broad application of the EEG-BCI technology [22]. In 2015 K. Ang et al. [23] reported that 103 of 125 stroke patients successfully used EEG to modulate and control the BCI neurorehabilitation system, which could prove the practicability of EEG-BCI rehabilitation systems for stroke patients during the rehabilitation sessions [23].
A suitable paradigm and protocol should be carefully established for all experiment phases to construct an EEG-based BCI system for a given application. Each of these paradigms has advantages and drawbacks with respect to the patient’s physical state and user friendliness. In each paradigm, the participant performs a specific endogenous and exogenous job (e.g., imaging or visual task) to learn how to modify their brain activity during capturing the EEG signals from the scalp. A neural decoder for the paradigm is created using recorded EEG activity as training data. Following that, the individual repeats the task; then the neural decoder is employed for BCI control.
BCI systems have been used in two directions. The first is looking at brain activity to see whether a feed-forward route can be utilized to control external devices without targeting rehabilitation [12]. During neurorehabilitation, the other dominating path is closed-loop BCI systems, with the feedback loop playing a critical role in restoring neural plasticity training or regulating brain processes [12]. A suitable paradigm and protocol should be chosen for all experiment parts to develop an EEG-based BCI system for a given application. First, the participant completes a task (e.g., imaging or a visual task) to learn how to modify brain activity while transmitting EEG data. A neural decoder for the paradigm is created using the obtained EEG signals as training data. After that, the individual repeats the task while the neural decoder is used to operate the BCI.
Limb dysfunction is one of the most common symptoms of stroke patients; according to recent statistics, 80% of stroke patients suffer from limb dysfunction [24,25], affecting the patient and the patient family’s daily life. Moreover, it greatly impacts the patient’s self-trust and economic and social life, and the country’s labor power. For all the factors mentioned above, finding an innovative rehabilitation method becomes an urgent need to replace the traditional rehabilitation method given its weakness, which mainly is slow restoring of the motor function, and finding a rehabilitation method with no need of the dysfunctional limb contribution. Due to the plasticity of the central nervous system, with repetitive rehabilitation training based on brain signal, the brain can establish a new connection with the dysfunctional limb and the central nervous system, which can positively help treat stroke patients. The role of BCI systems is to help patients do practical rehabilitation training for the dysfunctional limb without the need for any contribution of this limb, which is a crucial point for stroke and post-stroke patients. As a result of the problems of traditional rehabilitation, different rehabilitation robotic systems have been built to help patients complete repetitive training through external rehabilitation robots combined with BCI systems [26,27]. Due to the continuous research in rehabilitation robots, the United States and European medical BCI systems markets are continuously growing. More stroke patients with hand dysfunction benefit from rehabilitation robotics with BCI systems. As shown in Figure 2, the annual trend of research publications related to BCI systems and rehabilitation robots in the web of Science database indicates the promising results of the BCI systems. Moreover, the scarcity of review papers shows a lack of reviews that might help the researcher better understand the field and keep up with the new trends and achievements in the field.
Several review papers have been published on the BCI rehabilitation system [28,29,30,31]. But few of these papers have described in detail the applications of the BCI system in different fields, how the BCI concept was applied to these systems, the drive modes, and the control strategies. In this paper, we review the current development of the BCI systems over the last five years and provide an overview of the electroencephalography (EEG), BCI, different paradigms based on exogenous and endogenous EEG signals and their advantages and disadvantages, together with endogenous and exogenous suitability for the different applications besides the control strategies, starting with the EEG signal Pre-processing and classification. Finally, we reviewed the latest BCI application in rehabilitation based on the assistive robot, and the virtual reality technology. Figure 3 explains the various applications of the BCI systems and shows the importance of the BCI in different life sectors, such as gaming, rehabilitation, and in industry. The rest of the paper is organized as follows: Section 2.1 describes the EEG signals and their characteristics. Section 2.2 describes the different paradigms used in the BCI systems based on endogenous and exogenous EEG signal reactions. Section 3.1 and Section 3.2 describe the control strategies, including signal processing and the classification methods for the EEG signals. Section 4.1 and Section 4.2 analyze the different BCI systems’ based on assistive robot technology and virtual reality technology state of the art. Finally, the conclusion of the discussion summarizes the whole paper.

2. Overview of EEG and BCI

2.1. Electroencephalography (EEG)

Electroencephalography (EEG) is the most often-used brain signal in brain-machine interface applications. EEG measures brain activity electric signals generated by currents created by neurons within the brain. By placing the electrode on the scalp, the EEG signal can be detected non-invasively [32]; while the electrode placement has different placing systems: (10-5), (10-10), and (10-20) EEG systems; one of the most promising used systems is the (10-20) system. The 10-20 system concept is described as shown in Figure 4.
Several factors contribute to this popularity compared to other brain wave measurement methods. EEG signals are non-invasive, low cost, compatible, portable, and have a high temporal resolution. This explains why EEG is the most widely used tool to measure brain activity [34]. Furthermore, it is reasonably priced and has an excellent temporal resolution (1 ms). However, it has a poor signal, is prone to artifacts, and has a low spatial signal resolution [35]. Waveforms measured in an EEG test reflect the electrical activity of the brain. The strength of the EEG activity signal is frequently and relatively low, measured in (θ) and gamma [36] depending on their frequency range, as presented in Table 1.

2.2. BCI Has Different Paradigms Based on Exogenous and Endogenous EEG Signals

The BCI system based on EEG signals mainly depends on how the EEG signal is distinguished. It depends on whether the brain ignition method is internal or external. The most used internal and external paradigms are as below: Motor Imagery Paradigms (Imagined Body Kinematics Paradigm, Sensorimotor Rhythms (SMR) Paradigm), External Stimulation Paradigm (Visual P300 Paradigms, Steady-State Visual Evoked Potential Paradigms (SSVEP), Error-Related Potential), Hybrid Paradigms, and others, such as the Discrete Movement Intention Paradigm, the Auditory Paradigm, the Somatosensory (Tactile) paradigm, and the Reflexive Semantic Conditioning Paradigm.

2.2.1. Endogenous EEG Signal

In the endogenous BMI technique, the EEG signal is generated independently of external stimulation and may be fully and freely regulated by the individual. It is also helpful for patients with neurological problems as it allows for more natural and spontaneous interactions because the neuroprosthesis is automatically controlled [51]. However, this often needs longer training time and a lower bit rate than SSVEP and P300. Sensorimotor rhythms (SMR) and slow cortical potentials (SCP) are two examples of endogenous EEG signals [56]. SMR can withstand two types of amplitude modulation: event-related synchronization and event-related desynchronization. Sensorimotor rhythms are composed of mu and beta rhythms, which are different frequencies of brain activity that occur in the mu (7–13 Hz) and beta (13–30 Hz) bands, respectively. In both motor imagery and active motion situations, ERD is indicated by a reduction in EEG power associated with motion-related activities. Sensorimotor rhythms are crucial to motor imagery tasks, even when no effective movement is present [57]. Practicality can be improved using sensorimotor rhythms to create endogenous BCI, which is more helpful than exogenous BCI in practical applications. In their survey, Maged S. AL-Quraish et al. concluded that 57 percent of their selected research employed motor imagery tasks. ERD is the most prevalent signal that has been used as an example in the following studies [58,59,60,61,62,63] to operate assistive devices. Movement-Related Cortical Potentials (MRCPs) indicate fundamental processes proportional to motor execution and are connected to both active and imagined motor tasks [64]. Xu et al. [65] MRCPs’ reliance on force-related characteristics that can be used to generate control signals for impaired people controlling assistive robots also used MRCPs to identify imaginary ankle dorsiflexion motions with a short delay from the scalp EEG. MRCPs may be linked to EEG modules to regulate the ambulation of the lower extremity exoskeleton while walking, as demonstrated in [66].

2.2.2. Exogenous Evoked Potentials, EEG Signal

Exogenous BMI and EEG signals are produced in response to external inputs such as visual or auditory cues. External stimuli, such as flashing LEDs and music, can influence brain activity. The changed EEG activity was recorded and processed to control the actual or virtual items. External stimulation can take the form of visual [67,68], auditory [68,69], or somatosensory [70] stimulation. Visual P300 Paradigms and Steady State Visual Evoked Potential are the most popular forms of External Stimulation Paradigms.
This technique has the advantage of requiring little subject training and achieving a high bit rate of 60 bits/min [71]. However, users must always focus on the external cue or stimulus, which limits its application. Furthermore, due to the overwhelming stimulation, users might soon feel fatigued [72]. Exogenous EEG signals include SSVEP (steady-state visual-evoked potential) and P300 [73]. SSVEP is the natural reaction to visual stimuli at different frequencies [74]. In other words, when a person looks at a flashing light at a given frequency, the visual cortex responds with an EEG signal of the same frequency. SSVEPs are used in exoskeleton robots to provide control signals to the exoskeleton. On the screen, the user is presented with several control inputs, such as moving forward and moving left, which the subject may pick by focusing on. For example, Kwak et al. [75] used a visual stimulation screen with five LEDs attached to the exoskeleton to stimulate SSVEP. Each LED denotes a different control instruction (for example, standing, walking ahead, and turning left/right). Another type of exogenous EEG signal is the P300, which occurs approximately 300 milliseconds after the subject notices an external stimulus. P300, like the SSVEP, is programmed to select one of several potential instructions from which the user intends to trigger a P300 reaction. The P300 does not need any training; however, it has a slower data transfer rate than SSVEP.

3. EEG Control Strategies

Different control mechanisms for controlling human-robot interaction have been established. The first approach is to employ a control scheme that predicts or follows the subject’s intention based on data acquired from the exoskeletons. Only the information acquired from the exoskeleton would be used to predict the user’s movement intention in this control method. Two closed loops are needed to control this scheme. The first controller reflects the user and actuator’s effect on the exoskeleton [51]. The second control strategy utilizes a control scheme based on the interaction force that can be measured by measuring the deformation of an elastic transmission element or structure coupled to an exoskeleton robot link. Low-level control techniques (direct control) were used to a large extent in this previously demonstrated electroencephalography (EEG)-based computer interface (BCI)-controlled robotic arm system. However, users must issue control orders often under a low-level control method, which can lead to the user [76]. EEG is a popular non-invasive technology for capturing brain activity. EEG signals are analyzed and translated into control commands [77]. The interface between the human and the wearable robot is crucial for an efficient and successful control scheme that predicts the intention of the user to move. Consequently, the control scheme can be categorized according to the human-robot interaction. Exoskeleton information is obtained based on the interaction force measured between the exoskeleton and the human. The physiological signal measured from the human body reflects the user’s movement intention, Electroencephalography (EEG) signals significantly impact the development of assistive rehabilitation devices. EEG signals have recently been employed as a common way to explore human motion behavior and functions [51]. Human motion intention (HMI) based on EEG can control different kinds of robots to assist paralyzed persons with neuromuscular diseases such as amyotrophic lateral sclerosis and stroke in rehabilitation training. Compared to the traditional approach of repeated motion, a large body of research suggests that EEG-based assisted robots enhance patients’ recovery by essentially helping to reestablish the neural circuit between the brain and the muscles [78]. Brain potentials captured by scalp electrodes are converted into commands for controlling robot arms, exoskeletons, wheelchairs, or other robots through brain-computer interface algorithms. Slow cortical potentials, event-related P300, and steady-state visual evoked potentials are several EEG processes that distinguish EEG-based brain-computer interfaces [77]. In terms of reliability, the BCI can be divided into independent BCI and independent BCI. The dependent brain-computer interface enables people to use some form of motion control, such as gaze. The brain-computer interface based on moving images is one of the most commonly-used paradigms of brain-computer interface [79]. Independent BCIs such as P300 evoked potentials, steady-state visual evoked potentials (SSVEPs), sensorimotor rhythms, motion-onset visual evoked potentials, and slow cortical potentials can be utilized to extract control signals; SSVEPs are periodic evoked potentials (PEPs) generated by rapidly repeating visual stimulation, particularly at a frequency greater than 6 Hz. The 5–20 Hz stimulation frequencies produce the most significant response to visual inputs. SSVEPs are more abundant in the occipital and parietal lobes, and their frequency corresponds to the fundamental frequency and harmonics of the frequency-coded inputs. By extracting frequency information from this signal, an SSVEP-BCI system may identify the user’s intended command, such as moving a cursor on a computer screen or operating a robot arm [80]. SSVEP-based BCIs have a high information transfer rate (ITR) and need little user training. SSVEP-based BCIs are easier to encode with more instructions without much training and show good promise in high-speed communication [76] but are limited by a small number of controls. In other words, SSVEP-BCIs of various classes can be realized using flickering lights with different frequencies. These flashing stimuli, given using light-emitting diodes (LEDs) or a computer display, change EEG signals at the stimulating frequency and its harmonics. The frequency components of SSVEP could be calculated using the lock-in analyzer system (LAS), the power spectral density analysis, and the canonical correlation analysis (CCA) [81].
BCIs based on SSVEP and the P300 component can be set up with little or no training, but they require external stimuli. In contrast, BCIs based on sensorimotor rhythms (SMR) and slow cortical potentials (SCP), on the other hand, do not require external input but do require significant user training [81]. The assist robot can be controlled more easily via event-related potentials (ERPs), which are brain voltage fluctuations reacting to certain stimuli such as sights or noise. A lower limb prosthesis based on P300, the peak detected 300 ms (250–500 ms) following a given event, has been developed to assist persons in walking. Motor imaging (MI) has also been addressed to tightly connecting brain commands and bodily movement responses. A method for after-stroke rehabilitation activities that use MI to control a robot to drive the arm by allowing individuals to visualize moving their hands has been demonstrated [77]. Because of its effectiveness over traditional BCI, hybrid BCIs (hBCI), which “detect at least two brain modalities in a simultaneous or sequential pattern,” have been emphasized for control applications [82]. Researchers looked for multiple regions of the brain to boost the number of commands, improve classification accuracy, reduce signal detection time, and shorten brain command detection time. For example, SSVEPs and event-related potentials (ERPs) were mixed to generate a hybrid EEG paradigm. The combination of SSVEP and P300 signals for BCI is a good example. SSVEP has also been paired with motor imagery (MI) [83]. EEG is also hybridized with electrooculography (EOG), functional near infrared spectroscopy (fNIRS), electromyography (EMG), and eye tracker [84].

3.1. EEG Signal Preparation Overview

To operate external devices such as an upper or lower limb exoskeleton using an EEG signal, the individual must generate various brain activity patterns (motor imagery or motor execution), which will be identified and translated into control commands [84]. The detected brain signal is preprocessed to remove artifacts and prepare the signal for machine learning, turning EEG signals into control commands operating terminal devices. The feature extraction stage began from this process, and the extracted features were then submitted to a feature reduction procedure if necessary. Finally, the new projected feature vectors are divided into various classes based on the task. A BCI system has four major components: signal acquisition, signal preprocessing, feature extraction, and classification, as depicted in Figure 5.
The user conducts MI of the limb, which is encoded in EEG readings; features describing the task are deciphered, processed, and transformed into commands to control the assistive robot equipment.
The brain signals are captured in the signal acquisition stage, which may also include noise reduction and artifact processing. Skin impedance fluctuations, electrooculography activity, eye blinks, electrocardiographic activity, facial/body muscular EMG activity, and respiration can cause EEG abnormalities. The bandpass filter can be an effective preprocessing tool because the frequency ranges for the physiological signals are typically known.

3.2. Feature Extraction

The feature extraction stage is to identify distinguishable information in the recorded brain signals. Then, the EEG signals can be mapped to different processing vectors, which include the actual features and discrimination features of the measured observation signals. Some methods divide the signals into short parts, from which the parameters can be calculated. The length of the segment length impacts the accuracy of the estimated features. Wavelet transform or adaptive autoregressive components are preferred to highlight non-stationary time changes in brain signals [14].
Several distinct feature extraction techniques, including the autoregressive model, discrete wavelet transform, wavelet packet transform, and sample entropy, were utilized. The redundant and irrelevant information was managed by the feature selection methods, which benefited classification. To improve the performance of feature selection, one of the global optimization strategies based on binary particle swarm optimization (BPSO) is presented [85,86]. To evaluate the efficacy of feature extraction, class separability experiments are conducted. Using a 14-channel EEG machine, 21 healthy subjects aged 12 to 14 years who viewed images containing one of four distinct emotional stimuli had scalp EEG data recorded (happy, calm, sad or scared).
Then, a balanced one-way ANOVA was used to determine the most useful EEG characteristics. Statistics-based selection of features outperformed manual or multiple variable selection. Support vector machine, k-nearest neighbor, linear discriminant analysis, naive Bayes, random forest, deep learning, and four ensembles were used to classify emotions using the most effective features [87,88]. In addition, a Markov is employed to process the simulated EEG signals based on the actual EEG signals. Simulated and experimental results demonstrate that the performance of the proposed method is superior to that of widely used methods [89,90]. The proposed method can prevent the mixing of components of EEG signals with complex structures and extract brain rhythm from EEG signals with low SNR.
The most common features of EEG-based BCIs include spatial filtering, band power, time points, etc. [91]. In addition, stationary subspace analysis (SSA), which decomposes multivariate time series into stationary and non-stationary components, has recently been presented to cope with the non-stationarity of EEG data [14] after the retrieved feature vector is used to train a classifier [92].

3.3. Classification

Small changes can easily affect the complex structure of EEG in human cognition. As a result, a highly efficient and robust classifier is required. In a BCI system, the objective of the classification step is to recognize a user’s intents using a feature vector that characterizes the brain activity provided by the feature step. This goal can be achieved using regression or classification methods. However, classification techniques are currently the most preferred option. Regression methods use features retrieved from EEG signals as independent variables to predict user intentions. On the other hand, classification algorithms use the extracted features as independent variables to create boundaries between various targets in the feature space [14].
Classification algorithms turn the extracted data into distinct motor activities such as hand gestures, foot movements, word production, and so on in motor imagery brain-computer interfaces [79]. Combining several signal characteristics from different modalities/devices for the same brain activity can increase the classification accuracy. For example, finger-tapping and hand/arm movement have been detected using a combination of EEG and fNIRS [84]. Machine learning (ML) and deep learning (DL) techniques have been used to identify EEG-based BCI; with each successive session, machine learning techniques allow the brain-computer interface to learn from the subject’s brain, modifying the generated rules for classifying ideas and thereby increasing the effectiveness of the system [79]. Machine learning algorithms are divided into three groups based on their results: supervised, unsupervised, and reinforcement learning [79]. Moreover, deep learning approaches have been shown to improve classification accuracy. Deep networks can also detect latent structures or patterns in raw data [92], and robots can study innate movement patterns and estimate human intentions when combined with MLAs [93].
Various classification algorithms have been implemented, such as k-nearest neighbors (k-NN), multilayer perceptron (MLP), decision trees [92], convolutional neural network (CNN) [83], linear discriminant analysis (LDA), support vector machine (SVM) [79] with the SVM classifier outperforming other classifiers such as LDA and K-NN [79]. When comparing the classification accuracies of LDA, SVM, and backpropagation neural network (BPNN), the former two classifiers produced similar high accuracies, which are more significant than BPNN [59]. Compared with PCA, Recurrent Neural Network (RNN) obtained a control accuracy of 94.5 percent and a time cost of 0.61, whereas the PCA algorithm achieved a control accuracy of 93.1 percent and a time cost of 0.48 ms [94].
A convolutional neural network (CNN) based deep learning framework is employed for inter-subject continuous decoding of MI-related electroencephalographic (EEG) signals. The results, which were obtained using the publicly available BCI competition IV-2b dataset, show that adaptive moment estimation and stochastic gradient descent yield an average continuous decoding accuracy of 71.49 percent (a = 0.42) and 70.84 percent (=0.42) for the two different training methods, respectively [95,96].
The pattern recognition step is coming after the feature classification step, which means that the EEG signal has been classified into different shapes, and the subsequent step is required to determine the pattern recognition. This is the case for this part of the process. Statistical data analysis, signal processing, image analysis, information retrieval, bioinformatics, data compression, computer graphics, and machine learning are just some of the fields that can benefit from its use. The fields of statistics and engineering are where pattern recognition first found its roots; some contemporary approaches to pattern recognition include the application of machine learning.

4. Application of EEG in BCI Systems

EEG signals can offer a channel from the brain to several external devices, providing a brain-controlled assistive device for disabled and brain-controlled rehabilitation equipment for patients with stroke and neurological abnormalities [97,98]. Control equipment such as wheelchairs [97] and communication help systems [98] have been programmed using EEG signals, as shown in Figure 6. Throughout the last decade, different EEG approaches have proven a viable strategy in controlling rehabilitative and assistive equipment.

4.1. BCI-Assistive Robot Rehabilitation Application

In 2016, Applied Neurotechnology Laboratory, University Hospital Tübingen, Germany, used a hybrid non-invasive neural hand exoskeleton with six paraplegic subjects aged between 30 ± 14 to control the paralyzed wrist fingers flexion/extension movement, as shown in Figure 7. They controlled the hand exoskeleton by wireless transmitting of the EOG and EEG signals to a tablet that did the signal preprocessing and converted the final signal record into a control command, thus sending it to the control box and then to the actuator to move the hand mechanism using a flexible cable system. Their system proved that the assistive brain/neural systems could help the paraplegic patients independently do their daily activities, such as holding a cup and drinking, eating with cutlery or manipulating different objects [99].
In 2019, Zhang Jinhua et al. considering the needs of hand rehabilitation, as shown in Figure 8, they created a multimodal human-machine interface system using three bio-signals which are electroencephalography (EEG), electrooculography (EOG), and electromyogram (EMG). They use bio-signals to generate a multi-control command for a multitask, real-time soft assistive robot; moreover, they investigated the acceptance of the patient of use of a wearable hand for a robot assistive hand movement, as shown in Figure 9. To apply the concept of using EEG, EOG, and EMG together, six subjects were hired to experiment with imaginary flow motor for EEG, looking left/right for EOG, and different hand movements for EMG. The subject spent <2 min training to set the EEG/EOG mode parameters. The experiment scenario was as follows: 2 s black screen, then there is a cross on the screen center and lasts until 4 s; after that, a cue picture in a dashed border appeared for 2 s. For the EOG, a left or right arrow appears on the screen to guide the subject to track the arrow by looking left/right, which then makes the subject blink. The EEG mode was imagining the left/right-hand movement on the screen as a cue for 2 s. For training, the EOG and EEG models contained ten trials, including five left and five right arrows, five times left hand, and five times right hand MI. Using this model, the number of control commands that could be achieved is significantly greater than in other single modes. This multifunction had achieved a classification precision of 93.83% with a 47.41 bit/min information transfer rate, which means the user can control 17 action/minute, which is convenient for disabled patients [100].
In 2020, N. Cheng et al. studied the effect of BCI-based Soft Robotic Gloves on the rehabilitation progress compared to the soft robotics glove for stroke patient rehabilitation. A total of 11 chronic stroke patients were recruited for the 24 weeks experiment and divided with two groups (six patients in the BCI soft robotic glove) and five patients in the Soft robotic glove to do daily life-oriented tasks. The BCI-with soft robotics glove group used BCI motor imagery-based. The soft robotics glove group used the soft robotic glove to assist the patient affected hand in daily activity tasks. In the BCI- soft robotic glove. The patient is provided with the intended task through a computer screen, performing motor imagery by imagining the task. Then the subject’s EEG signal is collected using the EEG cap, and ERD/ERS is detected from the patient’s EEG signals. The acquisition system sends two control signals, one to the robotic glove to activate the actuator and assist the hand in performing finger-specific movement tasks, and the other to the computer screen to play an animation for a successful specific hand movement task, as shown in Figure 10. During the first six weeks, the two groups had no significant changes. But after that, all patients with the BCI soft robotic glove announced a sense of small movement of the stroke-impaired hand, while this sense lasted with three out of five of these patients until the 24th week. None of the patients with only the soft robotic glove had this sense during the 24 weeks. Their results indicated that BCI combined with soft robotic training for ADL-oriented stroke recovery has the potential to provide long-term benefits and prompt perception of motor actions [101].
In 2021, Mads Jochumsen et al. developed a cheap BCI system with a cheap 3D-printed wrist exoskeleton controlled with an open source cheap BCI. The aim was to overcome the major obstacle that impedes BCI use in the rehabilitation field, which is cost and usability, and to check if its system can simulate neural plasticity. Eleven healthy subjects joined this experiment, including four females aged 28 ± 3. EEG was recorded from seven channels (OpenBCI, Brooklyn, NY, USA) from F1, F2, C3, Cz, C4, P1, and P2 concerning the International 10-20 System. The subjects were seated in a chair and it was explained how to do motor imagination and then there was a training session for 5 min. Subjects were asked to follow the procedure to image 30 extensions of the right-hand wrist extensions with a specific timeline; see Figure 10 for the experiment timeline, as shown in Figure 11 [102].
During the experiment, the subject was instructed through a screen and the visual cue was as follows, every 30 successful motors imaginary (display “REST”) and 30 motor imagines were recorded. The imaginary right hand wrist movement lasts for 4 s. The subject wore the wrist exoskeleton on the right forearm and hand during the experiment. The subject’s forearm was comfortably on the chair armrest. The experiment was completed successfully if a 50 right exoskeleton movement was obtained due to 50 right-hand wrist imagination. The TMS was measured before, immediately after, and 30 min after BCI training. They found that the BCI system has an 86.12% true positive rate and 1.20 0.57 false detections per minute. The MEPs increased by 35–60% immediately after the BCI training and 67–60% 30 min later than the measurement before the BCI training. There was no correlation between BCI performance and plasticity induction.
In summary, an open-source BCI system can detect imagined motions and operate a low-cost 3D printed exoskeleton, which may induce brain plasticity when paired with the BCI. Their discoveries might help BCI technology to become widely used in rehabilitation at home, as shown in Figure 12. However, users must be enhanced, and further experiments with stroke subjects are required [102].

4.2. BCI-Virtual Reality Rehabilitation Application

In 2015, Kathner et al. researched whether VR devices can achieve the same precision and rapid data transmission compared to regular display methods that are used in the P300-BCI systems. They conducted an experiment based on 18 subjects who were asked to do an online spelling task using three different presentation methods, as shown in Figure 13. The first screen was a standard thin film transistor five-by-five matrix. The second was the same five-by-five screen but in a virtual reality scenario that filled the subject’s field of view. The third was similar in VR, but only one letter from the five-by-=five matrix filled the subject’s field of view at a time. Empirical findings revealed equivalent online spelling accuracy (96%, 96%, and 94%, respectively). As a result, VR devices could report the same precision as regular flat panel displays while still performing quick P300-BCI transmission [103].
Ortner et al. from Johannes Kepler University Linz, Austria found that fusion of VR systems and MI-based BCI can improve the efficiency of rehabilitation training for patients with neurological diseases, particularly motor impairment. The proposed MI can be a common technique for motor rehabilitation of stroke patients by using the MI-BCI system to train stroke patients to imagine left- and right-hand movements in VR scenes. The researchers also optimized an algorithm that decreased the classification average error to 9.6% [104]. Moreover, other researchers provided additional evidence that neurological disease patients can use motor imaginary based BCI systems to imagine and run a virtual or actual device in VR scenes to perform and repeat different movements as rehabilitation training sessions, therefore performing a training aim to neural plasticity and helping the recovery of the injured motor nerve pathway.
In 2018, Robert Lupu et al., Technical University of Iasi, Romania presented a stroke rehabilitation therapy method based on a novel technique. The suggested approach places the patient in a virtual situation where a virtual therapist organizes activities to restore brain function using a virtual reality Oculus Rif device. The electrical stimulator assists the patient in rehabilitation activities, and a BCI system and an EEG device are used to verify whether the exercises are being performed correctly [105]. The BCI-FES TRAVEE subsystem consists of a stimulation part which is FES, BCI monitoring devices, an electro-oculography (EOG), a VR headset that is Oculus Rif, and a computer, as shown in Figure 14 [105].
This system was focused on flexion and extension. The experiment scenario was as follows: the patient seated in a normal chair or wheelchair. FES electrodes were placed on the forearm extensor muscles as shown in Figure 13. The EEG helmet and EOG electrodes are placed before attaching the VR headset. The therapist sits in front of the patient and shows him the following images to describe what he will see: The virtual therapist will raise their hand, as shown in Figure 15 (the therapist’s left hand is the patient’s right hand); a large arrow will appear on the upper left or right of the screen, depending on the virtual therapist’s indications, and the patient will also hear sounds from the left or right. The following explanations are given: VR is fitted, the EOG system is calibrated, and recovery exercise may begin, but only after the actual therapist informs the patient that he has the option of selecting between two views: frontal perspective (the virtual therapist is located in front of the patient) or a mirror view (the virtual therapist is on the left side, and a mirror is in front of them, as in a dance studio), as shown in Figure 14. In this system, the subject is not disturbed by surrounding events or the real environment. The subject is only immersed in VR, and the VR therapist shows the subject how to do every exercise, and a big red arrow is shown every time. The eye tracking system detects when the patient loses concentration and sends a warning. The system achieved low control error rates compared to the ones they reported in the investigation. Table 2, Summarizes some of the presented paper details such as subject, signal type and electrode location and other details.

5. Conclusions

  • The P300-BCI system is convenient for rehabilitation due to its effective cost, reliable performance, and variety of applications. Furthermore, many research groups integrated the P300 with VR technology for rehabilitation of an immersive experience for neurological diseases. MI offers a solid basis for BCI research and implementation, and the combination of MI-based BCI and VR systems increases the effectiveness of rehabilitation training for people with neurological diseases, particularly motor impairment. In VR feedback, there are obstacles in development and implementation. For example, people may struggle to focus on goals while ignoring the immersive virtual world, which can be distracting. Furthermore, the use of VR equipment is not consistent across the duration of experiments. Both characteristics diminish the efficacy of rehabilitation training. Researchers ran tests on several BCI feedback and VR platforms to discover a reliable approach.
  • The most promising paradigm uses the MI-VR novel multiplatform prototype that improves attention by providing multimodal feedback in VR settings utilizing cutting-edge head-mounted displays. By integrating an immersive VR environment, sensory stimulation, and MI, the NeuRow system is a promising VR BCI system that can offer a holistic approach to MI-driven BCI.

Author Contributions

Conceptualization, H.Y. and K.G.; methodology, M.O.; writing—original draft preparation, M.O., M.E. and K.G.; writing—review and editing, M.O., M.E. and S.Z.; supervision, H.Y.; project administration, K.G.; funding acquisition, H.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Key Research and Development Program of Jiangsu Province (Grant number:BE2021012-1), the International Partnership Program of the Chinese Academy of Science (Grant number:154232KYSB20200016).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

This research was supported in part by the Key Research and Development Program of Jiangsu Province (Grant number:BE2021012-1), the International Partnership Program of the Chinese Academy of Science (Grant number:154232KYSB20200016).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Lozada-Martínez, I.; Maiguel-Lapeira, J.; Torres-Llinás, D.; Moscote-Salazar, L.; Rahman, M.M.; Pacheco-Hernández, A. Letter: Need and Impact of the Development of Robotic Neurosurgery in Latin America. Neurosurgery 2021, 88, E580–E581. [Google Scholar] [CrossRef] [PubMed]
  2. He, B.; Yuan, H.; Meng, J.; Gao, S. Brain–computer interfaces. In Neural Engineering; Springer: Berlin/Heidelberg, Germany, 2020; pp. 131–183. [Google Scholar]
  3. Mrachacz-Kersting, N.; Jiang, N.; Stevenson, A.J.; Niazi, I.K.; Kostic, V.; Pavlovic, A.; Radovanovic, S.; Djuric-Jovicic, M.; Agosta, F.; Dremstrup, K.; et al. Efficient neuroplasticity induction in chronic stroke patients by an associative brain-computer interface. J. Neurophysiol. 2016, 115, 1410–1421. [Google Scholar] [CrossRef] [Green Version]
  4. Floriana, P.; Mattia, D. Brain-computer interfaces in neurologic rehabilitation practice. Handb. Clin. Neurol. 2020, 168, 101–116. [Google Scholar] [CrossRef]
  5. Mrachacz-Kersting, N.; Ibáñez, J.; Farina, D. Towards a mechanistic approach for the development of non-invasive brain-computer interfaces for motor rehabilitation. J. Physiol. 2021, 599, 2361–2374. [Google Scholar] [CrossRef]
  6. Soekadar, S.R.; Birbaumer, N.; Slutzky, M.W.; Cohen, L.G. Brain-machine interfaces in neurorehabilitation of stroke. Neurobiol. Dis. 2015, 83, 172–179. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Yang, S.; Li, R.; Li, H.; Xu, K.; Shi, Y.; Wang, Q.; Yang, T.; Sun, X. Exploring the Use of Brain-Computer Interfaces in Stroke Neurorehabilitation. Biomed Res. Int. 2021, 2021, 9967348. [Google Scholar] [CrossRef] [PubMed]
  8. Cervera, M.A.; Soekadar, S.R.; Ushiba, J.; Millán, J.D.R.; Liu, M.; Birbaumer, N.; Garipelli, G. Brain-computer interfaces for post-stroke motor rehabilitation: A meta-analysis. Ann. Clin. Transl. Neurol. 2018, 25, 651–663. [Google Scholar] [CrossRef]
  9. López-Larraz, E.; Sarasola-Sanz, A.; Irastorza-Landa, N.; Birbaumer, N.; Ramos-Murguialday, A. Brain-machine interfaces for rehabilitation in stroke: A review. NeuroRehabilitation 2018, 43, 77–97. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Cameirão, M.S.; Badia, S.B.; Oller, E.D.; Verschure, P.F. Neurorehabilitation using the virtual reality based Rehabilitation Gaming System: Methodology, design, psychometrics, usability and validation. J. Neuroeng. Rehabil. 2010, 7, 48. [Google Scholar] [CrossRef] [Green Version]
  11. Zhou, Z.; Yin, E.; Liu, Y.; Jiang, J.; Hu, D. A novel task-oriented optimal design for P300-based brain-computer interfaces. J. Neural Eng. 2014, 11, 056003. [Google Scholar] [CrossRef] [PubMed]
  12. Lebedev, M.A.; Nicolelis, M.A. Brain-machine interfaces: From basic science to neuroprostheses and neurorehabilitation. Physiol. Rev. 2017, 97, 767–837. [Google Scholar] [CrossRef] [PubMed]
  13. Khan, M.A.; Das, R.; Iversen, H.K.; Puthusserypady, S. Review on motor imagery based BCI systems for upper limb post-stroke neurorehabilitation: From designing to application. Comput. Biol. Med. 2020, 123, 103843. [Google Scholar] [CrossRef] [PubMed]
  14. Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain computer interfaces, a review. Sensors 2012, 12, 1211–1279. [Google Scholar] [CrossRef]
  15. Ramadan, R.A.; Vasilakos, A.V. Brain computer interface: Control signals review. Neurocomputing 2017, 223, 26–44. [Google Scholar] [CrossRef]
  16. Waldert, S.; Pistohl, T.; Braun, C.; Ball, T.; Aertsen, A.; Mehring, C. A review on directional information in neural signals for brain-machine interfaces. J. Physiol.-Paris 2009, 103, 244–254. [Google Scholar] [CrossRef]
  17. Slutzky, M.W.; Flint, R.D. Physiological properties of brain-machine interface input signals. J. Neurophysiol. 2017, 118, 1329–1343. [Google Scholar] [CrossRef] [Green Version]
  18. Hauschild, M.; Mulliken, G.H.; Fineman, I.; Loeb, G.E.; Andersen, R.A. Cognitive signals for brain–machine interfaces in posterior parietal cortex including continuous 3D trajectory commands. Proc. Natl. Acad. Sci. USA 2012, 109, 17075–17080. [Google Scholar] [CrossRef] [Green Version]
  19. Abiri, R.; Borhani, S.; Sellers, E.W.; Jiang, Y.; Zhao, X. A comprehensive review of EEG-based brain-computer interface paradigms. J. Neural Eng. 2019, 16, 011001. [Google Scholar] [CrossRef]
  20. Borkowski, K.; Krzyżak, A.T. Assessment of the systematic errors caused by diffusion gradient inhomogeneity in DTI-computer simulations. NMR Biomed 2019, 32, e4130. [Google Scholar] [CrossRef]
  21. Birbaumer, N. Breaking the silence: Brain-computer interfaces (BCI) for communication and motor control. Psychophysiology 2006, 43, 517–532. (In English) [Google Scholar] [CrossRef]
  22. Shu, X.; Chen, S.; Yao, L.; Sheng, X.; Zhang, D.; Jiang, N.; Jia, J.; Zhu, X. Fast recognition of BCI-inefficient users using physiological features from EEG signals: A screening study of stroke patients. Front. Neurosci. 2018, 12, 93. [Google Scholar] [CrossRef] [Green Version]
  23. Ang, K.K.; Guan, C. Brain–computer interface for neurorehabilitation of the upper limb after stroke. Proc. IEEE 2015, 103, 944–953. [Google Scholar] [CrossRef]
  24. Cassidy, J.M.; Mark, J.I.; Cramer, S.C. Functional connectivity drives stroke recovery: Shifting the paradigm from correlation to causation. Brain 2021, 145, 1211–1228. [Google Scholar] [CrossRef]
  25. Raghavan, P. The nature of hand motor impairment after stroke and its treatment. Curr. Treat. Options Cardiovasc. Med. 2007, 9, 221–228. [Google Scholar] [CrossRef]
  26. Hakon, J.; Quattromani, M.J.; Sjölund, C.; Tomasevic, G.; Carey, L.; Lee, J.-M.; Ruscher, K.; Wieloch, T.; Bauer, A.Q. Multisensory stimulation improves functional recovery and resting-state functional connectivity in the mouse brain after stroke. NeuroImage Clin. 2018, 17, 717–730. [Google Scholar] [CrossRef]
  27. Nijenhuis, S.M.; Prange, G.B.; Amirabdollahian, F.; Sale, P.; Infarinato, F.; Nasr, N.; Mountain, G.; Hermens, H.J.; Stienen, A.H.; Buurke, J.H. Feasibility study of self-administered training at home using an arm and hand device with motivational gaming environment in chronic stroke. J. Neuroeng. Rehabil. 2015, 12, 89. [Google Scholar] [CrossRef] [Green Version]
  28. Jamil, N.; Belkacem, A.N.; Ouhbi, S.; Lakas, A. Noninvasive Electroencephalography Equipment for Assistive, Adaptive, and Rehabilitative Brain–Computer Interfaces: A Systematic Literature Review. Sensors 2021, 21, 4754. [Google Scholar] [CrossRef]
  29. Camargo-Vargas, D.; Callejas-Cuervo, M.; Mazzoleni, S. Brain-Computer Interface Systems for Upper and Lower Limb Rehabilitation: A Systematic Review. Sensors 2021, 21, 4312. [Google Scholar] [CrossRef]
  30. Mane, R.; Chouhan, T.; Guan, C. BCI for stroke rehabilitation: Motor and beyond. J. Neural Eng. 2020, 17, 041001. [Google Scholar] [CrossRef] [PubMed]
  31. Baniqued, P.D.E.; Stanyer, E.C.; Awais, M.; Alazmani, A.; Jackson, A.E.; Mon-Williams, M.A.; Mushtaq, F.; Holt, R.J. Brain–computer interface robotics for hand rehabilitation after stroke: A systematic review. J. Neuroeng. Rehabil. 2021, 18, 15. [Google Scholar] [CrossRef]
  32. He, Y.; Luu, T.P.; Nathan, K.; Nakagome, S.; Contreras-Vidal, J.L. Data Descriptor: A mobile brainbody imaging dataset recorded during treadmill walking with a brain-computer interface. Sci. Data 2018, 5, 180074. [Google Scholar] [CrossRef] [Green Version]
  33. Morley, A.; Hill, L.; Kaditis, A.G. 10–20 System EEG Placement. 2016, 34. Available online: https://www.ers-education.org/lrmedia/2016/pdf/298830.pdf (accessed on 25 October 2022).
  34. Liu, D.; Chen, W.; Pei, Z.; Wang, J. A brain-controlled lower-limb exoskeleton for human gait training. Rev. Sci. Instrum. 2017, 88, 104302. [Google Scholar] [CrossRef]
  35. Gordleeva, S.Y.; Lukoyanov, M.V.; Mineev, S.A.; Khoruzhko, M.A.; Mironov, V.I.; Kaplan, A.Y.; Kazantsev, V.B. Exoskeleton control system based on motor-imaginary brain-computer interface. Sovrem. Tehnol. Med. 2017, 9, 31–36. [Google Scholar] [CrossRef] [Green Version]
  36. Comani, S.; Velluto, L.; Schinaia, L.; Cerroni, G.; Serio, A.; Buzzelli, S.; Sorbi, S.; Guarnieri, B. Monitoring Neuro-Motor Recovery from Stroke with High-Resolution EEG, Robotics and Virtual Reality: A Proof of Concept. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 23, 1106–1116. [Google Scholar] [CrossRef]
  37. Lechat, B.; Hansen, K.L.; Melaku, Y.A.; Vakulin, A.; Micic, G.; Adams, R.J.; Appleton, S.; Eckert, D.J.; Catcheside, P.; Zajamsek, B. A Novel Electroencephalogram-derived Measure of Disrupted Delta Wave Activity during Sleep Predicts All-Cause Mortality Risk. Ann. Am. Thorac. Soc. 2022, 19, 649–658. [Google Scholar] [CrossRef] [PubMed]
  38. Hussain, I.; Hossain, M.A.; Jany, R.; Bari, M.A.; Uddin, M.; Kamal, A.R.M.; Ku, Y.; Kim, J.S. Quantitative Evaluation of EEG-Biomarkers for Prediction of Sleep Stages. Sensors 2022, 22, 3079. [Google Scholar] [CrossRef]
  39. Tarokh, L.; Carskadon, M.A. Developmental changes in the human sleep EEG during early adolescence. Sleep 2010, 33, 801–809. [Google Scholar] [CrossRef]
  40. Grigg-Damberger, M.; Gozal, D.; Marcus, C.L.; Quan, S.F.; Rosen, C.L.; Chervin, R.D.; Wise, M.; Picchietti, D.L.; Sheldon, S.H.; Iber, C. The visual scoring of sleep and arousal in infants and children. J. Clin. Sleep Med. 2007, 3, 201–240. [Google Scholar] [CrossRef] [PubMed]
  41. Sekimoto, M.; Kajimura, N.; Kato, M.; Watanabe, T.; Nakabayashi, T.; Takahashi, K.; Okuma, T. Laterality of delta waves during all-night sleep. Psychiatry Clin. Neurosci. 1999, 53, 149–150. [Google Scholar] [CrossRef]
  42. Schechtman, L.; Harper, R.K.; Harpe, R.M. Distribution of slow-wave EEG activity across the night in developing infants. Sleep 1994, 17, 316–322. [Google Scholar] [CrossRef]
  43. Anderson, A.J.; Perone, S.; Gartstein, M.A. Context matters: Cortical rhythms in infants across baseline and play. Infant Behav. Dev. 2022, 66, 101665. [Google Scholar] [CrossRef]
  44. Orekhova, E.; Stroganova, T.; Posikera, I.; Elam, M. EEG theta rhythm in infants and preschool children. Clin. Neurophysiol. 2006, 117, 1047–1062. [Google Scholar] [CrossRef]
  45. Mateos, D.M.; Krumm, G.; Arán Filippetti, V.; Gutierrez, M. Power Spectrum and Connectivity Analysis in EEG Recording during Attention and Creativity Performance in Children. Neuroscience 2022, 3, 25. [Google Scholar] [CrossRef]
  46. Goldman, R.I.; Stern, J.M.; Engel, J., Jr.; Cohen, M.S. Simultaneous EEG and fMRI of the alpha rhythm. Neuroreport 2002, 13, 2487–2492. [Google Scholar] [CrossRef]
  47. Tuladhar, M.; Nt, H.; Schoffelen, J.M.; Maris, E.; Oostenveld, R.; Jensen, O. Parieto-occipital sourcesaccount for the increase in alpha activity with working mem-ory load. Hum. Brain Mapp. 2007, 28, 785–792. [Google Scholar] [CrossRef]
  48. Rakhshan, V.; Hassani-Abharian, P.; Joghataei, M.; Nasehi, M.; Khosrowabadi, R. Effects of the Alpha, Beta, and Gamma Binaural Beat Brain Stimulation and Short-Term Training on Simultaneously Assessed Visuospatial and Verbal Working Memories, Signal Detection Measures, Response Times, and Intrasubject Response Time Variabilities: A Within-Subject Randomized Placebo-Controlled Clinical Trial. Biomed Res. Int. 2022, 2022, 8588272. [Google Scholar] [CrossRef]
  49. Kilavik, B.E.; Zaepffel, M.; Brovelli, A.; Mackay, W.A.; Riehle, A. The ups and downs of beta oscillations in the sensorimotor cortex. Exp. Neurol. 2013, 245, 15–26. [Google Scholar] [CrossRef] [Green Version]
  50. Darch, H.T.; Cerminara, N.L.; Gilchrist, I.D.; Apps, R. Pre-movement changes in sensorimotor beta oscillations predict motor adaptation drive. Sci. Rep. 2020, 10, 17946. [Google Scholar] [CrossRef]
  51. Al-Quraishi, M.S.; Elamvazuthi, I.; Daud, S.A.; Parasuraman, S.; Borboni, A. EEG-Based Control for Upper and Lower Limb Exoskeletons and Prostheses: A Systematic Review. Sensors 2018, 18, E3342. [Google Scholar] [CrossRef] [Green Version]
  52. Benasich, A.A.; Gou, Z.; Choudhury, N.; Harris, K.D. Early cognitive and language skills are linked to resting frontal gamma power across the first 3 years. Behav. Brain Res. 2008, 195, 215–222. [Google Scholar] [CrossRef]
  53. Ulloa, J.L. The Control of Movements via Motor Gamma Oscillations. Front. Hum. Neurosci. 2022, 15, 787157. [Google Scholar] [CrossRef]
  54. Fitzgibbon, S.P.; Pope, K.J.; Mackenzie, L.; Clark, C.R.; Willoughby, J.O. Cognitive tasks augment gamma EEG power. Clin. Neurophysiol. 2004, 115, 1802–1809. [Google Scholar] [CrossRef]
  55. Cannon, J.; McCarthy, M.M.; Lee, S.; Lee, J.; Börgers, C.; Whittington, M.A.; Kopell, N. Neurosystems: Brain rhythms and cognitive processing. Eur. J. Neurosci. 2014, 39, 705–719. [Google Scholar] [CrossRef]
  56. Del, R.; Millán, J.; Ferrez, P.W.; Galán, F.; Lew, E.; Chavarriaga, R. Non-invasive brain-machine interaction. Int. J. Pattern Recognit. Artif. Intell. 2008, 22, 959–972. [Google Scholar]
  57. Pfurtscheller, G.; Neuper, C. Motor imagery and direct brain- computer communication. Proc. IEEE 2001, 89, 1123–1134. [Google Scholar] [CrossRef]
  58. Pfurtscheller, G.; Neuper, C.; Flotzinger, D.; Pregenzer, M. EEG-based discrimination between imagination of right and left hand movement. Electroencephalogr. Clin. Neurophysiol. 1997, 103, 642–651. [Google Scholar] [CrossRef]
  59. Tang, Z.; Sun, S.; Zhang, S.; Chen, Y.; Li, C.; Chen, S. A brain-machine interface based on ERD/ERS for an upper-limb exoskeleton control. Sensors 2016, 16, 2050. [Google Scholar] [CrossRef] [Green Version]
  60. Grimm, F.; Walter, A.; Spüler, M.; Naros, G.; Rosenstiel, W.; Gharabaghi, A. Hybrid neuroprosthesis for the upper limb: Combining brain-controlled neuromuscular stimulation with a multi-joint arm exoskeleton. Front. Neurosci. 2016, 10, 367. [Google Scholar] [CrossRef] [Green Version]
  61. Formaggio, E.; Masiero, S.; Bosco, A.; Izzi, F.; Piccione, F.; Del Felice, A. Quantitative EEG Evaluation during Robot-Assisted Foot Movement. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 1633–1640. [Google Scholar] [CrossRef]
  62. Murphy, D.P.; Bai, O.; Gorgey, A.S.; Fox, J.; Lovegreen, W.T.; Burkhardt, B.W.; Atri, R.; Marquez, J.S.; Li, Q.; Fei, D.Y. Electroencephalogram-based brain-computer interface and lower-limb prosthesis control: A case study. Front. Neurol. 2017, 8, 696. [Google Scholar] [CrossRef] [Green Version]
  63. Meng, J.; Zhang, S.; Bekyo, A.; Olsoe, J.; Baxter, B.; He, B. Noninvasive Electroencephalogram-Based Control of a Robotic Arm for Reach and Grasp Tasks. Sci. Rep. 2016, 6, 38565. [Google Scholar] [CrossRef]
  64. Hortal, E.; Planelles, D.; Resquin, F.; Climent, J.M.; Azorín, J.M.; Pons, J.L. Using a brain-machine interface to control a hybrid upper limb exoskeleton during rehabilitation of patients with neurological conditions. J. Neuroeng. Rehabil. 2015, 12, 92. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Kirchner, E.A.; Tabie, M.; Seeland, A. Multimodal movement prediction—Towards an individual assistance of patients. PLoS ONE 2014, 9, e85060. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  66. Xu, R.; Jiang, N.; Mrachacz-Kersting, N.; Lin, C.; Prieto, G.A.; Moreno, J.C.; Pons, J.L.; Dremstrup, K.; Farina, D. A Closed-Loop Brain-Computer Interface Triggering an Active Ankle-Foot Orthosis for Inducing Cortical Neural Plasticity. IEEE Trans. Biomed. Eng. 2014, 61, 2092–2101. [Google Scholar] [PubMed]
  67. López-Larraz, E.; Trincado-Alonso, F.; Rajasekaran, V.; Pérez-Nombela, S.; del-Ama, A.J.; Aranda, J.; Minguez, J.; Gil-Agudo, A.; Montesano, L. Control of an ambulatory exoskeleton with a brain-machine interface for spinal cord injury gait rehabilitation. Front. Neurosci. 2016, 10, 359. [Google Scholar] [CrossRef] [Green Version]
  68. Kapgate, D.; Kalbande, D. A Review on Visual Brain Computer Interface. In Advancements of Medical Electronics; Springer: Berlin/Heidelberg, Germany, 2015; pp. 193–206. [Google Scholar]
  69. Gao, S.; Wang, Y.; Gao, X.; Hong, B. Visual and auditory brain–computer interfaces. IEEE Trans. Biomed. Eng. 2014, 61, 1436–1447. [Google Scholar]
  70. Nijboer, F.; Furdea, A.; Gunst, I.; Mellinger, J.; McFarland, D.J.; Birbaumer, N.; Kübler, A. An auditory brain–computer interface (BCI). J. Neurosci. Methods 2008, 167, 43–50. [Google Scholar] [CrossRef]
  71. Yao, L.; Sheng, X.; Zhang, D.; Jiang, N.; Farina, D.; Zhu, X. A BCI System Based on Somatosensory Attentional Orientation. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 81–90. [Google Scholar] [CrossRef] [Green Version]
  72. Wolpaw, J.R.; McFarland, D.J.; Vaughan, T.M. Brain-computer interface research at the Wadsworth Center. IEEE Trans. Rehabil. Eng. 2000, 8, 222–226. [Google Scholar] [CrossRef] [Green Version]
  73. Lee, K.; Liu, D.; Perroud, L.; Chavarriaga, R.; Millán, J.D.R. A brain-controlled exoskeleton with cascaded event-related desynchronization classifiers. Robot. Autom. Syst. 2017, 90, 15–23. [Google Scholar] [CrossRef] [Green Version]
  74. Polich, J. Updating P300: An integrative theory of P3a and P3b. Clin. Neurophysiol. 2007, 118, 2128–2148. [Google Scholar] [CrossRef] [Green Version]
  75. Kwak, N.-S.; Müller, K.-R.; Lee, S.-W. A lower limb exoskeleton control system based on steady state visual evoked potentials. J. Neural Eng. 2015, 12, 056009. [Google Scholar] [CrossRef]
  76. Chen, X.; Zhao, B.; Wang, Y.; Gao, X. Combination of high-frequency SSVEP-based BCI and computer vision for controlling a robotic arm. J. Neural Eng. 2019, 16, 026012. [Google Scholar] [CrossRef]
  77. Tariq, M.; Trivailo, P.M.; Simic, M. EEG-based BCI control schemes for lower-limb assistive-robots. Front. Hum. Neurosci. 2018, 12, 312. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  78. Song, Y.; Cai, S.; Yang, L.; Li, G.; Wu, W.; Xie, L. A practical EEG-based human-machine interface to online control an upper-limb assist robot. Front. Neurorobot. 2020, 14, 32. [Google Scholar] [CrossRef]
  79. Aggarwal, S.; Chugh, N. Review of Machine Learning Techniques for EEG Based Brain Computer Interface. Arch. Comput. Methods Eng. 2022, 29, 3001–3020. [Google Scholar] [CrossRef]
  80. Qiu, S.; Li, Z.; He, W.; Zhang, L.; Yang, C.; Su, C.Y. Brain–machine interface and visual compressive sensing-based teleoperation control of an exoskeleton robot. IEEE Trans. Fuzzy Syst. 2016, 25, 58–69. [Google Scholar] [CrossRef] [Green Version]
  81. Horki, P.; Solis-Escalante, T.; Neuper, C.; Müller-Putz, G. Combined motor imagery and SSVEP based BCI control of a 2 DoF artificial upper limb. Med. Biol. Eng. Comput. 2011, 49, 567–577. [Google Scholar] [CrossRef]
  82. Bhattacharyya, S.; Konar, A.; Tibarewala, D. Motor imagery, P300 and error-related EEG-based robot arm movement control for rehabilitation purpose. Med. Biol. Eng. Comput. 2014, 52, 1007–1017. [Google Scholar] [CrossRef]
  83. Cecotti, H.; Graser, A. Convolutional neural networks for P300 detection with application to brain-computer interfaces. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 33, 433–445. [Google Scholar] [CrossRef]
  84. Hong, K.-S.; Khan, M.J. Hybrid brain–computer interface techniques for improved classification accuracy and increased number of commands: A review. Front. Neurorobot. 2017, 35. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  85. Ren, W.; Han, M.; Wang, J.; Wang, D.; Li, T. Efficient feature extraction framework for EEG signals classification. In Proceedings of the 2016 Seventh International Conference on Intelligent Control and Information Processing (ICICIP), Siem Reap, Cambodia, 1–4 December 2016; pp. 167–172. [Google Scholar] [CrossRef]
  86. Kumar, S.; Kumar, V.; Gupta, B. Feature extraction from EEG signal through one electrode device for medical application. In Proceedings of the 2015 1st International Conference on Next Generation Computing Technologies (NGCT), Dehradun, India, 4–5 September 2015; pp. 555–559. [Google Scholar] [CrossRef]
  87. Mehmood, R.M.; Du, R.; Lee, H.J. Optimal Feature Selection and Deep Learning Ensembles Method for Emotion Recognition From Human Brain EEG Sensors. IEEE Access 2017, 5, 14797–14806. [Google Scholar] [CrossRef]
  88. Roshdy, A.; Alkork, S.; Karar, A.S.; Mhalla, H.; Beyrouthy, T.; al Barakeh, Z.; Nait-ali, A. Statistical Analysis of Multi-channel EEG Signals for Digitizing Human Emotions. In Proceedings of the 2021 4th International Conference on Bio-Engineering for Smart Technologies (BioSMART), Virtual, 8–10 December 2021; pp. 1–4. [Google Scholar] [CrossRef]
  89. Xu, S.; Hu, H.; Ji, L.; Wang, P. An Adaptive Graph Spectral Analysis Method for Feature Extraction of an EEG Signal. IEEE Sens. J. 2019, 19, 1884–1896. [Google Scholar] [CrossRef]
  90. Kang, W.-S.; Kwon, H.-O.; Moon, C.; Kim, J.K.; Yun, S.; Kim, S. EEG-fMRI features analysis of odorants stimuli with citralva and 2-mercaptoethanol. In Proceedings of the Sensors, 2013 IEEE, Baltimore, MD, USA, 3–6 November 2013; pp. 1–4. [Google Scholar] [CrossRef]
  91. Zhang, K.; Xu, G.; Zheng, X.; Li, H.; Zhang, S.; Yu, Y.; Liang, R. application of transfer learning in EEG decoding based on brain-computer interfaces: A review. Sensors 2020, 20, 6321. [Google Scholar] [CrossRef] [PubMed]
  92. Thomas, J.; Maszczyk, T.; Sinha, N.; Kluge, N.; Dauwels, J. Deep learning-based classification forbrain-computer interfaces. In Proceedings of the 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 5–8 October 2017; IEEE: Piscataway, NJ, USA, 2017. [Google Scholar]
  93. Ai, Q.; Liu, Z.; Meng, W.; Liu, Q.; Xie, S.Q. Machine Learning in Robot Assisted Upper Limb Rehabilitation: A Focused Review. IEEE Transactions on Cognitive and Developmental Systems; IEEE: Piscataway, NJ, USA, 2021. [Google Scholar]
  94. Lu, W.; Wei, Y.; Yuan, J.; Deng, Y.; Song, A. Tractor assistant driving control method based on EEG combined with RNN-TL deep learning algorithm. IEEE Access 2020, 8, 163269–163279. [Google Scholar] [CrossRef]
  95. Roy, S.; Chowdhury, A.; Mccreadie, K.; Prasad, G. Deep learning based inter-subject continuous decoding of motor imagery for practical brain-computer interfaces. Front. Neurosci. 2020, 14, 918. [Google Scholar] [CrossRef]
  96. Choi, J.; Kim, K.; Lee, J.; Lee, S.J.; Kim, H. Robust semi-synchronous bci controller for brain-actuated exoskeleton system. In Proceedings of the 2020 8th international winter conference on brain-computer interface (BCI), Gangwon, Repulic of Korea, 26–28 February 2020; IEEE: Piscataway, NJ, USA, 2020. [Google Scholar]
  97. Bi, L.; Fan, X.-A.; Liu, Y. EEG-based brain-controlled mobile robots: A survey. Human-Machine Systems. IEEE Trans. 2013, 43, 161–176. [Google Scholar]
  98. Birbaumer, N.; Ghanayim, N.; Hinterberger, T.; Iversen, I.; Kotchoubey, B.; Kübler, A.; Perelmouter, J.; Taub, E.; Flor, H. A spelling device for the paralysed. Nature 1999, 398, 297–298. [Google Scholar] [CrossRef] [PubMed]
  99. Soekadar, S.R.; Witkowski, M.; Gómez, C.; Opisso, E.; Medina, J.; Cortese, M.; Cempini, M.; Carrozza, M.C.; Cohen, L.G.; Birbaumer, N.; et al. Hybrid EEG/EOG-based brain/neural hand exoskeleton restores fully independent daily living activities after quadriplegia. Sci. Robot. 2016, 1, eaag3296. [Google Scholar] [CrossRef] [PubMed]
  100. Zhang, J.; Wang, B.; Zhang, C.; Xiao, Y.; Wang, M.Y. An EEG/EMG/EOG-Based Multimodal Human-Machine Interface for Real-Time Control of a Soft Robot Hand. Front. Neurorobot. 2019, 13, 7. [Google Scholar] [CrossRef] [Green Version]
  101. Cheng, N.; Phua, K.S.; Lai, H.S.; Tam, P.K.; Tang, K.Y.; Cheng, K.K.; Yeow, R.C.; Ang, K.K.; Guan, C.; Lim, J.H. Brain-Computer Interface-Based Soft Robotic Glove Rehabilitation for Stroke. IEEE Trans. Biomed. Eng. 2020, 67, 3339–3351. [Google Scholar] [CrossRef] [PubMed]
  102. Jochumsen, M.; Janjua, T.A.M.; Arceo, J.C.; Lauber, J.; Buessinger, E.S.; Kæseler, R.L. Induction of Neural Plasticity Using a Low-Cost Open Source Brain-Computer Interface and a 3D-Printed Wrist Exoskeleton. Sensors 2021, 21, 572. [Google Scholar] [CrossRef]
  103. Käthner, I.; Kübler, A.; Halder, S. Rapid P300 brain-computer interface communication with a head-mounted display. Front. Neurosci. 2015, 9, 207. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  104. Ortner, R.; Irimia, D.; Scharinger, J.; Guger, C. A motor imagery-based braincomputer interface for stroke rehabilitation. Stud. Health Technol. Inf. 2012, 181, 319–323. [Google Scholar] [CrossRef]
  105. Lupu, R.G.; Irimia, D.C.; Ungureanu, F.; Poboroniuc, M.; Moldoveanu, A. BCI and FES based therapy for stroke rehabilitation using VR facilities. Wirel Commun. Mob. Com. 2018, 2018, 4798359. [Google Scholar] [CrossRef]
Figure 1. The rule of the EEG signal as the main element in the BCI-EEG rehabilitation system.
Figure 1. The rule of the EEG signal as the main element in the BCI-EEG rehabilitation system.
Bioengineering 09 00768 g001
Figure 2. The annual rating of research and review articles on BCI rehabilitation devices.
Figure 2. The annual rating of research and review articles on BCI rehabilitation devices.
Bioengineering 09 00768 g002
Figure 3. The different division of BCI systems based on the applications and products examples.
Figure 3. The different division of BCI systems based on the applications and products examples.
Bioengineering 09 00768 g003
Figure 4. Shows the (10-20) system to place the EEG surface electrode on the scalp [33].
Figure 4. Shows the (10-20) system to place the EEG surface electrode on the scalp [33].
Bioengineering 09 00768 g004
Figure 5. EEG signal processing for HMI.
Figure 5. EEG signal processing for HMI.
Bioengineering 09 00768 g005
Figure 6. Application of the BCI system.
Figure 6. Application of the BCI system.
Bioengineering 09 00768 g006
Figure 7. Scheme of the process pipeline to control the exoskeleton of the hand. Adapted with permission from Ref. [99]. 2022, S.R. Soekadar et al.
Figure 7. Scheme of the process pipeline to control the exoskeleton of the hand. Adapted with permission from Ref. [99]. 2022, S.R. Soekadar et al.
Bioengineering 09 00768 g007
Figure 8. (A) shows the prototype model BCI and its experimental condition, (B) is the control scheme of the soft robot hand [100].
Figure 8. (A) shows the prototype model BCI and its experimental condition, (B) is the control scheme of the soft robot hand [100].
Bioengineering 09 00768 g008
Figure 9. (AI) are the results presentation of the hand action. As an example, A is grasping various objects in their everyday life quickly according to his/her intention with the help of a soft robot [100].
Figure 9. (AI) are the results presentation of the hand action. As an example, A is grasping various objects in their everyday life quickly according to his/her intention with the help of a soft robot [100].
Bioengineering 09 00768 g009
Figure 10. The setup of BCI-assisted soft robotic glove for stroke rehabilitation at (a) a local hospital, with (b) depicting an illustrated overview. The setup comprises a EEG cap, EEG amplifier, and soft robotic glove. Adapted with permission from Ref. [101]. 2022, Cheng N et al.
Figure 10. The setup of BCI-assisted soft robotic glove for stroke rehabilitation at (a) a local hospital, with (b) depicting an illustrated overview. The setup comprises a EEG cap, EEG amplifier, and soft robotic glove. Adapted with permission from Ref. [101]. 2022, Cheng N et al.
Bioengineering 09 00768 g010
Figure 11. Timeline of the experiment [102].
Figure 11. Timeline of the experiment [102].
Bioengineering 09 00768 g011
Figure 12. (A) The hardware setup. The Arduino control board and Linear Actuator were mounted on the exoskeleton. The signals were transferred to the OpenViBE PC running OpenViBE via a wireless connection. When an imagined wrist extension was identified, a trigger was transmitted to the Arduino on the exoskeleton through wireless communication. A wire linked the Arduino to the Linear Actuator Control board. A 12 V power supply was used to power the Linear Actuator Control board. A wire linked the motor to the Linear Actuator Control board. (B) The 3D-printed exoskeleton. The contact surfaces of the forearm were padded with foam to improve comfort. Velcro straps were used to fix the exoskeleton to the subject’s forearm [102].
Figure 12. (A) The hardware setup. The Arduino control board and Linear Actuator were mounted on the exoskeleton. The signals were transferred to the OpenViBE PC running OpenViBE via a wireless connection. When an imagined wrist extension was identified, a trigger was transmitted to the Arduino on the exoskeleton through wireless communication. A wire linked the Arduino to the Linear Actuator Control board. A 12 V power supply was used to power the Linear Actuator Control board. A wire linked the motor to the Linear Actuator Control board. (B) The 3D-printed exoskeleton. The contact surfaces of the forearm were padded with foam to improve comfort. Velcro straps were used to fix the exoskeleton to the subject’s forearm [102].
Bioengineering 09 00768 g012
Figure 13. (A) Field of view for the glasses. In (A), the user sees the five-by-five matrix and the screen is fixed. In (B), the user can see only one letter from the five-by-five matrix, and the subject moves his head to concentrate on a specific letter [103].
Figure 13. (A) Field of view for the glasses. In (A), the user sees the five-by-five matrix and the screen is fixed. In (B), the user can see only one letter from the five-by-five matrix, and the subject moves his head to concentrate on a specific letter [103].
Bioengineering 09 00768 g013
Figure 14. BCI-FES subsystem and the patient executing the control command for a rehabilitation exercise [105].
Figure 14. BCI-FES subsystem and the patient executing the control command for a rehabilitation exercise [105].
Bioengineering 09 00768 g014
Figure 15. The VR environment: (a,c) patient views; (b,d) world views; (a) the therapist in front of the patient; (c) the therapist on the left side of the patient with mirror in the front [105].
Figure 15. The VR environment: (a,c) patient views; (b,d) world views; (a) the therapist in front of the patient; (c) the therapist on the left side of the patient with mirror in the front [105].
Bioengineering 09 00768 g015
Table 1. Shows the EEG rhythms and their frequency.
Table 1. Shows the EEG rhythms and their frequency.
RhythmsRhythm Frequency Band (Hz)Functions Related
Delta (δ)0.5–4 HZappear in infants and deep sleep [37,38,39,40,41,42].
Theta (θ)4–8 HZIt occurs in the parietal and temporal areas in children [43,44,45]
Alpha (α)8–13 HZIt can be found in a wake adult. It also appears in the occipital area; however, it can be detected in the scalp frontal, and parietal regions
[46,47,48].
Beta (β)13–30 HZDecreasing the Beta rhythm reflects movement, planning a movement, imagining a movement, or preparation of movements. This decrease is most dominant in the contralateral motor cortex. These waves occur during movements and can be detected from the central and frontal scalp lope [49,50,51].
Gamma (G)>30 HZIt is the higher rhythms that have frequencies of more than 30 Hz. It is related to the formation of ideas, language processing, and various types of learning [52,53,54,55]
Table 2. Summarizes some of the presented paper details such as subject signal type and electrode location.
Table 2. Summarizes some of the presented paper details such as subject signal type and electrode location.
Type of Application Representative WorksBCI ParadigmDescriptionNo. of SubjectsSignal TypeElectrode Number Accuracy
BCI-Assistive robot for
Rehabilitation
Soekadar, S R et al. [99]MI- EEG
HOVs’ EOG
Help paraplegic patients to control the exoskeleton hand for daily life activity6EEG-EOGC384.96 ± 7.19%
Zhang Jinhua and et al. [100]MI-EEG
Left/right looking-EOG
6EEG-EOG-EMG40
Ag/AgCl channels placed 10–20
System
93.83%
N. Cheng et al. [101]MIStudied BCI-based Soft Robotic Glove applicability for stroke patient rehabilitation in daily life activities.11EEG24
Ag/AgCl channels placed 10–20
System
-
Mads Jochumsen and et al. [102]MIInduction of Neural Plasticity Using a Low-Cost Open Source
Brain-Computer Interface and a 3D-Printed Wrist Exoskeleton
11EEGF1, F2, C3, Cz, C4, P1, and P286 ± 12%;
Kathner et al. [103]P300Check if VR devices can achieve the same precision and rapid data transmission compared to the regular display methods18 + 1 person
(ALS). 80 years
EEG-VRFz, Cz, P3, P4, PO7, POz, PO8, Oz96%
BCI-virtual reality based for rehabilitation
Ortner et al. [104]MItraining stroke patients to imagine left and right hands movements in VR scenes3EEG-VR63 positionsmean 90.4%
Robert Lupu et al. [105]MIFlow instruction of virtual therapists, to control virtual characters in VR scenes using MI. Motor function was improved. 7EEG-FES
EOG
16 sensorimotor areas of channels sensorimotor areasmean85.44%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Orban, M.; Elsamanty, M.; Guo, K.; Zhang, S.; Yang, H. A Review of Brain Activity and EEG-Based Brain–Computer Interfaces for Rehabilitation Application. Bioengineering 2022, 9, 768. https://doi.org/10.3390/bioengineering9120768

AMA Style

Orban M, Elsamanty M, Guo K, Zhang S, Yang H. A Review of Brain Activity and EEG-Based Brain–Computer Interfaces for Rehabilitation Application. Bioengineering. 2022; 9(12):768. https://doi.org/10.3390/bioengineering9120768

Chicago/Turabian Style

Orban, Mostafa, Mahmoud Elsamanty, Kai Guo, Senhao Zhang, and Hongbo Yang. 2022. "A Review of Brain Activity and EEG-Based Brain–Computer Interfaces for Rehabilitation Application" Bioengineering 9, no. 12: 768. https://doi.org/10.3390/bioengineering9120768

APA Style

Orban, M., Elsamanty, M., Guo, K., Zhang, S., & Yang, H. (2022). A Review of Brain Activity and EEG-Based Brain–Computer Interfaces for Rehabilitation Application. Bioengineering, 9(12), 768. https://doi.org/10.3390/bioengineering9120768

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop