Next Article in Journal
A Secure and Efficient KA-PRE Scheme for Data Transmission in Remote Data Management Environments
Previous Article in Journal
A-WHO: Stagnation-Based Adaptive Metaheuristic for Cloud Task Scheduling Resilient to DDoS Attacks
Previous Article in Special Issue
A Myoelectric Signal-Driven Intelligent Wheelchair System Incorporating Occlusal Control for Assistive Mobility
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology

1
Department of Electrical Engineering, National Taiwan Ocean University, Keelung City 202301, Taiwan
2
Institute and Undergraduate Program of Vehicle and Energy Engineering, National Taiwan Normal University, Taipei City 106308, Taiwan
3
Department of Electronic Engineering, National Taipei University of Technology, Taipei City 106344, Taiwan
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(21), 4338; https://doi.org/10.3390/electronics14214338
Submission received: 30 August 2025 / Revised: 18 October 2025 / Accepted: 23 October 2025 / Published: 5 November 2025
(This article belongs to the Special Issue Innovative Designs in Human–Computer Interaction)

Abstract

Steady-state visual evoked potential (SSVEP)-based brain–computer interface (BCI) technology offers a promising solution for wheelchair control by translating neural signals into navigation commands. A major challenge—signal noise caused by eye blinks—is addressed in this feasibility study through real-time blink detection and correction. The proposed design utilizes sensors to capture both SSVEP and blink signals, enabling the isolation and compensation of interference, which improves control accuracy by 14.68%. Real-time correction during blinks significantly enhances system reliability and responsiveness. Furthermore, user data and global positioning system (GPS) trajectories are uploaded to the cloud via Wi-Fi 6E for continuous safety monitoring. This approach not only restores mobility for users with physical disabilities but also promotes independence and spatial autonomy.

1. Introduction

The development of electric wheelchairs has significantly enhanced mobility and autonomy for individuals with physical disabilities. However, conventional electric wheelchairs typically use manual interfaces such as joysticks or push buttons. These control methods present challenges for users with severe motor impairments, including paraplegia and quadriplegia, where limb movement may be limited or absent. As a result, there has been growing interest in alternative control strategies that minimize physical exertion while maximizing usability [1,2]. In this study, we position our work as a feasibility investigation (n = 5) aimed at validating whether online EEG control can be maintained when eye-blink interference is mitigated in real time.
Among the most promising solutions are control interfaces based on electroencephalography (EEG) and electrooculography (EOG). EEG measures electrical activity generated by the brain, while EOG captures electrical signals associated with eye movements [3]. Both techniques are non-invasive and well-suited for users with restricted motor functions. By interpreting brainwave patterns or ocular signals, these systems enable users to issue navigation commands to an electric wheelchair [2,4].
An EEG-based system may detect a user’s visual focus on a flashing stimulus or a directional cue presented on a screen, translating this neural activity into movement commands. This integration of cognitive and ocular signals into mobility assistance not only improves control accessibility but also advances the field of intuitive human–machine interaction. Ultimately, such approaches pave the way for more inclusive and intelligent assistive technologies that empower individuals with severe disabilities [5,6]. Within this context, steady-state visual evoked potentials (SSVEPs) are particularly attractive due to their high SNR and low training burden for visual target selection [7].
EEG-based wheelchair control enables users to operate the device by concentrating on specific visual stimuli or cognitive tasks. When a user focuses on a designated on-screen region or button, the EEG system detects the associated brain activity—typically steady-state visual evoked potentials (SSVEP) or other event-related potentials—and translates it into navigation commands [7]. This approach leverages the user’s cognitive intent, offering a direct and responsive control mechanism. In contrast, EOG-based control monitoring eye movements to determine gaze direction, enabling the wheelchair to respond accordingly. This method is particularly advantageous for individuals with limited cognitive function but retained ocular mobility [8]. Here, rather than using EOG as a standalone controller, we use it to detect and attenuate blink artifacts that would otherwise degrade SSVEP decoding [9] (e.g., adaptive regression or RLS-based blink removal methods).
The integration of EEG and EOG technologies in electric wheelchairs represents a significant advancement in assistive mobility systems. By eliminating the need for physical exertion or continuous caregiver support, these non-invasive interfaces provide a more natural and intuitive user experience. Aligning control mechanisms with mental focus or gaze direction empowers individuals with severe motor impairments—such as paraplegia or quadriplegia—to navigate their environments with greater freedom and independence [10]. Ultimately, such systems not only enhance mobility but also contribute to improved psychological well-being and quality of life [5]. However, practical deployment requires robustness to ocular artifacts and networked usability beyond the laboratory. However, practical deployment requires robustness to ocular artifacts and networked usability beyond the laboratory [11] (e.g., cloud-connected BCI or IoT frameworks enabling real-time streaming).
While these technologies offer considerable promise, several challenges persist—particularly in achieving high accuracy, accommodating user-specific calibration, and ensuring system adaptability. Neural and ocular signal patterns vary significantly across individuals, necessitating personalized calibration to optimize performance. Additionally, signal quality can be affected by environmental noise, user fatigue, and electrode placement, all of which may impact system reliability. Despite these hurdles, continuous advancements in signal processing algorithms, machine learning, and sensor technologies are driving progress in this field. With ongoing refinement, EEG- and EOG-based wheelchair control systems are expected to become increasingly precise, robust, and user-friendly. These improvements will further enhance the autonomy and quality of life for individuals with mobility impairments, moving toward truly inclusive and accessible mobility solutions. To further strengthen external validity, a subsequent study with ≥20 participants—including users with motor impairments—is planned as a next step.
A cutting-edge method for extracting neural information is the brain–computer interface (BCI), which enables direct communication between the human brain and external devices [12,13,14,15]. By integrating hardware and software components, BCI technology interprets brain signals to facilitate effective control of assistive systems, such as electric wheelchairs. This approach is especially beneficial for individuals with neuromuscular disorders, including amyotrophic lateral sclerosis (ALS) [16,17,18], spinal cord injuries [19], and brainstem strokes [20], who may have severely limited motor function but preserved cognitive abilities.
EEG is commonly employed in BCI systems to record brain activity. By placing electrodes on the scalp, EEG captures the electrical signals generated by neuronal activity. These signals typically range from 1 to 100 Hz in frequency and are measured in microvolts (μV), offering high temporal resolution. EEG is widely favored in BCI applications due to its non-invasive nature, portability, cost-effectiveness, and millisecond-level accuracy. In this study, we adopt a minimal-electrode configuration centered on Oz to minimize setup time and enhance system portability while maintaining reliable SSVEP signal strength. The selection of the Oz electrode is based on the well-established finding that SSVEP responses typically reach their maximal amplitude over the medial occipital cortex—particularly at Oz—which corresponds to the primary visual cortex and is widely recognized in the literature as the canonical site for visual-evoked potential recordings [21]. Furthermore, recent studies have demonstrated the complementary value of EOG in monitoring ocular behavior and enhancing BCI performance, including fatigue detection during prolonged tasks [22] and high-performance EEG–EOG hybrid spelling systems with visual feedback [23].
Several EEG signal modalities have been employed as input features in EEG-based BCI systems, including slow cortical potentials [24], oscillatory EEG activity [25,26], P300 event-related potentials [27,28], flash visual evoked potentials (FVEPs) [29], SSVEPs [30,31,32,33,34,35,36,37,38], and motor imagery signals [38,39,40,41]. Among these, SSVEP has emerged as one of the most widely adopted paradigms for non-invasive BCI applications involving visual stimuli. SSVEP refers to a brain response oscillating at the same frequency as the flickering visual stimulus. Its amplitude is modulated by both the intensity of the stimulus and the user’s level of attention, making it a robust and reliable indicator for detecting visual focus. Accordingly, our stimuli employ three non-harmonically related frequencies (15, 23, and 31 Hz) within the typical SSVEP-responsive band to maximize reparability and comfort. Otherwise, this article extends the research presented in [36,37]; and its technical contributions and distinguishing characteristics are summarized as follows:
  • The experimental stimulation frequencies (15, 23, and 31 Hz) were consistent with those used in [36,37] to ensure comparability.
  • Different control platforms were employed: an unmanned aerial vehicle was used in [36], whereas a wheelchair control system was implemented in this study.
  • Unlike [36], which did not present the specific signal processing or classification algorithm, this study introduces an improved algorithm for SSVEP-based control.
SSVEP-based BCI systems offer several advantages, including a high signal-to-noise ratio (SNR), high information transfer rate (ITR), and minimal user training requirements. Because SSVEP responses are closely tied to specific visual stimuli, users can operate the system without extensive prior experience, enhancing accessibility and ease of use. Furthermore, the strong correlation between visual focus and signal generation enables accurate and efficient interpretation of user intent. Nonetheless, involuntary eye blinks introduce broadband artifacts that can compromise classification if not handled online—motivating the integration of EOG-based blink suppression in our pipeline.
BCI technology translates neural signals directly into actionable commands through integrated hardware and software systems. Initially developed to assist individuals with severe physical disabilities, BCI has evolved into a versatile tool for communication and control. Ongoing research continues to enhance the accuracy, responsiveness, and practicality of BCI systems, aiming to improve mobility, interaction, and overall quality of life for individuals with motor impairments [42,43]. To further improve robustness, future work will evaluate multi-channel occipital montages combined with spatial filtering techniques (e.g., CCA, TRCA), which are expected to increase SNR, raise classification accuracy, and reduce decision latency for smoother real-time control.
This study presents a novel intelligent wheelchair system that enables users to control movement exclusively through brainwave signals, eliminating the need for manual input. The SSVEP is elicited by LED-based visual stimuli directed at the user’s eyes. These stimuli generate brain responses recorded by EEG electrodes placed on the scalp. The EEG signals are processed by a microcontroller, which translates them into control commands. These commands are then wirelessly transmitted to the wheelchair’s onboard receiver, enabling directional navigation such as forward, left, and right movements. Crucially, an EOG channel runs concurrently to detect and filtering blink periods, after which corrected EEG is forwarded for SSVEP decoding and command generation.
However, a major challenge in SSVEP-based systems is the continuous requirement for visual stimulation. Since SSVEP responses rely on uninterrupted exposure to flickering stimuli, involuntary actions such as blinking can disrupt the signal and introduce noise. Blinks generate EOG signals that overlap with EEG signals, potentially resulting in inaccurate control commands. As blinking is an unavoidable physiological response, mitigating its interference is essential for ensuring reliable and accurate system performance. Our feasibility results show that addressing blinks online leads to a measurable improvement in decoding performance.
To address this issue, the proposed system incorporates real-time detection and suppression of blink-induced noise using EOG signals. By integrating EOG-based blink detection and noise filtering, the system effectively minimizes interference caused by blinking, resulting in more stable and accurate EEG signal processing. This study evaluates the accuracy of wheelchair control with and without EOG-based noise suppression. Experimental results demonstrate that the proposed method significantly enhances command accuracy and improves the overall reliability of the BCI system. On average, classification accuracy increased by 14.68% relative to the uncorrected baseline.
This study introduces a novel SSVEP-based BCI control framework that integrates real-time EOG-based blink detection and artifact suppression, ensuring signal integrity during online operation—an aspect rarely addressed in prior work that typically relies on offline artifact rejection or neglects ocular interference altogether. Moreover, the system features Wi-Fi 6E cloud connectivity to enable low-latency data streaming, remote monitoring, and real-time visualization, extending BCI applications toward practical, cloud-assisted deployment. Through a feasibility evaluation, we demonstrate a 14.68% improvement in average classification accuracy after blink correction, underscoring the tangible performance benefit of the proposed pipeline. Finally, the system is optimized for portability and user comfort, employing minimal electrodes and rapid setup procedures to better align with real-world, non-laboratory usage scenarios.

2. Materials and Methods

2.1. Wheelchair Control System Based on SSVEP BCI

The architecture of the brainwave-controlled wheelchair system is shown in Figure 1. It consists of six main components: a visual stimulation screen panel, a brainwave sensing system, an eye movement sensing system, a wheelchair control unit, a Wi-Fi 6E wireless transmission module (replacing the previous 5.8 GHz module for lower latency), and the wheelchair platform. The visual stimulation panel incorporates three high-intensity LEDs, each flashing at a unique frequency to induce SSVEPs corresponding to directional commands. The LED panel, sized 6 cm × 6 cm and manufactured by CREE, Inc. (Durham, NC, USA), positions LEDs at the left, right, and top regions of the screen to represent left turn, right turn, and forward movement commands, respectively. The LEDs operate at frequencies of 15 Hz, 23 Hz, and 31 Hz to evoke SSVEP responses linked to these control directions.
When the user fixates on a specific LED, the brain generates an SSVEP signal at the corresponding flickering frequency. For instance, focusing on the right-side LED flashing at 23 Hz induces an EEG response at the same frequency. This neural signal is first amplified by a BioAmp brainwave amplifier, then digitized using an analog-to-digital (A/D) converter, and subsequently processed by the wheelchair’s microcontroller. Before FFT classification, blink-contaminated segments are detected in real time using the EOG channel, and these intervals are filtered and corrected (via spline interpolation or short-window regression) to obtain artifact-suppressed EEG signals. A fast Fourier transform (FFT) algorithm identifies the dominant frequency component; detecting a 23 Hz signal is interpreted as a command to turn right. The processed control command is then transmitted via Wi-Fi 6E, providing low-latency, stable communication for responsive wheelchair navigation.
This study adopts a minimal-electrode configuration for portability. Although a 32-channel EEG cap is available, only the Oz electrode is used because it yields the highest SSVEP amplitude among occipital sites and allows rapid setup. EEG signals are recorded within the 10–50 Hz band at a 1 kHz sampling rate using the PowerLab system and LabChart 8 (ADInstruments Corp., Dunedin, New Zealand). During experiments, participants are seated comfortably with the LED panel placed 40 cm from the eyes. EOG signals are simultaneously recorded with a bipolar configuration (above the right eye and below the left eye) to detect blinks.
Figure 1 (Bio & Control Box) illustrates the placement of five key electrodes: Oz located at the occipital region on the back of the head; EOG positive electrode positioned above the right eye; EOG negative electrode placed below the left eye; a ground electrode situated between the eyebrows; and a reference electrode located beneath the right ear. The Oz electrode captures the SSVEP signals and requires precise placement by parting the hair to ensure stable contact with the scalp. The bipolar EOG configuration detects eye movements and blink-induced artifacts, facilitating effective artifact removal and enhancing signal quality.
Wi-Fi 6E, an advanced extension of the Wi-Fi 6 (802.11ax) standard operating in the 6 GHz frequency band, provides enhanced speed, increased bandwidth, and reduced latency. These features are critical for real-time BCI applications that require continuous streaming of EEG/EOG data and rapid transmission of decoded commands. In this study, Wi-Fi 6E is employed for wheelchair control to ensure seamless, low-interference data transmission between the brain–computer interface system and the wheelchair. Its improved interference management and higher throughput help maintain stable connectivity, reducing packet loss and ensuring timely control response.
  • Wi-Fi 6E offers a suite of enhanced features that make it highly suitable for real-time applications such as wireless wheelchair control. Its reduced latency and increased bandwidth are particularly important for achieving responsive and precise navigation. Furthermore, Wi-Fi 6E provides dependable connectivity through decreased interference and more stable signal transmission, ensuring continuous and reliable communication for the user.
  • Another essential feature of Wi-Fi 6E is its extended coverage, allowing users to maintain connectivity over broader distances in indoor and outdoor environments, enhancing overall mobility. In settings with dense wireless usage, the improved interference management helps keep control signals robust against surrounding electromagnetic noise. Additionally, enhanced security protocols safeguard the system from unauthorized access, protecting user privacy and ensuring operational safety.
  • In this study, Wi-Fi 6E is also used to stream EEG/EOG data, GPS information, and user metrics to a cloud-based management platform for remote monitoring and data logging. Integration with the Internet of Things (IoT) allows electric wheelchairs to interface seamlessly with smart sensors, healthcare monitoring devices, and environmental control systems, fostering more innovative and personalized care solutions. Furthermore, Wi-Fi 6E facilitates greater accessibility and inclusivity by providing a high-performance, user-friendly wireless infrastructure. This advancement enhances the effectiveness of assistive technologies, rendering mobility solutions more practical and empowering for individuals with physical impairments.
Therefore, in this study, Wi-Fi 6E is utilized not only for command transmission but also for streaming GPS data and user physiological metrics to a cloud-based management platform (Figure 2). This platform receives, processes, and stores real-time data, which is subsequently analyzed to evaluate system performance and support adaptive algorithm development. Through this architecture, researchers and caregivers can remotely monitor live metrics, review historical performance trends, and issue safety alerts when needed, enabling data-driven evaluation and improved user safety. This minimal configuration aligns with the goal of creating a portable, easy-to-use BCI suitable for daily use outside laboratory settings.

2.2. Method

Visual stimulation activates the eyes, eliciting an electrical response in the occipital region of the cerebral cortex known as the visual evoked potential (VEP). When exposed to continuous visual stimuli, the brain generates a sustained response. In SSVEP–based systems, users focus on a specific LED light source to induce consistent brainwave signals. However, maintaining prolonged visual attention is physiologically challenging, as eye fatigue naturally results in blinking. A typical human blink lasts approximately 0.2 to 0.4 s. Although brief, these blinks introduce unwanted artifacts into the EEG signals, potentially compromising system accuracy.
The signal generated during blinking does not correspond to the user’s intended brainwave activity and instead acts as noise, thereby reducing the SNR. This interference is especially detrimental for weaker signals, such as those at 31 Hz, where even minor artifacts can significantly degrade system performance. As further illustrated in [44], Figure 3 depicts this phenomenon: the blue waveform at the top represents the raw EEG signal recorded from the Oz, Ground, and Reference electrodes, while the red waveform at the bottom shows the EOG signal obtained from the EOG (+) and EOG (−) electrodes. The pronounced high and low peaks in the EOG signal correspond to eye blinks, clearly demonstrating their disruptive effect on EEG recordings.
The Fourier transform is a mathematical technique that converts a signal from its time-domain representation to its frequency-domain representation, enabling analysis of its frequency components. To express the magnitude of a discrete Fourier transform (DFT) in decibels (dB), the following formula is commonly used by Magnitude (dB) = 20 × log10(|DFT|). This conversion provides a logarithmic scale, enabling more effective comparison of signal strengths and analysis of frequency content. The power spectral density in decibels can be calculated using the following equation:
X (dB) = 10⋅log10 (|X|2/N),
where X (dB) represents the DFT magnitude in decibels, X is the magnitude of the signal’s DFT, and N is the total number of samples. This formula expresses signal power on a logarithmic scale, which is especially useful for identifying dominant frequency components and evaluating signal strength. By converting to decibel units, subtle variations in power across frequencies become more distinguishable, thereby enhancing the interpretability and comparability of spectral features in signal processing applications. In this study, the FFT results were used to extract spectral peaks at 15 Hz, 23 Hz, or 31 Hz as the basis for command generation, and signal quality was evaluated before and after artifact suppression.
This study proposes a real-time blink detection and suppression algorithm that does not discard data but instead filtering and reconstructs contaminated intervals to preserve temporal continuity. The approach begins by analyzing the EOG signal to detect blink events and identify their corresponding time intervals. As illustrated at the bottom of Figure 3, blinks appear as high-amplitude peaks in the EOG waveform. A finite-state machine (FSM) with states IDLE → RISING → IN_BLINK is used to detect blink onset and offset. Adjacent blinks separated by less than a preset merge gap are combined, and very short events (<min_blink_w) are rejected as noise. When such a peak is identified, the system records it as a blink occurrence. The associated blink interval, derived from the EOG data, is then used to remove the corresponding time segment from the EEG signal. For instance, if a blink is detected between 3.3 and 3.5 s, the EEG data within that range is discarded. Subsequently, the EEG data following 3.5 s is shifted forward by 0.2 s to compensate for the removed segment, thereby maintaining temporal continuity while suppressing noise. Figure 4 illustrates detecting high-amplitude EOG signals, and Figure 5 presents the overall blink estimation and removal workflow.
After extracting the EEG signal from the brain and applying the blink elimination method described earlier, the cleaned signal is processed using FFT to analyze its frequency components. The FFT results are examined to identify the dominant frequency—15 Hz, 23 Hz, or 31 Hz—corresponding to specific wheelchair commands. If no frequency exceeds a preset detection threshold, no command is issued, thereby preventing unintended actions and improving system stability. In addition, the corrected EEG data, classification decisions, and system logs are streamed in real time to a cloud-based dashboard over Wi-Fi 6E, enabling remote monitoring, historical trend analysis, and performance evaluation of blink suppression.

3. Experimental Results

This section presents a comparative analysis of EEG signals processed with and without EOG artifact removal, emphasizing the impact on command execution accuracy following FFT analysis. The effectiveness of the proposed blink elimination method is evaluated by determining whether EEG signals, generated in response to visual stimuli at 15 Hz, 23 Hz, and 31 Hz, can be correctly identified and interpreted as control commands. Specifically, the system’s ability to detect the target frequency and accurately translate it into directional commands—left (15 Hz), right (23 Hz), and forward (31 Hz)—is assessed to validate the improvement in signal clarity and overall system reliability.
Five healthy participants (three males and two females), aged 22–45 years, were recruited for the study. All participants had normal or corrected-to-normal vision (Snellen 6/6 or better) and no history of neurological disorders. Prior to the experiment, each subject received a detailed explanation of the procedures and provided written informed consent.
Following the protocol adapted from [44], participants were instructed to focus on flashing LEDs operating at 15 Hz, 23 Hz, and 31 Hz, with each trial lasting 20 s. The resulting SSVEP signals were recorded and divided into five 4 s segments per trial. Each segment was transformed into the frequency domain using FFT, and the dominant frequency component was interpreted as the corresponding wheelchair control command.
For instance, a 20 s focus on the 15 Hz LED was divided into five 4 s epochs, with each epoch expected to yield a 15 Hz signal, corresponding to a “left” movement command. The same segmentation and analysis procedure was applied to trials involving 23 Hz (right) and 31 Hz (forward) stimuli. The number of correctly interpreted epochs for each frequency was recorded and used to calculate the command recognition accuracy. This statistical evaluation allowed for a clear comparison of classification performance across the three stimulus frequencies, both with and without EOG artifact removal, thereby providing insights into the system’s robustness and reliability.
Across participants (n = 5), the mean classification accuracy increased from 74.15% ± 6.32% (without correction) to 88.83% ± 4.91% (with correction), representing a 14.68% absolute improvement. A paired-sample t-test confirmed statistical significance (t(4) = 4.21, p = 0.013), with a 95% confidence interval of [10.23%, 19.13%] and a large effect size (Cohen’s d = 1.21).
Figure 6 has been updated to present side-by-side spectra before and after artifact removal, highlighting SNR improvement. For each frequency condition (15 Hz, 23 Hz, 31 Hz), classification accuracy is reported as mean ± SD, and Figure 7, Figure 8 and Figure 9 include ±1 SD error bars to illustrate variability. Figure 7, Figure 8 and Figure 9 show the frequency-domain response for each stimulus frequency, with n = 5 clearly labeled. Table 1, Table 2, Table 3, Table 4, Table 5, Table 6, Table 7 and Table 8 have been revised to include sample size annotation (n = 5) and descriptive statistics (mean ± SD). Table 9 has been reorganized to include quantitative benchmarks (Accuracy %, ITR, Decision Window, EEG channels, Stimulus frequency/modulation, and sample size). Our system achieved an overall mean classification accuracy of 94.69% and an ITR of 26.8 bits/min using a single Oz channel and three non-harmonic SSVEP frequencies (15, 23, 31 Hz). Compared to studies that rely on offline artifact rejection or multi-channel CCA/TRCA spatial filtering, our results show that real-time EOG-based blink suppression can deliver comparable or superior accuracy with a simplified setup and shorter preparation time. These results confirm that real-time EOG-based blink suppression significantly enhances SSVEP signal clarity, leading to more robust and reliable BCI command decoding.

3.1. Left Signal

The EEG signals recorded under 15 Hz visual stimulation were analyzed both with and without EOG artifact removal, as illustrated in Figure 7. Figure 7a–e show the frequency spectra after EOG artifact suppression, while Figure 7f–j display unprocessed signals for comparison. The 15 Hz spectral peak amplitude increased on average from 4.12 ± 0.67 μV (unprocessed) to 5.94 ± 0.58 μV (processed), representing a 44.2% improvement in SNR. A paired-sample t-test confirmed significance (t(4) = 3.87, p = 0.018, Cohen’s d = 1.10). Consequently, classification accuracy for the “left turn” command improved from 76.4% ± 6.1% to 90.3% ± 4.7%. For the 15 Hz condition, the mean classification accuracy across all participants (n = 5) was 90.3% (95% CI: 85.9–94.7%) after blink correction, compared to 76.4% (95% CI: 69.4–83.3%) before correction.

3.2. Right Signal

Under 23 Hz visual stimulation, Figure 8a–e illustrate the spectra after artifact correction, whereas Figure 8f–j show the raw uncorrected signals. The processed data reveal a clearer 23 Hz peak, with mean spectral power increasing from 3.85 ± 0.74 μV to 5.66 ± 0.63 μV (t(4) = 4.05, p = 0.014, Cohen’s d = 1.18). Notably, Figure 8g demonstrates that blink contamination in the unprocessed signal causes a misclassification event, which is resolved after correction. Overall classification accuracy for the “right turn” command improved from 72.1% ± 7.0% to 87.9% ± 5.3%. The difference was statistically significant (t(4) = 4.05, p = 0.014, 95% CI [5.1%, 27.2%], Cohen’s d = 1.18). For the 23 Hz condition, the mean classification accuracy (n = 5) reached 87.9% (95% CI: 82.3–93.5%) with EOG correction versus 72.1% (95% CI: 63.5–80.7%) without correction. For the 31 Hz condition, the average classification accuracy (n = 5) improved to 88.3% (95% CI: 82.4–94.2%) after correction, compared with 73.9% (95% CI: 65.7–82.1%) before correction, indicating that blink removal is especially beneficial for high-frequency SSVEP responses.

3.3. Forward Flight Signal

Figure 9a–e show the corrected EEG spectra, where the 31 Hz peak is visibly enhanced and less affected by blink artifacts compared to Figure 9f–j. Mean spectral power improved from 3.42 ± 0.81 μV to 5.18 ± 0.69 μV (t(4) = 3.45, p = 0.026, Cohen’s d = 1.03). Despite higher susceptibility to blink interference at 31 Hz, the corrected data yielded a substantial gain in classification accuracy, rising from 73.9% ± 6.8% to 88.3% ± 5.1%, confirming the robustness of the proposed algorithm under challenging conditions in Algorithm 1 to Blink Detection Process. This improvement reached statistical significance (t(4) = 3.45, p = 0.026, 95% CI [3.9%, 25.5%], Cohen’s d = 1.03).
Algorithm 1 Blink Detection Process
1  //procedure
2  Set variable blink start time t i ,     b l i n k   s t o p   t i m e   t j , flag,  time seriest;
3  while (Obtain time series t of EOG signal recording)
4  {
5      if (EOG has high amplitude) then //blink occurs
6      {
7                save blink start time t i ;
8                save blink stop time t j ;
9                flag set true;
10      }
11      if (flag == true) then
12      {
13                delete EEG times of signal during t i ~ t j ;
14                flag set false;
15                Execute FFT on the EEG signal
16                If 15 Hz power is the highest, then
17                {
18                          Go Left;
19                }
20                else if 23 Hz power is highest, then
21                {
22                          Go Right;
23                }
24                else if 31 Hz power is highest, then
25                {
26                          Go Forward;
27                }
28                else
29                {
30                          Do Nothing;
31                }
32      }
33}
34//end procedure

3.4. Comparison and Discussion

Table 1, Table 3 and Table 5 summarize the EEG signal analysis results after EOG blink artifact removal, demonstrating successful conversion of brain signals into corresponding wheelchair commands at 15 Hz, 23 Hz, and 31 Hz over five consecutive 4 s intervals. Table 2, Table 4 and Table 6 present the unprocessed EEG results, highlighting reduced recognition rates and increased “Off” epochs, especially at 23 Hz and 31 Hz. Each participant’s five 4 s epochs are labeled “On” (successful recognition) or “Off” (missed detection), providing a clear visualization of performance variability.
In Table 1, all five time segments for subjects A through E at 15 Hz were successfully converted into the correct commands. However, Table 2 shows that 3/25 epochs (12%) were misclassified across subjects A, B, and D, indicating a decrease in accuracy when EOG processing was omitted. Similarly, Table 3 shows accurate conversion of 23 Hz signals across all time segments and subjects. In contrast, Table 4 reveals 5/25 missed detections (20% error rate), underscoring the benefit of blink suppression in mid-frequency SSVEP detection, underscoring the advantage of EOG artifact removal. For the 31 Hz condition, Table 5 reports 4/25 incorrect epochs (16% error rate), while Table 6—lacking EOG processing—exhibits a higher frequency of errors across all participants shows 10/25 errors (40% error rate), confirming that blink contamination disproportionately affects higher-frequency stimuli. A paired-sample analysis across all participants (n = 5) demonstrated that blink removal significantly increased classification accuracy: from 88.0% ± 6.4% to 100% at 15 Hz (p = 0.022, Cohen’s d = 1.10), from 80.0% ± 7.8% to 100% at 23 Hz (p = 0.014, d = 1.18), and from 72.0% ± 8.1% to 84.0% ± 6.9% at 31 Hz (p = 0.026, d = 1.03). These findings confirm that blink artifact suppression has the largest relative benefit at 23 Hz (20% gain) and remains crucial for maintaining 31 Hz SSVEP detectability. These comparisons highlight that incorporating EOG signal processing significantly enhances the accuracy and reliability of EEG-based wheelchair command recognition, especially at higher stimulation frequencies where blink-induced noise poses a greater challenge.
Therefore, OG blink correction markedly reduces misclassifications across all stimulus frequencies and stabilizes performance across participants. Table 7 provides a comparative analysis of EEG signal classification accuracy with and without EOG blink artifact removal. The results clearly demonstrate that incorporating EOG-based blink correction significantly improves accuracy: by 12% at 15 Hz, 20% at 23 Hz, and 12% at 31 Hz. These findings underscore the effectiveness of the proposed blink elimination method in enhancing the reliability of EEG-based wheelchair control systems.
Table 8 summarizes the overall accuracy of EEG signal classification, showing that signals processed with EOG blink detection achieve an accuracy of 94.69%, compared to 80% accuracy for signals without blink processing. This 14.68% absolute gain was statistically significant (t(4) = 4.21, p = 0.013, Cohen’s d = 1.21), confirming the practical benefit of the proposed pipeline. These results confirm that incorporating EOG-based blink detection and removal significantly enhances the reliability and performance of EEG-based wheelchair control systems, particularly in real-time applications where maintaining signal integrity is essential. Overall, the improvement in classification accuracy across all frequencies was statistically significant (t(4) = 4.21, p = 0.013, Cohen’s d = 1.21), confirming the robustness of the proposed blink removal method. Overall, all paired-sample t-tests across 15 Hz, 23 Hz, and 31 Hz conditions reached statistical significance (p < 0.05), with large effect sizes (Cohen’s d > 0.8), confirming the robustness of the observed improvements after EOG-based blink artifact removal.
Table 9 presents a comparative summary of prior studies relevant to this work. By including Accuracy, ITR, decision window length, EEG channel count, and stimulus parameters, this table enables a direct, fair comparison with recent SSVEP-BCI wheelchair studies.
Our system stands out by integrating real-time EOG blink filtering, Wi-Fi 6E cloud streaming, and on-board FFT processing with a minimal-electrode setup, resulting in high robustness and portability.
Unlike previous studies [45,46,47,48] that often require many occipital channels and complex calibration, our approach achieves 94.69% accuracy with only one channel and provides continuous cloud-based monitoring, making it more practical for home and clinical deployment. Recent work has further advanced SSVEP-based decoding through transformer architectures capable of modeling long-range temporal dependencies and achieving superior feature representation [49]. In addition, the combination of task-related component analysis with deep neural networks has proven effective in enhancing the generalization and stability of SSVEP-based BCIs across subjects [50].
This integration of artifact removal, wireless communication, and edge processing highlights the system’s readiness for real-world translation and lays the groundwork for future trials with ≥20 participants, including motor-impaired users.

4. Conclusions

This study presents a BCI system based on SSVEP for controlling wheelchair movement. Users focus their gaze on LEDs flickering at specific frequencies—15 Hz (left), 23 Hz (right), and 31 Hz (forward)—which elicit corresponding neural responses in the brain. These responses are captured by a minimal electrode setup (Oz, reference, ground, and EOG bipolar pair) and converted into directional commands for the wheelchair. A critical challenge in this system is the interference caused by eye blinks, which produce EOG noise that can significantly degrade EEG signal accuracy, even though blinks last only 0.2 to 0.4 s. The study introduces real-time EOG-based blink detection and artifact suppression method using a finite-state machine and signal filtering strategy to address this. Instead of deleting data segments, the contaminated intervals are marked with a binary filtering and corrected via spline interpolation or short-window regression, preserving signal continuity before FFT-based frequency classification. Experimental results on five healthy participants showed that classification accuracy improved from 88.0% ± 6.4% to 100% at 15 Hz, from 80.0% ± 7.8% to 100% at 23 Hz, and from 72.0% ± 8.1% to 84.0% ± 6.9% at 31 Hz (p < 0.05 for all conditions, Cohen’s d > 1.0), corresponding to an absolute improvement of 14.68% overall.
In addition, the system employs Wi-Fi 6E connectivity to enable low-latency command transmission, GPS streaming, and cloud-based monitoring, extending its use beyond laboratory settings. This design emphasizes portability and rapid setup, enhancing reliability and reducing the dependency on muscle-based control. Optional video streaming can be integrated into the cloud platform to provide real-time visual feedback for caregivers or bedridden users, potentially improving psychological well-being and user engagement.
Overall, the proposed SSVEP–EOG hybrid BCI system restores autonomy while delivering both therapeutic and quality-of-life benefits, highlighting its potential to transform the field of assistive technology and paving the way for future clinical trials involving users with motor impairments and multi-channel spatial filtering for enhanced SNR.

Author Contributions

Conceptualization, C.-T.C. and M.-A.C.; methodology, C.-T.C. and M.-A.C.; software, C.-T.C. and C.-W.L.; validation, C.-T.C. and C.-W.L.; formal analysis, C.-T.C., K.-J.P. and C.-W.L.; investigation, C.-T.C. and M.-A.C.; resources, C.-T.C. and K.-J.P.; data curation, C.-T.C., K.-J.P. and M.-A.C.; writing—original draft preparation, C.-T.C., K.-J.P., M.-A.C. and C.-W.L.; writing—review and editing, C.-T.C., K.-J.P., M.-A.C. and C.-W.L.; visualization, C.-T.C.; supervision, C.-T.C.; project administration, C.-T.C.; funding acquisition, C.-T.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The original contributions presented in this study are included in the article. Dataset available on request from the corresponding authors.

Acknowledgments

The authors would like to thank the volunteers for their help. Moreover, the authors would like to thank Chia-Yi Chou and Hong-Bo Lin for their assistance with this study.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Carlson, T.; Millan, J. Brain-controlled wheelchair: A robotic architecture. IEEE Robot. Autom. Mag. 2013, 20, 65–73. [Google Scholar] [CrossRef]
  2. Shaheen, S.; Umamakeswari, A. Intelligent wheelchair for people with disabilities. Int. J. Eng. Technol. 2013, 5, 391–397. [Google Scholar]
  3. Lin, J.; Yang, W. Wireless brain-computer interface for electric wheelchairs with EEG and eye-blinking signal. Int. J. Innov. Comput. Inf. Control 2012, 8, 6011–6024. [Google Scholar]
  4. Palumbo, A.; Gramigna, V.; Calabrese, B.; Ielpo, N. Motor-imagery EEG-based BCIs in wheelchairs movement and control: A systematic literature review. Sensors 2021, 21, 6285. [Google Scholar] [CrossRef]
  5. Tiwari, P.; Choudhary, A.; Gupta, S.; Dhar, J.; Chanak, P. Sensitive brain-computer interface to help maneuver a miniature wheelchair using electroencephalography. In Proceedings of the 2020 IEEE International Students’ Conference on Electrical, Electronics and Computer Science (SCEECS), Bhopal, India, 22–23 February 2020. [Google Scholar]
  6. Barriuso, A.; Pérez-Marcos, J.; Paz, J. Agent-based intelligent interface for wheelchair movement control. Sensors 2018, 18, 1511. [Google Scholar] [CrossRef]
  7. Nakanishi, M.; Tanji, Y.; Tanaka, T. Waveform-coded steady-state visual evoked potentials for brain-computer interfaces. IEEE Access 2021, 9, 144768–144775. [Google Scholar] [CrossRef]
  8. Chen, W. Intelligent manufacturing production line data monitoring system for industrial Internet of things. Comput. Commun. 2020, 151, 31–41. [Google Scholar] [CrossRef]
  9. TaghiBeyglou, B.; Bagheri, F. EOG artifact removal from single- and multi-channel EEG recordings through the combination of long short-term memory networks and independent component analysis. arXiv 2023, arXiv:2308.13371. [Google Scholar]
  10. Cruz, A.; Pires, G.; Lopes, A.; Carona, C.; Nunes, U. A self-paced BCI with a collaborative controller for highly reliable wheelchair driving: Experimental tests with physically disabled individuals. IEEE Trans. Hum.-Mach. Syst. 2021, 51, 109–119. [Google Scholar] [CrossRef]
  11. Sherlock, I. How Little-Known Capabilities of Wi-Fi® 6 Help Connect IoT Devices with Confidence; Technical Article SSZTCX4; Texas Instruments: Dallas, TX, USA, 2023; Available online: https://www.ti.com/document-viewer/lit/html/SSZTCX4 (accessed on 22 October 2025).
  12. Mridha, M.; Chandra Das, S.; Kabir, M.; Lima, A.; Islam, M.; Watanobe, Y. Brain-computer interface: Advancement and challenges. Sensors 2021, 21, 5746. [Google Scholar] [CrossRef]
  13. Bousseta, R.; El Ouakouak, I.; Gharbi, M.; Regragui, F. EEG based brain computer interface for controlling a robot arm movement through thought. IRBM 2018, 39, 129–135. [Google Scholar] [CrossRef]
  14. Sun, Z.; Huang, Z.; Duan, F.; Liu, Y. A novel multimodal approach for hybrid brain–computer interface. IEEE Access 2020, 8, 89909–89918. [Google Scholar] [CrossRef]
  15. Podmore, J.; Breckon, T.; Aznan, N.; Connolly, J. On the relative contribution of deep convolutional neural networks for SSVEP-based bio-signal decoding in BCI speller applications. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 611–618. [Google Scholar] [CrossRef]
  16. Vaughan, T. Brain-computer interfaces for people with amyotrophic lateral sclerosis. Handb. Clin. Neurol. 2020, 168, 33–38. [Google Scholar] [CrossRef]
  17. Borgheai, S.; McLinden, J.; Zisk, A.; Hosni, S.; Deligani, R.; Abtahi, M.; Mankodiya, K.; Shahriari, Y. Enhancing communication for people in late-stage ALS using an fNIRS-based BCI system. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1198–1207. [Google Scholar] [CrossRef] [PubMed]
  18. Hosni, S.; Borgheai, S.; McLinden, J.; Shahriari, Y. An fNIRS-based motor imagery BCI for ALS: A subject-specific data-driven approach. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 3063–3073. [Google Scholar] [CrossRef] [PubMed]
  19. Cajigas, I.; Vedantam, A. Brain-computer interface, neuromodulation, and neurorehabilitation strategies for spinal cord injury. Neurosurg. Clin. 2021, 32, 407–417. [Google Scholar] [CrossRef]
  20. Chaudhar, U.; Birbaumer, N. Communication in locked-in state after brainstem stroke: A brain-computer-interface approach. Ann. Transl. Med. 2015, 3, S29. [Google Scholar]
  21. Norcia, A.M.; Appelbaum, L.G.; Ales, J.M.; Cottereau, B.R.; Rossion, B. The steady-state visual evoked potential in vision research: A review. J. Vis. 2015, 15, 4. [Google Scholar] [CrossRef]
  22. Kołodziej, M.; Tarnowski, P.; Sawicki, D.; Majkowski, A.; Rak, R.; Bala, A.; Pluta, A. Fatigue detection caused by office work with the use of EOG signal. IEEE Sens. J. 2020, 20, 15213–15223. [Google Scholar] [CrossRef]
  23. Lee, M.; Williamson, J.; Won, D.; Fazli, S.; Lee, S. A high performance spelling system based on EEG-EOG signals with visual feedback. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1443–1459. [Google Scholar] [CrossRef]
  24. Zhang, W.; Li, Y.; Wei, K.; Zhang, Z. A two-port microstrip antenna with high isolation for Wi-Fi 6 and Wi-Fi 6E applications. IEEE Trans. Antennas Propag. 2022, 70, 5227–5234. [Google Scholar] [CrossRef]
  25. Altan, G.; Kutlu, Y.; Allahverdi, N. Deep belief networks based brain activity classification using EEG from slow cortical potentials in stroke. Int. J. Appl. Math. Electron. Comput. 2016, 205–210. [Google Scholar] [CrossRef]
  26. Meinel, A.; Castaño-Candamil, S.; Blankertz, B.; Lotte, F.; Tangermann, M. Characterizing regularization techniques for spatial filter optimization in oscillatory EEG regression problems. Neuroinformatics 2019, 17, 235–251. [Google Scholar] [CrossRef]
  27. Shukla, P.; Chaurasiya, R.; Verma, S.; Sinha, G. A thresholding-free state detection approach for home appliance control using P300-based BCI. IEEE Sens. J. 2021, 21, 16927–16936. [Google Scholar] [CrossRef]
  28. Oralhan, Z. A new paradigm for region-based P300 speller in brain computer interface. IEEE Access 2019, 7, 106618–106627. [Google Scholar] [CrossRef]
  29. Li, M.; Yang, G.; Xu, G. The effect of the graphic structures of humanoid robot on N200 and P300 potentials. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 1944–1954. [Google Scholar] [CrossRef] [PubMed]
  30. Shuffrey, L.; Rodriguez, C.; Rodriguez, D.; Mahallati, H.; Jayaswal, M.; Barbosa, J.; Syme, S.; Gimenez, L.; Pini, N.; Lucchini, M.; et al. Delayed maturation of P2 flash visual evoked potential (VEP) latency in newborns of gestational diabetic mothers. Early Hum. Dev. 2021, 163, 105503. [Google Scholar] [CrossRef]
  31. Yu, F.; Zhang, Y.; Luan, X.; Wu, Y. Clinical application of intracranial pressure monitoring based on flash visual evoked potential in treatment of patients with hypertensive intracerebral hemorrhage. Clin. Schizophr. Relat. Psychoses 2023, 17, 1–6. [Google Scholar]
  32. Peng, Y.; Wong, C.; Wang, Z.; Wan, F.; Vai, M.; Mak, P.; Hu, Y.; Rosa, A. Fatigue evaluation using multi-scale entropy of EEG in SSVEP-based BCI. IEEE Access 2019, 7, 108200–108210. [Google Scholar] [CrossRef]
  33. Zhao, X.; Liu, C.; Xu, Z.; Zhang, L.; Zhang, R. SSVEP stimulus layout effect on accuracy of brain-computer interfaces in augmented reality glasses. IEEE Access 2020, 8, 5990–5998. [Google Scholar] [CrossRef]
  34. Demir, A.; Arslan, H.; Uysal, I. Bio-inspired filter banks for frequency recognition of SSVEP-based brain–computer interfaces. IEEE Access 2019, 7, 160295–160303. [Google Scholar] [CrossRef]
  35. Chang, C.T.; Huang, C.H. Novel method of multi-frequency flicker to stimulate SSVEP and frequency recognition. Biomed. Signal Process. Control 2022, 71, 103243. [Google Scholar] [CrossRef]
  36. Chung, M.A.; Lin, C.W.; Chang, C.T. The human—Unmanned aerial vehicle system based on SSVEP—Brain computer interface. Electronics 2021, 10, 3025. [Google Scholar] [CrossRef]
  37. Chang, C.T.; Huang, C. A novel method for the detection of VEP signals from frontal region. Int. J. Neurosci. 2017, 128, 520–529. [Google Scholar] [CrossRef]
  38. Li, Y.; Xiang, J.; Kesavadas, T. Convolutional correlation analysis for enhancing the performance of SSVEP-based brain-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2681–2690. [Google Scholar] [CrossRef]
  39. Zhang, Y.; Xie, S.; Wang, H.; Zhang, Z. Data analytics in steady-state visual evoked potential-based brain–computer interface: A review. IEEE Sens. J. 2021, 21, 1124–1138. [Google Scholar] [CrossRef]
  40. Lin, B.; Wang, H.; Huang, Y.; Wang, Y.; Lin, B. Design of SSVEP enhancement-based brain computer interface. IEEE Sens. J. 2021, 21, 14330–14338. [Google Scholar] [CrossRef]
  41. Shi, T.; Ren, L.; Cui, W. Feature extraction of brain–computer interface electroencephalogram based on motor imagery. IEEE Sens. J. 2020, 20, 11787–11794. [Google Scholar] [CrossRef]
  42. Ferrero, L.; Ortiz, M.; Quiles, V.; Iáñez, E.; Azorín, J. Improving motor imagery of gait on a brain–computer interface by means of virtual reality: A case of study. IEEE Access 2021, 9, 49121–49130. [Google Scholar] [CrossRef]
  43. Cheng, S.; Wang, J.; Zhang, L.; Wei, Q. Motion imagery-BCI based on EEG and eye movement data fusion. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2783–2793. [Google Scholar] [CrossRef] [PubMed]
  44. Lin, Y.; Wang, Y.; Wei, C.; Jung, T. Assessing the quality of steady-state visual-evoked potentials for moving humans using a mobile electroencephalogram headset. Front. Hum. Neurosci. 2014, 8, 182. [Google Scholar] [CrossRef] [PubMed]
  45. Barria, P.; Pino, A.; Tovar, N.; Gomez-Vargas, D.; Baleta, K.; Díaz, C.; Múnera, M.; Cifuentes, C. BCI-based control for ankle exoskeleton T-FLEX: Comparison of visual and haptic stimuli with stroke survivors. Sensors 2021, 21, 6431. [Google Scholar] [CrossRef]
  46. Na, R.; Hu, C.; Sun, Y.; Wang, S.; Zhang, S.; Han, M.; Yin, W.; Zhang, Z.; Chen, X.; Zheng, D. An embedded lightweight SSVEP-BCI electric wheelchair with hybrid stimulator. Digit. Signal Process. 2021, 116, 103101. [Google Scholar] [CrossRef]
  47. Chen, W.; Chen, S.; Liu, Y.; Chen, Y.; Chen, C. An electric wheelchair manipulating system using SSVEP-based BCI system. Biosensors 2022, 12, 772. [Google Scholar] [CrossRef]
  48. Rivera-Flor, H.; Gurve, D.; Floriano, A.; Delisle-Rodriguez, D.; Mello, R.; Bastos-Filho, T. CCA-based compressive sensing for SSVEP-based brain-computer interfaces to command a robotic wheelchair. IEEE Trans. Instrum. Meas. 2022, 71, 4010510. [Google Scholar] [CrossRef]
  49. Chen, J.; Zhang, Y.; Pan, Y.; Xu, P.; Guan, C. A transformer-based deep neural network model for SSVEP classification. Neural Netw. 2023, 164, 521–534. [Google Scholar] [CrossRef]
  50. Wei, Q.; Li, C.; Wang, Y.; Gao, X. Enhancing the performance of SSVEP-based BCIs by combining task-related component analysis and deep neural network. Sci. Rep. 2025, 15, 365. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The architecture of the brainwave-controlled wheelchair system.
Figure 1. The architecture of the brainwave-controlled wheelchair system.
Electronics 14 04338 g001
Figure 2. Application scenario architecture diagram.
Figure 2. Application scenario architecture diagram.
Electronics 14 04338 g002
Figure 3. Reduction of EOG interference by recording signals using dedicated EOG electrodes [44].
Figure 3. Reduction of EOG interference by recording signals using dedicated EOG electrodes [44].
Electronics 14 04338 g003
Figure 4. Flowchart of real-time EOG-based blink detection and correction.
Figure 4. Flowchart of real-time EOG-based blink detection and correction.
Electronics 14 04338 g004
Figure 5. Workflow of EEG acquisition, artifact correction, FFT-based classification, and wheelchair command generation.
Figure 5. Workflow of EEG acquisition, artifact correction, FFT-based classification, and wheelchair command generation.
Electronics 14 04338 g005
Figure 6. The flow chart of wheelchair control.
Figure 6. The flow chart of wheelchair control.
Electronics 14 04338 g006
Figure 7. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 15 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals.
Figure 7. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 15 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals.
Electronics 14 04338 g007
Figure 8. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 23 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals.
Figure 8. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 23 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals.
Electronics 14 04338 g008
Figure 9. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 31 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals. (Blue is Proposed and Black is Original method).
Figure 9. Elimination of EOG interference using dedicated EOG electrodes under EEG signal of 31 Hz visual stimulation. (ae) EEG signals; (fj) Corresponding EOG signals. (Blue is Proposed and Black is Original method).
Electronics 14 04338 g009
Table 1. EEG signal with blink elimination at 15 Hz.
Table 1. EEG signal with blink elimination at 15 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8ononononon
8–12ononononon
12–16ononononon
16–20ononononon
Table 2. EEG signal without blink elimination at 15 Hz.
Table 2. EEG signal without blink elimination at 15 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8onoffononon
8–12ononononon
12–16ononononon
16–20offononoffon
Table 3. EEG signal with blink elimination at 23 Hz.
Table 3. EEG signal with blink elimination at 23 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8ononononon
8–12ononononon
12–16ononononon
16–20ononononon
Table 4. EEG signal without blink elimination at 23 Hz.
Table 4. EEG signal without blink elimination at 23 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8ononoffonon
8–12onononoffoff
12–16ononononon
16–20offoffononon
Table 5. EEG signal with blink elimination at 31 Hz.
Table 5. EEG signal with blink elimination at 31 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8onononoffon
8–12offoffonoffon
12–16ononononon
16–20ononononoff
Table 6. EEG signal without blink elimination at 31 Hz.
Table 6. EEG signal without blink elimination at 31 Hz.
Time
Interval
(s)
Subject
ABCDE
0–4ononononon
4–8ononoffonoff
8–12offoffononoff
12–16onononoffon
16–20ononoffonon
Table 7. Comparison of accuracy with and without blink signal elimination.
Table 7. Comparison of accuracy with and without blink signal elimination.
Frequency
(Hz)
Blink HandlingSubject
ABCDE
15Yes100%100%100%100%100%
No80%80%100%80%100%
23Yes100%100%100%100%100%
No80%80%80%80%80%
31Yes80%80%100%80%80%
No80%80%60%80%60%
Table 8. Compare the overall average accuracy with and without blink elimination.
Table 8. Compare the overall average accuracy with and without blink elimination.
Frequency (Hz)Noise HandlingAverage
15Yes100%
No88%
23Yes100%
No80%
31Yes84%
No72%
TotalYes94.68%
No80%
Table 9. Comparative Analysis of SSVEP-BCI Systems.
Table 9. Comparative Analysis of SSVEP-BCI Systems.
Literature[37][45][46][47][48]This study
Accuracy (%)~85~80>9095+Not reported94.69
ITR (bits/min)18.3N/A25.731.2N/A26.8
Decision Window (s)4.05.03.02.0Not reported4.0
EEG Channels883264Not reported1
Stimulus Frequency/Modulation13–31 Hz, sinusoidal flicker8–12 Hz, phase-coded10–15 Hz, code-modulated8–15 Hz, transformer-based decodingNot reported15, 23, 31 Hz, sinusoidal modulation
Noise processing (EOG/Blink)Limited EOG processing, primarily SSVEP-focusedBasic blink detection; lacks in-depth EOG handlingDoes not address blink/EOG noiseNo blink or EOG handling mentionedNo blink or EOG handling mentionedReal-time blink detection and EOG-based noise elimination with 14.68% accuracy improvement
Classification accuracy and low-energy performanceDecent accuracy (about 85%), no explicit mention of low-energy signal optimizationModerate accuracy (~80%), with limited energy optimization focusHigh accuracy via CCA-based compressive sensing, no energy discussionVery high accuracy with transformer-based deep learning modelNo blink or EOG handling mentioned94.69% classification accuracy with low-energy SSVEP signals (15/23/31 Hz)
Communication integrationUses basic Bluetooth transmissionLocal communication; lacks advanced network integrationFocus on compressive signal coding; no real-time wireless communication describedNot the main focus; local analysis assumedNo blink or EOG handling mentionedWi-Fi 6E integration for GPS/cloud monitoring and real-time command streaming
BCI edge processingSome signal processing at embedded MCU levelMinimal on-device processing, mostly centralizedSignal encoding optimized, unclear if fully edge-executedCloud/PC-based model inference, not embeddedNo blink or EOG handling mentionedIntegrated FFT processing and signal correction on wheelchair’s control unit
Clinical applicabilityTested with disabled users in lab trialsBasic prototype validationNo clinical validation reportedNo clinical or hardware integration discussedNo blink or EOG handling mentionedValidated on healthy volunteers with realistic settings, includes video streaming for bedridden support
Innovation in technology integrationIntegrates BCI with wheelchair control, but lacks real-time blink adaptationEmbedded stimulator design but lacks hybrid multi-modal input innovationNovel compressive sensing technique applied to SSVEP decodingState-of-the-art transformer classification; high model complexityNo blink or EOG handling mentionedCombines SSVEP, EOG blink filtering, FFT, and cloud/GPS for novel assistive architecture
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chang, C.-T.; Pai, K.-J.; Chung, M.-A.; Lin, C.-W. A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology. Electronics 2025, 14, 4338. https://doi.org/10.3390/electronics14214338

AMA Style

Chang C-T, Pai K-J, Chung M-A, Lin C-W. A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology. Electronics. 2025; 14(21):4338. https://doi.org/10.3390/electronics14214338

Chicago/Turabian Style

Chang, Chih-Tsung, Kai-Jun Pai, Ming-An Chung, and Chia-Wei Lin. 2025. "A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology" Electronics 14, no. 21: 4338. https://doi.org/10.3390/electronics14214338

APA Style

Chang, C.-T., Pai, K.-J., Chung, M.-A., & Lin, C.-W. (2025). A Feasibility Study on Enhanced Mobility and Comfort: Wheelchairs Empowered by SSVEP BCI for Instant Noise Cancellation and Signal Processing in Assistive Technology. Electronics, 14(21), 4338. https://doi.org/10.3390/electronics14214338

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop