Next Article in Journal
Design and Implementation of Single-Phase Grid-Connected Low-Voltage Battery Inverter for Residential Applications
Next Article in Special Issue
Causal-Based Approaches to Explain and Learn from Self-Extension—A Review
Previous Article in Journal
Multiscale Feature Fusion and Graph Convolutional Network for Detecting Ethereum Phishing Scams
Previous Article in Special Issue
Feasibility Study for a Python-Based Embedded Real-Time Control System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Architectural Proposal for Low-Cost Brain–Computer Interfaces with ROS Systems for the Control of Robotic Arms in Autonomous Wheelchairs

by
Fernando Rivas
1,*,
Jesús Enrique Sierra
2,* and
Jose María Cámara
1
1
Department of Electromechanical Engineering, University of Burgos, 09006 Burgos, Spain
2
Department of Digitalization, University of Burgos, 09006 Burgos, Spain
*
Authors to whom correspondence should be addressed.
Electronics 2024, 13(6), 1013; https://doi.org/10.3390/electronics13061013
Submission received: 8 January 2024 / Revised: 18 February 2024 / Accepted: 22 February 2024 / Published: 7 March 2024
(This article belongs to the Special Issue Intelligent Control and Computing in Advanced Robotics)

Abstract

:
Neurodegenerative diseases present significant challenges in terms of mobility and autonomy for patients. In the current context of technological advances, brain–computer interfaces (BCIs) emerge as a promising tool to improve the quality of life of these patients. Therefore, in this study, we explore the feasibility of using low-cost commercial EEG headsets, such as Neurosky and Brainlink, for the control of robotic arms integrated into autonomous wheelchairs. These headbands, which offer attention and meditation values, have been adapted to provide intuitive control based on the eight EEG signal values read from Delta to Gamma (high and low/medium Gamma) collected from the users’ prefrontal area, using only two non-invasive electrodes. To ensure precise and adaptive control, we have incorporated a neural network that interprets these values in real time so that the response of the robotic arm matches the user’s intentions. The results suggest that this combination of BCIs, robotics, and machine learning techniques, such as neural networks, is not only technically feasible but also has the potential to radically transform the interaction of patients with neurodegenerative diseases with their environment.

1. Introduction

Over the past decade, the intersection of neuroscience and robotics has experienced exponential growth, largely driven by advances in brain–computer interfaces (BCIs). These technologies, which translate brain activity into commands for external devices, have opened new possibilities for enhancing the quality of life of individuals with severe motoric disabilities [1,2] Unlike conventional interfaces, such as those using eye movement ElectroOculoGraphy (EOG) or facial muscle contractions (electromyography, EMG) [3,4]. BCIs require no connection to peripheral muscles or nerves, allowing control of devices without verbal or physical interaction [5,6]. This is particularly relevant for patients in advanced stages of diseases that preclude any movement, such as subcortical stroke, amyotrophic lateral sclerosis, or cerebral palsy.
Specifically, the use of BCIs in assisted mobility, like wheelchair control, represents a rapidly developing and promising research field [7]. This study focuses on the integration of low-cost BCIs, specifically using commercial ElectroEncefaloGraphy (EEG) headbands with a single electrode, for controlling robotic arms on autonomous wheelchairs, an area that has not yet been exhaustively explored [8].
Controlling assistive devices through brain signals not only offers new hope for individuals with physical limitations but also poses significant technical challenges. Accuracy, ease of use, and adaptability are crucial aspects that must be addressed to ensure the feasibility of these systems in real-world environments. In this context, our work focuses on the use of commercial EEG headbands, providing a non-invasive and accessible way to capture brain signals [1].
Integrating these technologies with autonomous robotic systems presents a series of unique challenges. The architecture of the system proposed in this study includes a development board capable of hosting both artificial intelligence (AI) and running the Robot Operating System (ROS), along with a microcontroller to control the drivers associated with the stepper motors of a six-axis cobot and wheelchair movements. This setup allows real-time interpretation of EEG signals and an adaptive response of the system, ensuring that the robotic arm’s action aligns with the user’s intentions [9].
The incorporation of the AI development board is crucial for the efficient real-time processing of EEG data. The ability of these boards to run deep learning models, such as Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), facilitates the accurate interpretation of brain signals [10]. These models are essential for decoding complex patterns of brain activity and converting them into specific commands for robotic control.
Moreover, the use of ROS as a software architecture underpins the system’s flexibility and scalability. ROS provides a robust platform for integrating various software and hardware modules, enabling efficient communication between the development board and other system components, such as sensors and actuators [11]. This modular architecture facilitates the implementation of improvements and adaptation to different types of BCI applications.
On the other hand, the microcontroller plays a fundamental role in controlling the physical components of the system. Its capability to handle multiple inputs and outputs makes it ideal for controlling the stepper motors and other mechanical elements of the robotic arm and wheelchair. The combination of the development board with the microcontroller and ROS creates an integrated system capable of performing complex tasks efficiently and reliably.
This paper details the design and evaluation of a BCI system architecture that combines EEG signals, machine learning techniques with neural networks, and robotic systems, focused on providing solutions for individuals with neurodegenerative diseases, spinal problems, or any other reason that may lead to a reduction in their natural motor abilities. With this architecture, we aim not only to demonstrate the feasibility of the proposed development but also to make a significant contribution to the emerging field of BCI-assisted mobility, paving new paths for the research and development of more advanced and accessible BCI mobility solutions.
The content of this article is structured by initially addressing Section 2, which focuses on the state-of-the-art study of BCI systems to date, followed by a description of the iterative process of defining the architecture in Section 3, in the Research Methodology section. It continues in Section 4 with a description of the proposed architecture, breaking down the study into functional architecture, hardware, and software. Section 5 relates to the results of the development, compliance with identified requirements, and the economic and legal technical feasibility of the proposal, concluding the article with Section 6 on discussion and Section 7 with conclusions and future work that complete the content of the paper.

2. State of the Art

In recent years, as evidenced in published research, there have been significant advancements in the field of BCIs applied to wheelchair control and other assistive devices. In 2021, a notable study by Banach et al. explored the use of Alpha wave-based EEG signals for controlling electric wheelchairs, offering a new mode of interaction for individuals with incurable diseases that severely limit their communication and mobility [12]. This approach marks a critical advancement in aiding individuals “locked-in” their bodies due to severe conditions.
Furthermore, Antoniou et al. (2021) introduced an innovative system using a brain–computer interface to capture EEG signals during eye movement, classifying them into six categories using a random forest classification algorithm [13]. This development is significant in the interpretation of EEG signals for BCI applications [14]. On another note, a 2019 study, extending into 2021, demonstrated a hybrid brain–computer interface (hBCI) combining EEG and EOG signals to control an integrated wheelchair and robotic arm system, showing satisfactory control accuracy and highlighting the potential of BCI-controlled systems in complex daily tasks.
In a similar vein, a 2021 study by researchers explored a facial–machine interface system based on EEG artifacts to enhance mobility in individuals with paraplegia using the Emotiv Neuroheadset. Results indicated that combining eye and jaw movements can be highly efficient, suggesting a practical hybrid BCI system for wheelchair control [15].
Another significant study in 2021 focused on developing an operative motor controller that can extract and read brain signals to convert them into usable commands to act upon the wheels of a wheelchair [16]. Also, a semi-autonomous navigation control system for an intelligent wheelchair was presented, based on asynchronous Steady State Visually Evoked Potential (SSVEP), demonstrating high stability and flexibility [17].
Finally, a work by Olesen et al. (2021) investigated the possibility of a hybrid BCI system combining EEG and EOG signals for the remote control of a vehicle, such as a wheelchair, using machine learning techniques to design a robust and computationally efficient system [18]. These studies represent a significant body of work in the development of BCI technologies to enhance the quality of life for individuals with severe physical disabilities.
Recent research in the realm of EEG-based BCIs has made significant strides, especially in applications for controlling robotic devices like robotic arms and wheelchairs. Here is a structured summary of key developments in this field, reflecting the current state of the art and highlighting principal focus areas, as well as significant challenges and achievements.
Advancements and Utilization of BCI Technologies: BCI systems endeavor to establish a conduit between the human cerebrum and external apparatuses. Prior investigations have underscored the potential of BCIs in maneuvering both virtual entities and tangible objects, such as wheelchairs and quadcopters. Yet, the application of non-invasive BCIs in directing robotic limbs for reach and grasp endeavors poses significant hurdles [13,19].
BCIs for Individuals with Motor Impairments: Persons afflicted with severe neuromuscular conditions or motor system impairments often experience a loss of voluntary muscle control. Nonetheless, many retain the ability to generate neural signals pertinent to motor functions akin to those of unimpaired individuals. This revelation has propelled the exploration of BCIs as a nascent technology capable of decoding cerebral activity in real time, thereby facilitating anthropomorphic manipulation of prosthetic or exoskeletal assistive devices [19].
Non-Invasive BCIs in Robotic Limb Control: The aspiration to control robotic limbs via non-invasive BCIs presents a compelling alternative, albeit with constraints in achieving adept multidimensional manipulation within a three-dimensional ambit. While previous endeavors have predominantly focused on virtual and tangible object control, scant research has ventured into the domain of prosthetic or robotic limb manipulation through scalp-based EEG BCIs [13].
Experimental Framework and Reach and Grasp Tasks: Addressing this lacuna, experiments were meticulously designed, incorporating progressively challenging reach and grasp tasks. A cohort of healthy participants was enlisted to navigate a robotic limb through intricate tasks using non-invasive BCIs, segmented into two phases: initial guidance of the cursor/robotic appendage across a bi-dimensional plane toward an object positioned in a tri-dimensional space, followed by the downward maneuver of the robotic limb to secure the object. This sequential experimental approach effectively minimized the degrees of freedom required for BCI interpretation, thereby simplifying the grasping mechanism within a tri-dimensional context [13].
User Proficiency and Control Retention: Participants exhibited an exceptional aptitude in modulating their cerebral rhythms to govern a robotic limb using our bifurcated non-invasive control schema. They adeptly mastered the manipulation of a robotic limb to seize and reposition arbitrarily located objects within a confined tri-dimensional space, consistently maintaining control proficiency over multiple sessions spanning 2–3 months [20].
This summary reflects the evolution and current state of the art in the development of BCIs for controlling robotic devices, underscoring both the achievements and the outstanding challenges in this research area. Continuation in innovation and refinement of these technologies is essential to overcome current limitations and expand the practical applications of BCIs in assisting individuals with motor disabilities and enhancing human interaction with advanced technologies. Several key challenges for future research in the field of BCIs for controlling robotic limbs can be identified.
Improvement in Accuracy and Speed: Although significant advancements have been made, the accuracy and response speed of BCI systems in controlling robotic limbs can still be limited. Investigating methods to enhance the decoding speed of brain signals and the precision of control would be a valuable area of research.
Integration of Sensory Feedback: Most current BCI systems primarily focus on translating user intentions into commands for the robotic device. Incorporating sensory feedback, such as touch or pressure, into the BCI system could significantly improve natural interaction and control efficacy.
System Adaptability and Learning: Developing BCI systems that can adapt and learn from individual user interactions could enhance personalization and efficiency of control. This includes adapting to changes in the user’s brain signals over time or between sessions [13].
Reduction in User’s Mental Load: Controlling complex devices via BCIs can be mentally demanding for users. Investigating ways to reduce this mental load, possibly through hybrid or shared control systems, is crucial for the long-term viability of BCIs in practical applications.
Improvement in Non-Invasiveness and Comfort: Although non-invasive BCI systems have advanced, they can still be uncomfortable or impractical for prolonged use. Investigating new sensing technologies and algorithms that improve comfort and ease of use without compromising performance would be beneficial [20].
Applications in Rehabilitation and Assistance: Further exploring how BCI systems can be customized and optimized for rehabilitation and assistance applications, especially for individuals with severe motor disabilities, is an area of great potential.
Enhanced Brain–Robot Interface: Developing more intuitive and natural interfaces between the human brain and robotic devices, possibly through improved artificial intelligence and machine learning algorithms, could significantly enhance the usability and acceptance of BCI systems [20].
Ethics and Privacy: As BCI systems become more advanced, ethical and privacy concerns related to the access and use of brain data arise. Addressing these concerns and developing clear ethical guidelines will be crucial for the widespread adoption of BCI technology.
These challenges represent exciting and fundamental areas for future research, with the potential to significantly advance the field of brain–computer interfaces and their application in controlling robotic devices.
Considering the comprehensive review of existing work, our proposed approach is designed to overcome the identified limitations and challenges. Leveraging advanced algorithms and innovative hardware integration, we aim to offer a more robust, efficient, and user-centric solution. By adopting a holistic and adaptive framework, our method not only addresses immediate technical limitations but also paves the way for future improvements. In this way, our approach remains at the forefront of technological advances and offers a scalable and flexible solution that adapts to changing user needs and new challenges in the field.

3. Research Methodology, Iterative Process of Defining the Architecture

We will start by outlining the workflow performed, which can be seen in Figure 1, to develop the definition of the proposed architecture and achieve the results presented.
The workflow commences with an extensive bibliographic review and analysis of existing studies to understand best practices and available technologies in BCIs, robotics, and assistive systems. Based on the research, functional and technical requirements that need to be met are defined. This phase focuses on theory and technical possibilities.
Next, a theoretical model of how functional components interact to fulfil the functional requirements are designed. Then, a theoretical hardware structure, selecting ideal components based on theoretical and comparative analyses, is proposed. From the hardware structure, a theoretical software architecture choosing frameworks, languages, and algorithms based on their theoretical suitability is designed.
Once the architectures are proposed, the theoretical models are evaluated to verify their feasibility. The feasibility analysis conducts a detailed analysis of the technical, economic, and operational feasibility of the proposed architecture. Then, the validation of the requirements defined in the design stage, both functional and technical requirements, is conducted.
This approach ensures a thorough and well-founded development of the architecture before any physical implementation, minimizing risks and ensuring that the design is suitable for the end user’s needs.
A conceptual diagram of the basic architecture can be seen in Figure 2. It is essential to note that we are combining brain data capture technologies through EEG signals from a specific area of the brain to ultimately control a collaborative robot or cobot positioned at a point in space, combining 6 stepper motors in conjunction with a motorized wheelchair.
For this development, the functional requirements of the system and the technical requirements have been identified separately to ensure, firstly, the suitability of the architecture and, secondly, the feasibility of the system and its value proposition.

3.1. Functional Requirements

Table 1 below sets out the functional requirements identified for the development of the research and the necessary architecture, which can be seen in the functional architectures, hardware, and software section of this paper (Figure 3, Figure 4 and Figure 13).
Further information on each of the functional requirements identified in Table 1 can be found in Appendix A.
To group functional and technical requirements into typologies, we can consider several aspects, such as the main functionality of the system, usability, adaptability, security, portability, and others, as you can see next. Here is a proposed grouping with at least five typologies and at least two requirements in each:
  • Functionality and Integration: EEG signal capture; real-time interpretation of EEG signals; system integration with commercial chairs.
  • Control and Operability: Intuitive control of the robotic arm and wheelchair; control of wheelchair movements.
  • Adaptability and Usability: Adaptability to different users; preset routines in robot and robot + chair actions.
  • Safety and Stability: Safety and stability in motion; EEG/ECG signal quality analysis.
  • Portability and Compatibility: Portability and low weight (<100 g; commercial wireless headsets.
Each of these groups covers a set of key aspects of the system, from how it interacts with the user and the environment to how security and usability are ensured. This grouping should provide you with a clear structure for organizing and addressing the functional requirements of the project. These requirement groups will be used to compare the architecture with other previous works.

3.2. Technological Requirements

In Table 2, we can find the identified technical requirements necessary for the development of the proposed architecture.
Further information on each of the technical requirements identified in Table 2 can be found in Appendix A.
To group technical requirements, we can consider aspects such as integration and compatibility, system efficiency and performance, usability and accessibility, autonomy and economic sustainability, and regulatory compliance. Here is a proposed grouping with at least five typologies and at least two requirements in each:
  • Integration and Compatibility: Effective integration of BCI with chair + robot; integration with commercial motorized wheelchairs; multi-platform headset support.
  • Efficiency and Performance: Efficient EEG data processing (t ≤ 1 s); real-time interpretation of EEG signals; robust control system for robotic arm and wheelchair.
  • Usability and Accessibility: Accessible and user-friendly interface; user feedback system.
  • Autonomy and Economic Sustainability: Autonomy of use (≥3 h); low integration cost (<EUR 1500).
  • Regulatory Compliance: Compliance with applicable regulations.
Each of these groups addresses different technical aspects that are fundamental to the development and effective implementation of the system.
These tables and the detailed information for each requirement will serve as a reference for determining the feasibility of the proposed architecture.

4. Architecture

This section presents the proposed architecture to achieve the research objectives and requirements presented in Section 3.1 and Section 3.2 of these documents.
The architecture is described by levels of abstraction, from functional levels through hardware elements to software components.

4.1. Functional Architecture

Figure 3 illustrates the developed functional architecture of the BCI system to control a robot and a motorized wheelchair. The process begins with the capture of EEG signals from the brain, measured in microvolts (µV) and segmented by frequency ranges: Delta, Theta, Alpha, Beta, and Gamma signals. These raw signals, filtered by the TGAM, are then processed, and their values are used as training data in an initial stage and subsequently utilized in real time. LSTM networks are deep learning models that require a previous training process before they can be used to make predictions or classifications. This training allows the LSTM network to learn patterns and relationships in the training data.
Precise hyperparameter tuning is crucial as it can drastically impact LSTM convergence, overfitting, and generalization. Factors like the number of layers, nodes per layer, activations, dropouts, batch size, and optimizers each play a key role. These hyperparameters can be manually adjusted, although an automatic method such as random search is recommended.
Once the LSTM network is trained, the optimized weights and parameters of the model are saved in a file with a .h5 extension. This file basically contains all the “memory” and knowledge captured by the LSTM network during training.
If we wanted to use the LSTM network to classify new data, it is absolutely necessary to load the .h5 file with the trained weights. Otherwise, we would be using an untrained LSTM network, which would provide completely random or erroneous classification results.
The next stage is classification, likely through a DL model or RNN. The model interprets the EEG signals and issues a movement order, which is transmitted to the robot’s control system. This order can be as specific as “M1 + 500”, encoding a particular action of the robot or wheelchair, with different values associated with and varying depending on the number of degrees of freedom of the system (M1 and M2 for wheelchair motors and J1 to J6 for the axes of the robot). In this case, a total of eight degrees of freedom are considered: six from the robot and two from the motorized wheelchair, as the combination of turning the two motors, right and left, facilitates the complete mobility of the wheelchair, according to the user’s commands.
Figure 3 presents the functional architecture of our brain–computer interface (BCI) system, which is designed for controlling robotic devices and motorized wheelchairs. It begins with capturing EEG signals from the brain, which are categorized into various frequency bands like Delta, Theta, Alpha, Beta, and Gamma. The RNN model receives the raw data of these signals as inputs to generate the movement orders. Different DL neural networks can be used, but Long Short-Term Memory (LSTM) networks are recommended due to their proven efficiency in signal classification tasks. In this architecture, it is shown that this RNN has been previously trained (information in h5 file), and here, it is utilized for real-time decision making.
The movement orders are received by the control module. The control module communicates with the wheelchair, and the module controls the robot. This component adapts the abstract commands sent by the classification system to the low-level commands or control signals of the devices. In addition, it oversees all the actions carried out by the system.
On the other hand, the robot control module is in charge of managing the low-level control of the robotic arm. It generates the speed references for the motors to control the position of each joint. As feedback position, it receives the encoder signal of each joint. The control of the robot is managed through a module, possibly a C++ code file (indicated by the .h extension for a header file). Finally, the energy source provides the required power to all the components in the architecture.
Finally, the system provides feedback to the user, likely indicating whether an action has been executed or if there is any error or state that the user needs to be aware of. This feedback loop is essential for the interactivity and real-time error correction of the BCI system.

4.2. Hardware Architecture

The elements that make up the proposed architecture are the following, which can be seen in Figure 4, which describes, in addition to the actual photograph of the hardware device, the main functional characteristics of each of the elements.
Figure 4 shows the hardware architecture of a BCI system, with a complex network of components designed to control robotic devices and motorized wheelchairs. At the core of the acquisition process, EEG signals are captured by the Neurosky and/or Brainlink devices, which capture and read the user’s brainwaves. These signals are then transmitted to the Jetson Nano device, a compact hardware device capable of performing computationally intensive tasks. The signals are transmitted via Bluetooth, which ensures wireless communication for the user’s convenience and mobility.
The Jetson Nano processes EEG signals through an LSTM neural network, a type of recurrent neural network recommended for its ability to handle time series data. This LSTM is responsible for managing the EEG data workflow and generating movement commands based on the analysis performed. The ROS operating system acts as a middleware that enables communication between the various hardware and software components. This process is carried out within the Jetson Nano, optimizing the process of analysis, communication, and management of the orders associated with the EEG signal values.
The signals resulting from the above process are routed to various ports depending on the function to be performed. The USB serial port facilitates direct communication with the Arduino modules responsible for controlling the robot and the system’s safety chain, while the GPIO ports will focus on the direct control of the motorized wheelchair’s movements. The safety signals (ξs) and encoder signals (ξe) are crucial to the operation of the robot: the safety signal serves as an emergency or caution interrupt redundantly, and the encoder signal provides information about the positions of the robot’s joints, controlling the actual position of each of the robot’s axes.
The Arduino Mega + Nano configuration is the microcontroller platform that receives commands from the ROS part of the Jetson Nano, processes these signals, and translates them into physical movements of the NIRYO robot, which is shown with a blue outline, indicating its status as the central element of the system. The Arduino setup is also responsible for handling encoder feedback and issuing safety signals as needed.
The wheelchair interface is represented by a digital/analog signal line (4–20 mA), common in industrial control systems, to manage its operation. Power is supplied by a 12 Vdc supply, and there is a secondary 5 Vdc/4 Amp power supply line, highlighting the system’s need for different voltage levels for various components.
In adherence to stringent safety standards, a dual redundant system based on Arduino technology has been incorporated, which integrates an Arduino Mega with an Arduino Nano, both connected to the Jetson Nano. In turn, these Arduino devices are connected to the safety chain of the electrical wheelchair. This configuration ensures that in the event of a malfunction in any of the devices or an anomaly within the safety chain, the entire system will halt, thereby activating the safety mechanisms of the motorized wheelchair. This setup guarantees continuous monitoring of the system’s performance, effectively precluding any safety hazards during operation. Such a redundant system not only reinforces the reliability of our setup but also aligns with the highest safety norms, ensuring that our design is safe and functional.

4.2.1. EEG Data Acquisition

In our research, the capture of EEG data is conducted using low-cost devices such as Neurosky and Brainlink commercial headbands. This represents a significant advancement in the accessibility, application, and democratization of EEG technology in BCIs. These headbands, unlike traditional EEG systems, offer a practical and non-invasive solution for capturing brain signals, although with certain limitations in terms of resolution and spatial precision since they only collect signals from a specific area of the user’s brain, as described in Figure 5. On the other hand, the use of these headbands, due to their ease of use, favours the comfort and portability of the device for the user, eliminating the need to wear bulky devices or caps with multiple electrodes.
These devices capture a wide range of brain signals from the prefrontal area of the user’s skull, specifically from points (Fp1) as observed in Figure 6, obtaining Delta (1–3 Hz), Theta (4–7 Hz), Alpha (8–13 Hz, subdivided into high and low), Beta (14–30 Hz, subdivided into high and low), and Gamma (31–100 Hz, subdivided into high and low) waves [14].
These devices feature a chip called TGAM integrated into the headbands, which performs essential preprocessing of the data, filtering noise in the extraction of these characteristics [15]. In addition to the brain signals, they provide attention and meditation indices calculated from the processed EEG signals. The voltage amplitudes of these signals vary between 20 and 200 μV [16], with the voltage ranges of the EEG signals presented in Table 3.
A crucial technical aspect of these headbands is their signal capture method. The Neurosky headband, for instance, uses a dry electrode on the forehead and a reference point on the left earlobe to establish a baseline voltage of 0. In contrast, the Brainlink headband uses a reference point on the user’s own forehead. This simplified signal capture approach, though it limits spatial resolution, makes the headbands ideal for BCI applications where the complexity and cost of traditional medical EEG systems are very high.
The operation of the Neurosky headset and the preprocessing of EEG signals involves several stages. First, the raw EEG data are subjected to a common reference and baseline correction process [21]. Subsequently, artifact detection algorithms, such as facial gestures or muscle movements, can be applied to identify and remove unwanted signals [22]. In addition, relevant feature extraction steps are performed from the pre-processed signals for further analysis [23]. These processes are essential to ensure the quality of EEG data and their usefulness in brain–computer interface applications and other research areas.
ThinkGear technology in the Neurosky headband is a system that amplifies the raw brainwave signal, eliminates noise and artifacts, and delivers a digitized brainwave signal [20]. This technology, together with a dry electrode, enables the acquisition of EEG signals for further processing and analysis in brain–computer interface applications and other areas of research. ThinkGear and TGAM are Neurosky’s patented technologies.
Figure 7 provides a comprehensive visualization of EEG signal fluctuations over time across various frequency bands: Delta, Theta, Alpha (low and high), Beta (low and high), and Gamma (low and high), alongside metrics for attention and meditation. The figure demonstrates the complexity and dynamic nature of brain activity, as well as the capability of our system to capture and distinguish between different mental states and cognitive loads. These data are crucial for our analysis as they validate the sensitivity and specificity of our BCI system in detecting and interpreting nuanced patterns of neural activity, which is foundational for accurate system responses.
The eight base signals, from Delta to Gamma, can be observed, along with the attention and meditation values calculated by the Neurosky algorithm [24].
The application of these headbands in signal labelling for BCIs is particularly promising. Initially, our focus was on using motor imagery to label EEG signals. This technique, which involves mentally simulating movements without physical execution, has the potential to enable users to control devices by simply imagining movements. Additionally, we are exploring models based on evoked potentials, such as P300 and SSVEPs, for cases where motor imagery does not provide the desired level of control [25].
However, low-cost EEG headbands present challenges related to the quality and accuracy of the captured signals. Limitations to a single capture point and the use of dry electrodes reduce spatial resolution and data fidelity. Despite this, advances in machine learning algorithms and signal processing are improving the interpretation of these data. For example, the article “Methodologies and Wearable Devices to Monitor Biophysical Parameters Related to Sleep Dysfunctions: An Overview” (2022) by Roberto De Fazio et al. provides an overview of the use of EEG devices in monitoring applications, which can be extrapolated to BCI applications [26].
Looking forward, the combination of these low-cost technologies with sophisticated algorithms promises broader integration of BCIs in various applications. Future research should focus on improving the accuracy of these devices and exploring new methodologies for signal classification. The democratization of access to EEG technology opens a world of possibilities in the field of neuroscience and human–computer interaction, promising significant advances in understanding and manipulating brain activity.

4.2.2. Hardware for Classification and Control

The NVIDIA Jetson Nano (Figure 8), pivotal for classification and control tasks in our BCI project, is a compact AI powerhouse. With its quad-core ARM Cortex-A57 CPU and 128-core Maxwell GPU, it is adept at running parallel neural networks and handling multiple high-resolution streams, courtesy of 4 GB LPDDR4 RAM. Its compatibility with a plethora of interfaces like GPIO and I2C, along with its real-time computational prowess, makes it ideal for real-time EEG signal processing with LSTM networks and robotic control via ROS, providing the necessary speed and parallel processing capabilities for sophisticated robotics and BCI applications.
Table 4 summarizes the key specifications and features of the NVIDIA Jetson Nano.
These specs highlight the Jetson Nano’s capacity for complex, high-speed AI computations and its suitability for advanced BCIs and robotics systems.
The Arduino Mega, powered by the ATmega 2560 (Figure 9), is a versatile microcontroller with 54 digital I/O pins, 16 analogue inputs, and 4 UARTs, suitable for complex projects involving multiple I/O operations, such as robotic control. It is complemented by the Ramps 1.4 shield, which can manage five stepper motors, typical in precision applications such as 3D printing. This shield also supports power controller drivers for high endurance operation and variable power requirements. Together, the Arduino Mega and Ramps 1.4 shield form a powerful duo for a six-axis robotic system, offering the control and power needed to drive multiple motors and axes with precision. This cost-effective and customizable combination is capable of the detailed motion control essential to the requirements of the BCI project, allowing for custom tuning and scalability of the system.
The Arduino Nano, a compact microcontroller board based on the ATmega 328P, plays a critical role in our redundant safety system. Its small footprint and robust functionality make it an ideal choice for embedding within complex systems. In our setup, the Nano serves as a backup controller, working in conjunction with the Arduino Mega to ensure system reliability. Through seamless communication with the Mega, it provides an additional layer of monitoring and control, enhancing the safety and stability of the motorized wheelchair’s operation.

4.2.3. Cobot and Motorized Wheelchair

The integration of advanced technologies in motorized wheelchairs and cobots, such as the Niryo (Figure 10) open-source model, represents a significant advance in assistance for people with reduced mobility [27,28]. This integration not only improves the autonomy of users but also opens new possibilities in terms of control and adaptability.
Modern motorized wheelchairs (Figure 11), offer various control options to suit the individual needs of users. Control methods include joysticks, manual controls, and remote control, each with its own advantages in terms of accessibility and ease of use. For example, a study on “Modelling and control strategies for a motorized wheelchair with hybrid locomotion systems” highlights the importance of versatile control systems in motorized wheelchairs [29].
The proportional control of power wheelchairs (Figure 12) improves manoeuvrability and user comfort by adjusting the movement to the force applied., which is vital for a safe and pleasant driving experience. Studies, such as one on motorized wheelchair implementation for the handicapped, affirm the efficacy of proportional controls. Wheelchair access can be through parallel joystick control or directly to the motor driver, allowing movement control via digital or analogue setpoint signals. Collaborative robots (cobots) like the open-source Niryo model offer adaptability to various environments and tasks, with precise programming capabilities making them suitable for daily assistance and user life quality improvement [27]. Integrating cobots with motorized wheelchairs, such as the Niryo One, a six-axis robot designed for education, research, and industry, fosters new assistance and autonomy avenues. This integration allows cobots to perform tasks while the user navigates, enhancing efficiency and independence.
The Niryo One’s simple, efficient mechanical design, its ease of programming through Niryo Studio, and 3D Unity visualization cater to users across technical proficiencies, enabling immersive learning and operation understanding. Its precision and repeatability are pivotal for EEG signal processing and LSTM network classifications, requiring sensitivity and specificity in tasks involving brain signals. ROS integration expands its application to advanced robotics and BCI projects, making Niryo One an exemplary platform for BCI endeavors that demand seamless hardware–software integration. The combination of its technical capabilities, user-friendly software, and precision control positions Niryo One as an ideal tool for high-level BCI project integration, underscoring the potential of combining advanced control systems and collaborative robotics to enhance autonomy and assistance in BCI applications [30].
Despite advances, there are challenges in integrating these technologies, especially in terms of user interface and accessibility. However, studies such as “Design and Implementation of Hybrid BCI based Wheelchair” show how continuous innovation in wheelchair control can overcome these challenges [31].

4.3. Software Architecture

In this section, we address the integration of EEG signal processing with the robotic control system through ROS. We detail how brain signals are processed using machine learning algorithms and neural networks, enabling precise interpretation of user intentions. This crucial process ensures safe and effective commands for the dynamic control of robotic devices and motorized wheelchairs, highlighting technical innovations and challenges in the brain–computer interface and robotics, based on ROS by the Niryo robot, and other similar robots [27].
As a first approach, Figure 13 presents the diagram of the proposed software architecture, as well as the flow of signals and data necessary for the implementation of the proposed structure.
For this integration, the Jetson Nano device has been used, which allows all the requirements of this development to be integrated into a single device, including each component of the process, its function, specific details, the type of output generated, and the necessary software or libraries.
Having identified the steps for processing the signals in Table 5, we have opted for the NVIDIA Jetson Nano device as our module for execution and data processing.
One of the main advantages of the Jetson Nano is its ability to perform deep learning inferences at the edge, meaning it can process data directly on the device without needing a connection to a central server. This is crucial in BCI applications, where low latency and real-time processing are essential for smooth and effective interaction. This is necessary in our research for the application of controlling wheelchairs or robotic prosthetics, given its capacity to process EEG signals and make real-time decisions, which is fundamental for the safety and efficacy of the system.
Another factor that influenced our choice of the Jetson Nano is its energy efficiency, making it suitable for portable and mobile applications. This is especially relevant in the context of low-cost EEG headbands like Neurosky or Brainlink, where portability and energy efficiency are key considerations. Combining these headbands with a device like the Jetson Nano can facilitate the implementation of BCI systems in non-laboratory environments, increasing their accessibility and applicability in real-life situations.

4.3.1. Real-Time Signal Preprocessing (Jetson Nano)

The Bluetooth connection between the EEG headband and the Jetson Nano initiates the preprocessing of EEG signals, a crucial step in implementing efficient BCIs. This process converts raw EEG signals into a format suitable for real-time analysis and classification, essential for the precision and effectiveness of brainwave classification.
In preprocessing, normalization standardizes the EEG data, while the Fourier Transform and wavelet transforms decompose the signals into frequency components, which is crucial for identifying different cognitive and emotional states. Normalization enhances the robustness of the model, and the Fast Fourier Transform and wavelet transform allow for precise temporal localization of frequencies.
Subasi (2021), in his article, highlights the importance of these techniques in BCIs [32]. Subsequently, the pre-processed EEG signals are classified by neural networks. CNNs and RNNs have proven effective for this, as per Roy et al. (2022) [33].
The integration of these preprocessing techniques with advanced neural networks promises to improve the accuracy and efficacy of BCI systems, opening new possibilities in human–computer interaction and assistance to individuals with motor disabilities.

4.3.2. Classification

The classification of EEG signals, fundamental for brain–computer interfaces, has made significant progress with the incorporation of deep learning and machine learning techniques. Advanced neural networks like CNNs and LSTMs have enhanced efficiency and precision in EEG classification [34,35]. One of the main innovations of our proposed EEG-based control architecture is the integration of a neural short-term memory (LSTM) network to decode brain activity signals in real time. As LSTMs possess innate capabilities to model temporal sequences and learn long-range contextual patterns, they are ideally suited to handle the dynamic and non-linear relationships of EEG data streams.
Once trained, the LSTM model is integrated into our architecture for classifying EEG segments in real time and mapping predictions to wheelchair commands. As users generate distinctive brain patterns via motor imagery, the LSTM classifier assigns probability scores for each control label. On passing preset confidence thresholds, the categorized intents are programmed to trigger corresponding wheelchair movements (e.g., turning left/right, forward/back). This allows intuitive, adaptive wheelchair navigation based solely on decoded brain activity. Moreover, the adaptation of deep transfer networks has shown advantages over traditional methods [36], while feature selection based on minimum entropy has improved performance [37]. The optimization of deep learning models using evolutionary algorithms has increased energy efficiency [34]. Finally, the implementation of simplified models, such as a version of GoogLeNet, has been effective in EEG signal classification [38], highlighting the advancement and practical application of these technologies in BCIs.
Next, we propose a list of possible labels that the LSTM network could send and interpret in ROS to be operational.
WCL, WCR, WCB, and WCF, followed by a value between 0 and 50. The string corresponds to the acronym for Wheelchair Left (WCL), Wheelchair Right (WCR), and so on with backward and forward.
Following the same mechanism, we can do the same with the robot, assigning the letter R for Robot and J to determine the axis that moves, plus a value from 0 to 360 degrees to determine the movement.

4.3.3. LSTM Description

LSTM networks, a subclass of recurrent neural networks (Figure 14), are pivotal in our project for their exceptional ability to process and remember information over extended periods, making them ideal for handling the sequential and temporal nature of EEG signals. Unlike traditional neural networks, LSTMs are designed to avoid the long-term dependency problem, enabling them to remember inputs for long durations with their unique architecture comprising three gates: input, output, and forget gates [39].
Figure 14 shows the structure of a cell unit in the LSTM network. The input gate controls the extent to which a new value flows into the cell, the forget gate regulates the retention of previous information, and the output gate determines the value to be outputted based on input and the memory of the cell. This gating mechanism allows LSTMs to make precise decisions about retaining or discarding information, making them highly effective for tasks requiring the understanding of context over time, such as EEG signal analysis in our BCI project.
In the context of our work, LSTMs are employed to analyse EEG signals for pattern recognition and classification, which is crucial for translating neural activity into commands for assistive technologies. The ability of LSTMs to learn from the temporal sequence of EEG data, recognizing patterns associated with specific neural activities or intentions, underpins the system’s capability to provide accurate and responsive control to the end users. This makes LSTM networks an integral component of our project, bridging the gap between raw EEG signals and actionable outputs in assistive devices, thereby enhancing the interaction between humans and machines in a seamless, intuitive manner. Their integration within our BCI framework not only exemplifies the cutting-edge neural processing technologies but also solidifies the foundation for future advancements in the field [40].

4.3.4. ROS

ROS is a versatile framework like an operating system for robotics, offering hardware abstraction, device control, functionality implementation, inter-process communication, and package management. Its modularity is essential, allowing robots to be created with reusable software components. ROS 2 further advances robotics by focusing on intelligent systems and compatibility with modern technologies.
Integration with hardware such as Arduino is simplified by rosserial, which combines the hardware control of Arduino with the high-level functions of ROS. The programming of nodes in ROS, which communicate to perform computations, is vital and emphasizes publish/subscribe mechanisms, services, and actions for robust robotic systems. The ROS environment favours AI algorithms for advanced automation and intelligent robotics. The use of a Jetson Nano for AI, together with an Arduino with TMC controllers, exemplifies the merging of AI with precise motor control for autonomous robots. This innovative integration of ROS, Arduino, and AI is fundamental to the advancement of intelligent robotics [41].

4.3.5. ROS Nodes on Jetson Nano

Next, in Table 6, we outline the necessary implementation in ROS, focusing on nodes, topics, and basic functions for controlling the robotic arm and wheelchair.
These components and configurations form a comprehensive guide to implementing the system on the Jetson Nano, utilizing ROS for EEG signal processing, control of the robotic arm, and wheelchair navigation [42].
Controlling an electric wheelchair involves using a propulsion and steering system, which can be controlled via electrical signals. In the context of ROS, a node could be developed that subscribes to a specific topic where LSTM network labels are published. This node would interpret the received labels and generate the appropriate control signals for the GPIO terminals that control the wheelchair. The detailed implementation of this node will depend on the specific configuration of the wheelchair and how it is electrically controlled. For example, if the wheelchair uses DC (Direct Current) motors for propulsion, the node will need to generate the appropriate PWM (Pulse Width Modulation) signals to control the speed and direction of the motors. If the wheelchair uses a joystick for control, the node will need to map the LSTM network labels to joystick movements and generate the corresponding electrical signals.

4.3.6. Coding Orders

The development of an advanced robotic application that integrates EEG data capture, interpretation, and prediction using AI on a Jetson Nano platform, using ROS to control a motorized wheelchair in conjunction with a six-degrees-of-freedom cobot and an Arduino Mega as a microcontroller, is a complex process involving multiple stages and components.
The final architecture, which meets all the requirements defined in this research, is shown in Table 7. This table relates the selected physical components with the input and output signals between the devices and the necessary software for their implementation.
Having identified both the functional and technical requirements, as well as the necessary hardware and software devices to meet the requirements and define the proposed architecture in detail, we will proceed to elaborate on each part. To make sense of this definition of methods and methodology, we will follow the timeline of the process and, thus, the data flow.
In summary, this proposed architecture integrates cutting-edge technologies in BCIs, machine learning, robotics, and hardware control to create a solution that can significantly improve the quality of life for patients with neurodegenerative diseases. The key to the success of this project lies in the effective integration of these technologies and in the thorough validation of their functionality and safety, in addition to a substantial cost reduction, focusing on the use of low-cost and/or open-source devices, democratizing this technology.

5. Results

In this section, we present a theoretical examination of the proposed method in comparison with established baselines. It is imperative to note that the scope of this study encompasses theoretical analyses, eschewing empirical evaluations of the proposed system. Such empirical assessments are earmarked for subsequent investigations. This delineation ensures a clear understanding of our current findings, which are intended to illustrate potential improvements and theoretical advancements over existing frameworks. Our aim is to lay a robust groundwork for future empirical studies, thereby contributing both to the academic discourse and to practical applications in the field.

5.1. Comparison with Other Works

After a comparison of the proposed architecture with nine similar developments, in which both the objectives and the way the research was carried out were evaluated, the results are conclusive, especially in the achievement of the technical requirements.
These works were selected after a search of references in Scopus and Google Scholar, where a search was filtered by terms related to this research. The combination of the keywords “EEG”, “ML”, “ROS”, and “wheelchair” has resulted in the following research that has been used to compare our research with research that is very close in concept and format.
  • Ref 1.—SSVEP-Based BCI Wheelchair Control System [43].
  • Ref 2.—Brain-Computer Interface Controlled Robotic Gait Orthosis [44].
  • Ref 3.—Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks [45].
  • Ref 4.—EEG-Based BCIs: A Survey [46].
  • Ref 5.—EEG Wheelchair for People of Determination [47].
  • Ref 6.—BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance [48].
  • Ref 7.—A Literature Review on the Smart Wheelchair Systems [49].
  • Ref 8.—A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface [50].
  • Ref 9.—Real-Time Brain Machine Interaction via Social Robot Gesture Control [51].
The spider chart provided in Figure 15 presents a comparison of functional requirements across a proposed solution and eight references. Notably, both the proposal and reference 3 exhibit remarkable alignment across all the metrics, suggesting a close or identical prioritization of functional aspects. These encompass functionality and integration, control and operability, adaptability and usability, as well as security and stability, and finally, portability and compatibility. In these domains, the proposal and reference 3 reach parallel scores, implying that they may share similar design philosophies or operational objectives.
Contrastingly, the other references display varied degrees of divergence from the proposed solution, with each reference presenting a unique profile of strengths and weaknesses. This diversity in scoring indicates differing emphases on the functional requirements, which may stem from alternative strategic focuses or target user needs. Such discrepancies underscore the necessity for a thorough analysis when benchmarking against multiple frameworks to ensure that the chosen reference aligns with the specific goals and constraints of the project at hand.
In Figure 16, the spider chart delineates a comparative assessment of technical requirements for a proposal against a suite of references. It becomes apparent that the proposal is particularly strong in regulatory compliance, suggesting a conscientious alignment with relevant laws and standards. However, in other technical domains, such as efficiency and performance, usability and accessibility, and autonomy and economic sustainability, the proposal does not reach the benchmark set by some references. This is indicative of a potential trade-off that has been made in favour of compliance over other technical merits.
Differences among the references themselves are also noticeable, with some excelling in areas where others are less capable. Reference 7, for example, shows substantial scores in integration and compatibility, indicating a focus on harmonious system incorporation. In contrast, other references, such as Ref 1 and Ref 9, seem to prioritize efficiency and performance more highly. These variances highlight the diversity of technical strategies and priorities that exist within the field and the importance of carefully selecting a reference that aligns with the specific technical aspirations of a project.
As can be seen in Table 8 and Table 9, as well as in the diagrams in Figure 15 and Figure 16, for the functional requirements part, some of the papers. Specifically, the one referenced with nº 3 could be considered to meet the parameters proposed by our architecture, but on the contrary, this same paper, Ref. 3, is substantially far from the technical objectives; therefore, we can consider that in none of the cases compared, the specifications and requirements that our architecture proposes as a solution are met.

5.2. Analysis of the Requirement Fulfillment

Our architecture meets all the functional and technical requirements detailed in Table 1 and Table 2 of this publication, as shown in the checklists of Table 8 and Table 9. Table 10 shows the degree of compliance with the functional requirements and the justification for compliance.
Appendix B. details the result on the Fulfilment of Functional and Technical Requirements with Other Works.
Appendix C details each of the requirements and their degree of compliance in relation to the functional issues.
Next, in Table 11, we will verify that compliance with the technical requirements is also achieved according to the proposed architecture.
Appendix D details each of these requirements and their degree of compliance in relation to technical issues.

5.3. Economic Viability of the Proposed Architecture

An economic and financial analysis was carried out, considering a scenario of device sales, with the results shown in the following figures and tables.
Figure 17 and Figure 18 shows the result of costs with the two options evaluated.
NPV, IRR, and ROI results for the sales scenario with an initial investment of EUR 6000, necessary for the first year of activity, assuming a rate of return of 10%.
In Table 12, you can see the results of the analysis during the first 5 years, which are fundamental for the survival of the business model.
Figure 19 shows the evolution of annual cash flow for each of the 5 years with the updated initial investment of EUR 6.000.
The annual cash flow reflects the net after subtracting variable costs from the revenue generated by unit sales each year. The initial investment is shown in Year 0 as a negative cash flow, and then a progressive increase in annual cash flow is observed as sales increase.
Moving beyond numbers, the intangible benefits of BCI technology are equally compelling. The independence it affords to individuals with physical disabilities is immeasurable, enhancing their quality of life and autonomy. A BCI system provides constant support, promoting self-reliance and emotional well-being, an advantage that cannot be matched by human assistants. Although the upfront cost is substantial, potential grants and funding opportunities can alleviate this financial challenge. In sum, the long-term economic and emotional gains from a BCI system make a strong case for its adoption, transcending mere financial metrics and profoundly impacting users’ lives.

5.4. Legal and Technical Viability

To assess the legal and technical viability of a BCI project with applications in robotics and assisted mobility, it is essential to carefully consider the following aspects.
  • Intellectual Property Protection:
Patentability of the development is not anticipated, nor is any other type of intellectual property protection at the moment. After reviewing existing patents, no similar models have been found, thus avoiding potential litigation for intellectual property infringement.
The possible patents and/or utility models have been reviewed in the European Patent Office (EPO) and specifically in PATENTSCOPE—Worldwide Inventions within the alternatives of web patent search engines with the following search parameters [52,53]:
  • Search 1: “Assistive cobot in an electric wheelchair”; 0 results.
  • Search 2: “Assistive cobot in a motorised wheelchair”; 0 results.
  • Search 3: “Motorised wheelchair with integrated robot”; 33 results, nonsimilar.
Figure 20, Figure 21 and Figure 22 show the result of the search of the patents found in the PATENTSCOPE browser.

6. Discussion

The use of low-cost commercial EEG headbands like Neurosky or Brainlink represents a significant advancement in the accessibility of BCI technology. These devices, although limited in terms of the number of electrodes and precision compared to more advanced EEG systems, offer a viable solution for applications where portability and cost are critical factors. The ability of these headbands to capture a range of brain signals, including Delta, Theta, Alpha, Beta, and Gamma waves, as well as attention and meditation metrics, makes them suitable for a variety of BCI applications, from device control to monitoring mental well-being.
Real-time preprocessing and classification of EEG signals are fundamental to the effectiveness of BCI systems. Techniques such as normalization, Fourier transformation, and wavelet transformation are essential for filtering noise and extracting significant features from EEG signals. These methods allow for more accurate and efficient classification, crucial for real-time applications like the control of motorized wheelchairs and cobots. Research in this field has demonstrated various strategies to improve accuracy and processing speed, vital for the practical implementation of these technologies.
The integration of systems such as Arduino Mega and Jetson Nano in BCI applications presents unique opportunities for device control. Arduino Mega, with its ability to handle multiple inputs and outputs, is ideal for controlling the mechanical aspects of wheelchairs and cobots. On the other hand, Jetson Nano, with its powerful data processing and machine learning capabilities, is suitable for real-time analysis of EEG signals. This combination of hardware allows for a smooth and efficient interaction between the user and the device, opening new possibilities in assistance for people with disabilities and automation.
In summary, research in the field of brain–computer interfaces using low-cost EEG headbands and their integration with control systems like Arduino and Jetson Nano shows great potential. Although there are limitations in terms of precision and processing capacity, advances in signal processing and classification techniques are opening new avenues for practical and accessible applications in this field.

7. Conclusions and Futures Work Lines

Research in the field of BCIs using low-cost EEG headbands, such as Neurosky or Brainlink, focused on reading brain signals from the prefrontal area, has proven to be promising for a variety of practical applications. These devices, accessible, portable, and non-invasive, offer a viable solution for capturing brain signals, including a range of brain waves and attention and meditation metrics, calculated thanks to their own algorithm. Despite their limitations compared to more advanced EEG systems, these headbands are sufficient for many BCI applications, especially in contexts where portability, ease of use, and cost are critical considerations.
Real-time preprocessing and classification of EEG signals are crucial aspects of the effectiveness of BCI systems. Brain signal processing techniques, such as normalization, Fourier transformation, and wavelet transformation, play an essential role in improving the accuracy and efficiency of classification. These methods extract significant features from EEG signals, which is fundamental for real-time applications, considering that the filtering stage is carried out in the EEG headband itself thanks to its TGAM device. Research in this field has provided valuable insights on how to improve the speed and accuracy of signal processing, which is vital for the practical implementation of BCI technologies.
The integration of systems like Arduino Mega and Jetson Nano in BCI applications has opened new possibilities for device control. While Arduino Mega is ideal for handling the mechanical aspects of devices like motorized wheelchairs and cobots, Jetson Nano, with its data processing and machine learning capabilities, is suitable for real-time analysis of EEG signals and management of the control environment through ROS and ROS-Neuro. This combination of hardware facilitates a smooth and efficient interaction between the user and the device, which is crucial for practical applications in assistance to people with disabilities and in the automation process.
Regarding the control of motorized wheelchairs, research has shown that the integration of BCI systems with proportional controls demonstrates the ability to control a wheelchair using brain signals, opening new possibilities for people with severe motor disabilities and offering greater independence and quality of life.
We can affirm after this research that it is theoretically demonstrated that the use of cobots, such as the open-source model from Niryo, controlled by open-source ROS platforms (ROS-Neuro) in combination with EEG/BCI systems, represents a promising area of application. These cobots, adapted to work alongside humans, can be controlled by brain signals to perform complex tasks, which has significant implications in the industry and especially in personal assistance.
In conclusion, research in the field of brain–computer interfaces using low-cost EEG headbands and their integration with control systems like Arduino and Jetson Nano shows great potential for practical and accessible applications. These developments, as shown in this publication, not only improve the quality of life of people with neurodegenerative diseases but also offer new opportunities in automation and the control of physical devices.
While the current research has focused on mobility assistance and robotic control, future work could explore other potential applications of low-cost BCI systems. This includes areas such as mental health monitoring, neurofeedback therapy, and integration with virtual and augmented reality for various educational and entertainment purposes.
By focusing on this area, the field of brain–computer interfaces can continue to evolve, offering increasingly effective and user-friendly solutions that enhance the lives of individuals, particularly those with disabilities, and pave the way for innovative applications across various sectors.

Author Contributions

Conceptualization, F.R.; methodology, J.E.S. and J.M.C.; formal analysis, F.R.; research, F.R., J.E.S. and J.M.C.; data preservation, F.R.; data, F.R.; writing the original draft, F.R.; writing, revising and editing, J.E.S. and J.M.C., visualization, J.E.S.; supervision, J.E.S. and J.M.C.; project administration, F.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Statement of the Bioethics Committee

The study has been conducted in accordance with ethical standards and, in particular, with the provisions of the Declaration of Helsinki and other applicable regulations, as well as with the provisions of the Universal Declaration on Bioethics and Human Rights (UNESCO, 2005), the General Data Protection Regulation of the European Union (Regulation 2016/679) and Organic Law 3/2018, of 5 December, on the Protection of Personal Data and Guarantee of Digital Rights and approved by the Bioethics Committee of the University of Burgos, U05100001, Burgos (Spain). Approval Code: REGAGE24s00009305342 of Date of approval: 5 February 2024.

Informed Consent Statement

Informed consent is available for all volunteers participating in the study.

Data Availability Statement

The data presented in this study are available on request form the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
BCIBrain–Computer Interface
hBCIHybrid Brain–Computer Interface
TGAMThinkGear ASIC Module
MIMotor Imagery
SSVEPSteady State Visually Evoked Potential
EEGElectroencephalography
EMGElectromyography
EOG(Electrooculography)
AIArtificial Intelligence
CNNsConvolutional Neural Networks
RNNsRecurrent Neural Networks
LSTMLong Short-Term Memory
PIDProportional–Integral–Derivative
GPUsGraphics Processing Units
GPIOGeneral Purpose In and Out
STLStandard Triangle Language
ISOInternacional Organization for Standardization

Appendix A. Functional and Technical Requirements

In this appendix, you can find the details and additional data associated with the description of the functional and technical requirements identified in the proposed architecture and their corresponding justification.

Appendix A.1. Functional Requirements

  • EEG signal capture
Using high-quality EEG headsets ensures a frustration-free experience where the user’s intentions are correctly translated into actions. This is crucial in terms of ergonomics, as it reduces the need for extra effort or repeated attempts to perform a task, improving user comfort and efficiency. In addition, specialized and comfortable hardware minimizes discomfort during prolonged use, which is especially important for users with disabilities who may have additional sensitivities or limitations in mobility.
  • Real-time interpretation of EEG signals
Real-time interpretation of EEG signals is vital for a smooth and responsive user experience. By minimizing latency in the system’s response, users feel they have immediate and natural control, similar to habitual body movements. This not only increases user confidence in the system but also enhances intuitive interaction, reducing cognitive load and improving system ergonomics. Powerful hardware and advanced software enable this rapid interpretation to make the system more user-friendly and accessible, even for users with limited technological skills.
  • Intuitive control of the robotic arm and wheelchair
Intuitive control of the robotic arm and wheelchair is essential for users with limitations to be able to operate the system independently and with confidence. A simplified user interface, designed with ergonomics in mind, facilitates adaptation and learning of the system, making it accessible even for users with little experience in technology. Integration with commercial motorized wheelchair joysticks offers a familiar control option, which can significantly improve comfort and reduce the stress of learning a new system.
  • Safety and stability of movement
Safety and stability in movement are critical aspects for the user’s peace of mind and confidence. Implementing control systems and safety sensors, such as LiDAR, provides an additional layer of protection, ensuring that the environment is safe to operate in. This preventive approach is essential to avoid accidents, which is of utmost importance for users who may have limited mobility or reduced capacity to react to dangerous situations. The ergonomics of the safety design ensures that users can enjoy an experience without fear of injury or damage.
  • Adaptability to Different Users
The adaptability to different users ensures that the system is inclusive and effective for a wide range of needs and preferences. The system must be usable by patients with neurodegenerative diseases, reduced mobility, individuals who have suffered spinal cord injuries, and even those in postoperative stages. This adaptability will be achieved through machine learning algorithms that adjust to the individual characteristics of each user, allowing for personalization of the system and enhancing the experience and ergonomics. This means that the system is not only easier to use but also adapts to the specific ergonomic needs of each individual, resulting in a more comfortable and effective experience. Precise and personalized adaptation is key to ensuring that all users, regardless of their abilities or limitations, can interact with the system effectively and comfortably.
  • Control of the Wheelchair’s Movements
The chair must perform at least the four basic movements necessary to consider the preset movements in the four basic directions and combinations of these basic movements. These directions can be associated with specific values of the Alpha signal, as seen in the study, where controlled variations of the Alpha signal allow a user to select the direction they wish to move. In our case, the selection of the wheelchair’s movement takes as a reference more than one signal, even the values of attention and meditation resulting from the calculation algorithm performed by the headband itself [12].
  • Integration of the System with Commercial Chairs
The system must be able to integrate with commercial chairs, allowing future users to adapt the motorized chair they have. The adaptation must be possible regardless of the brand and model of the chair, always seeking maximum compatibility with the existing chairs and control forms in the market.
  • Portability and Low Weight (<100 g)
A necessary condition to improve the usability of the system. The headband or headbands must be light enough and comfortable for the user to sustain continuous use for at least 3 h, as indicated in the subsequent technical requirements. The maximum bearable weight has been set at 100 g, in line with the weights observed in other wearables with approximate values of 100 g as the maximum weight.
  • EEG/ECG Signal Quality Analysis
To verify the correct functioning of the device, the headband must have its own signal that analyzes the quality of the signal. This way, it can even determine if the user is wearing the headband or not. This value will determine that the system is in use and will avoid unwanted signals and, therefore, unwanted or involuntary movements. This creates an extra layer of safety.
  • Commercial Wireless Headbands
Maintaining the same philosophy in usage ergonomics, it is proposed to use commercial devices that send the collected data of brain activity wirelessly, facilitating their portability and interconnection with other devices without the need for cables that hinder the user experience.
  • Preset Routines in the Actions of the Robot and Robot + Chair
Given the anticipated need for repeated actions over time, which we might consider typical actions, it becomes necessary to predefine certain types of actions that facilitate the control and adaptation of the user to the new tool.

Appendix A.2. Technical Requirements

  • Effective Integration of BCI with Chair + Robot
The effective integration of BCIs with wheelchair and robotic arm systems is crucial for smooth and accurate communication between the user and the device. This is achieved through standard communication protocols such as Bluetooth or Wi-Fi for wireless communication and serial communication for bidirectional communication between control boards, adapting to the latency and range requirements of each device. Software compatibility is also essential and might involve the use of open APIs (Application Programming Interfaces) or middleware software that allows the integration of various devices and platforms. For instance, the use of ROS is ideal for the integration of different hardware and software modules, providing a flexible and scalable platform for robotic device control.
  • Efficient EEG Data Processing (t ≤ 1 s)
To minimize latency and maximize accuracy in EEG data processing, it is crucial to have optimized hardware. Platforms like NVIDIA Jetson or Raspberry Pi equipped with powerful processors and GPUs (Graphics Processing Units) may be suitable for this purpose. These devices can run complex signal processing and machine learning algorithms, such as CNNs, in real time. Optimizing power consumption is also a crucial aspect, which can be achieved through efficient programming techniques and the selection of low-power components.
  • Real-time Interpretation of EEG Signals
For effective real-time interpretation, high-performance processors, and GPUs capable of executing sophisticated algorithms, like CNNs, which can rapidly identify patterns in EEG data, are used. An example of this could be the use of platforms like NVIDIA Jetson, offering AI processing capabilities in compact, energy-efficient devices. Additionally, selecting appropriate algorithms is crucial; for instance, real-time machine learning algorithms must be optimized to ensure there are no significant delays in signal interpretation.
A significant technical challenge is to minimize noise and interference in the EEG signal, which can be achieved through digital filtering techniques and signal amplification. System calibration for each user is also essential, involving adjusting the algorithms to synchronize correctly with the individual brain signals of the user. This requires a machine learning process that can adapt to variations in EEG signals between different users.
In terms of software, operating systems and development frameworks that can efficiently handle real-time data input and facilitate integration with other system components, such as user interfaces and device control systems, are needed. Here, technologies like ROS can play a crucial role, providing a platform for integrating signal processing with real-time control actions.
In summary, real-time interpretation of EEG signals is an advanced technical requirement that combines specialized hardware, sophisticated algorithms, and integrated software design. Its success depends on the accuracy, speed, and adaptability of the system, factors that are essential for providing a smooth and effective user experience, especially for those who rely on these technologies to interact with the world around them.
  • Robust Control System for Robotic Arm and Wheelchair
To ensure precise and safe movements, a robust control system comprising advanced algorithms and reliable hardware components is required. Predictive control algorithms, such as PID (Proportional–Integral–Derivative) controllers, can be used for finely tuning the movements of the robotic arm and wheelchair. Moreover, implementing predefined trajectories for common operations can simplify user interaction and enhance system efficiency. Regarding hardware, high-precision stepper motors and sensors like gyroscopes and accelerometers can be used to improve the precision and stability of movement.
  • Accessible and User-Friendly Interface
Developing an accessible and user-friendly interface is crucial for the user experience. This could involve using advanced graphical interface technologies and comprehensive usability testing. The interfaces should be intuitive and tailored to the user’s needs, possibly incorporating elements of universal design to ensure accessibility. For example, the use of large icons and contrasting colours can aid users with limited vision. Additionally, implementing tactile or auditory feedback can enhance interaction for users with different capabilities, providing a valuable UX (User Experience) measure to adjust the overall functioning of the system.
  • User Feedback System
An effective feedback system is essential for keeping the user informed about the system’s status. This can be achieved through visual or auditory feedback devices, such as LCD screens or speakers. The information provided should be clear and easily understandable, using visual or auditory indicators to alert about the system status or potential errors. This improves the user experience and contributes to the safety and efficiency of the system.
  • Integration with Commercial Motorized Wheelchairs
Compatibility with existing motorized wheelchair systems is an important aspect of ensuring the broad applicability of the system. This may include the development of adapters and interface software that allow integration with commercial joysticks. Mechanical and electrical integration of the system must also be considered.
  • Usage Autonomy (≥3 h)
One of the key points in terms of the system’s usability is its energy aspect; hence, a minimum target usage time of 3 h has been set as the energy threshold for the system to be viable, both for the headband and the wheelchair, which will be the limiting aspects of the system.
  • Multi-platform Headband Support, Windows, Apple, Linux, Android
Connectivity between systems must be ensured, given the need for integration of different elements with various architectures and connectivity modes. We must have multi-platform systems. In our case, we focus on the EEG headband, as it is the key element in data collection, which is fundamental for the operation of our development. Additionally, the ability to connect headbands with mobile devices allows for extra verification in signal collection.
  • Low Integration Cost (<EUR 1500)
The cost of the system, excluding the motorized chair, which in many cases the user already possesses for daily use, should allow for integration at a reasonable cost, understood as a cost not exceeding EUR 1500, including all necessary devices: headband, controller, and robot, as well as integration into the chair and possible modifications for this integration. In the case of needing a specific motorized chair, it will depend on the necessary needs and usage characteristics, not evaluating this part due to the numerous varieties of commercial chairs available.
  • Compliance with Applicable Regulations
All current regulations applicable to this type of device must be complied with. As these are elements used by people, the regulations applicable must be followed, not only because of the type of device but also due to its use and target audience.

Appendix B. Comparative Result on the Fulfillment of Functional and Technical Requirements with Other Works

Appendix B.1. Functional Requirements

Compliance with the functional requirements is then determined for each identified reference.
Functional Requirements Ref 1.Ref 2.Ref 3.Ref 4.Ref 5.Ref 6.Ref 7.Ref 8.Ref 9.
FR 1Functionality and IntegrationCompliant: Used SSVEP to capture EEG signals.Compliant: Records EEG during motor imagery tasks.Compliant: Combines SSVEP and flashing for signal acquisition.Compliant: Reviews EEG signal sensing technologies.Compliant: Employs an EEG helmet to capture brain signals.Compliant: Utilizes BCI to interpret and convert brain signals into motion commands.Compliant: Analyzes various BCI systems for signal acquisition.Compliant: Develops a real-time control approach based on BCI.Compliant: Gathers real-time EEG brain waves to control a robot.
FR 2Compliant: Processes EEG signals in real time to control the chair.Compliant: EEG prediction model for real-time BCI operation.Compliant: Real-time processing for navigation commands.Compliant: Discusses approaches for real-time data streaming.Compliant: Real-time preprocessing and classification of EEG signals.Compliant: Mental control of the chair’s destination with brief training.Compliant: Discusses the importance of real-time interpretation for software.Compliant: Proposes a classification method for rapid response.Compliant: Real-time interaction with a robot via the BCI platform.
FR 7Compliant: System designed to be integrated with electronic chairs.Not applicable: Focused on gait orthoses, not wheelchairs.Compliant: Prototype based on an existing wheelchair.Not applicable: A review study, does not address integration with chairs.Compliant: Designed to couple with commercial wheelchairs.Compliant: Designed for integration with existing wheelchairs.Partial: Discusses the integration of BCI technologies in software but does not detail compatibility with commercial chairs.Not applicable: Focuses on aerial vehicles, not wheelchairs.Not applicable: Focuses on interaction with social robots, not wheelchairs.
FR 3Control and OperabilityCompliant: Intuitive control through SSVEP visual stimuli.Compliant: KMI for intuitive control of the gait orthosis.Compliant: Use of SSVEP and blinking for intuitive control.Not applicable: A review study, does not implement a system.Compliant: Simple commands to control the wheelchair.Compliant: Mental navigation of the chair with safety features.Partial: Reviews software systems but focuses more on the chair than the robotic arm.Not applicable: Focuses on unmanned aerial vehicles, not chairs or robotic arms.Partial: Controls gestures of a robot, not specifically a robotic arm or chair.
FR 6 Compliant: Controls directional movements of the chair.Compliant: Controls gait but focuses on orthoses, not chairs.Compliant: Allows precise navigation within a domestic environment.Not applicable: A review study, does not focus on wheelchairs.Compliant: Enables control of the chair in various directions.Compliant: Allows mental control of the chair’s direction and destination.Partial: Reviews software systems, focusing on navigation and control.Not applicable: Focuses on aerial vehicle control, not wheelchairs.Not applicable: Controls a social robot, not a wheelchair.
FR 5
Adaptabilidad a diferentes usuarios
Adaptability and UsabilityCompliant: Evaluated on subjects of diverse ages and races.Compliant: Tested on subjects with and without disabilities.Compliant: Designed to be adaptable to different users.Compliant: Discusses the adaptability of BCI technologies.Compliant: Designed for individuals with various paralysis conditions.Compliant: Tested on human subjects with adaptability.Compliant: Discusses the adaptability of BCI systems in software.Compliant: Mentions the adaptability of the proposed classification method.Compliant: Allows adaptation through training in imagined movement tasks.
FR 11
Rutinas preestablecidas en las acciones del robot y robot + silla.
Not applicable: Pre-established routines are not mentioned.Not applicable: Focused on gait control, not pre-established routines.Compliant: Use of blinking for actions such as stopping.Not applicable: A review study, does not address pre-established routines.Not applicable: Pre-established routines are not mentioned.Not applicable: Pre-established routines for the wheelchair are not mentioned.Not applicable: A review study, does not specifically address pre-established routines.Not applicable: Focuses on aerial vehicle control, not pre-established routines for chairs or robots.Compliant: Provides specific robot feedback based on the user’s imagined movement performance.
FR 4Security and StabilityCompliant: Design focused on the safe operation of the chair.Compliant: System tested for safe gait control.Compliant: Blinking as a stop command for safety.Not applicable: A review study, does not directly address safety.Compliant: Implements clear commands, including stop, for safety.Compliant: Incorporates ultrasonic sensors for obstacle avoidance.Compliant: Addresses the importance of autonomous navigation and assistance.Not applicable: Focuses on aerial vehicles, not wheelchair safety.Compliant: Provides safe interaction with a social robot.
FR 9Compliant: Applies filters and classification methods to ensure signal quality.Compliant: Uses prediction models to ensure signal quality.Compliant: Employs noise reduction and feature extraction methods.Compliant: Reviews technologies and approaches to improve signal quality.Compliant: DWT for feature extraction and quality assurance.Compliant: Addresses the challenge of noisy signals and their analysis.Compliant: Discusses challenges in signal categorization and proposes solutions.Compliant: Proposes a classification method to enhance real-time accuracy.Compliant: Decodes brain waves associated with imagined body kinematics.
FR 8Portability and CompatibilityNot applicable: The weight or portability of the system is not mentioned.Not applicable: Focused on gait orthoses, not portability.Compliant: Design considers portability and low weight.Not applicable: A review study, does not address portability.Compliant: Emotiv Epoc is lightweight and portable.Not applicable: The weight or portability of the BCI system is not mentioned.Not applicable: A review study, does not directly address portability.Not applicable: Focuses on aerial vehicle control, not BCI system portability.Not applicable: Focuses on robot interaction, not BCI system portability.
FR 10 Compliant: Utilizes wireless EEG for signal capture.Not applicable: The use of wireless headbands is not specified.Compliant: Based on a wireless system to avoid cables.Not applicable: A review study, does not specify wireless headbands.Compliant: Emotiv Epoc is a wireless headband.Not applicable: The use of commercial wireless headbands is not mentioned.Not applicable: A review study, does not specify wireless headbands.Not applicable: Focuses on aerial vehicle control, not wireless headbands.Not applicable: Focuses on robot interaction, not the type of headband used.
This tables provides a more detailed overview of how each study addresses the functional requirements established, providing a justification for compliance or non-compliance with each.

Appendix B.2. Technical Requirements

Detailed table including justifications for each of the technical requirements to be compared.
Technical Requirement Ref 1.Ref 2.Ref 3.Ref 4.Ref 5.Ref 6.Ref 7.Ref 8.Ref 9.
TR 1Integration and CompatibilityCompliant: Integrates BCI with wheelchair control via SSVEP.Partial: Focuses more on gait orthosis than integration with chairs.Compliant: Integrates EEG signals with wheelchair control.Not applicable: A review study, does not implement systems.Compliant: Integrates BCI with wheelchair control.Compliant: Integrates BCI with wheelchair and sensors for navigation.Compliant: Analyzes the integration of BCI systems in software.Not applicable: Focuses on aerial vehicles, not chairs or robots.Compliant: Integrates BCI to control a social robot.
TR 7Compliant: Designed to integrate with electronic wheelchairs.Not applicable: Focused on gait orthoses, not wheelchairs.Compliant: Prototype based on an existing wheelchair.Not applicable: A review study, does not address integration with chairs.Compliant: Designed to couple with commercial wheelchairs.Compliant: Designed to integrate with existing wheelchairs.Partial: Discusses the integration of BCI in software but does not detail compatibility with specific chairs.Not applicable: Focuses on aerial vehicles.Not applicable: Focuses on interaction with a social robot.
TR 9Not applicable: Multi-platform support is not mentioned.Not applicable: Multi-platform support is not mentioned.Not applicable: Multi-platform support is not mentioned.Not applicable: A review study, does not address multi-platform support.Not applicable: Multi-platform support is not mentioned.Not applicable: Multi-platform support is not mentioned.Not applicable: A review study, does not address multi-platform support.Not applicable: Focuses on aerial vehicle control.Not applicable: Focuses on interaction with a robot, not on multi-platform support.
TR 2Efficiency and PerformanceCompliant: Integrates BCI with wheelchair control via SSVEP.Compliant: Designed for real-time operation with low latency.Compliant: Efficient processing for rapid commands.Not applicable: A review study, does not directly address data processing.Compliant: Utilizes real-time processing techniques.Compliant: Enables efficient mental control with brief training.Compliant: Discusses BCI technologies for efficient processing.Compliant: Proposes a classification method for rapid processing.Compliant: Processes EEG signals in real time for interaction.
TR 3 Compliant: Interprets EEG signals in real time for control.Compliant: Uses EEG prediction models for real-time control.Compliant: Interprets signals in real time for navigation.Compliant: Discusses approaches for real-time interpretation.Compliant: Interprets EEG signals in real time for wheelchair control.Compliant: Real-time control of the wheelchair.Compliant: Emphasizes the importance of real-time interpretation.Compliant: Focus on rapid classification for real-time control.Compliant: Real-time interaction with the robot.
TR 4Compliant: Precise control of the chair, robotic arm not mentioned.Partial: Focused on gait orthosis, not on chairs or robotic arms.Compliant: Precise control of the chair, robotic arm not mentioned.Not applicable: A review study, does not directly address control systems.Compliant: Precise control of the chair, robotic arm not mentioned.Compliant: Safe navigation system with obstacle avoidance.Partial: Discusses autonomous navigation systems, not specifically robust control.Not applicable: Focuses on aerial vehicle control.Not applicable: Controls a social robot, not specifically a robotic arm or chair.
TR 5Usability and AccessibilityPartial: Mentions the use of visual stimuli, but interface details are not provided.Not applicable: User interface details are not provided.Compliant: Design considers ease of use with SSVEP and blinking.Not applicable: A review study, does not address user interfaces.Partial: Mentions control via thoughts, but interface details are not provided.Compliant: Intuitive mental control of the wheelchair.Compliant: Discusses adaptive intelligent interfaces for software.Not applicable: Focuses on aerial vehicle control.Compliant: Intuitive interaction with the robot via BCI.
TR 6Not applicable: An explicit feedback system is not mentioned.Not applicable: A feedback system is not detailed.Partial: Blinking may provide an implicit form of feedback.Not applicable: A review study, does not address feedback systems.Not applicable: A feedback system is not mentioned.Not applicable: An explicit feedback system is not mentioned.Compliant: Addresses the importance of feedback in software.Not applicable: Focuses on aerial vehicle control.Compliant: Provides feedback through robot gestures and eye color changes.
TR 8Autonomy and Economic SustainabilityNot applicable: System autonomy is not mentioned.Not applicable: System autonomy is not mentioned.Not applicable: System autonomy is not mentioned.Not applicable: A review study, does not address system autonomy.Not applicable: System autonomy is not mentioned.Not applicable: System autonomy is not mentioned.Not applicable: A review study, does not address autonomy.Not applicable: Focuses on aerial vehicle control.Not applicable: Focuses on interaction with a robot, not on system autonomy.
TR 10Not applicable: Integration cost is not mentioned.Not applicable: Integration cost is not mentioned.Not applicable: Integration cost is not mentioned.Not applicable: A review study, does not address integration cost.Not applicable: Integration cost is not mentioned.Not applicable: Integration cost is not mentioned.Not applicable: A review study, does not address integration cost.Not applicable: Focuses on aerial vehicle control.Not applicable: Focuses on interaction with a robot, not on integration cost.
TR 11
Cumplimiento con regulaciones aplicables
Regulatory ComplianceNot applicable: Compliance with regulations is not mentioned.Not applicable: Compliance with regulations is not mentioned.Not applicable: Compliance with regulations is not mentioned.Not applicable: A review study, does not address compliance with regulations.Not applicable: Compliance with regulations is not mentioned.Not applicable: Compliance with regulations is not mentioned.Not applicable: A review study, does not address compliance with regulations.Not applicable: Focuses on aerial vehicle control.Not applicable: Focuses on interaction with a robot, not on compliance with regulations.
This table provides a detailed overview of how each study addresses the technical requirements set, providing a justification for compliance or non-compliance with each.

Appendix C. Compliance with the Functional Requirements

We will now detail each of the requirements and their degree of compliance.

Appendix C.1. EEG Signals Capture

NeuroSky’s EEG biosensor digitizes and amplifies raw analog brain signals to deliver concise inputs to games, toys, and devices running health and wellness, educational, and research applications. Our brainwave algorithms, developed by NeuroSky neuroscientists and our partner research institutions, have revealed many new ways to interact with our world [54].

Appendix C.2. Real-Time Interpretation of EEG Signals

For the real-time interpretation of EEG signals, generic ROS modules are used for data capture and transmission, such as rosbag for recording sensor data and topic_tools for transforming and filtering data streams in real time. Integration with Python libraries for signal processing and machine learning (such as TensorFlow or PyTorch) is essential to implement and run LSTM or GRU models on EEG data.

Appendix C.3. Intuitive Control of the Robotic Arm and Wheelchair

The key to achieving intuitive control of the robotic arm and wheelchair through ROS lies in the system’s ability to seamlessly integrate BCI signals with mechanical control commands. The moveit module in ROS, for example, enables advanced planning and simulation of robotic arm movements, facilitating the creation of movement trajectories that mimic natural human movements. This means that commands derived from EEG signals can be translated into movements that are intuitively understood and expected by the user, like reaching for an object or moving the wheelchair in a particular direction.
For the wheelchair, the use of nav_stack in ROS allows for the implementation of autonomous and assisted navigation systems. This module can process BCI commands and translate them into smooth and predictable wheelchair movements, adapting to environments and avoiding obstacles, resulting in a navigation experience that feels natural and easy to control for the user.

Appendix C.4. Safety and Stability in Movement

For safety and stability, ROS offers modules like diagnostic_aggregator, which collects and analyzes diagnostic data from various sensors and actuators, and tf for managing space transformations between different system components. These modules help continuously monitor the system’s status and ensure that movements are performed within established safety parameters.

Appendix C.5. Adaptability to Different Users

The headbands have been used with a sample of five different users of varying sex and age ranges, with no notable difficulties in reading data. Additionally, the prefrontal cortex is associated with higher cognitive functions such as decision making, planning, and attention. These functions are often preserved in many neurodegenerative diseases and spinal injuries, allowing patients to use these capabilities to interact with a BCI system.
In diseases like ALS (Amyotrophic Lateral Sclerosis), although motor function is severely affected, cognitive function, including operations in the prefrontal cortex, often remains intact [4].
Different neurodegenerative diseases affect different areas of the brain. For example, Alzheimer’s disease mainly affects memory and cognitive processing, while ALS affects motor function.
The location for EEG data reading should be selected based on the brain areas least likely to be affected by the user’s specific disease. In many cases, the prefrontal cortex remains functional even when other areas, such as motor areas, are compromised.
Finally, placing electrodes in the prefrontal region is relatively easy and comfortable for the user. This accessibility is crucial for systems designed for prolonged or everyday use.

Appendix C.6. Control of Chair Movements

The control of wheelchair movements in the four main directions—forward, backward, left, and right—and their combinations through a BCI relies on advanced interpretation of EEG signals. By using machine learning algorithms such as LSTM or GRU networks, real-time analysis of brain signals is performed to map them into specific movement commands. This technology allows for precise translation of the user’s intentions into concrete actions. Integrated actuation systems within the wheelchair, managed by advanced control software, possibly based on ROS, dynamically respond to these commands, enabling smooth movements in multiple directions [55]. To ensure safety and reliability, the wheelchair is equipped with obstacle detection systems and safety protocols that allow for quick responses to changes in the environment and provide safe operation. This approach not only enhances user autonomy and mobility but also ensures an optimal level of safety and comfort, reflecting a significant advancement in personalized and technological assistance for individuals with mobility limitations.

Appendix C.7. Portability and Light Weight (<100 g)

In the following Table A1, we can see how either of the two selected devices comply with the weight restrictions in order to guarantee the perfect usability and user experience that have been defined in the functional requirements tables.
Table A1. Headband weight.
Table A1. Headband weight.
Neurosky Mindwave Mobile 2Macrotellect Brainlink Lite
PhysicalNet Weight (g): 39
Weight: 90 g

Appendix C.8. EEG/ECG Signal Quality Analysis

As can be seen in Table A2, obtained from the manufacturers’ datasheets, both devices have an indicator and a value called “Signal quality”.
Table A2. Signal quality control.
Table A2. Signal quality control.
Neurosky Mindwave Mobile 2 Macrotellect Brainlink Lite
Can be used to detect poor contact and whether the device is off the head.Can be used to detect poor contact and whether the device is off the head.

Appendix C.9. Commercial Wireless Headsets

As can be seen from the datasheets of the two headsets, both meet the requirements for wireless systems Table A3.
Table A3. Wireless headsets.
Table A3. Wireless headsets.
Neurosky Mindwave Mobile 2 Macrotellect Brainlink Lite
BluetoothTransfer Method: Bluetooth
BT/BLE dual mode moduleTransmission Range (m): <10 m
-BT(SPP) for PC, Mac, AndroidBaud Rate: 57,600
-BLE(GATT) for iOSBluetooth Compatibility:
BT Range: 10 m rangeBluetooth 2.0, Bluetooth3.0,
Bluetooth 4.0 mobile device
Preset routines in robot and robot + chair actions.
To implement preset routines in the robot actions and the robot + wheelchair combination, the capabilities of ROS in terms of motion planning and task sequencing are used. The functional justification for this approach is based on several key ROS components and design considerations.

Appendix C.10. Motion Planning and Task Sequencing

Use of the moveit module in ROS: moveit is a widely used ROS package for robot arm motion planning. It allows the definition of preset routines by configuring specific trajectories and articulation points. This is essential for programming complex robot movements efficiently and accurately.
Implementation of actionlib: actionlib is another package in ROS that is used for the execution of long-running tasks, such as moving the wheelchair from one point to another. It allows for defining pre-configured actions and monitoring their progress, which is crucial to ensure that tasks are performed as intended.
User Interface and Routine Selection: Integration of a user interface with the BCI system allows users to select and activate preset routines. This is performed by specific commands interpreted by the BCI, which then activate the corresponding routines in the robot or in the robot + wheelchair system [56].
The preset routines are customized according to the user’s needs and preferences. This includes adjusting the speed, range of motion, and navigation patterns for the wheelchair, which increases functionality and user comfort.

Appendix C.11. Security and Efficiency

Using packages such as diagnostic_aggregator and tf, ROS provides a real-time monitoring system to ensure safety during the execution of routines. Protocols can be implemented to detect and respond to unforeseen or dangerous situations.
Optimization of Routes and Movements: Advanced motion planning in ROS allows the optimization of robot routes and actions to maximize efficiency and minimize wear and tear.
In short, the implementation of preset routines in robot actions and the robot + wheelchair system through ROS facilitates precise, customizable, and safe control. This significantly improves the user experience, providing a smooth and efficient interaction with the assisted system.

Appendix D. Compliance of the Technical Requirements

We will now detail each of the requirements and their degree of compliance regarding technical issues.
  • Effective Integration of BCI with Wheelchair + Robot
In this context, ROS acts as the central nexus, coordinating communication between the BCI and the wheelchair and robot systems. The BCI data, processed by LSTM or GRU networks, is translated into specific commands that are sent via ROS to the actuators of the robot and wheelchair. The modularity and scalability provided by ROS are fundamental to ensuring fluid and precise interaction between the brain–computer interface and the mechanical systems. Additionally, ROS facilitates the integration of safety and feedback systems, which is essential for reliable and safe control of assisted devices [55].
At this point, a specific node has been identified within the ROS ecosystem, known as ROS-Neuro, which plays a crucial role in this integration. ROS-Neuro specializes in interacting with BCIs, processing and translating EEG signals into useful commands for the system. This node is used as an alternative method to validate the compatibility and effectiveness of the Neurosky Mindwave and Macrotellect Brainlink Lite headsets, which were not originally evaluated with this node. The implementation and testing of ROS-Neuro, especially on a Jetson Nano board, presents a unique challenge since the ROS-Neuro node has not been evaluated in embedded applications on this type of board. Therefore, this provides an opportunity to validate and potentially optimize its functionality with these specific BCI devices and on the Jetson Nano platform [57]. This approach will not only allow the validation of the mentioned BCI devices in a new environment but will also contribute to the robustness and versatility of the ROS-Neuro node in assistive applications [58].
  • Efficient EEG data processing (t ≤ 1 s)
As we can see in Table A4, the data sending rate is less than 1 s, as shown by the “Timestamp” values of the time series of data captured by the EEG headsets. Therefore, the data are processed within the specified time ranges.
Table A4. Real data sample.
Table A4. Real data sample.
TimestampAtten.Medit.DeltaThetaL.AlphaH.AlphaL.BetaH.BetaL.GammaH.GammaSignalKey
54,127 *844825,25314,2253441417218819630686357450Lbutton **
55,056786049465021887913,0028796367431431500None **
56,0386675102,89614,83812,8178565289014,525914423,0720None
61,562515348,23910,9655537626621814752534522220None
62,073774453,75817,5639654051375010,528890439300None
63,11477441,284,256316,37987,06544,60512,99413,04616,701547826None
64,01277442,713,78764,145923939,990686427,5078148187151None
65,0016747277,59721,74616578398380410,485445922300Up **
66,0475463721672229248449549865293190619210Up
67,0176648379977181745169822338488254517820Up
* Time values in [ms]. ** Values labeled for classification.
Following the order defined in the functional requirements table, which matches the temporal requirement of the process, we can see in Figure 3 that the signals taken from a subject of an initial sample of six using the Brainlink headband meet the requested specifications, both for the functional and technical parts. The signals are well defined and structured without gaps in the sample and with coherent values for the attention and meditation data that the headband itself calculates, thanks to its embedded algorithm in the TGAM [59]. It can also be observed that the filtering process and elimination of noise and artifacts, understanding artifacts as the involuntary movements of the subject, which in many cases alter and/or modify the signals obtained by the headband, is correct and falls within the expected values.
  • Real-Time Interpretation of EEG Signals
Real-time interpretation of EEG signals is achieved using recurrent neural networks, such as LSTM or GRU, which are highly effective at processing sequential, temporal data, such as the EEGs obtained from the headsets [35]. These networks have the capability to remember past information, which is crucial for interpreting EEG signals that vary over time. This allows quick and accurate real-time classification of signals, essential for applications requiring immediate responses, such as assisted device control. The architecture of these networks is specially designed to handle the inherent complexities of EEG data, such as variations in amplitude and frequency, ensuring reliable and continuous interpretation.
  • Robust Control System for Robotic Arm and Wheelchair
The control system for the robotic arm and wheelchair greatly benefits from the precision and adaptability of LSTM or GRU networks. These networks can learn and differentiate between a wide range of EEG signal patterns, allowing for detailed and personalized control [60]. The robustness of the system stems from these networks’ ability to handle noisy signals and individual variations in EEG patterns, ensuring consistent and precise control. Furthermore, the learning and adaptation capabilities of these networks mean the system can improve with use, adjusting to the particularities of the user’s EEG signals, which is crucial for sensitive applications such as controlling a robotic arm or a wheelchair.
An LSTM network is a form of RNN designed to handle long-term dependencies in sequential data. Each LSTM cell has the ability to maintain or discard information using structures called gates [61]. These gates are essentially neural network layers that decide what information is relevant to keep or discard during learning.
In the context of controlling a robotic arm or a wheelchair, the EEG signals would be the inputs to the LSTM network. These signals would be processed sequentially, with the network updating its internal state based on the incoming signals and its own past state. The output of the network at each time step could be interpreted as a command or a series of commands for the device.
For instance, if a particular sequence of EEG signals is associated with moving forward, the LSTM network, after being trained on these signals, would learn to associate these sequences with the corresponding activation to move the wheelchair forward [54].
This process allows for fine and adaptive control, as the network can learn and respond to the user’s specific signals, which is crucial for creating a robust and personalized control system for assistive applications.

Appendix D.1. Application in Robotic Arm and Wheelchair Control

In the context of controlling a robotic arm or a wheelchair, the EEG signals would be the inputs to the LSTM network. These signals would be processed sequentially, with the network updating its internal state based on the incoming signals and its own past state. The output of the network at each time step could be interpreted as a command or a series of commands for the device.
  • Accessible and User-friendly Interface
The accessibility and friendliness of the user interface are based on the implementation of universal design principles, ensuring that the interface is understandable and usable by people with a wide range of abilities and experiences. This includes the use of clear iconography, legible text, and contrasting colours for easy visual interpretation. The interface will also incorporate adaptive controls, which can be configured according to the specific needs of the user, such as sensitivity adjustments or customization of input methods, which are crucial for users with motor or cognitive limitations.
Moreover, simplicity in interface design is a priority [62]. This means minimizing visual and functional complexity and avoiding overwhelming the user with unnecessary options or confusing information. Navigation will be intuitive, focusing on easy access to the most important functions. This is achieved through a clear hierarchy of menus and the implementation of step-by-step setup assistants, guiding the user in customizing the system according to their preferences and needs.
Finally, the interface will offer interactive tutorials and integrated support, particularly useful for users who may be new to technology or require additional guidance.
  • User Feedback System
The feedback system is a critical component for enhancing the user’s interaction with the system. It will provide visual and tactile responses to inform the user that their commands have been received and executed correctly. This is especially important in assistive applications, where the user must have confidence in the system. Tactile feedback can be particularly useful for users with visual or auditory limitations [63]. Additionally, the feedback system can include adaptive learning elements, where the user receives feedback on how to improve the effectiveness of their commands, increasing the efficiency and satisfaction of the user with the system over time.
  • Integration with Commercial Motorized Wheelchairs
We can propose two paths for system integration: firstly, targeting the control of the wheelchair’s joystick movement by sending parallel information to the controller or, as a second option, directly sending control signals over the motor regulator (driver) if they allow this type of actuation with external signals. Given the wide range of motorized wheelchairs and manufacturers, the system’s universality must be ensured for integration. To address the integration of a ROS-based control system with commercial wheelchairs, it is crucial to consider both mechanical and electrical aspects. This integration involves adapting hardware and software to ensure smooth interaction between the BCI-ROS system and the wheelchair.
Some of the world’s leading manufacturers of power wheelchairs are Quirumed, Ortopedia ITOMI, Karma Mobility, and Sunrise Medical. These manufacturers offer a variety of power wheelchairs, including compact, foldable, lightweight models with various functionalities [64,65,66]. For example, Quirumed and Ortopedia ITOMI offer power wheelchairs at factory prices, while Karma Mobility specializes in top-quality manual and power wheelchairs [64,66,67]. The MARTINIKA EVO power wheelchair, offered by Ortopedia Silvio, is another example of a comfortable, foldable, and easy-to-manage model [68].
Sunrise Medical manufactures a variety of power wheelchairs under its Magic Mobility and Quickie brands [69]. Some of the models include the following:

Appendix D.2. Magic Mobility

X8 Extreme 4 × 4: An all-terrain power wheelchair that adapts to difficult terrain such as sand, snow, and unpaved roads.

Appendix D.3. Quickie

Sedeo Pro: A high-end standing wheelchair with front-wheel drive and a multi-position seating system.
F35: An economical, detachable model, made in Spain, with 50 amp batteries that offer great autonomy.

Appendix D.4. Mechanical Integration

Mechanically, adaptation generally requires installing additional actuators and sensors compatible with ROS. These may include stepper motors or servo motors for motion control and sensors like optical encoders for position feedback. Mechanical integration should consider the following:
Mounting Compatibility: Ensure that new mechanical components can be mounted on the wheelchair without affecting its basic functionality.
Load Capacity: Assess whether the chair can support the weight and additional strain of the new components.
Modular Design: Prefer solutions that can be adapted to different wheelchair models.
This integration implies the ability to make modifications to the robot itself. Therefore, the choice has been an open-source robot that allows these modifications. Niryo, in its One model, provides not only the source of the design files and STL files necessary for the printing process but also facilitates the entire breakdown of assembly and adjustment of the robot. Thus, redesigning the support pieces is feasible. In Figure A1, we can appreciate the breakdown of the base parts of the robot (Figure A2), allowing in a simple design phase to redesign a base for adjustment to the chair and then proceed to its printing. This process facilitates the integration, adjustment, and maintenance of the robot, adapting the robot to the chair and not the chair to the robot, gaining versatility and integration.
Figure A1. Robot base (STL format).
Figure A1. Robot base (STL format).
Electronics 13 01013 g0a1
Figure A2. Assembling base.
Figure A2. Assembling base.
Electronics 13 01013 g0a2

Appendix D.5. Electrical Integration

Electrical integration involves connecting the ROS-based control system to the wheelchair’s electronics. This may include the following:
Interface with the Wheelchair Controller: Many commercial wheelchairs have their own controllers. A module like ros_serial can be used to establish communication between the ROS system and these controllers.
Power Supply Adaptation: Ensure that the wheelchair’s power supply can handle the additional load of the new electronic components.
Electrical Safety: Implement safety measures to protect both the electronic system and the user, including fuses, emergency switches, and adequate insulation.

Appendix D.6. General Considerations

Software Interoperability: Use ROS to develop custom controllers that can interact with the existing hardware in the wheelchair. This may involve developing specific ROS nodes that communicate with the wheelchair’s protocols.
Scientific Publications and Regulations: Consult the relevant literature and safety regulations to ensure that the integration complies with safety and accessibility standards. This includes following IEEE guidelines in robotics and assistive systems.
In conclusion, the effective integration of a BCI-ROS system with commercial wheelchairs is a multidisciplinary process that requires attention to mechanical and electrical compatibility, such as ROS-Neuro, as well as the development of custom software and adherence to safety regulations [70].
  • Usage Autonomy (≥3 h)
In this aspect, two elements need to be evaluated: the headband, which has its own energy system, rechargeable or not, and the mechanical elements, in this case, the robot, wheelchair, and controller, looking for the most limiting factor as the maximum operational value of the system.
According to the datasheets of the headbands, we find that in the case of Neurosky, its headband has an estimated duration of 9 h as it relies on a non-rechargeable AAA battery. Our other headband model, Brainlink Lite by Macrotellect, has a small rechargeable lithium battery but with a maximum duration of 180 min. For wheelchairs, the usage and type of battery vary, with the following ranges of use. (Source: www.onlinemedical.es/ (accessed on 26 December 2023.))
  • Lithium batteries 20 Ah—4 kg—maximum autonomy 25 km.
  • Gel/AGM batteries 12 Ah—10 kg—maximum autonomy 16 km.
  • Gel/AGM batteries 20 Ah—17 kg—maximum autonomy 20 km.
Considering that, by regulation, a wheelchair cannot exceed 6 km/h for an average value of use and a 20 AH gel/AGM battery with a maximum autonomy of 20 km, we can perform a simple calculation in Equations (A1) and (A2) to determine the maximum usage time.
T = 20   K m 6   K m h = 3.33   h .   m a x   a u t o n o m y ;
Expected   consumption :   C = 20   A h 3.33   h = 6 A
Furthermore, the power consumption of the control unit and the robot must be considered. Since they do not have their own power supply system, they will need to be connected to the wheelchair’s battery for operation.
Considering the maximum consumption forecast according to the manufacturers of the controller and the robot, the maximum consumption of the Jetson Nano is between 5 and 10 watts, depending on the power mode, either by USB (5 V/2 Amp) or by DC jack. For the Niryo robot, the consumption is like that of the Jetson Nano controller, about 10 W, also with a power supply of 5 V/2 Amp for the Arduino Mega system. Therefore, the sum of the two consumptions, applying a high simultaneity coefficient of 80%, we can calculate a consumption of 4 amperes at 5 volts, which makes a total of 20 w × 0.8 = 16 w/h.
In any case, given the low consumption of these devices, it would also be possible to use a power bank like those used by mobile devices that provide this rate of energy.
  • Multi-platform headset support, Windows, Apple, Linux, Android
Accessing the data supplied by the manufacturers Neurosky and Macrotellect, we can find that in both cases, the two selected models, Neurosky Mindwave Mobile 2 and Macrotellect Brainlink lite, support all the required platforms. An extract of the data obtained from the manufacturer’s website is given in Table A5 and Table A6, below [71,72].
Table A5. Multi-platform compatibility.
Table A5. Multi-platform compatibility.
Neurosky Mindwave Mobile 2 Macrotellect Brainlink Lite
Supported platforms: Windows (XP/7/8/10), Mac (OSX 10.8 or later), iOS (iOS 8 or later), and Android (Android 2.3 or later) Compatibility: iOS/Android
With other SDK platforms
  • Low integration costs (<EUR 1500)
To complete this requirement, we assume, as stated in the definition of the requirement, that we do not consider the value of the motorized wheelchair.
Table A6 gives a breakdown of the cost scale.
Table A6. Cost scenario.
Table A6. Cost scenario.
Devices Total Cost Vat Incl.Store
Headband Neurosky Mindwave Mobile 2 EUR 169.90 (Op1)Amazon
Headband Macrotellect Brainlink LiteEUR 239.00 (Op2)Mindtecstore
Jetson Nano Developer KitEUR 194.96RS-online
Ardino Mega + Shield Ramps 1.4EUR 51.40Bricogeek
5 drivers TMC 2208EUR 22.99Amazon
Robot Niryo One (Open Source)EUR 450.00DIY
Material Total CostEUR 889.25 (Op1)
EUR 958.35 (Op2)
The cost breakdown determines that both options fall within the projected cost limit. It should be noted that this does not include the costs related to personnel or development, whose allocation will depend on the potential monetization of the finished product/development.
  • Compliance with Applicable Regulations
In this section, it has been verified that all devices comply with all applicable regulations, detailing the standards applicable to each device.
Firstly, although not a limiting factor of this study, as we largely depend on the commercial wheelchair available to the user, we will address the motorized wheelchair and its regulations. For this purpose, we consulted with the certification entity TÜV Rheinland [73], which indicates that motorized or electric wheelchairs, as they are also known, must undergo the following certification for approval as a medical product. In our case, what we do is integrate with an already certified chair, so not all these certifications will be applicable to us. The selection of partial certification will depend on the degree of modification and the criteria of the certifying entity.
Table A7 shows the regulations applicable to powered wheelchairs, as well as the recommended certifications for their commercialization.
Table A7. Applicable standards for testing of wheelchairs and powered scooters for disabled people.
Table A7. Applicable standards for testing of wheelchairs and powered scooters for disabled people.
StandardDescription
ISO 7176 [74]Mechanical tests for compliance with various sections of the ISO 7176 series of standards describing different test methods for wheelchairs and electric scooters
EN 12182 [75]Assistive products for persons with disability—General requirements and test methods
EN 12183 [76]Manual wheelchairs—requirements and test methods
EN 12184 [77]Electrically powered wheelchairs, scooters, and their chargers—requirements and test methods
ISO 7176-14 [78]Power and control systems for electrically powered wheelchairs and scooters. Requirements and test methods
IEC 60601-1 [79]Electrical safety tests in accordance with standards describing general safety of electrical medical devices Requirements and test methods
ISO 7176-21 [80] and
IEC 60601-1-2 [79]
Electromagnetic compatibility (EMC). Requirements and test methods
ISO 7176-19 [81]Impact test. Requirements and test methods
EN 1021 [82]Fire resistance tests
Cadmium and PAHTests based on European standards
ProposalsIn addition to these specific standards for wheelchairs and electric scooters, it is also recommended to carry out quality management system certification according to ISO 13485 [83] and/or ISO 9001 [84] standards
The next element susceptible to standardization is the Niryo robot, which, in this case, will be subject to the following standards Table A8. UNE Safety Standards in the Robots Sector, which have been extracted from the normative presentation made by AENOR on 28 April 2017 by Mr. Francisco Arribas from the Directorate of Standardization [85].
Table A8. Applicable standards for robots and robotic devices.
Table A8. Applicable standards for robots and robotic devices.
StandardDescription
UNE-EN ISO 10218-1:2012 [86]Robots and robotic devices. Safety requirements for industrial robots. Part 1: Robots. (ISO 10218-1:2011)
UNE-EN ISO 10218-2:2011 [87]Robots and robotic devices. Safety requirements for industrial robots. Part 2: Robot systems and integration. (ISO 10218-2:2011)
UNE-EN ISO 13482:2014 [88]Robots and robotic devices. Safety requirements for non-industrial robots. Non-medical personal assistance robots. (ISO 13482:2014)
In our case, due to the use of the robot, we can consider that we are subject to the UNE-EN ISO 13482:2014 standard as it is a personal assistance robot. However, with the latest considerations of COBOTS or Collaborative Robots brought by Industry 4.0, we must consider the following standards in Table A9.
Table A9. ISO/TS 15066:2016 robots and robotic devices—collaborative robots.
Table A9. ISO/TS 15066:2016 robots and robotic devices—collaborative robots.
StandardDescription
ISO/TS 15066:2016 [89] Specifies safety requirements for collaborative industrial robot systems and the work environment and supplements the requirements and guidance on collaborative industrial robot operation given in ISO 10218-1 and ISO 10218-2
ISO/TS 15066:2016 [89]Applies to industrial robot systems as described in ISO 10218-1 and ISO 10218-2. It does not apply to non-industrial robots, although the safety principles presented can be useful to other areas of robotics
The next item, we analyse is the NVIDIA Jetson Nano control device, which, as an electronic device, must comply with the Declaration of Conformity with the RoHS Directive. Compliance with this directive is described in the following link, provided by an authorized distributor [90].
Lastly, but no less important, is the regulation applicable to headbands, which is shown in the following Table A10.
Table A10. Applicable regulations.
Table A10. Applicable regulations.
Neurosky Mindwave Mobile 2 Macrotellect Brainlink Lite
Electronics 13 01013 i001Safety Certification Bluetooth SIG
FCC/CE/SRRC/ROHS
UN38.3 Lithium Battery
Apple Made for iOS (MFi)

References

  1. Bhattacharyya, S.; Konar, A.; Tibarewala, D.N. Motor Imagery, P300 and Error-Related EEG-Based Robot Arm Movement Control for Rehabilitation Purpose. Med. Biol. Eng. Comput. 2014, 52, 1007–1017. [Google Scholar] [CrossRef]
  2. Chen, X.; Wang, Y.; Nakanishi, M.; Gao, X.; Jung, T.-P.; Gao, S. High-Speed Spelling with a Noninvasive Brain–Computer Interface. Proc. Natl. Acad. Sci. USA 2015, 112, E6058–E6067. [Google Scholar] [CrossRef]
  3. Frontiers|Hybrid Brain–Computer Interface Techniques for Improved Classification Accuracy and Increased Number of Commands: A Review. Available online: https://www.frontiersin.org/articles/10.3389/fnbot.2017.00035/full?ref=https://githubhelp.com (accessed on 8 January 2024).
  4. Huang, Q.; Zhang, Z.; Yu, T.; He, S.; Li, Y. An EEG-/EOG-Based Hybrid Brain-Computer Interface: Application on Controlling an Integrated Wheelchair Robotic Arm System. Front. Neurosci. 2019, 13, 1243. [Google Scholar] [CrossRef]
  5. Jin, J.; Zhang, H.; Daly, I.; Wang, X.; Cichocki, A. An Improved P300 Pattern in BCI to Catch User’s Attention. J. Neural Eng. 2017, 14, 36001. [Google Scholar] [CrossRef]
  6. Khan, M.J.; Hong, K.-S. Hybrid EEG–fNIRS-Based Eight-Command Decoding for BCI: Application to Quadcopter Control. Front. Neurorobotics 2017, 11, 6. [Google Scholar] [CrossRef]
  7. Quadcopter Control in Three-Dimensional Space Using a Noninvasive Motor Imagery-Based Brain–Computer Interface—IOPscience. Available online: https://iopscience.iop.org/article/10.1088/1741-2560/10/4/046003/meta (accessed on 8 January 2024).
  8. Single-Trial Analysis and Classification of ERP Components—A Tutorial—ScienceDirect. Available online: https://www.sciencedirect.com/science/article/abs/pii/S1053811910009067?via%3Dihub (accessed on 8 January 2024).
  9. A Brain Controlled Wheelchair to Navigate in Familiar Environments|IEEE Journals & Magazine|IEEE Xplore. Available online: https://ieeexplore.ieee.org/abstract/document/5462915?casa_token=a8MeRLms4mwAAAAA:uCYHeESHaaFyMvjT_4UeGJOG-nNdx215rOo5S_Moot09Tgj1x5Xg3wEDAIaxMYxQOh_fLfoP4Q (accessed on 8 January 2024).
  10. Soekadar, S.R.; Witkowski, M.; Vitiello, N.; Birbaumer, N. An EEG/EOG-Based Hybrid Brain-Neural Computer Interaction (BNCI) System to Control an Exoskeleton for the Paralyzed Hand. Biomed. Eng. Biomed. Tech. 2015, 60, 199–205. [Google Scholar] [CrossRef]
  11. Witkowski, M.; Cortese, M.; Cempini, M.; Mellinger, J.; Vitiello, N.; Soekadar, S.R. Enhancing Brain-Machine Interface (BMI) Control of a Hand Exoskeleton Using Electrooculography (EOG). J. NeuroEngineering Rehabil. 2014, 11, 165. [Google Scholar] [CrossRef] [PubMed]
  12. Banach, K.; Małecki, M.; Rosół, M.; Broniec, A. Brain-Computer Interface for Electric Wheelchair Based on Alpha Waves of EEG Signal. Bio-Algorithms Med-Syst. 2021, 17, 165–172. [Google Scholar] [CrossRef]
  13. Antoniou, E.; Bozios, P.; Christou, V.; Tzimourta, K.D.; Kalafatakis, K.; Tsipouras, M.G.; Giannakeas, N.; Tzallas, A.T. EEG-Based Eye Movement Recognition Using the Brain–Computer Interface and Random Forests. Sensors 2021, 21, 2339. [Google Scholar] [CrossRef] [PubMed]
  14. Belo, J.; Clerc, M.; Schön, D. EEG-Based Auditory Attention Detection and Its Possible Future Applications for Passive BCI. Front. Comput. Sci. 2021, 3. [Google Scholar] [CrossRef]
  15. Saichoo, T.; Boonbrahm, P.; Punsawad, Y. Facial-Machine Interface-Based Virtual Reality Wheelchair Control Using EEG Artifacts of Emotiv Neuroheadset. In Proceedings of the 2021 18th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON), Chiang Mai, Thailand, 19–22 May 2021; pp. 781–784. [Google Scholar]
  16. Vélez, L.; Kemper, G. Algorithm for Detection of Raising Eyebrows and Jaw Clenching Artifacts in EEG Signals Using Neurosky Mindwave Headset. In Proceedings of the 5th Brazilian Technology Symposium; Smart Innovation, Systems and Technologies; Springer Science and Business Media Deutschland GmbH: Berlin/Heidelberg, Germany, 2021; Volume 202, pp. 99–110. [Google Scholar]
  17. Ping, J.; Wang, F.; Xu, Z.; Bi, J.; Xiao, L. Semi-Autonomous Navigation Control System of Intelligent Wheelchair Based on Asynchronous SSVEP-BCI. In Proceedings of the 2021 IEEE 11th Annual International Conference on CYBER Technology in Automation, Control, and Intelligent Systems (CYBER), Jiaxing, China, 27–31 July 2021; pp. 1–6. [Google Scholar]
  18. Olesen, S.D.T.; Das, R.; Olsson, M.D.; Khan, M.A.; Puthusserypady, S. Hybrid EEG-EOG-Based BCI System for Vehicle Control. In Proceedings of the 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021, Gangwon, Republic of Korea, 22 February 2021; Institute of Electrical and Electronics Engineers Inc.: Piscataway, NJ, USA, 2021. [Google Scholar]
  19. Zhang, M.; Zhang, Q. Conditions for Prosperity and Depression of a Stochastic R&D Model under Regime Switching. Adv. Differ. Equ. 2020, 2020, 173. [Google Scholar] [CrossRef]
  20. Xu, B.; Li, W.; Liu, D.; Zhang, K.; Miao, M.; Xu, G.; Song, A. Continuous Hybrid BCI Control for Robotic Arm Using Noninvasive Electroencephalogram, Computer Vision, and Eye Tracking. Mathematics 2022, 10, 618. [Google Scholar] [CrossRef]
  21. Cao, L.; Li, G.; Xu, Y.; Zhang, H.; Shu, X.; Zhang, D. A Brain-Actuated Robotic Arm System Using Non-Invasive Hybrid Brain-Computer Interface and Shared Control Strategy. J. Neural Eng. 2021, 18, 46045. [Google Scholar] [CrossRef]
  22. Park, S.; Han, C.-H.; Im, C.-H. Design of Wearable EEG Devices Specialized for Passive Brain–Computer Interface Applications. Sensors 2020, 20, 4572. [Google Scholar] [CrossRef] [PubMed]
  23. Michel, C.M.; Brunet, D. EEG Source Imaging: A Practical Review of the Analysis Steps. Front. Neurol. 2019, 10, 325. [Google Scholar] [CrossRef] [PubMed]
  24. Rușanu, O.-A. Python Implementation for Brain-Computer Interface Research by Acquiring and Processing the NeuroSky EEG Data for Classifying Multiple Voluntary Eye-Blinks. In Proceedings of the 5th International Conference on Nanotechnologies and Biomedical Engineering, Chisinau, Moldova, 3–5 November 2021; Tiginyanu, I., Sontea, V., Railean, S., Eds.; Springer International Publishing: Cham, Switzerland; pp. 666–672. [Google Scholar]
  25. Alvarado, O.; Tinoco, D.; Veintimilla, J. Sistema Embebido Para Detección de Somnolencia En Conductores Mediante Señal EEG. In Avances y Aplicaciones de Sistemas Inteligentes y Nuevas Tecnologías; Universidad de Los Andes (ULA): Merida, Venezuela, 2016; p. 9. ISBN 978-980-11-1836-7. [Google Scholar]
  26. De Fazio, R.; Mattei, V.; Al-Naami, B.; De Vittorio, M.; Visconti, P. Methodologies and Wearable Devices to Monitor Biophysical Parameters Related to Sleep Dysfunctions: An Overview. Micromachines 2022, 13, 1335. [Google Scholar] [CrossRef]
  27. Niryo One Documentation. Niryo. 2018. Available online: https://niryo.com/docs/niryo-one/ (accessed on 21 February 2024).
  28. Groshev, M.; Sacido, J.; Martín-Pérez, J. FoReCo: A Forecast-Based Recovery Mechanism for Real-Time Remote Control of Robotic Manipulators. In Proceedings of the SIGCOMM ’22 Poster and Demo Sessions, Amsterdam, The Netherlands, 22–26 August 2022; pp. 7–9. [Google Scholar]
  29. Jorge, A.A.; Riascos LA, M.; Miyagi, P.E. Modelling and Control Strategies for a Motorized Wheelchair with Hybrid Locomotion Systems. J. Braz. Soc. Mech. Sci. Eng. 2021, 43, 46. [Google Scholar] [CrossRef]
  30. Ruman, M.R.; Barua, A.; Mohajan, S.; Paul, D.; Sarker, A.K.; Rabby, M.R. An Implementation of Motorized Wheelchair for Handicapped Persons. In Proceedings of the 2019 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS), Greater Noida, India, 18–19 October 2019; pp. 301–305. [Google Scholar]
  31. Design and Implementation of Hybrid BCI Based Wheelchair|IEEE Conference Publication|IEEE Xplore. Available online: https://ieeexplore.ieee.org/document/9591796 (accessed on 22 December 2023).
  32. Subasi, A.; Tuncer, T.; Dogan, S.; Tanko, D.; Sakoglu, U. EEG-Based Emotion Recognition Using Tunable Q Wavelet Transform and Rotation Forest Ensemble Classifier. Biomed. Signal Process. Control 2021, 68, 102648. [Google Scholar] [CrossRef]
  33. Roy, A.M. Adaptive Transfer Learning-Based Multiscale Feature Fused Deep Convolutional Neural Network for EEG MI Multiclassification in Brain–Computer Interface. Eng Appl Artif Intell 2022, 116, 105347. [Google Scholar] [CrossRef]
  34. Gong, S.; Xing, K.; Cichocki, A.; Li, J. Deep Learning in EEG: Advance of the Last Ten-Year Critical Period. IEEE Trans. Cogn. Dev. Syst. 2021, 14, 348–365. [Google Scholar] [CrossRef]
  35. Aquino-Brítez, D.; Ortiz, A.; Ortega, J.; León, J.; Formoso, M.; Gan, J.Q.; Escobar, J.J. Optimization of Deep Architectures for EEG Signal Classification: An AutoML Approach Using Evolutionary Algorithms. Sensors 2021, 21, 2096. [Google Scholar] [CrossRef]
  36. Nagabushanam, P.; Thomas George, S.; Radha, S. EEG Signal Classification Using LSTM and Improved Neural Network Algorithms. Soft Comput. 2020, 24, 9981–10003. [Google Scholar] [CrossRef]
  37. Xu, G.; Shen, X.; Chen, S.; Zong, Y.; Zhang, C.; Yue, H.; Liu, M.; Chen, F.; Che, W. A Deep Transfer Convolutional Neural Network Framework for EEG Signal Classification. IEEE Access 2019, 7, 112767–112776. [Google Scholar] [CrossRef]
  38. Rahman, M.A.; Khanam, F.; Ahmad, M.; Uddin, M.S. Multiclass EEG Signal Classification Utilizing Rényi Min-Entropy-Based Feature Selection from Wavelet Packet Transformation. Brain Inform. 2020, 7, 7. [Google Scholar] [CrossRef] [PubMed]
  39. Wang, L.; Wang, J.; Wen, B.; Mu, W.; Liu, L.; Han, J.; Zhang, L.; Jia, J.; Kang, X. Enhancing Motor Imagery EEG Signal Classification with Simplified GoogLeNet. In Proceedings of the 2023 11th International Winter Conference on Brain-Computer Interface (BCI), Gangwon, Republic of Korea, 20–22 February 2023; pp. 1–6. [Google Scholar]
  40. Zargar, S.A. Introduction to Sequence Learning Models: RNN, LSTM, GRU. 2021. Available online: https://www.researchgate.net/profile/Sakib-Zargar-2/publication/350950396_Introduction_to_Sequence_Learning_Models_RNN_LSTM_GRU/links/607b41c0907dcf667ba83ade/Introduction-to-Sequence-Learning-Models-RNN-LSTM-GRU.pdf (accessed on 21 February 2024).
  41. Ko, D.-H.; Shin, D.-H.; Kam, T.-E. Attention-Based Spatio-Temporal-Spectral Feature Learning for Subject-Specific EEG Classification. In Proceedings of the 2021 9th International Winter Conference on Brain-Computer Interface (BCI), Gangwon, Republic of Korea, 22–24 February 2021; pp. 1–4. [Google Scholar]
  42. Meng, J.; Zhang, S.; Bekyo, A.; Olsoe, J.; Baxter, B.; He, B. Noninvasive Electroencephalogram Based Control of a Robotic Arm for Reach and Grasp Tasks. Sci. Rep. 2016, 6, 38565. [Google Scholar] [CrossRef] [PubMed]
  43. Zhou, C. SSVEP-Based BCI Wheelchair Control System. arXiv 2023, arXiv:2307.08703. [Google Scholar]
  44. Do, A.H.; Wang, P.T.; King, C.E.; Chun, S.N.; Nenadic, Z. Brain-Computer Interface Controlled Robotic Gait Orthosis. J. Neuroeng. Rehabil. 2013, 10, 111. [Google Scholar] [CrossRef]
  45. Kanungo, L.; Garg, N.; Bhobe, A.; Rajguru, S.; Baths, V. Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks. In Proceedings of the 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Melbourne, VIC, Australia, 17–20 October 2021. [Google Scholar]
  46. Gu, X.; Cao, Z.; Jolfaei, A.; Xu, P.; Wu, D.; Jung, T.-P.; Lin, C.-T. EEG-Based Brain-Computer Interfaces (BCIs): A Survey of Recent Studies on Signal Sensing Technologies and Computational Intelligence Approaches and Their Applications. IEEE/ACM Trans. Comput. Biol. Bioinform. 2020, 18, 1645–1666. [Google Scholar] [CrossRef]
  47. AlAbboudi, M.; Majed, M.; Hassan, F.; Nassif, A.B. EEG Wheelchair for People of Determination. In Proceedings of the 2020 Advances in Science and Engineering Technology International Conferences (ASET), Dubai, United Arab Emirates, 4 February–9 April 2020. [Google Scholar]
  48. Mounir, R.; Alqasemi, R.; Dubey, R. BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance. arXiv 2020, arXiv:2005.04209. [Google Scholar]
  49. Kim, Y.; Velamala, B.; Choi, Y.; Kim, Y.; Kim, H.; Kulkarni, N.; Lee, E.-J. A Literature Review on the Smart Wheelchair Systems. arXiv 2023, arXiv:2312.01285. [Google Scholar]
  50. Vishwanath, R.M.; Kumaar, S.; Omkar, S.N. A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface. arXiv 2018, arXiv:1809.00346. [Google Scholar]
  51. Abiri, R.; Borhani, S.; Zhao, X.; Jiang, Y. Real-Time Brain Machine Interaction via Social Robot Gesture Control. In Dynamic Systems and Control Conference; American Society of Mechanical Engineers: New York, NY, USA, 2017. [Google Scholar]
  52. OMPI—Búsqueda En Las Colecciones de Patentes Nacionales e Internacionales. Available online: https://patentscope.wipo.int/search/es/search.jsf (accessed on 8 January 2024).
  53. Homepage|Epo.Org. Available online: https://www.epo.org/en (accessed on 8 January 2024).
  54. EEG—Electroencephalography—BCI|NeuroSky. Available online: https://developer.neurosky.com/docs/doku.php?id=neurosky_101 (accessed on 8 January 2024).
  55. Sivakanthan, S.; Candiotti, J.L.; Sundaram, A.S.; Duvall, J.A.; Sergeant, J.J.G.; Cooper, R.; Satpute, S.; Turner, R.L.; Cooper, R.A. Mini-Review: Robotic Wheelchair Taxonomy and Readiness. Neurosci. Lett. 2022, 772, 136482. [Google Scholar] [CrossRef] [PubMed]
  56. Cebolla Arroyo, R.; de León Rivas, J.; Barrientos, A. Estructura de control en ROS y modos de marcha basados en máquinas de estados de un robot hexápodo. In Actas de las XXXVIII Jornadas de Automática; Servicio de Publicaciones de la Universidad de Oviedo: Oviedo, Spain, 2017; pp. 686–693. [Google Scholar]
  57. Tonin, L.; Beraldo, G.; Tortora, S.; Menegatti, E. ROS-Neuro: An Open-Source Platform for Neurorobotics. Front. Neurorobotics 2022, 16, 886050. [Google Scholar] [CrossRef]
  58. ROS-Neuro. Available online: https://github.com/rosneuro (accessed on 20 December 2023).
  59. Rușanu, O.-A. A Brain-Computer Interface for Controlling a Mobile Assistive Device by Using the NeuroSky EEG Headset and Raspberry Pi. In Proceedings of the 5th International Conference on Nanotechnologies and Biomedical Engineering, ICNBME 2021, Chisinau, Moldova, 3–5 November 2021; Tiginyanu, I., Sontea, V., Railean, S., Eds.; Springer International Publishing: Cham, Switzerland, 2022; pp. 231–238. [Google Scholar]
  60. Bala, P.; Amob, R.; Islam, M.; Hasan, F.; Uddin, M.N. EEG—Based Load Control System for Physically Challenged People. In Proceedings of the 2021 2nd International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), Dhaka, Bangladesh, 5–7 January 2021; pp. 603–606. [Google Scholar]
  61. Alfredo, C.S.; Adytia, D.A. Time Series Forecasting of Significant Wave Height Using GRU, CNN-GRU, and LSTM. J. RESTI (Rekayasa Sist. Teknol. Inf.) 2022, 6, 776–781. [Google Scholar] [CrossRef]
  62. Memmott, T.; Koçanaoğulları, A.; Lawhead, M.; Klee, D.; Dudy, S.; Fried-Oken, M.; Oken, B. BciPy: Brain–Computer Interface Software in Python. Brain-Comput. Interfaces 2021, 8, 137–153. [Google Scholar] [CrossRef]
  63. Nieto-Vallejo, A.E.; Ramírez-Pérez, O.F.; Ballesteros-Arroyave, L.E.; Aragón, A. Design of a Neurofeedback Training System for Meditation Based on EEG Technology. Rev. Fac. Ing. 2021, 30, e12489. [Google Scholar] [CrossRef]
  64. Sillas de Ruedas Eléctricas Con Motor (22 Productos)|Quirumed. Available online: https://www.quirumed.com/es/ortopedia/sillas-de-ruedas/sillas-de-ruedas-electricas (accessed on 8 January 2024).
  65. Todos los Tipos de Sillas de Ruedas Ortopédicas Eléctrica. Available online: https://www.ortopediamimas.com/movilidad/sillas-de-ruedas-electricas.html (accessed on 8 January 2024).
  66. Sillas de ruedas KARMA|Distribuidor y Fabricante 2019. Available online: https://www.karmamobility.es/ (accessed on 21 February 2024).
  67. Sillas de Ruedas Eléctricas de Todos Los Tipos—Ortopedia ITOMI. Available online: https://www.ortopediaitomi.es/venta-articulos-ortopedia/movilidad/sillas-de-ruedas-electricas (accessed on 8 January 2024).
  68. Silla de ruedas MARTINIKA EVO, eléctrica plegable. Available online: https://www.ortopediasilvio.com/es/sre-000-silla-de-ruedas-electrica/8783-177456-silla-de-ruedas-martinika-evo-electrica-plegable.html (accessed on 8 January 2024).
  69. Sillas de Ruedas, Grúas y Scooters Eléctricos. Available online: https://www.sunrisemedical.es/ (accessed on 8 January 2024).
  70. Valenti, A.; Barsotti, M.; Brondi, R.; Bacciu, D.; Ascari, L. ROS-Neuro Integration of Deep Convolutional Autoencoders for EEG Signal Compression in Real-Time BCIs. In Proceedings of the 2020 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Toronto, ON, Canada, 11–14 October 2020. [Google Scholar]
  71. MindWave. Available online: https://store.neurosky.com/pages/mindwave (accessed on 8 January 2024).
  72. BrainLink Lite. Available online: https://o.macrotellect.com/2020/BrainLink_Lite.html (accessed on 8 January 2024).
  73. Rheinland, T. Ensayos de Sillas de ruedas y Scooters Eléctricos. Available online: https://www.tuv.com/spain/es/ensayos-en-sillas-de-ruedas-y-scooters-el%C3%A9ctricas.html (accessed on 8 January 2024).
  74. ISO 7176; Wheelchairs: Requirements and Test Methods. Available online: https://landingpage.bsigroup.com/LandingPage/Series?UPI=BS%20ISO%207176 (accessed on 8 January 2024).
  75. UNE-EN 12182:2012; Productos de Apoyo para Personas con Discapacidad. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0050343 (accessed on 8 January 2024).
  76. UNE-EN 12183:2012; Manual Wheelchairs—Requirements and Test Methods. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0054120 (accessed on 8 January 2024).
  77. UNE-EN 12184:2012; Electrically Powered Wheelchairs, Scooters and Their Chargers—Requirements and Test Methods. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0054121 (accessed on 8 January 2024).
  78. ISO 7176-14; Wheelchairs—Part 14: Power and Control Systems for Electrically Powered Wheelchairs and Scooters—Requirements and Test Methods. Available online: https://www.iso.org/standard/72408.html (accessed on 8 January 2024).
  79. UNE-EN 60601-1:2008; Equipos electromédicos: Parte 1: Requisitos generales para la seguridad básica y funcionamiento esencial. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0041083 (accessed on 8 January 2024).
  80. ISO 7176-21:2009; Wheelchairs—Part 21. Requirements and Test Methods for Electromagnetic Compatibility of Electrically Powered Wheelchairs and Scooters, and Battery Chargers. Available online: https://www.iso.org/standard/51048.html (accessed on 8 January 2024).
  81. ISO 7176-19:2022; Wheelchairs—Part 19: Wheelchairs for Use as Seats in Motor Vehicles. Available online: https://www.iso.org/standard/71919.html (accessed on 8 January 2024).
  82. UNE-EN 1021-1:2015; Mobiliario. Valoración de la Inflamabilidad. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0054530 (accessed on 8 January 2024).
  83. UNE-EN ISO 13485:2018; Medical devices—Quality management systems—Requirements for regulatory purposes (ISO 13485:2016). (Consolidated Version). Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0060449 (accessed on 8 January 2024).
  84. ISO 9001:2015(es); Sistemas de gestión de la calidad—Requisitos. Available online: https://www.iso.org/obp/ui/#iso:std:iso:9001:ed-5:v1:es (accessed on 8 January 2024).
  85. Arribas, F. Normas Técnicas en Seguridad Robótica. 2017. Available online: https://es.scribd.com/document/371727317/Robot-Norma (accessed on 8 January 2024).
  86. UNE-EN ISO 10218-1; Robots y Dispositivos Robóticos. Requisitos de Seguridad para Robots Industriales. Parte 1: Robots. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0049289 (accessed on 8 January 2024).
  87. UNE-EN ISO 10218-2:2011; Robots y Dispositivos Robóticos. Requisitos de Seguridad para Robots Industriales. Parte 2: Sistemas Robot e Integración. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0048668 (accessed on 8 January 2024).
  88. UNE-EN ISO 13482:2014; Robots y Dispositivos Robóticos. Requisitos de Seguridad para Robots no Industriales. Robots de Asistencia Personal no Médicos. Available online: https://www.une.org/encuentra-tu-norma/busca-tu-norma/norma?c=N0053216 (accessed on 8 January 2024).
  89. ISO/TS 15066:2016; Robots and Robotic Devices—Collaborative Robots. Available online: https://tienda.aenor.com/norma-iso-ts-15066-2016-062996 (accessed on 8 January 2024).
  90. 945-13450-0000-100|JETSON NANO DEVELOPMENT KIT|RS. Available online: https://es.rs-online.com/web/p/kits-de-desarrollo-de-procesadores/1999831 (accessed on 8 January 2024).
Figure 1. Workflow.
Figure 1. Workflow.
Electronics 13 01013 g001
Figure 2. Conceptual diagram of the basic architecture for control of a wheelchair and a cobot.
Figure 2. Conceptual diagram of the basic architecture for control of a wheelchair and a cobot.
Electronics 13 01013 g002
Figure 3. Functional architecture.
Figure 3. Functional architecture.
Electronics 13 01013 g003
Figure 4. Hardware architecture.
Figure 4. Hardware architecture.
Electronics 13 01013 g004
Figure 5. Brainlink headband from Macrotellect (left). Brainwave headband from Neurosky (right).
Figure 5. Brainlink headband from Macrotellect (left). Brainwave headband from Neurosky (right).
Electronics 13 01013 g005
Figure 6. Electrode position (researchgate source).
Figure 6. Electrode position (researchgate source).
Electronics 13 01013 g006
Figure 7. Brain signals obtained with Brainlink headset.
Figure 7. Brain signals obtained with Brainlink headset.
Electronics 13 01013 g007
Figure 8. Jetson Nano device.
Figure 8. Jetson Nano device.
Electronics 13 01013 g008
Figure 9. Arduino Mega + Shield Ramps 1.4 (left); Arduino Nano 33 (right).
Figure 9. Arduino Mega + Shield Ramps 1.4 (left); Arduino Nano 33 (right).
Electronics 13 01013 g009
Figure 10. Open-source NIRYO ONE robot.
Figure 10. Open-source NIRYO ONE robot.
Electronics 13 01013 g010
Figure 11. Basic model of motorized wheelchair.
Figure 11. Basic model of motorized wheelchair.
Electronics 13 01013 g011
Figure 12. Manual control of motorized wheelchair.
Figure 12. Manual control of motorized wheelchair.
Electronics 13 01013 g012
Figure 13. Software architecture.
Figure 13. Software architecture.
Electronics 13 01013 g013
Figure 14. LSTM diagram.
Figure 14. LSTM diagram.
Electronics 13 01013 g014
Figure 15. Functional requirements compliance diagram. Ref 1.—SSVEP-Based BCI Wheelchair Control System [42]; Ref 2.—Brain-Computer Interface Controlled Robotic Gait Orthosis [43]. Ref 3.—Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks [44]. Ref 4.—EEG-Based BCIs: A Survey [45]. Ref 5.—EEG Wheelchair for People of Determination [46]. Ref 6.—BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance [47]. Ref 7.—A Literature Review on the Smart Wheelchair Systems [48]. Ref 8.—A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface [49]. Ref 9.—Real-Time Brain Machine Interaction via Social Robot Gesture Control [50].
Figure 15. Functional requirements compliance diagram. Ref 1.—SSVEP-Based BCI Wheelchair Control System [42]; Ref 2.—Brain-Computer Interface Controlled Robotic Gait Orthosis [43]. Ref 3.—Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks [44]. Ref 4.—EEG-Based BCIs: A Survey [45]. Ref 5.—EEG Wheelchair for People of Determination [46]. Ref 6.—BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance [47]. Ref 7.—A Literature Review on the Smart Wheelchair Systems [48]. Ref 8.—A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface [49]. Ref 9.—Real-Time Brain Machine Interaction via Social Robot Gesture Control [50].
Electronics 13 01013 g015
Figure 16. Technical requirements compliance diagram. Ref 1.—SSVEP-Based BCI Wheelchair Control System [42]; Ref 2.—Brain-Computer Interface Controlled Robotic Gait Orthosis [43]. Ref 3.—Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks [44]. Ref 4.—EEG-Based BCIs: A Survey [45]. Ref 5.—EEG Wheelchair for People of Determination [46]. Ref 6.—BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance [47]. Ref 7.—A Literature Review on the Smart Wheelchair Systems [48]. Ref 8.—A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface [49]. Ref 9.—Real-Time Brain Machine Interaction via Social Robot Gesture Control [50].
Figure 16. Technical requirements compliance diagram. Ref 1.—SSVEP-Based BCI Wheelchair Control System [42]; Ref 2.—Brain-Computer Interface Controlled Robotic Gait Orthosis [43]. Ref 3.—Wheelchair Automation by a Hybrid BCI System Using SSVEP and Eye Blinks [44]. Ref 4.—EEG-Based BCIs: A Survey [45]. Ref 5.—EEG Wheelchair for People of Determination [46]. Ref 6.—BCI-Controlled Hands-Free Wheelchair Navigation with Obstacle Avoidance [47]. Ref 7.—A Literature Review on the Smart Wheelchair Systems [48]. Ref 8.—A Real-Time Control Approach for Unmanned Aerial Vehicles Using Brain-Computer Interface [49]. Ref 9.—Real-Time Brain Machine Interaction via Social Robot Gesture Control [50].
Electronics 13 01013 g016
Figure 17. Cost of production scenario.
Figure 17. Cost of production scenario.
Electronics 13 01013 g017
Figure 18. Selling scenario (5 years).
Figure 18. Selling scenario (5 years).
Electronics 13 01013 g018
Figure 19. Annual cash flow (5 years).
Figure 19. Annual cash flow (5 years).
Electronics 13 01013 g019
Figure 20. Results of the 1st search on Patentscope.
Figure 20. Results of the 1st search on Patentscope.
Electronics 13 01013 g020
Figure 21. Results of the 2nd search on Patentscope.
Figure 21. Results of the 2nd search on Patentscope.
Electronics 13 01013 g021
Figure 22. Results of the 3rd search on Patentscope.
Figure 22. Results of the 3rd search on Patentscope.
Electronics 13 01013 g022
Table 1. Functional requirements tables.
Table 1. Functional requirements tables.
Functional RequirementsWhy
Capture of EEG signalsCorrectly interpret user intentions.
Real-time interpretation of EEG signalsImmediate response of the system.
Intuitive control of the robotic arm and wheelchairFacilitate operation by users with limitations.
Safety and stability in movementEnsure user safety.
Adaptability to different usersUsable by a wide range of patients. Neurodegenerative diseases. Reduced mobility. Spinal cord injuries. Postoperative.
Control of the wheelchair’s movementsPrecise movements in the displacement of the chair, forward, backward, left, right, diagonal forward left, forward right.
Integration of the system with commercial chairsAbility to control commercial motorized chairs without substantial modifications.
Portability and low weight (<100 g)Improved user experience and ease of use.
EEG/ECG signal quality analysisGuarantee of reading and improvement in the usability of the system (EEG headband)
Commercial wireless headbandsWireless connection to avoid cables and problems of tangling and/or accidents. BLE connection.
Preset routines in the actions of the robot and robot + chair.Facilitate repetitive actions, improve the usability of the system.
Table 2. Technical requirements table.
Table 2. Technical requirements table.
Technical RequirementWhy
Effective integration of BCI with chair + robotSmooth communication between BCI and control system
Efficient EEG data processing (t ≤ 1 s)Minimize latency and maximize accuracy
Real-time interpretation of EEG signalsImmediate response of the system
Robust control system for robotic arm and wheelchairPrecise and safe movements
Accessible and user-friendly interfaceFacilitate user interaction with the system
User feedback systemInform the user about the system’s status
Integration with commercial motorized wheelchairsCompatibility with existing systems
Usage autonomy (≥3 h)Limitation of use due to the autonomy of the wheelchair and the headband
Support for multi-platform headbands, Windows, Apple, Linux, AndroidPossibility of connecting the headband with multiple platforms to facilitate the interconnection between devices
Low integration cost (<EUR 1500)The cost of integration should not be a barrier to entry for the end user
Compliance with applicable regulationsCompliance with current regulations and CE marking
Table 3. Frequencies and voltages of captured brain signals.
Table 3. Frequencies and voltages of captured brain signals.
Brain SignalFrequency (Hz)Voltage (V)
Delta Waves1–3 Hz20–200 μV
Theta Waves4–7 Hz10–100 μV
Alpha Waves8–13 Hz20–100 μV 1
Beta Waves14–30 Hz5–30 μV 2
Gamma Waves31–100 Hz>20 μV 3
1 Alpha waves are prominent when a person is relaxed but awake, especially with eyes closed. 2 Beta waves are associated with concentration and active thinking. 3 Gamma waves are associated with higher cognitive processes.
Table 4. Jetson Nano specifications.
Table 4. Jetson Nano specifications.
FeatureSpecification
CPUQuad-core ARM Cortex-A57
GPU128-core NVIDIA Maxwell
Memory4 GB LPDDR4 RAM
InterfacesGPIO, I2C, I2S, SPI, UART
Suitable ApplicationsRobotics, BCI projects
SpecialtiesReal-time processing, AI tasks, multiple neural networks
Table 5. EEG to ROS processing and control structure.
Table 5. EEG to ROS processing and control structure.
ComponentFunctionDetailsOutputSoftware and Libraries
EEG Signal AcquisitionCaptures raw EEG data from the headsetDirect interface with EEG headbandRaw EEG dataEEG headset SDK or API
Signal PreprocessingFilters and normalizes EEG signalsBandpass filters, normalization techniquesProcessed EEG dataPython 3.11 libraries (e.g., scipy, numpy)
Feature ExtractionExtracts relevant features from EEG signalsWavelet transforms, extraction of statistical featuresFeature vectorsPython 3.11 libraries (e.g., pywavelets)
Neural Network ModelClassifies signals using a deep learning modelLSTM or Convolutional Neural Network (CNN)Classification resultsTensorFlow (V2.13),
PyTorch (2.2.1)
Output InterpretationInterprets the output of the neural networkMapping of network output to specific commandsInterpreted commandsCustom script or module
Command GenerationGenerates appropriate commands for robotic controlConversion of network output to control signalsControl commands for ROSCustom script for integration with ROS
Table 6. ROS nodes and topics.
Table 6. ROS nodes and topics.
ComponentFunctionROS NodesTopicsAdditional Notes
EEG Signal Processing Node
Neural Network NodeInterprets EEG signals using LSTM/RNN to generate commandsneural_network_controller/eeg/commands, /control/commandsUtilizes Deep Learning frameworks such as TensorFlow (V2.13) or PyTorch (2.2.1)
Motor Control NodeProcesses commands to control robotic arm motorsmotor_control/motor/commandsRequires knowledge of motor control
Servo Control NodeControls servos for precise movementsservo_control/servo/commandsServo control algorithms necessary
Communication with ArduinoManages communication between ROS and Arduino Megaarduino_communicator/arduino/commands, /arduino/responsesrosserial package can be used for communication
Sensor Data ProcessingProcesses sensor data in robotic arm and wheelchairsensor_data_processor/sensors/raw_data, /sensors/processed_dataSensor data processing for feedback and safety
Robotic Arm ControlCoordinates movements of the robotic arm based on commandsarm_movement_coordinator/arm/commands, /arm/statusIntegration with moveit or similar packages
Wheelchair NavigationManages autonomous navigation of the wheelchairwheelchair navigator/navigation/commands, /navigation/statusIncludes route planning and obstacle avoidance algorithms
Table 7. Hardware/signals/software relationship.
Table 7. Hardware/signals/software relationship.
ComponentHardwareInput SignalsOutput SignalsSoftware
EEG Headsets (Neurosky, Brainlink)Neurosky, Brainlink headsetsUser’s EEG signalsDelta to Gamma EEG signalsSignal processing software
Prefrontal EEG SignalEEG electrodes in headsetsDelta to Gamma EEG signalsProcessed EEG signals for interpretationSignal classification algorithms
Neural Network (Jetson Nano)Jetson NanoProcessed EEG signalsControl commands for robotic arm/wheelchairDeep Learning frameworks (e.g., TensorFlow (V2.13), PyTorch (2.2.1)), LSTM/RNN
ROS Nodes on Jetson NanoJetson NanoCommands from Neural NetworkControl signals to ArduinoROS, rosserial, moveit, navigation stack
Arduino MegaArduino Mega controllerCommands from ROS NodesMotion commands to robotic armFirmware for motor control and feedback
Robotic ArmMotors, sensors, actuatorsMotion commands from ArduinoMovement actionsControl algorithms for robotic arm
Autonomous WheelchairMotors, control systemsNavigation commands from ROSMovement actionsWheelchair control and navigation systems
Table 8. Comparative table of functional requirements.
Table 8. Comparative table of functional requirements.
Functional Requirements ProposalRef 1Ref 2Ref 3Ref 4Ref 5Ref 6Ref 7Ref 8Ref 9
FR 1Functionality and Integration
FR 2
FR 7
100%100%66%100%66%100%100%100%66%66%
FR 3Control and Operability??
FR 6
100%100%100%100%0%100%100%50%0%50%
FR 5Adaptability and Usability
FR 11
100%50%50%100%50%50%50%50%50%100%
FR 4Security and Stability
FR 9
100%100%100%100%50%100%100%100%50%100%
FR 8Portability and Compatibility
FR 10
100%50%0%100%0%100%0%0%0%0%
%Total 100%82%64%100%36%91%73%64%36%55%
Table 9. Comparative table of technical requirements.
Table 9. Comparative table of technical requirements.
Technical
Requirements
ProposalRef 1Ref 2Ref 3Ref 4Ref 5Ref 6Ref 7Ref 8Ref 9
TR 1Integration and Compatibility?
TR 7?
TR 9
100%66%0%66%0%66%66%33%0%33%
TR 2Efficiency and Performance
TR 3
TR 4??
100%100%66%100%33%100%100%66%66%66%
TR 5Usability and Accessibility??
TR 6?
100%0%0%50%0%0%50%100%0%100%
TR 8Autonomy and Economic Sustainability
TR 10
100%0%0%0%0%0%0%0%0%0%
TR 11Regulatory Compliance
100%0%0%0%0%0%0%0%0%0%
% Total 100%45%19%55%9%45%55%45%19%45%
Table 10. Functional requirements compliance tables.
Table 10. Functional requirements compliance tables.
Functional RequirementAchievedJustification
EEG signal captureYesNeurosky and Brainlink EEG headbands
Real-time interpretation of EEG signalsYesSignal processing algorithms, embedded in Jetson Nano and associated with ROS module execution
Intuitive control of the robotic arm and wheelchairYesClassification by IM (Motor Imagery) and/or P300
Safety and stability in movementYesControl systems and safety sensors in the prototype
Adaptability to different usersYesMachine learning algorithms that adapt to each user, supported by ROS
Control of the wheelchair’s movementsYesRational control of movements
Portability and low weight (<100 g)YesCommercial headbands with weights less than 100 g
EEG/ECG signal quality analysisYesThe selected headbands provide and monitor this quality signal value
Commercial wireless headbandsYesBluetooth connection
Integration of the robot in commercial chairsYesPossibility of modifying the robot’s base support as it is an open-source model with Creative Commons SLTs
Preset routines in the actions of the robot and robot + chairYesImplementation of preset movements in ROS nodes
Table 11. Technical requirements compliance tables.
Table 11. Technical requirements compliance tables.
Technical RequirementAchievedJustification
Effective integration of BCI with wheelchair + robotYesThe ROS module allows this integration
Efficient EEG data processing (t = 1 s)YesProcessing times on Jetson Nano with ROS (Benchmark comparison)
Real-time interpretation of EEG signalsYesThe use of the open-source ROS module provides robustness for wheelchair and robot control
Robust control system for robotic arm and wheelchairYesGraphical interface and usability tests in the field
Accessible and user-friendly interfaceYesIntegrated feedback system in the Jetson Nano control device, with ROS nodes
Feedback system for the userYesTesting with various commercial models
Integration with commercial motorized wheelchairsYesMechanical and electrical integration
Usage autonomy (≥3 h)YesCalculation of predicted consumption values
Support for multi-platform headbands: Windows, Apple, Linux, Android.YesMulti-platform headbands with Windows, Linux, iOS, and Android connectivity
Low integration cost (<EUR 1500)YesCost estimation less than EUR 1500.
Compliance with applicable regulationsYesCompliance with the regulations applicable to these devices according to the current applicable standard
Table 12. Financial results.
Table 12. Financial results.
ResultsData
NPVEUR 2448.22
IRR19.49%
ROI210.75%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rivas, F.; Sierra, J.E.; Cámara, J.M. Architectural Proposal for Low-Cost Brain–Computer Interfaces with ROS Systems for the Control of Robotic Arms in Autonomous Wheelchairs. Electronics 2024, 13, 1013. https://doi.org/10.3390/electronics13061013

AMA Style

Rivas F, Sierra JE, Cámara JM. Architectural Proposal for Low-Cost Brain–Computer Interfaces with ROS Systems for the Control of Robotic Arms in Autonomous Wheelchairs. Electronics. 2024; 13(6):1013. https://doi.org/10.3390/electronics13061013

Chicago/Turabian Style

Rivas, Fernando, Jesús Enrique Sierra, and Jose María Cámara. 2024. "Architectural Proposal for Low-Cost Brain–Computer Interfaces with ROS Systems for the Control of Robotic Arms in Autonomous Wheelchairs" Electronics 13, no. 6: 1013. https://doi.org/10.3390/electronics13061013

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop