Skip to Content
  • Proceeding Paper
  • Open Access

28 June 2023

Control of Unmanned Vehicles in Smart Cities Using a Multi-Modal Brain–Computer Interface †

,
and
V.A. Trapeznikov Institute of Control Sciences of Russian Academy of Sciences, Profsoyuznaya Street 65, Moscow 117342, Russia
*
Author to whom correspondence should be addressed.
Presented at the 15th International Conference “Intelligent Systems” (INTELS’22), Moscow, Russia, 14–16 December 2022.

Abstract

The article presents an overview of several studies in the field of Brain–Computer Interfaces (BCIs), the requirements for the architecture of such promising devices, as well as multi-modal BCI for drone control in a smart-city environment. Distinctive features of the proposed solution are the simplicity of the architecture (the use of only one smartphone for both receiving and processing bio-signals from the headset and transmitting commands to the drone), an open-source software solution for signal processing, generating, and sending commands to the unmanned aerial vehicle (UAV), as well as multimodality of the BCI (the use of both electroencephalographic (EEG) and electrooculographic (EOG) signals of the operator). For bio-signal acquisition, we used the NeuroSky Mindwave Mobile 2 headset, which is connected to an Android-based smartphone via Bluetooth. The developed Android application (Tello NeuroSky) processes signals from the headset and generates and transmits commands to the DJI Tello UAV via Wi-Fi. The decrease (depression) and increase of α - and β -rhythms of the brain, as well as EOG signals that occur during blinking were the triggers for UAV commands. The developed software allows the manual setting of the minimum, maximum and threshold values for the processed bio-signals. The following commands for the UAV were implemented: take-off, landing, forward movement, and backwards movement. Two threads of the smartphone’s central processing unit (CPU) were utilized when processing signals in the software to increase the performance: for signal processing (1-D Daubechies 2 (db2) wavelet transform) and updating data on the diagrams, and for generating and transmitting commands to the drone.

1. Introduction

The current concept of a smart city assumes the increase of self-sufficiency of unmanned vehicles (UVs), as well as the amount of data generated and transmitted. The decisions on changing the trajectory and modes of movement will be made either by the UVs themselves or by traffic control centers. The operators in these centers will monitor and control current road/air situations. In addition, in the case of emergencies, the operators will be able to take control of one or more UVs to prevent traffic/air incidents. The rapid development in BCI-based control of robotics, UAVs, and other objects (including in smart environments [1,2,3,4]) supposes the implementation and use of such a way of control of UVs in traffic control centers, including as a backup option. UV control using BCI has the potential to reduce the time of transmitting the commands, as well as to ensure simultaneous control of multiple vehicles by one operator. Thus, the problem of developing methods, techniques, algorithms, and software for the control of UVs in smart cities using BCI is relevant. It should be noted that this paper focuses exclusively on the control of aerial objects—UAVs—but the given solution can be adapted for other similar objects, including ground and surface UVs.
It is customary to allocate two main data-processing and transmission nodes in the loop of UV/UAV control using the operator’s bio-signals:
  • BCI, designed to acquire, convert, and process these bio-signals, classify and detect features;
  • Computer–Machine Interface (CMI), designed to convert the output of BCI to drone/robot/machine-compatible control commands and transmit them to the control object.
Although the general solution for operating a robot/UAV using human bio-signals would be better called the Brain–Machine Interface, the literature also refers to it as the BCI (as a combination of the Brain–Computer Interface itself and the CMI). Hereinafter, the term BCI will be used [5].
The work is structured as follows. Section 2 presents a description of existing BCI-based UAV control solutions (related work), the types of bio-signals used, and their analysis. Section 3 provides the requirements for promising BCIs for UV control in a smart-city environment, as well as the aim of the research. Section 4 presents the description of the architecture of the proposed BCI, its hardware, and software. Section 5 presents a discussion on the current state of BCI-based solutions for UAV control, including in smart cities.

3. Requirements for a Promising BCI and Problem Statement

Based on the review of the considered studies that propose different approaches, methods, and algorithms for BCI-based drone control, as well as the shortcomings identified, it is advisable to highlight the following four main requirements for the promising solutions:
1.
The implementation of at least 10 control commands (take-off, land, move forward, move backward, move left, move right, move up, move down, turn left, and turn right).
2.
Ease of control and switching between actions, for example, using different bio-signals to generate different commands within multi-modal BCI.
3.
Simplicity of BCI architecture, and the use of the minimum necessary hardware and software stack for drone control. It is preferable to use open-source software solutions.
4.
Taking into account the unique features of the brain activity of each operator, including adaptive adjustments via the use of machine-learning methods when processing bio-signals.

Problem Statement

In this paper, we aim to develop the architecture of the BCI and its corresponding software that meet the third and second (partially) items of the above requirements. The aim of the work is to simplify the hardware components of the BCI, as well as to develop an open-source solution for processing bio-signals and generating commands for a drone. The multimodality of the developed BCI involves the use of both EEG and EOG signals.

4. BCI Architecture, Hardware and Software

This paper uses the proposed concept of non-invasive BCI, as well as methods of EEG signal retrieval and processing [22,23]. To acquire EEG and EOG signals, we used the NeuroSky Mindwave Mobile 2 headset, which was connected to an Android-based smartphone via Bluetooth. The developed Android application—Tello NeuroSky (v. 1.0)—processes the signals received from the headset, generates drone-compatible commands, and transmits them to the DJI Tello UAV via Wi-Fi. The architecture of the proposed BCI and the equipment used are shown in Figure 1 and Figure 2, respectively.
Figure 1. The architecture of the proposed BCI-based UAV control solution.
Figure 2. BCI equipment and hardware.
The developed software allows the use of the increase and decrease of α - and β -rhythms and EOG signals from blinking as triggers to generated commands for UAV. The application implements manual adjustment of minimum, maximum, and threshold values for α - and β -rhythms, as well as for EOG signals. The following commands have been implemented: two blinks in a row—take-off; three blinks in a row—land; increase of β -rhythms (above the threshold and α -rhythms)—move forward by 20 cm; increase of α -rhythms (above the threshold and β -rhythms)—move backward by 20 cm. Due to the high computational load on the smartphone CPU, the software uses two separate threads: one for signal processing (1-D Daubechies 2 (db2) wavelet transform) and visualizing data on charts, and another one for generating and transmitting commands to the drone. Figure 3 shows the graphical user interface (GUI) of the developed software.
Figure 3. Software GUI (a) detection of two blinks in a row; (b) detection of three blinks in a row; (c) detection of α -rhythm amplification and β -rhythm depression; (d) detection of β -rhythm amplification and α -rhythm depression.

5. Discussion

Within the formed list of requirements for a promising BCI-based solution for UAV control (including in smart cities), we managed to implement the third as well as part of the second requirement. The simplicity of the architecture of the proposed solution lies in the use of only one smartphone for retrieving and processing bio-signals, and for transmitting control commands to the drone; the software solution is open-source and free to the public. The developed BCI is multi-modal: the tracking of changes in α - and β -rhythms and eye-blinking are used to form different commands for the UAV. We plan to introduce additional commands for the UAV: changes in μ -rhythms of the brain can be used to implement left/right movement. In addition, machine-learning methods and a neural network will be introduced to better detect blinking and take into account the individual characteristics of the bio-signals of each operator. This paper focuses exclusively on the control of aerial objects—UAVs—though the proposed solution can be adapted for other objects, including ground and surface UVs. The software code for the presented solution is available at [24].

6. Conclusions

The article presents an overview of the current state of research in BCI for UAV control and requirements for similar promising solutions, and proposes multi-modal BCI to control drones in a smart-city environment. The developed software allows the use of the depression and amplification of α - and β -rhythms of the brain and EOG signals that occur during blinking as commands to control the flight of aerial objects. Four commands are implemented: take-off, land, move forward, and move backward.
The distinctive features of the proposed solution are the simplicity of the architecture. Only one Android-based smartphone is used for both receiving and processing signals from the headset and generating and transmitting commands to the drone. The above functions are implemented in the developed open-source publicly available software. The proposed BCI is multi-modal since both EEG and EOG signals are being processed. It is planned to introduce additional commands for a UAV (move left/right) and implement a neural network and machine-learning methods to take into account the individual characteristics of bio-signals of each operator. The presented solution can be adapted to control other mobile objects in smart cities, including ground and surface UVs.

Author Contributions

Data curation, D.W.; formal analysis, D.W. and M.M.; funding acquisition, E.J.; investigation, D.W.; methodology, E.J.; project administration, E.J.; resources, D.W.; software, D.W.; supervision, E.J.; validation, M.M.; visualization, M.M.; writing—original draft, M.M.; writing—review and editing, M.M. All authors have read and agreed to the published version of the manuscript.

Funding

The reported study was partially funded by RFBR, project number 19-29-06044.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data available in a publicly accessible repository that does not issue DOIs. URL: https:/github.com/Runsolar/tello-neurosky.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Latif, M.Y.; Naeem, L.; Hafeez, T.; Raheel, A.; Saeed, S.M.U.; Awais, M.; Alnowami, M.; Anwar, S.M. Brain computer interface based robotic arm control. In Proceedings of the 2017 International Smart Cities Conference (ISC2), Wuxi, China, 14–17 September 2017; pp. 1–5. [Google Scholar]
  2. Al-Turabi, H.; Al-Junaid, H. Brain computer interface for wheelchair control in smart environment. In Proceedings of the Smart Cities Symposium 2018, Zallaq, Bahrain, 22–23 April 2018; pp. 1–6. [Google Scholar]
  3. Li, Y.; Zhang, F.; Yang, Y. Smart House Control System Controlled by Brainwave. In Proceedings of the 2019 International Conference on Intelligent Transportation, Big Data & Smart City (ICITBS), Changsha, China, 12–13 January 2019; pp. 536–539. [Google Scholar]
  4. Thum, G.E.; Gaffar, A. The future of brain-computer interaction: How future cars will interact with their passengers. In Proceedings of the 2017 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computed, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI), San Francisco, CA, USA, 4–8 August 2017; pp. 1–5. [Google Scholar]
  5. Duarte, R.M. Low Cost Brain Computer Interface System for AR.Drone Control. Master’s Thesis, Universidade Federal de Santa Catarina Centro Tecnológico Programa de Pós-Graduação em Engenharia de Automação e Sistemas, Florianópolis, Brazil, 2017. [Google Scholar]
  6. Hekmatmanesh, A.; Nardelli, P.H.J.; Handroos, H. Review of the State-of-the-Art of Brain-Controlled Vehicles. IEEE Access 2021, 9, 110173–110193. [Google Scholar] [CrossRef]
  7. Värbu, K.; Muhammad, N.; Muhammad, Y. Past, Present, and Future of EEG-Based BCI Applications. Sensors 2022, 22, 3331. [Google Scholar] [CrossRef] [PubMed]
  8. Villegas, I.D.; Camargo, J.R.; Perdomo, C.C.A. Recognition and Characteristics EEG Signals for Flight Control of a Drone. IFAC-PapersOnLine 2021, 54, 50–55. [Google Scholar] [CrossRef]
  9. Sun, S.; Ma, J. Brain Wave Control Drone. In Proceedings of the 2019 International Conference on Artificial Intelligence and Advanced Manufacturing (AIAM), Dublin, Ireland, 16–18 October 2019; pp. 300–304. [Google Scholar]
  10. Tezza, D.; Garcia, S.; Hossain, T.; Andujar, M. Brain eRacing: An Exploratory Study on Virtual Brain-Controlled Drones. In Virtual, Augmented and Mixed Reality. Applications and Case Studies, Proceedings of the 11th International Conference, VAMR 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, 26–31 July 2019; Springer International Publishing: Cham, Switzerland, 2019; Volume 11575, pp. 150–162. [Google Scholar]
  11. Al-Nuaimi, F.A.; Al-Nuaimi, R.J.; Al-Dhaheri, S.S.; Ouhbi, S.; Belkacem, A.N. Mind Drone Chasing Using EEG-Based Brain Computer Interface. In Proceedings of the 2020 16th International Conference on Intelligent Environments (IE), Madrid, Spain, 20–23 July 2020; pp. 74–79. [Google Scholar]
  12. Chiuzbaian, A.; Jakobsen, J.; Puthusserypady, S. Mind Controlled Drone: An Innovative Multiclass SSVEP based Brain Computer Interface. In Proceedings of the 2019 7th International Winter Conference on Brain-Computer Interface (BCI), Gangwon, Republic of Korea, 18–20 February 2019; pp. 1–5. [Google Scholar]
  13. Kim, S.; Lee, S.; Kang, H.; Kim, S.; Ahn, M. P300 Brain—Computer Interface-Based Drone Control in Virtual and Augmented Reality. Sensors 2021, 21, 5765. [Google Scholar] [CrossRef]
  14. Dumitrescu, C.; Costea, I.-M.; Semenescu, A. Using Brain-Computer Interface to Control a Virtual Drone Using Non-Invasive Motor Imagery and Machine Learning. Appl. Sci. 2021, 11, 11876. [Google Scholar] [CrossRef]
  15. Reddy, M.H.H.N. Brain Computer Interface Drone. In Brain-Computer Interface; IntechOpen: Rijeka, Croatia, 2021; pp. 1–19. [Google Scholar] [CrossRef]
  16. Peining, P.; Tan, G.; Aung, A.; Phyo Wai, A. Evaluation of Consumer-Grade EEG Headsets for BCI Drone Control. In Proceedings of the IRC Conference on Science, Engineering, and Technology, Singapore, 10 August 2017; pp. 1–6. [Google Scholar]
  17. Rosca, S.; Leba, M.; Ionica, A.; Gamulescu, O. Quadcopter control using a BCI. IOP Conf. Ser. 2018, 294, 0120485. [Google Scholar] [CrossRef]
  18. Duan, X.; Xie, S.; Xie, X.; Meng, Y.; Xu, Z. Quadcopter Flight Control Using a Non-invasive Multi-Modal Brain Computer Interface. Front. Neurorobot. 2019, 13, 23. [Google Scholar] [CrossRef] [PubMed]
  19. Lee, D.-H.; Ahn, H.-J.; Jeong, J.-H.; Lee, S.-W. Design of an EEG-based Drone Swarm Control System using Endogenous BCI Paradigms. In Proceedings of the 2021 9th International Winter Conference on Brain-Computer Interface (BCI), Gangwon, Republic of Korea, 22–24 February 2021; pp. 1–5. [Google Scholar]
  20. Marin, I.; Al-Battbootti, M.J.H.; Goga, N. Drone Control based on Mental Commands and Facial Expressions. In Proceedings of the 2020 12th International Conference on Electronics, Computers and Artificial Intelligence (ECAI), Bucharest, Romania, 25–27 June 2020; pp. 1–4. [Google Scholar]
  21. Abdulwahhab, A.H. Improved Algorithms for EEG-Based BCI Application. Master’s Thesis, Istanbul Gelisim University, Institute of Graduate Studies, Istanbul, Turkey, 2021. [Google Scholar]
  22. Turovskiy, Y.; Volf, D.; Iskhakova, A.; Iskhakov, A.Y. Neuro-Computer Interface Control of Cyber-Physical Systems. In High-Performance Computing Systems and Technologies in Scientific Research, Automation of Control and Production, Proceedings of the 11th International Conference, HPCST 2021, Barnaul, Russia, 21–22 May 2021; Springer International Publishing: Cham, Switzerland, 2022; Volume 1526, pp. 338–353. [Google Scholar]
  23. Kharchenko, S.; Meshcheryakov, R.; Turovsky, Y.; Volf, D. Implementation of Robot—Human Control Bio-Interface When Highlighting Visual-Evoked Potentials Based on Multivariate Synchronization Index. In Proceedings of the 15th International Conference on Electromechanics and Robotics “Zavalishin’s Readings”, Ufa, Russia, 15–18 April 2020; Smart Innovation, Systems and Technologies. Springer: Singapore, 2021; Volume 187, pp. 225–236. [Google Scholar]
  24. GitHub Repository. Tello-Neurosky. 2022. Available online: https:/github.com/Runsolar/tello-neurosky (accessed on 1 September 2022).
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.