Brain-Computer Interface-Based Humanoid Control: A Review
Abstract
:1. Introduction
- This paper reviews various applications in which a humanoid is controlled using brain signals for performing a wide variety of applications such as grasping of objects, navigation, telepresence etc.;
- For each of the applications, we discuss the overview of the application, system design, and results associated with the experiments conducted;
- Specifically in this review, we consider BCI applications which use just EEG signals (discussed in Section 3), applications which use multisensor fusion where in addition to EEG, other sensor inputs are also considered for execution of the desired task (Section 4), as well as augmented reality-assisted BCI (Section 5);
- To the best of our knowledge, this work is the first review on BCI-controlled humanoids.
2. Preliminary Knowledge
2.1. Brain-Computer Interface
2.2. Hybrid BCI
- Brain signals must be used in the BCI System;
- The user should be able to control one of the brain signals intentionally;
- The BCI System should do real-time processing of the signal;
- User must be provided with the feedback of the BCI output.
2.3. Classification Algorithms
2.4. Humanoids
- Nao Humanoid (Softbank Robotics) [62];
- HRP-2 Humanoid (Kawada Industries) [63];
- KT-X Humanoid (Kumotek Robotics) [24];
- DARwIn-OP (Robotis) [64].
- 17–30 degrees of freedom;
- Multiple sensors like gyroscope, force sensors, etc. on different body parts like head, torso, arms, legs;
- Microphones and speakers to interact with humans;
- Two cameras for object detection and recognition (in NAO);
- Custom application development due to open architecture.
3. BCI-Controlled Humanoid Applications Using Only EEG
3.1. Grasp a Glass of Water Using NAO (Type: Rehabilitation)
- Teleoperated Mode: In this mode, the user controls the movement of the robot and also gives commands to grasp and give a glass of water;
- Autonomous Mode: In this, the user would just give abstract commands and the humanoid plans its actions according to the state.
3.2. Telepresence by Humanoid Using P300 Signal (Type: Entertainment)
3.3. BCI Operated Museum Guide (Type: Entertainment)
4. BCI-Controlled Humanoid Applications Using Hybrid BCI
4.1. Picking Objects Using Neuro-Biological Feedback Fusion (Type: Rehabilitation)
- BCI system:Visual Evoked Potentials (VEPs) and P300 are used. Oddball paradigm is used for eliciting ERPs. The salient features of the system were as follows:Signal Processing: g.USBamp device was used for recording the signals, using 10–20 standard system. The signal was digitised at 256 Hz. Butterworth filter was used to reduce the artefacts. A temporal filter was also used to average the samples in order to reduce the noise. In this study, 6 epochs each with a window of 800 ms were used.Feature extraction: Fisher’s stepwise Linear discriminant is used during the training to configure according to the user’s brain. LDA was used to differentiate the different classes by using hyperplanes. In this application, LDA calculates the stimuli recorded for every action on the grid and then selects the most prominent action corresponding to the grid.User Interface: It is similar to the 3 × 3 grid, which was used in [66] (Figure 6). Low-level behaviours include controlling all the possible directional movements of the humanoid. However, high-level behaviours include issuing control commands like holding some item and giving the held item, similar to the ones considered in [66].
- Biofeedback system uses neurological states and gaze: The biofeedback system takes into account the user’s eyes and brain activity. It includes four parameters—Mental intention, attention, visual focus, and stress. An action is executed only when the biofeedback factor (B) is greater than 60%. The various modules associated with the bio-feedback system are explained below:Attention module: Since there are nine commands, Fisher’s Linear Discriminant (FLD) is used with one versus rest approach. The attention is expressed in percentage and is it based on the power of P300 waves measured during performing the task.Intention module: Correlation factor of the P300 wave is used to measure intention. It is based on the precision of the system.Visual focus module: It is calculated by evaluating the user’s gaze by eye-tracking, as shown in Figure 10a. Here F represents the central focus, F is the lateral focus, and F is the outer focus; all values are in the form of a percentage.Entropy module: Stressful Condition corresponds to high entropy in brain signals. Signal processing steps are performed to extract the normalised value of the entropy. Finally value B is calculated by taking a weighted average of attention, intention, and visual focus values.
- Connection of the subject to the robot: For receiving commands from the BCI, User Datagram Protocol (UDP) connection is made to the control interface. Connection to the robotic system is made through TCP/IP socket for reliability.
- Controlling the behaviour of the robot. Two control modes are proposed by the authors:
- Navigation mode: NAO can move in 6 ways namely walking (front & reverse direction), turning ( left & right), and rotating (clockwise & anti-clockwise).
- High-level mode: It includes complex tasks like holding on to an object, and giving the object to the user after identifying the user’s location.
The distance metric (O) is also used to avoid collisions based on a threshold value. If distance metric is less than the threshold value, then is considered safe to execute a command. Once that is ensured, corresponding to that an reaction safe command is activated along with the biological factor B and O which is passed to function which finally executes the command R that corresponds to the control command.
4.2. Humanoid Control using Facial Signals (Type: Entertainment)
5. Application Using BCI Supported by Augmented Reality (AR)/Virtual Reality (VR)
6. Summary of Applications
7. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Mantri, S.; Dukare, V.; Yeole, S.; Patil, D.; Wadhai, V.M. A Survey: Fundamental of EEG. Int. J. Adv. Res. Comput. Sci. Manag. Stud. 2013, 1, 1–7. [Google Scholar]
- Pfurtscheller, G.; Neuper, C.; Guger, C.; Harkam, W.; Ramoser, H.; Schlögl, A.; Obermaier, B.; Pregenzer, M. Current trends in Graz Brain-Computer Interface (BCI) research. IEEE Trans. Rehabil. Eng. 2000, 8, 216–219. [Google Scholar] [CrossRef]
- Nicolas-Alonso, L.F.; Gomez-Gil, J. Brain Computer Interfaces, a Review. Sensors 2012, 12, 1211–1279. [Google Scholar] [CrossRef]
- Hirai, K.; Hirose, M.; Haikawa, Y.; Takenaka, T. The development of Honda humanoid robot. In Proceedings of the 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146), Leuven, Belgium, 20–20 May 1998; IEEE: Piscataway, NJ, USA, 1998; Volume 2, pp. 1321–1326. [Google Scholar]
- Brooks, R.; Breazeal, C.; Marjanović, M.; Scassellati, B.; Williamson, M.M. The Cog Project: Building a Humanoid Robot. In Computer Vision; Springer: Berlin/Heidelberg, Germany, 1999; Volume 1562, pp. 52–87. [Google Scholar]
- George, M.; Tardif, J.-P.; Kelly, A. Visual and inertial odometry for a disaster recovery humanoid. In Field and Service Robotics; Springer: Cham, Switzerland, 2015; pp. 501–514. [Google Scholar]
- Kakiuchi, Y.; Kojima, K.; Kuroiwa, E.; Noda, S.; Murooka, M.; Kumagai, I.; Ueda, R.; Sugai, F.; Nozawa, S.; Okada, K.; et al. Development of humanoid robot system for disaster response through team nedo-jsk’s approach to darpa robotics challenge finals. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Korea, 3–5 November 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 805–810. [Google Scholar]
- Vukobratović, M. Humanoid robotics, past, present state, future. Director Robotics Center. Mihailo Pupin Inst. 2006, 11000, 13–27. [Google Scholar]
- Vukobratović, M. Active exoskeletal systems and beginning of the development of humanoid robotics. Facta Univ.-Ser. Mech. Autom. Control. Robot. 2008, 7, 243–262. [Google Scholar]
- Shajahan, J.A.; Jain, S.; Joseph, C.; Keerthipriya, G.; Raja, P.K. Target detecting defence humanoid sniper. In Proceedings of the 2012 Third International Conference on Computing, Communication and Networking Technologies (ICCCNT’12), Coimbatore, India, 26 July 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1–6. [Google Scholar]
- Alladi, T.; Chamola, V.; Sikdar, B.; Choo, K.K. Consumer iot: Security vulnerability case studies and solutions. IEEE Consum. Electron. Mag. 2020, 9, 17–25. [Google Scholar] [CrossRef]
- Hassija, V.; Chamola, V.; Saxena, V.; Jain, D.; Goyal, P.; Sikdar, B. A Survey on IoT Security: Application Areas, Security Threats, and Solution Architectures. IEEE Access 2019, 7, 82721–82743. [Google Scholar] [CrossRef]
- Alladi, T.; Chamola, V.; Zeadally, S. Industrial Control Systems: Cyberattack trends and countermeasures. Comput. Commun. 2020, 155, 1–8. [Google Scholar] [CrossRef]
- Luo, R.C.; Chang, C.-C. Multisensor Fusion and Integration: A Review on Approaches and Its Applications in Mechatronics. IEEE Trans. Ind. Inf. 2011, 8, 49–60. [Google Scholar] [CrossRef]
- Novak, D.; Riener, R. A survey of sensor fusion methods in wearable robotics. Robot. Auton. Syst. 2015, 73, 155–170. [Google Scholar] [CrossRef]
- Wolpaw, J.R.; Birbaumer, N.; Heetderks, W.; McFarland, D.; Peckham, P.; Schalk, G.; Donchin, E.; Quatrano, L.; Robinson, C.; Vaughan, T. Brain-computer interface technology: A review of the first international meeting. IEEE Trans. Rehabil. Eng. 2000, 8, 164–173. [Google Scholar] [CrossRef] [PubMed]
- Fabiani, G.; McFarland, D.; Wolpaw, J.R.; Pfurtscheller, G. Conversion of EEG Activity Into Cursor Movement by a Brain–Computer Interface (BCI). IEEE Trans. Neural Syst. Rehabil. Eng. 2004, 12, 331–338. [Google Scholar] [CrossRef] [PubMed]
- Minguillon, J.; Lopez-Gordo, M.A.; Pelayo, F. Trends in EEG-BCI for daily-life: Requirements for artifact removal. Biomed. Signal Process. Control. 2017, 31, 407–418. [Google Scholar] [CrossRef]
- Abdulkader, S.N.; Atia, A.; Mostafa, M.-S. Brain computer interfacing: Applications and challenges. Egypt. Inf. J. 2015, 16, 213–230. [Google Scholar] [CrossRef] [Green Version]
- Gao, X.; Xu, D.; Cheng, M.; Gao, S. A bci-based environmental controller for the motion-disabled. IEEE Trans. Neural Syst. Rehabil. Eng. 2003, 11, 137–140. [Google Scholar] [CrossRef]
- Rebsamen, B.; Burdet, E.; Guan, C.; Zhang, H.; Teo, C.L.; Zeng, Q.; Laugier, C.; Ang, M. Controlling a Wheelchair Indoors Using Thought. IEEE Intell. Syst. 2007, 22, 18–24. [Google Scholar] [CrossRef]
- Reuderink, B. Games and Brain-Computer Interfaces: The State of the Art; WP2 BrainGain Deliverable, HMI, University of Twente: Enschede, The Netherlands, September 2008; pp. 1–11. [Google Scholar]
- Finke, A.; Lenhardt, A.; Ritter, H. The MindGame: A P300-based brain–computer interface game. Neural Netw. 2009, 22, 1329–1333. [Google Scholar] [CrossRef]
- Li, W.; Jaramillo, C.; Li, Y. Development of mind control system for humanoid robot through a brain computer interface. In Proceedings of the 2012 Second International Conference on Intelligent System Design and Engineering Application, Sanya, China, 6–7 January 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 679–682. [Google Scholar]
- Millán, J.D.; Rupp, R.; Müller-Putz, G.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kubler, A.; Leeb, R.; et al. Combining brain–computer interfaces and assistive technologies: State-of-the-art and challenges. Front. Mol. Neurosci. 2010, 4, 161. [Google Scholar] [CrossRef]
- Cortes, A.M.; Manyakov, N.V.; Chumerin, N.; Van Hulle, M.M. Language Model Applications to Spelling with Brain-Computer Interfaces. Sensors 2014, 14, 5967–5993. [Google Scholar] [CrossRef] [Green Version]
- Gomez-Gil, J.; San-Jose-Gonzalez, I.; Nicolas-Alonso, L.F.; Alonso-Garcia, S. Steering a Tractor by Means of an EMG-Based Human-Machine Interface. Sensors 2011, 11, 7110–7126. [Google Scholar] [CrossRef] [Green Version]
- Wang, F.; Zhang, X.; Fu, R.; Sun, G. Study of the Home-Auxiliary Robot Based on BCI. Sensors 2018, 18, 1779. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ahn, M.; Lee, M.; Choi, J.; Jun, S.C. A Review of Brain-Computer Interface Games and an Opinion Survey from Researchers, Developers and Users. Sensors 2014, 14, 14601–14633. [Google Scholar] [CrossRef] [PubMed]
- Sung, Y.; Cho, K.; Um, K. A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors. Sensors 2012, 12, 15671–15688. [Google Scholar] [CrossRef] [Green Version]
- Schalk, G.; McFarland, D.; Hinterberger, T.; Birbaumer, N.; Wolpaw, J.R. BCI2000: A General-Purpose Brain-Computer Interface (BCI) System. IEEE Trans. Biomed. Eng. 2004, 51, 1034–1043. [Google Scholar] [CrossRef]
- Chae, Y.; Jeong, J.; Jo, S. Toward Brain-Actuated Humanoid Robots: Asynchronous Direct Control Using an EEG-Based BCI. IEEE Trans. Robot. 2012, 28, 1131–1144. [Google Scholar] [CrossRef]
- Güneysu, A.; Akin, H.L. An SSVEP based BCI to control a humanoid robot by using portable EEG device. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Osaka, Japan, 3–7 July 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 6905–6908. [Google Scholar]
- Zander, T.O.; Kothe, C.; Jatzev, S.; Gaertner, M. Enhancing Human-Computer Interaction with Input from Active and Passive Brain-Computer Interfaces. In Evaluating User Experience in Games; Springer: London, UK, 2010; pp. 181–199. [Google Scholar]
- Shenoy, P.; Krauledat, M.; Blankertz, B.; Rao, R.P.N.; Müller, K.-R. Towards adaptive classification for BCI. J. Neural Eng. 2006, 3, R13–R23. [Google Scholar] [CrossRef] [Green Version]
- Lee, M.-H.; Fazli, S.; Mehnert, J.; Lee, S.-W. Subject-dependent classification for robust idle state detection using multi-modal neuroimaging and data-fusion techniques in BCI. Pattern Recognit. 2015, 48, 2725–2737. [Google Scholar] [CrossRef]
- Bansal, G.; Chamola, V.; Narang, P.; Kumar, S.; Raman, S. Deep3DSCan: Deep residual network and morphological descriptor based framework for lung cancer classification and 3D segmentation. IET Image Process. 2020, 14, 1240–1247. [Google Scholar] [CrossRef]
- Chamola, V.; Hassija, V.; Gupta, V.; Guizani, M. A Comprehensive Review of the COVID-19 Pandemic and the Role of IoT, Drones, AI, Blockchain, and 5G in Managing Its Impact. IEEE Access 2020, 8, 90225–90265. [Google Scholar] [CrossRef]
- Hassija, V.; Gupta, V.; Garg, S.; Chamola, V. Traffic Jam Probability Estimation Based on Blockchain and Deep Neural Networks. IEEE Trans. Intell. Transp. Syst. 2020, 1–10. [Google Scholar] [CrossRef]
- Hong, K.-S.; Khan, M.J. Hybrid Brain–Computer Interface Techniques for Improved Classification Accuracy and Increased Number of Commands: A Review. Front. Neurorobot. 2017, 11. [Google Scholar] [CrossRef] [Green Version]
- Choi, B.; Jo, S. A Low-Cost EEG System-Based Hybrid Brain-Computer Interface for Humanoid Robot Navigation and Recognition. PLoS ONE 2013, 8, e74583. [Google Scholar] [CrossRef]
- Fazli, S.; Dähne, S.; Samek, W.; Bieszmann, F.; Müller, K.-R.; Biebmann, F. Learning From More Than One Data Source: Data Fusion Techniques for Sensorimotor Rhythm-Based Brain—Computer Interfaces. Proc. IEEE 2015, 103, 891–906. [Google Scholar] [CrossRef]
- Pfurtscheller, G.; Allison, B.Z.; Brunner, C.; Bauernfeind, G.; Escalante, T.S.; Scherer, R.; Zander, T.O.; Mueller-Putz, G.; Neuper, C.; Birbaumer, N. The Hybrid BCI. Front. Mol. Neurosci. 2010, 4. [Google Scholar] [CrossRef]
- Aswath, S.; Tilak, C.K.; Suresh, A.; Udupa, G. Human Gesture Recognition for Real-Time Control of Humanoid Robot. Int. J. Adv. Mech. Automob. Engg. 2014, 1, 96–100. [Google Scholar]
- Yun, S.-J.; Lee, M.-C.; Cho, S.-B. P300 BCI based planning behavior selection network for humanoid robot control. In Proceedings of the 2013 Ninth International Conference on Natural Computation (ICNC), Shenyang, China, 23–25 July 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 354–358. [Google Scholar]
- Horki, P.; Solis-Escalante, T.; Neuper, C.; Müller-Putz, G.R. Combined motor imagery and SSVEP based BCI control of a 2 DoF artificial upper limb. Med. Boil. Eng. 2011, 49, 567–577. [Google Scholar] [CrossRef]
- Ramadan, R.A.; Vasilakos, A.V. Brain computer interface: Control signals review. Neurocomputing 2017, 223, 26–44. [Google Scholar] [CrossRef]
- Guger, C.; Daban, S.; Sellers, E.; Holzner, C.; Krausz, G.; Carabalona, R.; Gramatica, F.; Edlinger, G. How many people are able to control a P300-based brain–computer interface (BCI)? Neurosci. Lett. 2009, 462, 94–98. [Google Scholar] [CrossRef]
- Mellinger, J.; Schalk, G.; Braun, C.; Preissl, H.; Rosenstiel, W.; Birbaumer, N.; Kübler, A. An MEG-based brain–computer interface (BCI). NeuroImage 2007, 36, 581–593. [Google Scholar] [CrossRef] [Green Version]
- Müller-Putz, G.; Scherer, R.; Brunner, C.; Leeb, R.; Pfurtscheller, G. Better than random: A closer look on BCI results. Int. J. Bioelectromagn. 2008, 10, 52–55. [Google Scholar]
- Ebenuwa, S.H.; Sharif, M.S.; Alazab, M.; Al-Nemrat, A. Variance Ranking Attributes Selection Techniques for Binary Classification Problem in Imbalance Data. IEEE Access 2019, 7, 24649–24666. [Google Scholar] [CrossRef]
- Lotte, F.; Congedo, M.; Lecuyer, A.; Lamarche, F.; Arnaldi, B. A review of classification algorithms for EEG-based brain–computer interfaces. J. Neural Eng. 2007, 4, R1–R13. [Google Scholar] [CrossRef]
- Müller, K.R.; Krauledat, M.; Dornhege, G.; Curio, G.; Blankertz, B. Machine learning techniques for brain-computer interfaces. Biomed. Tech. 2004, 49, 11–22. [Google Scholar]
- Müller, K.-R.; Tangermann, M.; Dornhege, G.; Krauledat, M.; Curio, G.; Blankertz, B. Machine learning for real-time single-trial EEG-analysis: From brain–computer interfacing to mental state monitoring. J. Neurosci. Methods 2008, 167, 82–90. [Google Scholar] [CrossRef] [PubMed]
- Krusienski, D.J.; Sellers, E.W.; Cabestaing, F.; Bayoudh, S.; McFarland, D.; Vaughan, T.M.; Wolpaw, J.R. A comparison of classification techniques for the P300 Speller. J. Neural Eng. 2006, 3, 299–305. [Google Scholar] [CrossRef] [Green Version]
- Bi, L.; Fan, X.-A.; Liu, Y. EEG-Based Brain-Controlled Mobile Robots: A Survey. IEEE Trans. Hum.-Mach. Syst. 2013, 43, 161–176. [Google Scholar] [CrossRef]
- Subasi, A.; Gursoy, M.I. EEG signal classification using PCA, ICA, LDA and support vector machines. Expert Syst. Appl. 2010, 37, 8659–8666. [Google Scholar] [CrossRef]
- Millan, J.D.R.; Mouriño, J. Asynchronous bci and local neural classifiers: An overview of the adaptive brain interface project. IEEE Trans. Neural Syst. Rehabil. Eng. 2003, 11, 159–161. [Google Scholar] [CrossRef] [Green Version]
- Sturm, I.; Lapuschkin, S.; Samek, W.; Müller, K.-R. Interpretable deep neural networks for single-trial EEG classification. J. Neurosci. Methods 2016, 274, 141–145. [Google Scholar] [CrossRef] [Green Version]
- Kaper, M.; Meinicke, P.; Grossekathoefer, U.; Lingner, T.; Ritter, H. BCI Competition 2003—Data Set IIb: Support Vector Machines for the P300 Speller Paradigm. IEEE Trans. Biomed. Eng. 2004, 51, 1073–1076. [Google Scholar] [CrossRef] [PubMed]
- Kawanabe, M.; Krauledat, M.; Blankertz, B. A Bayesian Approach for Adaptive BCI Classification. In Proceedings of the 3rd International Brain-Computer Interface Workshop and Training Course, Graz, Austria, 21–24 September 2006; pp. 1–2. [Google Scholar]
- Gouaillier, D.; Hugel, V.; Blazevic, P.; Kilner, C.; Monceaux, J.; Lafourcade, P.; Marnier, B.; Serre, J.; Maisonnier, B. The nao humanoid: A combination of performance and affordability. arXiv 2008, arXiv:0807.3223. [Google Scholar]
- Kaneko, K.; Kanehiro, F.; Kajita, S.; Hirukawa, H.; Kawasaki, T.; Hirata, M.; Akachi, K.; Isozumi, T.T. Humanoid robot HRP-2. In Proceedings of the ICRA 2004 IEEE International Conference on Robotics and Automation 2004, New Orleans, LA, USA, 26 April–1 May 2004; Volume 2, pp. 1083–1090. [Google Scholar]
- Ha, I.; Tamura, Y.; Asama, H.; Han, J.; Hong, D.W. Development of open humanoid platform DARwIn-OP. In Proceedings of the SICE Annual Conference 2011, Tokyo, Japan, 13–18 September 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 2178–2181. [Google Scholar]
- Wirth, C.; Toth, J.; Arvaneh, M. “You Have Reached Your Destination”: A Single Trial EEG Classification Study. Front. Mol. Neurosci. 2020, 14, 66. [Google Scholar] [CrossRef] [PubMed]
- Spataro, R.; Chella, A.; Allison, B.; Giardina, M.; Sorbello, R.; Tramonte, S.; Guger, C.; La Bella, V. Reaching and grasping a glass of water by locked-in ALS patients through a BCI-controlled humanoid robot. Front. Hum. Neurosci. 2017, 11, 68. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Farwell, L.; Donchin, E. Talking off the top of your head: Toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr. Clin. Neurophysiol. 1988, 70, 510–523. [Google Scholar] [CrossRef]
- Saduanov, B.; Alizadeh, T.; An, J.; Abibullaev, B. Trained by demonstration humanoid robot controlled via a BCI system for telepresence. In Proceedings of the 2018 6th International Conference on Brain-Computer Interface (BCI), GangWon, Korea, 15–17 January 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–4. [Google Scholar]
- Chella, A.; Pagello, E.; Menegatti, E.; Sorbello, R.; Anzalone, S.M.; Cinquegrani, F.; Tonin, L.; Piccione, F.; Prifitis, K.; Blanda, C.; et al. A BCI Teleoperated Museum Robotic Guide. In Proceedings of the 2009 International Conference on Complex, Intelligent and Software Intensive Systems, Fukuoka, Japan, 16–19 March 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 783–788. [Google Scholar]
- Sorbello, R.; Tramonte, S.; Giardina, M.E.; La Bella, V.; Spataro, R.; Allison, B.Z.; Guger, C.; Chella, A. A Human–Humanoid Interaction Through the Use of BCI for Locked-In ALS Patients Using Neuro-Biological Feedback Fusion. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 26, 487–497. [Google Scholar] [CrossRef] [PubMed]
- Alimardani, M.; Nishio, S.; Ishiguro, H. The Importance of Visual Feedback Design in BCIs; From Embodiment to Motor Imagery Learning. PLoS ONE 2016, 11, e0161945. [Google Scholar] [CrossRef] [Green Version]
- Tidoni, E.; Gergondet, P.; Kheddar, A.; Aglioti, S.M. Audio-visual feedback improves the BCI performance in the navigational control of a humanoid robot. Front. Neurorobot. 2014, 8. [Google Scholar] [CrossRef] [Green Version]
- Nam, Y.; Koo, B.; Cichocki, A.; Choi, S. GOM-Face: GKP, EOG, and EMG-Based Multimodal Interface With Application to Humanoid Robot Control. IEEE Trans. Biomed. Eng. 2014, 61, 453–462. [Google Scholar] [CrossRef]
- Zhang, H.; Jolfaei, A.; Alazab, M. A Face Emotion Recognition Method Using Convolutional Neural Network and Image Edge Computing. IEEE Access 2019, 7, 159081–159089. [Google Scholar] [CrossRef]
- Petit, D.; Gergondet, P.; Cherubini, A.; Meilland, M.; Comport, A.I.; Kheddar, A. Navigation assistance for a BCI-controlled humanoid robot. In Proceedings of the 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent, Hong Kong, China, 4–7 June 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 246–251. [Google Scholar]
- Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [Google Scholar] [CrossRef] [Green Version]
- Gergondet, P.; Kheddar, A.; Hintermüller, C.; Guger, C.; Slater, M. Multitask Humanoid Control with a Brain-Computer Interface: User Experiment with HRP-2. In Experimental Robotics; Springer: Berlin, Germany, 2012. [Google Scholar]
- Weisz, J.; Elvezio, C.; Allen, P.K. A user interface for assistive grasping. In Proceedings of the 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, 3–7 November 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 3216–3221. [Google Scholar]
- Çağlayan, O.; Arslan, R.B. Humanoid robot control with SSVEP on embedded system. In Proceedings of the 5th International Brain-Computer Interface Meeting: Defining the Future, Taylor & Francis Conference, Pacific Grove, CA, USA, 3–7 June 2013; pp. 260–261. [Google Scholar]
- Hochberg, L.R.; Bacher, D.; Jarosiewicz, B.; Masse, N.Y.; Simeral, J.D.; Vogel, J.; Haddadin, S.; Liu, J.; Cash, S.S.; Van Der Smagt, P.; et al. Reach and grasp by people with tetraplegia using a neurally controlled robotic arm. Nature 2012, 485, 372–375. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Escolano, C.; Antelis, J.M.; Minguez, J. A Telepresence Mobile Robot Controlled With a Noninvasive Brain–Computer Interface. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2011, 42, 793–804. [Google Scholar] [CrossRef] [PubMed]
- Zhao, J.; Li, W.; Mao, X.; Hu, H.; Niu, L.; Chen, G. Behavior-Based SSVEP Hierarchical Architecture for Telepresence Control of Humanoid Robot to Achieve Full-Body Movement. IEEE Trans. Cogn. Dev. Syst. 2017, 9, 197–209. [Google Scholar] [CrossRef]
- Beraldo, G.; Antonello, M.; Cimolato, A.; Menegatti, E.; Tonin, L. Brain-Computer Interface Meets ROS: A Robotic Approach to Mentally Drive Telepresence Robots. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 1–6. [Google Scholar]
- Aznan, N.K.N.; Connolly, J.D.; Al Moubayed, N.; Breckon, T.P. Using Variable Natural Environment Brain-Computer Interface Stimuli for Real-time Humanoid Robot Navigation. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 4889–4895. [Google Scholar]
- Zhao, J.; Li, W.; Li, M. Comparative Study of SSVEP- and P300-Based Models for the Telepresence Control of Humanoid Robots. PLoS ONE 2015, 10, e0142168. [Google Scholar] [CrossRef]
- Thobbi, A.; Kadam, R.; Sheng, W. Achieving remote presence using a humanoid robot controlled by a non-invasive BCI device. Int. J. Artif. Intell. Mach. Learn. 2010, 10, 41–45. [Google Scholar]
- Leeb, R.; Tonin, L.; Rohm, M.; Desideri, L.; Carlson, T.; Millan, J.D.R. Towards Independence: A BCI Telepresence Robot for People With Severe Motor Disabilities. Proc. IEEE 2015, 103, 969–982. [Google Scholar] [CrossRef] [Green Version]
- Escolano, C.; Antelis, J.; Mínguez, J. Human brain-teleoperated robot between remote places. In Proceedings of the 2009 IEEE International Conference on Robotics and Automation, Kobe, Japan, 12–17 May 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 4430–4437. [Google Scholar]
- Stawicki, P.; Gembler, F.; Volosyak, I. Driving a Semiautonomous Mobile Robotic Car Controlled by an SSVEP-Based BCI. Comput. Intell. Neurosci. 2016, 2016, 1–14. [Google Scholar] [CrossRef] [Green Version]
- Ma, J.; Zhang, Y.; Cichocki, A.; Matsuno, F. A Novel EOG/EEG Hybrid Human–Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control. IEEE Trans. Biomed. Eng. 2015, 62, 876–889. [Google Scholar] [CrossRef]
- Kim, B.H.; Kim, M.; Jo, S. Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking. Comput. Boil. Med. 2014, 51, 82–92. [Google Scholar] [CrossRef]
- Stawicki, P.; Gembler, F.; Rezeika, A.; Volosyak, I. A Novel Hybrid Mental Spelling Application Based on Eye Tracking and SSVEP-Based BCI. Brain Sci. 2017, 7, 35. [Google Scholar] [CrossRef]
- Dong, X.; Wang, H.; Chen, Z.; Shi, B.E. Hybrid Brain Computer Interface via Bayesian integration of EEG and eye gaze. In Proceedings of the 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, France, 22–24 April 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 150–153. [Google Scholar]
- Nam, Y.; Zhao, Q.; Cichocki, A.; Choi, S. Tongue-Rudder: A Glossokinetic-Potential-Based Tongue–Machine Interface. IEEE Trans. Biomed. Eng. 2011, 59, 290–299. [Google Scholar] [CrossRef] [PubMed]
- Navarro, R.B.; Boquete, L.; Mazo, M.; Lopez, E.; Elena, L. System for assisted mobility using eye movements based on electrooculography. IEEE Trans. Neural Syst. Rehabil. Eng. 2002, 10, 209–218. [Google Scholar] [CrossRef]
- Tsui, C.S.L.; Jia, P.; Gan, J.Q.; Hu, H.; Yuan, K. EMG-based hands-free wheelchair control with EOG attention shift detection. In Proceedings of the 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China, 15–18 December 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 1266–1271. [Google Scholar]
- Usakli, A.B.; Gürkan, S.; Aloise, F.; Vecchiato, G.; Babiloni, F. A hybrid platform based on EOG and EEG signals to restore communication for patients afflicted with progressive motor neuron diseases. In Proceedings of the 2009 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Minneapolis, MN, USA, 3–6 September 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 543–546. [Google Scholar]
- Postelnicu, C.-C.; Girbacia, F.; Talaba, D. EOG-based visual navigation interface development. Expert Syst. Appl. 2012, 39, 10857–10866. [Google Scholar] [CrossRef]
- Ramli, R.; Arof, H.; Ibrahim, F.; Mokhtar, N.; Idris, M.Y.I. Using finite state machine and a hybrid of EEG signal and EOG artifacts for an asynchronous wheelchair navigation. Expert Syst. Appl. 2015, 42, 2451–2463. [Google Scholar] [CrossRef]
- Martens, N.; Jenke, R.; Abu-Alqumsan, M.; Kapeller, C.; Hintermüller, C.; Guger, C.; Peer, A.; Buss, M. Towards robotic re-embodiment using a Brain-and-Body-Computer Interface. In Proceedings of the 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal, 7–12 October 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 5131–5132. [Google Scholar]
- Acar, D.; Miman, M.; Akirmak, O.O. Treatment of anxiety disorders patients through eeg and augmented reality. Eur. Soc. Sci. Res. J. 2014, 3, 18–27. [Google Scholar]
- Lenhardt, A.; Ritter, H. An Augmented-Reality Based Brain-Computer Interface for Robot Control. In International Conference on Neural Information Processing; Springer: Berlin/Heidelberg, Germany, 2010; pp. 58–65. [Google Scholar]
- Takano, K.; Hata, N.; Kansaku, K. Towards Intelligent Environments: An Augmented Reality–Brain–Machine Interface Operated with a See-Through Head-Mount Display. Front. Mol. Neurosci. 2011, 5. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Faller, J.; Allison, B.Z.; Brunner, C.; Scherer, R.; Schmalstieg, D.; Pfurtscheller, G.; Neuper, C. A feasibility study on SSVEP-based interaction with motivating and immersive virtual and augmented reality. arXiv 2017, arXiv:1701.03981. [Google Scholar]
- Faller, J.; Leeb, R.; Pfurtscheller, G.; Scherer, R. Avatar navigation in virtual and augmented reality environments using an ssvep bci icabb-2010. In Proceedings of the Brain-Computer Interfacing and Virtual Reality Workshop W, Venice, Italy, 14–17 October 2010; Volume 1. [Google Scholar]
- Kerous, B.; Liarokapis, F. BrainChat—A Collaborative Augmented Reality Brain Interface for Message Communication. In Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality (ISMAR-Adjunct), Nantes, France, 9–13 October 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 279–283. [Google Scholar]
S.No. | Method | Description | Characteristics | |
---|---|---|---|---|
(1) | Electroencephalography (EEG) | Measuring the electric signals produced by the human brain | - Commonly used method. - Safe and affordable - Poor spatial resolution | |
(a) Evoked Signals | SSVEP | Brain signal generated in response to looking at source having a specific frequency of flickering | - Training time is short - Requires continuous attention for stimuli - Exhausting for user after long sessions | |
P300 | Signal generated in response to an infrequent stimulus, recorded with a latency of 250–500 ms | |||
(b) Spontaneous Signals | Voluntary signals generated without external stimulus | - External stimuli not required - Long training required | ||
(2) | Electromyography EMG) | Measure the electrical activity produced by skeletal muscles | - Easy to record - More noise contamination | |
(3) | Electrocorticography (ECoG) | Measuring the electric signals by placing electrodes beneath the skull | - Better signal quality than EEG - Risky (semi-invasive) - Less Common | |
(4) | Functional magnetic resonance imaging (fMRI) | Measure changes in the metabolism of the brain (e.g., oxygen saturation) | - Good spatial resolution - Poor temporal resolution(1 s–2 s) - Sensitive to motion | |
(5) | Near-Infrared Spectroscopy (NIRS) | - Good spatial resolution - Poor temporal resolution(2 s–5 s) |
Classifier | Mechanism | Properties | Choice Consideration |
---|---|---|---|
Linear Discriminant Analysis (LDA) | Decision boundary is made by maximising the mean among two class and minimising the variance inside each class. | 1) Simple 2) Less computational 3) Decision boundary is linear | - Suited for online sessions - Smaller training set |
Artificial Neural Networks (ANN) | Minimises the error in classifying training data by adjusting weights of neural connections | 1) Many parameters to set 2) Highly computational 3) Decision boundary is non-linear 4) Prone to overfitting | - Suitable for variety of applications - Sensitive to noisy data |
Support Vector Machines (SVM) | Decision boundary maximises the margin between two class | 1) Decision boundary can be linear or non-linear 2) Less prone to overfitting 3) High computation for non-linear cases | - Appropriate for high- dimensional data - Less sensitive to noisy data |
Statistical Classifiers | Estimates probability corresponding to each class and selects the class having the most favourable possibility | 1) Decision boundary is non-linear 2) Efficient for uncertain samples. | - Suited as adaptive algorithm - Considers variation in brain dynamics (e.g., fatigue) |
Session | Trials | Threshold & Feedback | Purpose | Accuracy (In %) (Mean ± Standard Deviation |
---|---|---|---|---|
Calibration | 9 | 100% No Feedback | For tuning signal processing parameters | - |
Online | 20 | 55% With Feedback | Train the classifier | Healthy: 74.5 ± 5.3 Amyotrophic Lateral Sclerosis (ALS) Patient: 69.75 ± 15.8 |
Robotic | 10 | N.A. With Feedback | Robot Executes the selected command | Healthy: 72.4 ± 9.4 ALS Patient: 71.25 ± 17.3 |
Session | Trials | Feedback | Purpose | Accuracy (In %) |
---|---|---|---|---|
Calibration | 5 | With Feedback | Tune Signal Processing Parameters & Train Classifier | - |
Real-Time | - | With Feedback | Control the Humanoid Robot. | 78 |
Session | Trials | Threshold & Feedback | Purpose | Success | Bio-Feedback Factor |
---|---|---|---|---|---|
Calibration | Till 100% correctness (Avg.: 3) | 100% No Feedback | Calibrate BCI System over the neural response | - | - |
Online | 10 | - With Feedback | Select the command with visual feedback | Healthy: 100% ALS: 97.22% | Healthy: 78.15% ALS: 79.61 |
Robotic | 5 | - With Feedback | Select the command with robotic feedback | Healthy: 100% ALS: 96.97% | Healthy: 75.83 ALS: 84.25 |
Session | Trials | Purpose | Accuracy |
---|---|---|---|
Training | 7 (Eye & Tongue) | To train the detection model | - |
Online | 1 | To evaluate the performance of the system | 86.7 ± 8.28% |
Name | Related Works | Used Signal | Classifier | Humanoid Used | Description |
---|---|---|---|---|---|
Fetching Water (Rossella et al., 2017) [66] | [77,78,79,80] | P300 | Stepwise LDA | NAO Humanoid | Humanoid fetches a glass of water for a patient using BCI-P300 |
Telepresence (Batyrkhan et al., 2018) [68] | [81,82,83,84,85,86,87] | P300 | Logistic Regression | NAO Humanoid | A user can interact with the world remotely using humanoid controlled by BCI |
Museum Guide (Antonio et al., 2009) [69] | [88,89] | P300 | N.A. | PeopleBot & Pioneer3 | A user can control a robot to visit a museum remotely |
Picking Object (Bio-Feedback) Rosario et al., 2018) [70] | [90,91,92,93] | P300 + Eyeball Tracking | Stepwise LDA | NAO Humanoid | Picking & placing objects. But control signals are generated based on bio-logical feedback & brain signal |
Control by Facial Signal (Yunjun et al., 2014 [73] | [94,95,96,97,98,99] | EOG, EMG, GKP | SVM | NAO Humanoid | Humanoid is controlled by facial signals which do not depend on spine for signal delivery |
Navigational Assistance (Damien et al., 2014) [75] | [100,101,102,103,104,105,106] | SSVEP | N.A. | HRP-2 Humanoid | A navigational scheme is presented to have greater precision while performing action using humanoid |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chamola, V.; Vineet, A.; Nayyar, A.; Hossain, E. Brain-Computer Interface-Based Humanoid Control: A Review. Sensors 2020, 20, 3620. https://doi.org/10.3390/s20133620
Chamola V, Vineet A, Nayyar A, Hossain E. Brain-Computer Interface-Based Humanoid Control: A Review. Sensors. 2020; 20(13):3620. https://doi.org/10.3390/s20133620
Chicago/Turabian StyleChamola, Vinay, Ankur Vineet, Anand Nayyar, and Eklas Hossain. 2020. "Brain-Computer Interface-Based Humanoid Control: A Review" Sensors 20, no. 13: 3620. https://doi.org/10.3390/s20133620
APA StyleChamola, V., Vineet, A., Nayyar, A., & Hossain, E. (2020). Brain-Computer Interface-Based Humanoid Control: A Review. Sensors, 20(13), 3620. https://doi.org/10.3390/s20133620