Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot
Abstract
:1. Introduction
2. Materials and Methods
2.1. Classifiers
2.1.1. Multilayer Perceptron (MLP)
2.1.2. Tree-Type Classifiers
2.1.3. Naïve Bayes (NB)
2.1.4. The K Nearest Neighbors (K-NN)
2.1.5. Logistic Classifier (Logistic)
2.1.6. Support Vector Machines (SVM)
2.1.7. Performance Measures
2.1.8. Ranking Metric Results
- A newly created dataset of each individual;
- The model resulting from the classifier implemented in an embedded system with memory characteristics lower than those of a personal computer;
- To determine the most appropriate classifier, the computational cost and the time required for each classifier were considered. Since these are higher the more accurate the classifier is, the multilayer perceptron classifier represents a balance between computational resources and accuracy.
2.2. EOG Signal
Algorithm 1: MLP algorithm implemented for the EOG. |
|
2.3. Design of the HMI EOG
2.3.1. Analog Signal Processing
- Use operational amplifiers with a high Circuit Mode Ratio Rejection (CMRR);
- Use a reference terminal connected to the forehead to decrease inductive noise and DC component;
- The electrodes must be fixed to the skin. The best location is in the periphery of the eye, in places with a greater bone proportion.
2.3.2. Digital Signal Processing
2.4. Classification of the EOG Signal by Multilayer Perceptron
2.4.1. Omnidirectional Robot
Algorithm 2: Neural network pseudocode. |
|
2.4.2. State Machine
3. Results and Discussion
3.1. EOG Acquisition System Evaluation
Virtual Test
3.2. Performance Test
4. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Gautam, G.; Sumanth, G.; Karthikeyan, K.C.; Sundar, S.; Venkataraman, D. Eye movement based electronic wheel chair for physically challenged persons. Int. J. Sci. Technol. Res. 2014, 3, 206–212. [Google Scholar]
- Jose, J. Development of EOG-Based Human Machine Interface Control System for Motorized Wheelchair. Master’s Thesis, National Institute of Technology Rourkela, Rourkela, India, 2013. [Google Scholar]
- Leaman, J.; La, H.M. A Comprehensive Review of Smart Wheelchairs: Past, Present and Future. IEEE Trans. Hum. Mach. Syst. 2017, 47, 486–499. [Google Scholar] [CrossRef] [Green Version]
- Djeha, M.; Sbargoud, F.; Guiatni, M.; Fellah, K.; Ababou, N. A combined EEG and EOG signals based wheelchair control in virtual environment. In Proceedings of the 2017 5th International Conference on Electrical Engineering—Boumerdes (ICEE-B), Boumerdes, Algeria, 29–31 October 2017. [Google Scholar]
- Huang, Q.; He, S.; Wang, Q.; Gu, Z.; Peng, N.; Li, K.; Zhang, Y.; Shao, M.; Li, Y. An EOG-Based Human–Machine Interface for Wheelchair Control. IEEE Trans. Biomed. Eng. 2018, 65, 2023–2032. [Google Scholar] [CrossRef] [PubMed]
- Majidov, I.; Whangbo, T. Efficient Classification of Motor Imagery Electroencephalography Signals Using Deep Learning Methods. Sensors 2019, 19, 1736. [Google Scholar] [CrossRef] [Green Version]
- Al-Hudhud, G.; Alqahtani, L.; Albaity, H.; Alsaeed, D.; Al-Turaiki, I. Analyzing Passive BCI Signals to Control Adaptive Automation Devices. Sensors 2019, 19, 3042. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, F.; Xu, Q.; Fu, R. Study on the Effect of Man-Machine Response Mode to Relieve Driving Fatigue Based on EEG and EOG. Sensors 2019, 19, 4883. [Google Scholar] [CrossRef] [Green Version]
- Callejas-Cuervo, M.; González-Cely, A.X.; Bastos-Filho, T. Control Systems and Electronic Instrumentation Applied to Autonomy in Wheelchair Mobility: The State of the Art. Sensors 2020, 20, 6326. [Google Scholar] [CrossRef]
- Roza, V.C.C.; De Araujo, M.V.; Alsina, P.J.; Matamoros, E.P. EOG Based Interface to Command a Powered Orthosis for Lower Limbs. In Proceedings of the 2014 Joint Conference on Robotics: SBR-LARS Robotics Symposium and Robocontrol, Sao Carlos, Brazil, 18–23 October 2014. [Google Scholar]
- Lledó, L.D.; Úbeda, A.; Iáñez, E.; Azorín, J.M. Internet browsing application based on electrooculography for disabled people. Expert Syst. Appl. 2013, 40, 2640–2648. [Google Scholar] [CrossRef]
- Hong, K.-S.; Khan, M.J. Hybrid Brain–Computer Interface Techniques for Improved Classification Accuracy and Increased Number of Commands: A Review. Front. Neurorobot. 2017, 11, 35. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Ülkütaş, H.Ö.; Yıldız, M. Computer based eye-writing system by using EOG. In Proceedings of the 2015 Medical Technologies National Conference (TIPTEKNO), Bodrum, Turkey, 15–18 October 2015. [Google Scholar]
- Chang, W.-D. Electrooculograms for Human-Computer Interaction: A Review. Sensors 2019, 19, 2690. [Google Scholar] [CrossRef] [Green Version]
- Rim, B.; Sung, N.-J.; Min, S.; Hong, M. Deep Learning in Physiological Signal Data: A Survey. Sensors 2020, 20, 969. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Martínez-Cerveró, J.; Ardali, M.K.; Jaramillo-Gonzalez, A.; Wu, S.; Tonin, A.; Birbaumer, N.; Chaudhary, U. Open Software/Hardware Platform for Human-Computer Interface Based on Electrooculography (EOG) Signal Classification. Sensors 2020, 20, 2443. [Google Scholar] [CrossRef]
- Laport, F.; Iglesia, D.; Dapena, A.; Castro, P.M.; Vazquez-Araujo, F.J. Proposals and Comparisons from One-Sensor EEG and EOG Human–Machine Interfaces. Sensors 2021, 21, 2220. [Google Scholar] [CrossRef] [PubMed]
- Lee, K.-R.; Chang, W.-D.; Kim, S.; Im, C.-H. Real-Time “Eye-Writing” Recognition Using Electrooculogram. IEEE Trans. Neural Syst. Rehabil. Eng. 2017, 25, 37–48. [Google Scholar] [CrossRef]
- Mohd Noor, N.M.; Ahmad, S.; Sidek, S.N. Implementation of Wheelchair Motion Control Based on Electrooculography Using Simulation and Experimental Performance Testing. App. Mech. Mater. 2014, 554, 551–555. [Google Scholar] [CrossRef]
- Fang, F.; Shinozaki, T. Electrooculography-based continuous eye-writing recognition system for efficient assistive communication systems. PLoS ONE 2018, 13. [Google Scholar]
- Iáñez, E.; Azorín, J.M.; Fernández, E.; Úbeda, A. Interface Based on Electrooculography for Velocity Control of a Robot Arm. Appl. Bionics Biomech. 2010, 7, 199–207. [Google Scholar] [CrossRef] [Green Version]
- Ubeda, A.; Iañez, E.; Azorin, J.M. Wireless and Portable EOG-Based Interface for Assisting Disabled People. IEEE/ASME Trans. Mechatron. 2011, 16, 870–873. [Google Scholar] [CrossRef]
- Ramkumar, S.; Sathesh Kumar, K.; Dhiliphan Rajkumar, T.; Ilayaraja, M.; Shankar, K. A review-classification of electrooculogram based human computer interfaces. Biomed. Res. 2018, 29, 1078–1084. [Google Scholar] [CrossRef] [Green Version]
- Reynoso, F.D.P.; Suarez, P.A.N.; Sanchez, O.F.A.; Yañez, M.B.C.; Alvarado, E.V.; Flores, E.A.P. Custom EOG-Based HMI Using Neural Network Modeling to Real-Time for the Trajectory Tracking of a Manipulator Robot. Front. Neurorobot. 2020, 14, 578834. [Google Scholar] [CrossRef] [PubMed]
- Kubacki, A.; Jakubowski, A. Controlling the industrial robot model with the hybrid BCI based on EOG and eye tracking. AIP Conf. Proc. 2018, 2029, 020032. [Google Scholar]
- Kim, B.H.; Kim, M.; Jo, S. Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking. Comput. Biol. Med. 2014, 51, 82–92. [Google Scholar] [CrossRef] [PubMed]
- Postelnicu, C.-C.; Girbacia, F.; Voinea, G.-D.; Boboc, R. Towards Hybrid Multimodal Brain Computer Interface for Robotic Arm Command. In Augmented Cognition; Schmorrow, D., Fidopiastis, C., Eds.; Springer: Cham, Switzerland, 2019; Volume 11580, pp. 461–470. [Google Scholar]
- McMullen, D.; Hotson, G.; Katyal, K.D.; Wester, B.A.; Fifer, M.S.; McGee, T.G.; Harris, A.; Johannes, M.S.; Vogelstein, R.J.; Ravitz, A.D.; et al. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking and computer vision to control a robotic upper limb prosthetic. IEEE Trans. Neural Syst. Rehabil. Eng. 2014, 22, 784–792. [Google Scholar] [CrossRef] [Green Version]
- Sai, J.-Z.; Lee, C.-K.; Wu, C.-M.; Wu, J.-J.; Kao, K.-P. A feasibility study of an eye-writing system based on electro-oculography. J. Med. Biol. Eng. 2008, 28, 39–46. [Google Scholar]
- Luu, T.; Ngoc, H.; Le, V.; Ho, T.; Truong, N.; Ngan, T.; Luong, H.; Nguyen, Q. Machine Learning Model for Identifying Antioxidant Proteins Using Features Calculated from Primary Sequences. Biology 2020, 9, 325. [Google Scholar]
- Nguyen, Q.; Duyen, T.; Truong, K.; Luu, T.; Tuan-Tu, H.; Ngan, K. A Computational Framework Based on Ensemble Deep Neural Networks for Essential Genes Identification. Int. J. Mol. Sci. 2020, 21, 9070. [Google Scholar]
- Daqi, G.; Yan, J. Classification methodologies of multilayer perceptrons with sigmoid activation functions. Pattern Recognit. 2005, 38, 1469–1482. [Google Scholar] [CrossRef]
- Quinlan, J.R. Improved use of continuous attributes in C4.5. J. Artif. Intell. Res. 1996, 4, 77–90. [Google Scholar] [CrossRef] [Green Version]
- Quinlan, J.R. Induction of decision trees. Mach. Learn. 1986, 1, 81–106. [Google Scholar] [CrossRef] [Green Version]
- Otneim, H.; Jullum, M.; Tjøstheim, D. Pairwise local Fisher and Naïve Bayes: Improving two standard discriminants. J. Econom. 2020, 216, 284–304. [Google Scholar] [CrossRef]
- Cover, T.; Hart, P. Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 1967, 13, 21–27. [Google Scholar] [CrossRef]
- Yamashita, Y.; Wakahara, T. Affine-transformation and 2D-projection invariant k-NN classification of handwritten characters via a new matching measure. Pattern Recognit. 2016, 52, 459–470. [Google Scholar] [CrossRef]
- Noh, Y.-K.; Zhang, B.-T.; Lee, D.D. Generative Local Metric Learning for Nearest Neighbor Classification. IEEE Trans. Pattern Anal. Mach. Intell. 2018, 40, 106–118. [Google Scholar] [CrossRef]
- Stoklasa, R.; Majtner, T.; Svoboda, D. Efficient k-NN based HEp-2 cells classifier. Pattern Recognit. 2014, 47, 2409–2418. [Google Scholar] [CrossRef]
- Pernkopf, F. Bayesian network classifiers versus selective k-NN classifier. Pattern Recognit. 2005, 38, 1–10. [Google Scholar] [CrossRef]
- Le Cessie, S.; van Houwelingen, J.C. Ridge Estimators in Logistic Regression. Appl. Stat. 1992, 41, 191–201. [Google Scholar] [CrossRef]
- Paranjape, P.; Dhabu, M.; Deshpande, P. A novel classifier for multivariate instance using graph class signatures. Front. Comput. Sci. 2020, 14, 144307. [Google Scholar] [CrossRef]
- Hall, M.; Frank, E.; Holmes, G.; Pfahringer, B.; Reutemann, P.; Witten, I.H. The WEKA data mining software: An update. ACM SIGKDD Explor. Newsl. 2009, 11, 10–18. [Google Scholar] [CrossRef]
- Lindberg, A. Developing Theory Through Integrating Human and Machine Pattern Recognition. J. Assoc. Inf. Syst. 2020, 21, 7. [Google Scholar] [CrossRef]
- Schwenker, F.; Trentin, E. Pattern classification and clustering: A review of partially supervised learning approaches. Pattern Recognit. Lett. 2014, 37, 4–14. [Google Scholar] [CrossRef]
- Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef] [Green Version]
- Adam, S.P.; Alexandropoulos, S.A.N.; Pardalos, P.M.; Vrahatis, M.N. No free lunch theorem: A review. In Approximation and Optimization; Springer Optimization and Its Applications Series; Springer: Cham, Switzerland, 2019; Volume 145, pp. 57–82. [Google Scholar]
- Stock, M.; Pahikkala, T.; Airola, A.; Waegeman, W.; De Baets, B. Algebraic shortcuts for leave-one-out cross-validation in supervised network inference. Brief. Bioinform. 2020, 21, 262–271. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jiang, G.; Wang, W. Error estimation based on variance analysis of k-fold cross-validation. Pattern Recognit. 2017, 69, 94–106. [Google Scholar] [CrossRef]
- Soleymani, R.; Granger, E.; Fumera, G. F-measure curves: A tool to visualize classifier performance under imbalance. Pattern Recognit. 2020, 100, 107–146. [Google Scholar] [CrossRef]
- Moreno-Ibarra, M.-A.; Villuendas-Rey, Y.; Lytras, M.; Yañez-Marquez, C.; Salgado-Ramirez, J.-C. Classification of Diseases Using Machine Learning Algorithms: A Comparative Study. Mathematics 2021, 9, 1817. [Google Scholar] [CrossRef]
- Cogollo, M.R.; Velásquez, J.D.; Patricia, A. Estimation of the nonlinear moving model Parameters using the DE-PSO Meta-Heuristic. Rev. Ing. Univ. Medellín 2013, 12, 147–156. [Google Scholar] [CrossRef] [Green Version]
- Pruna, E.; Sasig, E.R.; Mullo, S. PI and PID controller tuning tool based on the lambda method. In Proceedings of the 2017 CHILEAN Conference on Electrical, Electronics Engineering, Information and Communication Technologies (CHILECON), Pucon, Chile, 18–20 October 2017; pp. 1–6. [Google Scholar]
Classifier | Sensitivity | Specificity | Balanced Accuracy | Precision | ROC Area |
---|---|---|---|---|---|
Random Forest | 0.986 | 0.982 | 0.984 | 0.986 | 0.999 |
Random Tree | 0.986 | 0.982 | 0.984 | 0.986 | 0.996 |
J48 | 0.977 | 0.973 | 0.975 | 0.977 | 0.996 |
KNN-1 | 0.979 | 0.975 | 0.977 | 0.979 | 0.997 |
KNN-2 | 0.966 | 0.960 | 0.963 | 0.966 | 0.997 |
KNN-3 | 0.958 | 0.952 | 0.955 | 0.958 | 0.996 |
Logistic | 0.683 | 0.805 | 0.744 | 0.683 | 0.889 |
Multilayer Perception | 0.755 | 0.836 | 0.795 | 0.755 | 0.889 |
Support Vector Machine | 0.669 | 0.853 | 0.761 | 0.669 | 0.882 |
Naive Bayes | 0.714 | 0.849 | 0.782 | 0.714 | 0.905 |
# Class | One Hot Encoding/State Machine | ||||
---|---|---|---|---|---|
Ocular Movement | S3 (Left) | S2 (Down) | S4 (Right) | S1 (Up) | S0 (Stop) |
3 (Left) | 1 | 0 | 0 | 0 | 0 |
5 (Down) | 0 | 1 | 0 | 0 | 0 |
2 (Right) | 0 | 0 | 1 | 0 | 0 |
4 (Up) | 0 | 0 | 0 | 1 | 0 |
1 (Stop) | 0 | 0 | 0 | 0 | 1 |
State | Class EOG | Desired Value | Desired Movement |
---|---|---|---|
S3 | 3 | (−0.15 m/s, 0 m/s, 0 rad/s) | Left |
S2 | 5 | (0 m/s, −0.15 m/s, 0 rad/s) | Down |
S4 | 2 | (0.15 m/s, 0 m/s, 0 rad/s) | Right |
S1 | 4 | (0 m/s, 0.15 m/s, 0 rad/s) | Up |
S0 | 1 | (0 m/s, 0 m/s, 0 rad/s) | Stop |
Combined and sequential linear movements | |||
S5 | 3, 4 | (−0.15 m/s, 0.15 m/s, 0 rad/s) | Upper-Left Diagonal |
S6 | 2, 4 | (0.15 m/s, 0.15 m/s, 0 rad/s) | Upper-Right Diagonal |
S7 | 3, 5 | (−0.15 m/s, −0.15 m/s, 0 rad/s) | Lower-Left Diagonal |
S8 | 2, 5 | (0.15 m/s, −0.15 m/s, 0 rad/s) | Lower-Right Diagonal |
Combined and sequential rotational movements | |||
4, 2, 5, 3 | (0 m/s, 0 m/s, 0.8 rad/s) | Counterclockwise rotation | |
3, 5, 2, 4 | (0 m/s, 0 m/s, −0.8 rad/s) | Clockwise rotation |
Test Number | Test Board 1 | Test Board 2 | Test Board 3 |
---|---|---|---|
1 | 343.50 | 430.30 | 532.90 |
5 | 245.12 | 365.10 | 499.12 |
10 | 189.80 | 306.00 | 400.20 |
15 | 150.32 | 246.60 | 340.90 |
20 | 108.87 | 188.50 | 278.40 |
25 | 88.65 | 147.1 | 200.23 |
30 | 84.32 | 99.32 | 154.23 |
Test Number | Test Board 1 | Test Board 2 | Test Board 3 |
---|---|---|---|
1 | 15 | 10 | 18 |
5 | 11 | 7 | 14 |
10 | 9 | 5 | 11 |
15 | 7 | 2 | 8 |
20 | 5 | 1 | 6 |
25 | 3 | 1 | 3 |
30 | 0 | 0 | 0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pérez-Reynoso, F.D.; Rodríguez-Guerrero, L.; Salgado-Ramírez, J.C.; Ortega-Palacios, R. Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. Sensors 2021, 21, 5882. https://doi.org/10.3390/s21175882
Pérez-Reynoso FD, Rodríguez-Guerrero L, Salgado-Ramírez JC, Ortega-Palacios R. Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. Sensors. 2021; 21(17):5882. https://doi.org/10.3390/s21175882
Chicago/Turabian StylePérez-Reynoso, Francisco David, Liliam Rodríguez-Guerrero, Julio César Salgado-Ramírez, and Rocío Ortega-Palacios. 2021. "Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot" Sensors 21, no. 17: 5882. https://doi.org/10.3390/s21175882
APA StylePérez-Reynoso, F. D., Rodríguez-Guerrero, L., Salgado-Ramírez, J. C., & Ortega-Palacios, R. (2021). Human–Machine Interface: Multiclass Classification by Machine Learning on 1D EOG Signals for the Control of an Omnidirectional Robot. Sensors, 21(17), 5882. https://doi.org/10.3390/s21175882