Driving Attention State Detection Based on GRU-EEGNet
Abstract
:1. Introduction
2. Materials and Methods
2.1. Experiments
2.1.1. Subjects
2.1.2. Experimental Device
2.1.3. Experimental Paradigm
2.2. Data Acquisition and Preprocessing
2.3. Driving Attention State EEG Power Spectrum Feature Extraction
2.3.1. Power Spectrum Calculation Method
2.3.2. EEG Power Spectrum Analysis of Driving Attention States Based on Welch’s Method
2.4. GRU-EEGNet Basic Architecture
2.4.1. EEGNet Network Architecture
2.4.2. Gated Recurrent Unit
2.4.3. GRU-EEGNet
3. Results
3.1. Driving Attention State Detection Based on EEG Power Spectrum Features
3.1.1. Performance Comparison of Common Classification Methods
3.1.2. SVM-Based Driving Attention State Detection
3.2. EEGNet and GRU-EEGNet Classification Results
3.3. Online Detection System of Driving Attention State Based on EEG
3.3.1. System Structure and Function
3.3.2. System Module Realization
- (1)
- The human–computer interface was designed by QT Designer. The left side of the interface is for model selection, while the right side is for experimental control and test result display. This is illustrated in Figure 13. The model selection is realized by QFileDialog control, whereby the path of the model can be selected through the file selection dialog box. The experimental control section is implemented using a push-button control, which modifies the trigger signal via the button and calls the associated slot function to execute the corresponding function. The test result display function is a LineEdit text box, which transforms the predicted value transmitted from the data processing module into the corresponding driver state and displays it in the text box.
- (2)
- The EEG data acquisition module facilitates the real-time transmission of EEG signals to a computer terminal via the remote data transmission RDA (Remote Data Access) interface provided by Pycorder software (https://brainlatam.com/manufacturers/brain-products/pycorder-127, accessed on 8 August 2022). This is achieved through the transmission of the EEG signal based on the TCP, with the signal acquisition module triggering a transmit signal to send the data to the data processing module after acquiring the data for the specified length of time (here, 1 s).
- (3)
- The data processing module is described here. The EEG data captured by ActiCHamp in real time is of a list type without channel information. The EEG data are converted to a RawArray type using Python’s MNE toolkit, and then preprocessed sequentially with channel selection, re-referencing, filtering, and downsampling. The online preprocessing is consistent with the offline preprocessing, including the selection of the same channels (59 channels or 10 channels), re-reference electrodes (TP9, TP10), filtering parameters (0.1–45 Hz), and downsampling rate (100 Hz). The preprocessed data are directly fed into the trained GRU-EEGNet model for feature extraction and classification.
- (4)
- The display module is responsible for converting the predicted values generated by the data processing module into the corresponding driver states and displaying them in a text box.
3.3.3. Online Experiment and Effect
4. Discussions
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Shinar, D. Traffic safety and individual differences in drivers’ attention and information processing capacity. Alcohol Drugs Driv. Abstr. Rev. 1993, 9, 219–237. [Google Scholar]
- Kang, H.B. Various Approaches for Driver and Driving Behavior Monitoring: A Review. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), Sydney, Australia, 1–8 December 2013; pp. 616–623. [Google Scholar]
- NHTSA. Distracted Driving in 2022. Available online: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/813559 (accessed on 10 July 2024).
- Arevalo-Tamara, A.; Caicedo, A.; Orozco-Fontalvo, M.; Useche, S.A. Distracted driving in relation to risky road behaviors and traffic crashes in Bogota, Colombia. Saf. Sci. 2022, 153, 105803. [Google Scholar] [CrossRef]
- Nowosielski, R.J.; Trick, L.M.; Toxopeus, R. Good distractions: Testing the effects of listening to an audiobook on driving performance in simple and complex road environments. Accid. Anal. Prev. 2018, 111, 202–209. [Google Scholar] [CrossRef] [PubMed]
- Oviedo-Trespalacios, O.; Hague, M.M.; King, M.; Washington, S. Understanding the impacts of mobile phone distraction on driving performance: A systematic review. Transp. Res. Part C Emerg. Technol. 2016, 72, 360–380. [Google Scholar] [CrossRef]
- Atwood, J.; Guo, F.; Fitch, G.; Dingus, T.A. The driver-level crash risk associated with daily cellphone use and cellphone use while driving. Accid. Anal. Prev. 2018, 119, 149–154. [Google Scholar] [CrossRef] [PubMed]
- Brome, R.; Awad, M.; Moacdieh, N.M. Roadside digital billboard advertisements: Effects of static, transitioning, and animated designs on drivers’ performance and attention. Transp. Res. Part F Traffic Psychol. Behav. 2021, 83, 226–237. [Google Scholar] [CrossRef]
- Chee, P.; Irwin, J.; Bennett, J.M.; Carrigan, A.J. The mere presence of a mobile phone: Does it influence driving performance? Accid. Anal. Prev. 2021, 159, 106226. [Google Scholar] [CrossRef] [PubMed]
- Oviedo-Trespalacios, O.; Truelove, V.; Watson, B.; Hinton, J.A. The impact of road advertising signs on driver behaviour and implications for road safety: A critical systematic review. Transp. Res. Part A Policy Pract. 2019, 122, 85–98. [Google Scholar] [CrossRef]
- Dukic, T.; Ahlstrom, C.; Patten, C.; Kettwich, C.; Kircher, K. Effects of electronic billboards on driver distraction. Traffic Inj. Prev. 2013, 14, 469–476. [Google Scholar] [CrossRef]
- Mustapic, M.; Vrkljan, J.; Jelec, V. Research on the Influence of Roadside Billboards on Cognitive Workload of Young Drivers and Traffic Safety. Teh. Vjesn. 2021, 28, 488–494. [Google Scholar]
- Zokaei, M.; Jafari, M.J.; Khosrowabadi, R.; Nahvi, A.; Khodakarim, S.; Pouyakian, M. Tracing the physiological response and behavioral performance of drivers at different levels of mental workload using driving simulators. J. Saf. Res. 2020, 72, 213–223. [Google Scholar] [CrossRef] [PubMed]
- Yusoff, N.M.; Ahmad, R.F.; Guillet, C.; Malik, A.S.; Saad, N.M.; Mérienne, F. Selection of Measurement Method for Detection of Driver Visual Cognitive Distraction: A Review. IEEE Access 2017, 5, 22844–22854. [Google Scholar] [CrossRef]
- Schier, M.A. Changes in EEG alpha power during simulated driving: A demonstration. Int. J. Psychophysiol. 2000, 37, 155–162. [Google Scholar] [CrossRef] [PubMed]
- Sonnleitner, A.; Treder, M.S.; Simon, M.; Willmann, S.; Ewald, A.; Buchner, A.; Schrauf, M. EEG alpha spindles and prolonged brake reaction times during auditory distraction in an on-road driving study. Accid. Anal. Prev. 2014, 62, 110–118. [Google Scholar] [CrossRef]
- Gundel, A.; Wilson, G.F. Topographical changes in the ongoing EEG related to the difficulty of mental tasks. Brain Topogr. 1992, 5, 17–25. [Google Scholar] [CrossRef] [PubMed]
- Osaka, M. Peak alpha frequency of EEG during a mental task: Task difficulty and hemispheric differences. Psychophysiology 1984, 21, 101–105. [Google Scholar] [CrossRef] [PubMed]
- Almahasneh, H.; Kamel, N.; Walter, N.; Malik, A.S.; Chooi, W.T. Topographic Map Analysis for Driver Cognitive Distraction. In Proceedings of the IEEE International Conference on Biomedical Engineering and Sciences, Miri, Malaysia, 8–10 December 2014; pp. 117–122. [Google Scholar]
- Shi, C.C.; Yan, F.W.; Zhang, J.W.; Yu, H.; Peng, F.M.; Yan, L.R. Right superior frontal involved in distracted driving. Transp. Res. Part F-Traffic Psychol. Behav. 2023, 93, 191–203. [Google Scholar] [CrossRef]
- Karthaus, M.; Wascher, E.; Getzmann, S. Effects of Visual and Acoustic Distraction on Driving Behavior and EEG in Young and Older Car Drivers: A Driving Simulation Study. Front. Aging Neurosci. 2018, 10, 15. [Google Scholar] [CrossRef] [PubMed]
- Sonnleitner, A.; Simon, M.; Kincses, W.E.; Buchner, A.; Schrauf, M. Alpha spindles as neurophysiological correlates indicating attentional shift in a simulated driving task. Int. J. Psychophysiol. 2012, 83, 110–118. [Google Scholar] [CrossRef]
- Klimesch, W.; Doppelmayr, M.; Russegger, H.; Pachinger, T.; Schwaiger, J. Induced alpha band power changes in the human EEG and attention. Neurosci. Lett. 1998, 244, 73–76. [Google Scholar] [CrossRef]
- Inanaga, K. Frontal midline theta rhythm and mental activity. Psychiatry Clin. Neurosci. 2010, 52, 555–566. [Google Scholar] [CrossRef]
- Klimesch, W. EEG alpha and theta oscillations reflect cognitive and memory performance: A review and analysis. Brain Res. Rev. 1999, 29, 169–195. [Google Scholar] [CrossRef] [PubMed]
- Almahasneh, H.S.; Kamel, N.; Malik, A.S.; Wlater, N.; Chooi, W.T. EEG based Driver Cognitive Distraction Assessment. In Proceedings of the 5th International Conference on Intelligent and Advanced Systems (ICIAS), Kuala Lumpur, Malaysia, 3–5 June 2014. [Google Scholar]
- Yan, L.R.; Chen, Y.; Zhang, J.W.; Guan, Z.Z.; Wu, Y.B.; Yan, F.W. Distraction detection of driver based on EEG signals in a simulated driving with alternative secondary task. In Proceedings of the 2nd IEEE International Conference on Human-Machine Systems (ICHMS), Magdeburg, Germany, 8–10 September 2021; pp. 193–195. [Google Scholar]
- Li, G.F.; Yan, W.Q.; Li, S.; Qu, X.D.; Chu, W.B.; Cao, D.P. A Temporal-Spatial Deep Learning Approach for Driver Distraction Detection Based on EEG Signals. IEEE Trans. Autom. Sci. Eng. 2022, 19, 2665–2677. [Google Scholar] [CrossRef]
- Wang, Q.; Smythe, D.; Cao, J.; Hu, Z.L.; Proctor, K.J.; Owens, A.P.; Zhao, Y.F. Characterisation of Cognitive Load Using Machine Learning Classifiers of Electroencephalogram Data. Sensors 2023, 23, 8528. [Google Scholar] [CrossRef] [PubMed]
- Engström, J.; Johansson, E.; Östlund, J. Effects of visual and cognitive load in real and simulated motorway driving. Transp. Res. Part F Traffic Psychol. Behav. 2005, 8, 97–120. [Google Scholar] [CrossRef]
- Liang, Y.; Lee, J.D. Combining cognitive and visual distraction: Less than the sum of its parts. Accid. Anal. Prev. 2010, 42, 881–890. [Google Scholar] [CrossRef] [PubMed]
- Lawhern, V.J.; Solon, A.J.; Waytowich, N.R.; Gordon, S.M.; Hung, C.P.; Lance, B.J. EEGNet: A compact convolutional neural network for EEG-based brain-computer interfaces. J. Neural Eng. 2018, 15, 056013. [Google Scholar] [CrossRef] [PubMed]
- Nissimagoudar, P.C.; Nandi, A.V.; Gireesha, H.M. EEG Based Feature Extraction and Classification for Driver Status Detection. In Proceedings of the 9th International Conference on Innovations in Bio- Inspired Computing and Applications (IBICA)/7th World Congress on Information and Communication Technologies (WICT), Kochi, India, 17–19 December 2018; pp. 151–161. [Google Scholar]
- Wang, Y.K.; Jung, T.P.; Lin, C.T. EEG-Based Attention Tracking During Distracted Driving. IEEE Trans. Neural Syst. Rehabil. Eng. 2015, 23, 1085–1094. [Google Scholar] [CrossRef] [PubMed]
- Li, M.; Wang, W.H.; Liu, Z.; Qiu, M.J.; Qu, D.Y. Driver Behavior and Intention Recognition Based on Wavelet Denoising and Bayesian Theory. Sustainability 2022, 14, 6901. [Google Scholar] [CrossRef]
- Kumar, S.P.; Selvaraj, J.; Krishnakumar, R.; Sahayadhas, A. Detecting Distraction in Drivers using Electroencephalogram (EEG) Signals. In Proceedings of the 2020 Fourth International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 11–13 March 2020. [Google Scholar]
- Dehzangi, O.; Taherisadr, M. EEG Based Driver Inattention Identification via Feature Profiling and Dimensionality Reduction: Technology, Communications and Computing. In Advances in Body Area Networks I; Springer: Cham, Switzerland, 2019. [Google Scholar]
- Xing, Y.; Lv, C.; Zhang, Z.Z.; Wang, H.J.; Na, X.X.; Cao, D.P.; Velenis, E.; Wang, F.Y. Identification and Analysis of Driver Postures for In-Vehicle Driving Activities and Secondary Tasks Recognition. IEEE Trans. Comput. Soc. Syst. 2018, 5, 95–108. [Google Scholar] [CrossRef]
- Almogbel, M.A.; Dang, A.H.; Kameyama, W. EEG-Signals Based Cognitive Workload Detection of Vehicle Driver using Deep Learning. In Proceedings of the 20th International Conference on Advanced Communication Technology (ICACT), Chuncheon, Republic of Korea, 11–14 February 2018; pp. 256–259. [Google Scholar]
- Moinnereau, M.A.; Karimian-Azari, S.; Falk, T.H.; Sakuma, T.; Boutani, H.; Gheorghe, L. EEG Artifact Removal for Improved Automated Lane Change Detection while Driving. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7–10 October 2018; pp. 1076–1080. [Google Scholar]
- Kumar, S.; He, D.B.; Qiao, G.K.; Donmez, B. Classification of Driver Cognitive Load based on Physiological Data: Exploring Recurrent Neural Networks. In Proceedings of the 7th IEEE International Conference on Advanced Robotics and Mechatronics, Guilin, China, 9–11 July 2022; pp. 19–24. [Google Scholar]
- Kouchak, S.M.; Gaffar, A. Detecting Driver Behavior Using Stacked Long Short Term Memory Network With Attention Layer. IEEE Trans. Intell. Transp. Syst. 2021, 22, 3420–3429. [Google Scholar] [CrossRef]
- Zuo, X.; Zhang, C.; Cong, F.Y.; Zhao, J.; Hämäläinen, T. Driver Distraction Detection Using Bidirectional Long Short-Term Network Based on Multiscale Entropy of EEG. IEEE Trans. Intell. Transp. Syst. 2022, 23, 19309–19322. [Google Scholar] [CrossRef]
- Abbas, Q.; Ibrahim, M.E.A.; Khan, S.; Baig, A.R. Hypo-Driver: A Multiview Driver Fatigue and Distraction Level Detection System. CMC-Comput. Mater. Contin. 2022, 71, 1999–2017. [Google Scholar]
- Wu, Q.; Shi, S.; Wan, Z.Y.; Fan, Q.; Fan, P.Y.; Zhang, C. Towards V2I Age-Aware Fairness Access: A DQN Based Intelligent Vehicular Node Training and Test Method. Chin. J. Electron. 2023, 32, 1230–1244. [Google Scholar]
Block | Layer | # Filters | Size | Output | Activation |
---|---|---|---|---|---|
1 | Input | (C, T) | |||
Reshape | (1, C, T) | ||||
Conv2D | F1 | (1, 64) | (F1, C, T) | Linear | |
BatchNorm | (F1, C, T) | ||||
DepthwiseConv2D | D * F1 | (C, 1) | (D * F1, 1, T) | Linear | |
BatchNorm | (D * F1, 1, T) | ||||
Activation | (D * F1, 1, T) | ELU | |||
AveragePool2D | (1, 4) | (D * F1, 1, T//4) | |||
Dropout * | (D * F1, 1, T//4) | ||||
2 | SeparableConv2D | F2 | (1, 16) | (F2, 1, T//4) | Linear |
BatchNorm | (F2, 1, T//4) | ||||
Activation | (F2, 1, T//4) | ELU | |||
AveragePool2D | (1, 8) | (F2,1,T//32) | |||
Dropout * | (F2, 1, T//32) | ||||
Flatten | (F2 * (T//32)) | ||||
Classifier | Dense | N * (F2 * T//32) | N | Softmax |
Model Structure | Convolution Kernel Size | Number of Convolution Kernels | Step Size |
---|---|---|---|
Conv2D | (1,50) | 16 | 1 |
DepthwiseConv2D | (59,1) | 32 | 1 |
AveragePool2D | (1,4) | - | 4 |
SeparableConv2D | (1,16) | 32 | 1 |
AveragePool2D | (1,8) | - | 8 |
Sub1 | Sub2 | Sub3 | Sub4 | Sub5 | Sub6 | Sub7 | Sub8 | Sub9 | Sub10 | Avg. | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|
Acc(%) | 69.42 | 76.57 | 86.93 | 78.36 | 70.70 | 78.65 | 71.26 | 80.32 | 79.53 | 85.19 | 77.69 | |
K value | 0.664 | 0.728 | 0.842 | 0.741 | 0.653 | 0.746 | 0.672 | 0.784 | 0.768 | 0.827 | 0.743 | |
F1 score | 0.695 | 0.766 | 0.870 | 0.784 | 0.708 | 0.787 | 0.713 | 0.803 | 0.796 | 0.853 | 0.778 | |
Precision | V | 0.732 | 0.835 | 0.913 | 0.836 | 0.744 | 0.833 | 0.766 | 0.736 | 0.861 | 0.914 | 0.817 |
A | 0.638 | 0.714 | 0.823 | 0.766 | 0.668 | 0.802 | 0.674 | 0.813 | 0.739 | 0.798 | 0.743 | |
C | 0.714 | 0.726 | 0.854 | 0.744 | 0.734 | 0.749 | 0.744 | 0.871 | 0.776 | 0.828 | 0.774 | |
F | 0.695 | 0.788 | 0.888 | 0.790 | 0.685 | 0.764 | 0.669 | 0.794 | 0.814 | 0.868 | 0.775 | |
Avg. | 0.695 | 0.766 | 0.869 | 0.784 | 0.708 | 0.787 | 0.713 | 0.803 | 0.797 | 0.852 | 0.777 | |
Recall | V | 0.746 | 0.774 | 0.846 | 0.766 | 0.727 | 0.774 | 0.768 | 0.822 | 0.773 | 0.887 | 0.788 |
A | 0.689 | 0.809 | 0.878 | 0.828 | 0.709 | 0.794 | 0.715 | 0.785 | 0.852 | 0.844 | 0.790 | |
C | 0.641 | 0.756 | 0.853 | 0.784 | 0.676 | 0.825 | 0.662 | 0.754 | 0.798 | 0.896 | 0.765 | |
F | 0.709 | 0.728 | 0.901 | 0.760 | 0.719 | 0.759 | 0.710 | 0.853 | 0.758 | 0.786 | 0.768 | |
Avg. | 0.696 | 0.767 | 0.870 | 0.785 | 0.707 | 0.788 | 0.714 | 0.803 | 0.796 | 0.853 | 0.778 |
Model | Subject | Visual Distraction | Auditory Distraction | Cognitive Distraction | Focused Driving | Predictive Accuracy (%) | ||||
---|---|---|---|---|---|---|---|---|---|---|
T | C | T | C | T | C | T | C | |||
GRU-EEGNet | Sub1 | 65 | 52 | 150 | 72 | 100 | 69 | 608 | 486 | 73.56% |
Sub2 | 55 | 44 | 180 | 86 | 130 | 76 | 576 | 435 | 68.12% | |
Sub3 | 60 | 47 | 180 | 91 | 110 | 81 | 648 | 486 | 70.64% | |
Sub4 | 65 | 54 | 165 | 71 | 120 | 83 | 582 | 479 | 73.71% | |
Sub5 | 60 | 51 | 150 | 69 | 130 | 79 | 617 | 495 | 72.52% | |
EEGNet | Sub1 | 60 | 49 | 165 | 76 | 130 | 79 | 584 | 425 | 66.99% |
Sub2 | 65 | 51 | 150 | 76 | 120 | 75 | 612 | 413 | 64.94% | |
Sub3 | 50 | 41 | 195 | 94 | 110 | 73 | 638 | 456 | 66.87% | |
Sub4 | 55 | 46 | 180 | 89 | 120 | 82 | 596 | 421 | 67.09% | |
Sub5 | 60 | 47 | 180 | 85 | 110 | 71 | 629 | 437 | 65.37% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wu, X.; Shi, C.; Yan, L. Driving Attention State Detection Based on GRU-EEGNet. Sensors 2024, 24, 5086. https://doi.org/10.3390/s24165086
Wu X, Shi C, Yan L. Driving Attention State Detection Based on GRU-EEGNet. Sensors. 2024; 24(16):5086. https://doi.org/10.3390/s24165086
Chicago/Turabian StyleWu, Xiaoli, Changcheng Shi, and Lirong Yan. 2024. "Driving Attention State Detection Based on GRU-EEGNet" Sensors 24, no. 16: 5086. https://doi.org/10.3390/s24165086
APA StyleWu, X., Shi, C., & Yan, L. (2024). Driving Attention State Detection Based on GRU-EEGNet. Sensors, 24(16), 5086. https://doi.org/10.3390/s24165086