Next Article in Journal
Evaluating Machine Learning-Based Classification of Human Locomotor Activities for Exoskeleton Control Using Inertial Measurement Unit and Pressure Insole Data
Previous Article in Journal
Cross-Domain Object Detection with Hierarchical Multi-Scale Domain Adaptive YOLO
Previous Article in Special Issue
Intelligent Detection and Description of Foreign Object Debris on Airport Pavements via Enhanced YOLOv7 and GPT-Based Prompt Engineering
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

TMU-Net: A Transformer-Based Multimodal Framework with Uncertainty Quantification for Driver Fatigue Detection

School of Electronic Information Engineering, Xi’an Technological University, Xi’an 710021, China
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(17), 5364; https://doi.org/10.3390/s25175364
Submission received: 31 July 2025 / Revised: 22 August 2025 / Accepted: 28 August 2025 / Published: 29 August 2025
(This article belongs to the Special Issue AI and Smart Sensors for Intelligent Transportation Systems)

Abstract

Driving fatigued is a prevalent issue frequently contributing to traffic accidents, prompting the development of automated fatigue detection methods based on various data sources, particularly reliable physiological signals. However, challenges in accuracy, robustness, and practicality persist, especially for cross-subject detection. Multimodal data fusion can enhance the effective estimation of driver fatigue. In this work, we leverage the advantages of multimodal signals to propose a novel Multimodal Attention Network (TMU-Net) for driver fatigue detection, achieving precise fatigue assessment by integrating electroencephalogram (EEG) and electrooculogram (EOG) signals. The core innovation of TMU-Net lies in its unimodal feature extraction module, which combines causal convolution, ConvSparseAttention, and Transformer encoders to effectively capture spatiotemporal features, and a multimodal fusion module that employs cross-modal attention and uncertainty-weighted gating to dynamically integrate complementary information. By incorporating uncertainty quantification, TMU-Net significantly enhances robustness to noise and individual variability. Experimental validation on the SEED-VIG dataset demonstrates TMU-Net’s superior performance stability across 23 subjects in cross-subject testing, effectively leveraging the complementary strengths of EEG (2 Hz full-band and five-band features) and EOG signals for high-precision fatigue detection. Furthermore, attention heatmap visualization reveals the dynamic interaction mechanisms between EEG and EOG signals, confirming the physiological rationality of TMU-Net’s feature fusion strategy. Practical challenges and future research directions for fatigue detection methods are also discussed.
Keywords: driver fatigue detection; multimodal fusion; electroencephalogram (EEG); electrooculogram (EOG); uncertainty quantification driver fatigue detection; multimodal fusion; electroencephalogram (EEG); electrooculogram (EOG); uncertainty quantification

Share and Cite

MDPI and ACS Style

Zhang, Y.; Xu, X.; Du, Y.; Zhang, N. TMU-Net: A Transformer-Based Multimodal Framework with Uncertainty Quantification for Driver Fatigue Detection. Sensors 2025, 25, 5364. https://doi.org/10.3390/s25175364

AMA Style

Zhang Y, Xu X, Du Y, Zhang N. TMU-Net: A Transformer-Based Multimodal Framework with Uncertainty Quantification for Driver Fatigue Detection. Sensors. 2025; 25(17):5364. https://doi.org/10.3390/s25175364

Chicago/Turabian Style

Zhang, Yaxin, Xuegang Xu, Yuetao Du, and Ningchao Zhang. 2025. "TMU-Net: A Transformer-Based Multimodal Framework with Uncertainty Quantification for Driver Fatigue Detection" Sensors 25, no. 17: 5364. https://doi.org/10.3390/s25175364

APA Style

Zhang, Y., Xu, X., Du, Y., & Zhang, N. (2025). TMU-Net: A Transformer-Based Multimodal Framework with Uncertainty Quantification for Driver Fatigue Detection. Sensors, 25(17), 5364. https://doi.org/10.3390/s25175364

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop