This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Open AccessArticle
HMI Design of Intelligent Vehicles Based on Multimodal Experiments of Driver Emotions
by
Tongyue Sun
Tongyue Sun ,
Yongjia Li
Yongjia Li
and
Xihui Yang
Xihui Yang *
Department of Industrial Design, School of Mechano-Electronic Engineering, Xidian University, Xi’an 710071, China
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2026, 10(3), 33; https://doi.org/10.3390/mti10030033 (registering DOI)
Submission received: 27 January 2026
/
Accepted: 18 March 2026
/
Published: 21 March 2026
Abstract
Negative driving emotions constitute a significant factor compromising road safety. Current intelligent vehicle human machine interaction (HMI) systems predominantly focus on functional implementation, lacking the capability to perceive and adapt to the driver’s psychological state. To address this issue, this study investigates the intrinsic relationship between driving emotions and HMI through multimodal experiments. Experiment One reveals the distribution patterns of drivers’ visual attentional scope under different emotional states. Experiment Two establishes a color preference model for HMI interfaces corresponding to specific emotions. Experiment Three quantitatively analyzes the impact of emotional variations on the perceptual efficiency of auditory warnings. Based on the experimental data, an interaction design principle matching “Emotion-Scene-Modality” is formulated, guiding the design of a data-driven, emotion-adaptive HMI prototype system. This system can perceive the driver’s emotional state in real time via multimodal sensors and dynamically adjust interface color themes, information layout, warning sound effects, and voice interaction style according to predefined interaction strategies. Usability testing demonstrates that, compared to traditional static HMI, this affective adaptive system effectively mitigates the driver’s negative emotional load and provides alerts that are more perceptible and less likely to cause irritation during critical moments. Consequently, it offers a significant theoretical foundation and practical reference for constructing a safer and more comfortable next-generation intelligent vehicle cockpit interaction paradigm.
Share and Cite
MDPI and ACS Style
Sun, T.; Li, Y.; Yang, X.
HMI Design of Intelligent Vehicles Based on Multimodal Experiments of Driver Emotions. Multimodal Technol. Interact. 2026, 10, 33.
https://doi.org/10.3390/mti10030033
AMA Style
Sun T, Li Y, Yang X.
HMI Design of Intelligent Vehicles Based on Multimodal Experiments of Driver Emotions. Multimodal Technologies and Interaction. 2026; 10(3):33.
https://doi.org/10.3390/mti10030033
Chicago/Turabian Style
Sun, Tongyue, Yongjia Li, and Xihui Yang.
2026. "HMI Design of Intelligent Vehicles Based on Multimodal Experiments of Driver Emotions" Multimodal Technologies and Interaction 10, no. 3: 33.
https://doi.org/10.3390/mti10030033
APA Style
Sun, T., Li, Y., & Yang, X.
(2026). HMI Design of Intelligent Vehicles Based on Multimodal Experiments of Driver Emotions. Multimodal Technologies and Interaction, 10(3), 33.
https://doi.org/10.3390/mti10030033
Article Metrics
Article Access Statistics
For more information on the journal statistics, click
here.
Multiple requests from the same IP address are counted as one view.