Abstract
Autonomous driving is anticipated to increase safety, efficiency, and accessibility of passenger transportation. Passengers are given freedom in the use of travel time through the potential to conduct non-driving related tasks (NDRTs). However, factors such as trust and motion sickness pose challenges to the widespread adoption of this technology. Human–machine interfaces (HMIs) have shown potential in mitigating motion sickness and fostering trust calibration in autonomous vehicles (AVs), e.g., by visualizing upcoming or current maneuvers of the vehicle. The majority of research on such HMIs relies on the passengers’ attention, preventing uninterrupted NDRT execution and thus impeding the automation’s usefulness. In this paper, we present a visual HMI, providing AV passengers with information about current driving maneuvers through their peripheral fields of view. This method of information transmission is compared to conventional in-vehicle displays and LED strips regarding perceptibility and distraction. In a controlled laboratory setting, N = 34 participants experienced each HMI condition, indicating their perception of the maneuver visualizations using joystick input while either focusing on a fixation cross to measure perceptibility or solving math tasks to measure distraction. The peripheral HMIs caused better maneuver perception ( ) and lower distraction ( ) from a visual NDRT than the conventional displays. These results yield implications for the design of HMIs for motion sickness mitigation and trust calibration in AVs.