Next Article in Journal
X-Reality System Architecture for Industry 4.0 Processes
Next Article in Special Issue
Participatory Prototyping to Inform the Development of a Remote UX Design System in the Automotive Domain
Previous Article in Journal
Multimodal Technologies in LEGO House: A Social Semiotic Perspective
Previous Article in Special Issue
Takeover Requests in Highly Automated Truck Driving: How Do the Amount and Type of Additional Information Influence the Driver–Automation Interaction?
Article Menu

Export Article

Open AccessArticle
Multimodal Technologies Interact. 2018, 2(4), 71; https://doi.org/10.3390/mti2040071

Catch My Drift: Elevating Situation Awareness for Highly Automated Driving with an Explanatory Windshield Display User Interface

Chair of Human-Machine Communication, TUM Department of Electrical and Computer Engineering, Technical University of Munich, Arcisstr. 21, 80333 München, Germany
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Adjunct Proceedings of the 17th IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Munich, Germany, 16–20 October 2018.
Received: 6 August 2018 / Revised: 13 September 2018 / Accepted: 8 October 2018 / Published: 11 October 2018
(This article belongs to the Special Issue Automotive User Interfaces)
Full-Text   |   PDF [6277 KB, uploaded 11 October 2018]   |  

Abstract

Broad access to automated cars (ACs) that can reliably and unconditionally drive in all environments is still some years away. Urban areas pose a particular challenge to ACs, since even perfectly reliable systems may be forced to execute sudden reactive driving maneuvers in hard-to-predict hazardous situations. This may negatively surprise the driver, possibly causing discomfort, anxiety or loss of trust, which might be a risk for the acceptance of the technology in general. To counter this, we suggest an explanatory windshield display interface with augmented reality (AR) elements to support driver situation awareness (SA). It provides the driver with information about the car’s perceptive capabilities and driving decisions. We created a prototype in a human-centered approach and implemented the interface in a mixed-reality driving simulation. We conducted a user study to assess its influence on driver SA. We collected objective SA scores and self-ratings, both of which yielded a significant improvement with our interface in good (medium effect) and in bad (large effect) visibility conditions. We conclude that explanatory AR interfaces could be a viable measure against unwarranted driver discomfort and loss of trust in critical urban situations by elevating SA. View Full-Text
Keywords: autonomous driving; situation awareness; augmented reality; mixed reality; windshield display; head-up display; user interface; human-centered design; technology acceptance autonomous driving; situation awareness; augmented reality; mixed reality; windshield display; head-up display; user interface; human-centered design; technology acceptance
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Supplementary material

SciFeed

Share & Cite This Article

MDPI and ACS Style

Lindemann, P.; Lee, T.-Y.; Rigoll, G. Catch My Drift: Elevating Situation Awareness for Highly Automated Driving with an Explanatory Windshield Display User Interface. Multimodal Technologies Interact. 2018, 2, 71.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Multimodal Technologies Interact. EISSN 2414-4088 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top