Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI?
Abstract
:1. Introduction
- Can an ambient peripheral light display (Light-band HMI) be used to improve drivers’ perceived safety and trust during L3 automated driving?
- How effective is a Light-band HMI for facilitating an effective transition of control between L3 automated driving and manual driving, when compared to an auditory HMI alert?
- What is the pattern of drivers’ eye movements during the takeover process for each type of HMI?
2. Methods
2.1. Participants
2.2. Equipment
2.3. Experimental Design
- When automation was available to be engaged, the Light-band pulsed with a blue light at 2 Hz until the driver turned the automation on.
- During automated driving, the Light-band displayed a solid blue light to indicate that the automation was operating normally.
- During takeover requests, the Light-band pulsed with a red light at 2 Hz until the driver resumed manual control.
2.3.1. Vehicle Cut-In Scenarios
2.4. Driver Disengagement Algorithm
- Was pulling the stalk to turn automation off.
- Had at least one hand on the steering wheel (determined via the capacitive steering wheel).
- Was looking ahead (i.e., On Road) for a sufficient length of time (see below for thresholds).
2.5. Procedure
2.6. Statistical Analyses
3. Results
3.1. Visual Attention during Automated Driving
3.2. Behaviour during the Takeover
3.3. Perception of Safety and Trust
4. Discussion
5. Conclusions
Author Contributions
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- SAE. Summary of SAE International’S Levels of Driving Automation for On-Road Vehicles. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Ann Arbor, MI, USA, 24–26 October 2021; pp. 237–244. [Google Scholar] [CrossRef]
- Group, M.-B. Mercedes-Benz Backs Redundancy for Safe Conditionally Automated Driving|Mercedes-Benz Group > Innovation > Product Innovation > Autonomous Driving. 2022. Available online: https://group.mercedes-benz.com/innovation/product-innovation/autonomous-driving/redundancy-drive-pilot.html (accessed on 12 April 2023).
- Louw, T.; Madigan, R.; Carsten, O.; Merat, N. Were they in the loop during automated driving? Links between visual attention and crash potential. Inj. Prev. J. Int. Soc. Child Adolesc. Inj. Prev. 2016, 23, 281–286. [Google Scholar] [CrossRef] [PubMed]
- Unece. GRAV-13-36 Safety Requirements for Automated Driving Systems. 2022. Available online: https://unece.org/sites/default/files/2022-05/GRVA-13-36e.pdf (accessed on 12 April 2023).
- Endsley, M.R. Toward a Theory of Situation Awareness in Dynamic Systems. Hum. Factors Ergon. Soc. 1995, 37, 32–64. [Google Scholar] [CrossRef]
- Carrasco, M. Visual attention: The past 25 years. Vis. Res. 2011, 51, 1484–1525. [Google Scholar] [CrossRef] [PubMed]
- Svärd, M.; Bärgman, J.; Victor, T. Detection and response to critical lead vehicle deceleration events with peripheral vision: Glance response times are independent of visual eccentricity. Accid. Anal. Prev. 2021, 150, 105853. [Google Scholar] [CrossRef] [PubMed]
- Lamble, D.; Laakso, M.; Summala, H.; Laakso, M.A.; Tra HI, E.S. Detection thresholds in car following situations and peripheral vision: Implications for positioning of visually demanding in-car displays. Ergonomics 2010, 42, 807–815. [Google Scholar] [CrossRef]
- Lee, J.D. Trust and the teleology of technology Commentary on Hancock (2019) Some pitfalls in the promises of automated and autonomous vehicles. Ergonomics 2019, 62, 500–501. [Google Scholar] [CrossRef] [PubMed]
- Parasuraman, R.; Sheridan, T.B.; Wickens, C.D. A Model for Types and Levels of Human Interaction with Automation. Syst. Hum. 2000, 30, 286–297. [Google Scholar] [CrossRef] [PubMed]
- Cohen-Lazry, G.; Katzman, N.; Borowsky, A.; Oron-Gilad, T. Directional tactile alerts for take-over requests in highly-automated driving. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 217–226. [Google Scholar] [CrossRef]
- Feierle, A.; Schlichtherle, F.; Bengler, K. Augmented Reality Head-Up Display: A Visual Support During Malfunctions in Partially Automated Driving? IEEE Trans. Intell. Transp. Syst. 2022, 23, 4853–4865. [Google Scholar] [CrossRef]
- Nadri, C.; Ko, S.; Diggs, C.; Winters, M.; Sreehari, V.K.; Jeon, M. Novel Auditory Displays in Highly Automated Vehicles: Sonification Improves Driver Situation Awareness, Perceived Workload, and Overall Experience. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2021, 65, 586–590. [Google Scholar] [CrossRef]
- Tang, Q.; Guo, G.; Zhang, Z.; Zhang, B.; Wu, Y. Olfactory Facilitation of Takeover Performance in Highly Automated Driving. Hum. Factors. 2020, 63, 553–564. [Google Scholar] [CrossRef] [PubMed]
- Nikolic, M.I.; Sarter, N.B. Peripheral Visual Feedback: A Powerful Means of Supporting Effective Attention Allocation in Event-Driven. Data-Rich Environ. 2016, 43, 30–38. [Google Scholar] [CrossRef]
- Danielsson, L.; Lind, H.; Bekiaris, E.; Gemou, M.; Amditis, A.; Miglietta, M.; Stålberg, P. HMI principles for lateral safe applications. In Universal Access in Human-Computer Interaction, Proceedings of the International Conference on Universal Access in Human-Computer Interaction, Beijing, China, 22–27 July 2007; Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Springer: Berlin/Heidelberg, Germany, 2007; Volume 4555, pp. 330–338. [Google Scholar] [CrossRef]
- Kunze, A.; Summerskill, S.J.; Marshall, R.; Filtness, A.J. Conveying Uncertainties Using Peripheral Awareness Displays in the Context of Automated Driving. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 329–341. [Google Scholar] [CrossRef]
- Meschtscherjakov, A.; Döttlinger, C.; Rödel, C.; Tscheligi, M. Chase Light: Ambient LED Stripes to Control Driving Speed. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Nottingham, UK, 1–3 September 2015; pp. 212–219. [Google Scholar] [CrossRef]
- Van Huysduynen, H.H.; Terken, J.; Meschtscherjakov, A.; Eggen, B.; Tscheligi, M. Ambient light and its influence on driving experience. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 293–301. [Google Scholar]
- Schmidt, G.J.; Rittger, L. Guiding driver visual attention with LEDs. In Proceedings of the AutomotiveUI 2017-9th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 279–286. [Google Scholar] [CrossRef]
- Trösterer, S.; Wuchse, M.; Döttlinger, C.; Meschtscherjakov, A.; Tscheligi, M. Light my way: Visualizing shared gaze in the car. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Nottingham, UK, 1–3 September 2015; pp. 196–203. [Google Scholar]
- Langlois, S. ADAS HMI using peripheral vision. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI, Eindhoven, The Netherlands, 28–30 October 2013; pp. 74–81. [Google Scholar] [CrossRef]
- Borojeni, S.S.; Chuang, L.; Heuten, W.; Boll, S. Assisting drivers with ambient take-over requests in highly automated driving. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Ann Arbor, MI, USA, 24–26 October 2016; pp. 237–244. [Google Scholar]
- Dziennus, M.; Kelsch, J.; Schieben, A. Ambient light based interaction concept for an integrative driver assistance system—A driving simulator study. In Proceedings of the Human Factors and Ergonomics Society Europe Chapter 2015 Annual Conference, Groningen, The Netherlands, 14–16 October 2015; Available online: http://elib.dlr.de/99076/ (accessed on 13 April 2023).
- Jamson, A.H.; Merat, N. Surrogate in-vehicle information systems and driver behaviour: Effects of visual and cognitive load in simulated rural driving. Transp. Res. Part F Traffic Psychol. Behav. 2004, 8, 79–96. [Google Scholar] [CrossRef]
- Metz, B.; Rosener, C.; Louw, T.; Aittoniemi, E.; Bjorvatn, A.; Worle, J.; Weber, H.; Torrao, G.A.; Silla, A.; Innamaa, S.; et al. L3Pilot Deliverable D3.3—Evaluation and Methods. Available online: https://l3pilot.eu/fileadmin/user_upload/Downloads/Deliverables/Update_07102021/L3Pilot-SP3-D3.3_Evaluation_Methods-v1.0_for_website.pdf (accessed on 12 April 2023).
- Salvucci, D.D.; Goldberg, J.H. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 Symposium on Eye Tracking Research & Applications, Palm Beach Gardens, FL, USA, 6–8 November 2000; pp. 71–78. [Google Scholar] [CrossRef]
- Gonçalves, R.; Louw, T.; Madigan, R.; Merat, N. Using Markov chains to understand the sequence of drivers’ gaze transitions during lane-changes in automated driving. In Proceedings of the 10th International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, Santa Fe, NM, USA, 24–27 June 2019; Volume 10. [Google Scholar]
- Gonçalves Rafael, C.; Louw, T.L.; Quaresma, M.; Madigan, R.; Merat, N. The effect of motor control requirements on drivers’ eye-gaze pattern during automated driving. Accid. Anal. Prev. 2020, 148, 105788. [Google Scholar] [CrossRef] [PubMed]
- Goncalves, R.C.; Louw, T.L.; Madigan, R.; Quaresma, M.; Romano, R.; Merat, N. The effect of information from dash-based human-machine interfaces on drivers’ gaze patterns and lane-change manoeuvres after conditionally automated driving. Accid. Anal. Prev. 2022, 174, 106726. [Google Scholar] [CrossRef] [PubMed]
- Louw, T.; Kuo, J.; Romano, R.; Radhakrishnan, V.; Lenné, M.G.; Merat, N. Engaging in NDRTs affects drivers’ responses and glance patterns after silent automation failures. Transp. Res. Part F Traffic Psychol. Behav. 2019, 62, 870–882. [Google Scholar] [CrossRef]
Males (N = 20; Mean (SD)) | Females (N = 21; Mean (SD)) | |
---|---|---|
Age (years) | 44 (13) | 44 (13) |
Years with license | 25 (13) | 24 (12) |
Miles driven annually | 10,300 (5332) | 6642 (3350) |
Question | Response Format |
Baseline (Pre-Experiment) | Post Drive 1 | Post Drive 2 |
---|---|---|---|---|
I trust that the vehicle will drive safely, while I do the Arrows task | 5-point scale (Strongly disagree-Strongly agree) | x | ||
I trusted that the vehicle would drive safely while I did the Arrows task | 5-point scale (Strongly disagree-Strongly agree) | x | x | |
If your level of trust in the automated driving system changed since the start of the experiment, please explain why. | Free text | x | x | |
I felt safe while doing the Arrows task during automated driving | 5-point scale (Strongly disagree-Strongly agree) | x | x | |
In this drive, the Light-band/Auditory signal was... | 5-point scale for each Van der Laan Scale item | x | x | |
How engaged were you with the Arrows task while automation was on? | 10-point (Not at all engaged-Highly engaged) | x | ||
Apart from takeover requests, was there anything that interrupted your engagement in the Arrows task while automation was on? If so, please explain briefly. | Free text | x | ||
Which warning system did you prefer? | Light-band/Auditory | x |
Console | Driver Lap | Driver Mirror | Instrument Cluster | Off Road | On Road | Passenger Mirror | Rear Mirror | |
---|---|---|---|---|---|---|---|---|
Mann–Whitney U | 11,751 | 12,508 | 12,525 | 10,430 | 12,449 | 11,225 | 12,450 | 12,525 |
Wilcoxon W | 23,076 | 26,536 | 23,850 | 24,458 | 26,477 | 22,550 | 23,775 | 23,850 |
Z | −2.04 | −0.11 | 0.00 | −2.98 | −0.23 | −2.01 | −0.95 | 0.00 |
p | 0.04 | 0.91 | 1.00 | <0.001 | 0.81 | 0.04 | 0.34 | 1.00 |
Average Probability (Light-band HMI) | 10% | 0% | 0% | 50% | 5% | 35% | 0% | 0% |
Average Probability (Auditory HMI) | 5% | 0% | 0% | 65% | 6% | 24% | 0% | 0% |
Effect | HMI Type | Event Number | HMI Type X Event Number | ||||||
---|---|---|---|---|---|---|---|---|---|
F(df1, df2) | p | ηp2 | F(df1, df2) | p | ηp2 | F(df1, df2) | p | ηp2 | |
Hands on wheel time | 3.443 (1,29) | 0.074 | 0.106 | 1.553 (4,29) | 0.19 | 0.051 | 1.265 (4,116) | 0.228 | 0.042 |
Automation disengagement time | 0.364 (1,38) | 0.55 | 0.010 | 0.284 (4,38) | 0.88 | 0.007 | 0.302 (4,152) | 0.867 | 0.008 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gonçalves, R.C.; Louw, T.; Lee, Y.M.; Madigan, R.; Kuo, J.; Lenné, M.; Merat, N. Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI? Information 2023, 14, 260. https://doi.org/10.3390/info14050260
Gonçalves RC, Louw T, Lee YM, Madigan R, Kuo J, Lenné M, Merat N. Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI? Information. 2023; 14(5):260. https://doi.org/10.3390/info14050260
Chicago/Turabian StyleGonçalves, Rafael Cirino, Tyron Louw, Yee Mun Lee, Ruth Madigan, Jonny Kuo, Mike Lenné, and Natasha Merat. 2023. "Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI?" Information 14, no. 5: 260. https://doi.org/10.3390/info14050260
APA StyleGonçalves, R. C., Louw, T., Lee, Y. M., Madigan, R., Kuo, J., Lenné, M., & Merat, N. (2023). Is Users’ Trust during Automated Driving Different When Using an Ambient Light HMI, Compared to an Auditory HMI? Information, 14(5), 260. https://doi.org/10.3390/info14050260