Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention
Abstract
:1. Introduction
1.1. Background
1.2. Proposed eHMI Concept
1.3. Aim and Hypotheses
2. Method
2.1. Design
2.2. Stimuli
2.3. Apparatus
2.4. Procedure
2.5. Dependent Variable
2.6. Participants
2.7. Data Analysis
3. Results
3.1. Non-Yielding Vehicles
3.2. Cruising Vehicles
3.3. Yielding Vehicles
4. Discussion
4.1. Findings
4.2. Implications
5. Considerations
5.1. Trust and Acceptance
5.2. Limitations
5.3. Future Work
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Correction Statement
Appendix A. Profile of the Proposed eHMI Concept According to Dey et al. [52]
- Target road user: Pedestrians.
- Vehicle type: Passenger cars.
- Modality of communication: Visual (anthropomorphic).
- Colors for visual eHMIs: N/A.
- Covered states: Non-yielding; cruising; yielding.
- Messages of communication in right-of-way negotiation: Situational awareness; intention.
- HMI placement: On the vehicle.
- Number of displays: 1.
- Number of messages: 3.
- Communication strategy: Clear unicast.
- Communication resolution: High.
- Multiple road user addressing capability: Single.
- Communication dependence on distance/time gap: No.
- Complexity to implement: C4 (uses technology that is not yet developed or not widely available on the market).
- Dependence on new vehicle design: No.
- Ability to communicate vehicle occupant state/shared control: No.
- Support for people with special needs: No.
- Evaluation of concept: Yes
- Time of day: Unspecified.
- Number of simultaneous users per trial: 1.
- Number of simultaneous vehicles per trial: 1.
- Method of evaluation: Monitor-based laboratory experiment.
- Weather conditions: Unspecified.
- Road condition: Unspecified.
- Sample size: 30.
- Sample age: M = 33.1 years, SD = 11.9 years.
- Measures: Performance (accuracy).
Appendix B. Stills Taken from the 3D Animated Sequences
References
- Rasouli, A.; Kotseruba, I.; Tsotsos, J.K. Understanding pedestrian behavior in complex traffic scenes. IEEE Trans. Intell. Veh. 2017, 3, 61–70. [Google Scholar] [CrossRef]
- Markkula, G.; Madigan, R.; Nathanael, D.; Portouli, E.; Lee, Y.M.; Dietrich, A.; Billington, J.; Schieben, A.; Merat, N. Defining interactions: A conceptual framework for understanding interactive behaviour in human and automated road traffic. Theor. Issues Ergon. Sci. 2020, 21, 728–752. [Google Scholar] [CrossRef]
- Färber, B. Communication and communication problems between autonomous vehicles and human drivers. In Autonomous driving; Springer: Berlin/Heidelberg, Germany, 2016; pp. 125–144. [Google Scholar]
- Sucha, M.; Dostal, D.; Risser, R. Pedestrian-driver communication and decision strategies at marked crossings. Accid. Anal. Prev. 2017, 102, 41–50. [Google Scholar] [CrossRef] [PubMed]
- Llorca, D.F. From driving automation systems to autonomous vehicles: Clarifying the terminology. arXiv 2021, arXiv:2103.10844. [Google Scholar]
- SAE International. Taxonomy and Definitions of Terms Related to Driving Automation Systems for on-Road Motor Vehicles. 2021. Available online: www.sae.org (accessed on 26 January 2022).
- Dey, D.; Terken, J. Pedestrian interaction with vehicles: Roles of explicit and implicit communication. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 109–113. [Google Scholar]
- Moore, D.; Currano, R.; Strack, G.E.; Sirkin, D. The case for implicit external human-machine interfaces for autonomous vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 295–307. [Google Scholar]
- Lee, Y.M.; Madigan, R.; Giles, O.; Garach-Morcillo, L.; Markkula, G.; Fox, C.; Camara, F.; Rothmueller, M.; Vendelbo-Larsen, S.A.; Rasmussen, P.H.; et al. Road users rarely use explicit communication when interacting in today’s traffic: Implications for automated vehicles. Cogn. Technol. Work. 2020, 23, 367–380. [Google Scholar] [CrossRef]
- Guéguen, N.; Eyssartier, C.; Meineri, S. A pedestrian’s smile and drivers’ behavior: When a smile increases careful driving. J. Saf. Res. 2016, 56, 83–88. [Google Scholar] [CrossRef]
- Guéguen, N.; Meineri, S.; Eyssartier, C. A pedestrian’s stare and drivers’ stopping behavior: A field experiment at the pedestrian crossing. Saf. Sci. 2015, 75, 87–89. [Google Scholar] [CrossRef]
- Ren, Z.; Jiang, X.; Wang, W. Analysis of the influence of pedestrians’ eye contact on drivers’ comfort boundary during the crossing conflict. Procedia Eng. 2016, 137, 399–406. [Google Scholar] [CrossRef]
- Nathanael, D.; Portouli, E.; Papakostopoulos, V.; Gkikas, K.; Amditis, A. Naturalistic observation of interactions between car drivers and pedestrians in high density urban settings. In Congress of the International Ergonomics Association; Springer: Berlin Germany, 2018; pp. 389–397. [Google Scholar]
- Dey, D.; Walker, F.; Martens, M.; Terken, J. Gaze patterns in pedestrian interaction with vehicles: Towards effective design of external human-machine interfaces for automated vehicles. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 369–378. [Google Scholar]
- Eisma, Y.B.; Van Bergen, S.; Ter Brake, S.M.; Hensen, M.T.T.; Tempelaar, W.J.; De Winter, J.C.F. External Human–Machine Interfaces: The Effect of Display Location on Crossing Intentions and Eye Movements. Information 2020, 11, 13. [Google Scholar] [CrossRef]
- Uttley, J.; Lee, Y.M.; Madigan, R.; Merat, N. Road user interactions in a shared space setting: Priority and communication in a UK car park. Transp. Res. Part F Traffic Psychol. Behav. 2020, 72, 32–46. [Google Scholar] [CrossRef]
- de Winter, J.; Bazilinskyy, P.; Wesdorp, D.; de Vlam, V.; Hopmans, B.; Visscher, J.; Dodou, D. How do pedestrians distribute their visual attention when walking through a parking garage? An eye-tracking study. Ergonomics 2021, 64, 793–805. [Google Scholar] [CrossRef]
- Kong, X.; Das, S.; Zhang, Y.; Xiao, X. Lessons learned from pedestrian-driver communication and yielding patterns. Transp. Res. Part F Traffic Psychol. Behav. 2021, 79, 35–48. [Google Scholar] [CrossRef]
- Onkhar, V.; Bazilinskyy, P.; Dodou, D.; De Winter, J.C.F. The effect of drivers’ eye contact on pedestrians’ perceived safety. Transp. Res. Part F Traffic Psychol. Behav. 2022, 84, 194–210. [Google Scholar] [CrossRef]
- Lobjois, R.; Cavallo, V. Age-related differences in street-crossing decisions: The effects of vehicle speed and time constraints on gap selection in an estimation task. Accid. Anal. Prev. 2007, 39, 934–943. [Google Scholar] [CrossRef]
- Sun, R.; Zhuang, X.; Wu, C.; Zhao, G.; Zhang, K. The estimation of vehicle speed and stopping distance by pedestrians crossing streets in a naturalistic traffic environment. Transp. Res. Part F Traffic Psychol. Behav. 2015, 30, 97–106. [Google Scholar] [CrossRef]
- Papić, Z.; Jović, A.; Simeunović, M.; Saulić, N.; Lazarević, M. Underestimation tendencies of vehicle speed by pedestrians when crossing unmarked roadway. Accid. Anal. Prev. 2020, 143, 105586. [Google Scholar] [CrossRef] [PubMed]
- ISO/TR 23049:2018; Road Vehicles: Ergonomic aspects of external visual communication from automated vehicles to other road users. BSI: London, UK, 2018.
- Merat, N.; Louw, T.; Madigan, R.; Wilbrink, M.; Schieben, A. What externally presented information do VRUs require when interacting with fully Automated Road Transport Systems in shared space? Accid. Anal. Prev. 2018, 118, 244–252. [Google Scholar] [CrossRef] [PubMed]
- Rasouli, A.; Tsotsos, J.K. Autonomous vehicles that interact with pedestrians: A survey of theory and practice. IEEE Trans. Intell. Transp. Syst. 2019, 21, 900–918. [Google Scholar] [CrossRef]
- Rouchitsas, A.; Alm, H. External human–machine interfaces for autonomous vehicle-to-pedestrian communication: A review of empirical work. Front. Psychol. 2019, 10, 2757. [Google Scholar] [CrossRef]
- Schieben, A.; Wilbrink, M.; Kettwich, C.; Madigan, R.; Louw, T.; Merat, N. Designing the interaction of automated vehicles with other traffic participants: Design considerations based on human needs and expectations. Cogn. Technol. Work. 2019, 21, 69–85. [Google Scholar] [CrossRef]
- Carmona, J.; Guindel, C.; Garcia, F.; de la Escalera, A. eHMI: Review and Guidelines for Deployment on Autonomous Vehicles. Sensors 2021, 21, 2912. [Google Scholar] [CrossRef]
- Ezzati Amini, R.; Katrakazas, C.; Riener, A.; Antoniou, C. Interaction of automated driving systems with pedestrians: Challenges, current solutions, and recommendations for eHMIs. Transp. Rev. 2021, 41, 788–813. [Google Scholar] [CrossRef]
- Tabone, W.; de Winter, J.; Ackermann, C.; Bärgman, J.; Baumann, M.; Deb, S.; Stanton, N.A. Vulnerable road users and the coming wave of automated vehicles: Expert perspectives. Transp. Res. Interdiscip. Perspect. 2021, 9, 100293. [Google Scholar] [CrossRef]
- Böckle, M.P.; Brenden, A.P.; Klingegård, M.; Habibovic, A.; Bout, M. SAV2P: Exploring the impact of an interface for shared automated vehicles on pedestrians’ experience. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications Adjunct, Oldenburg, Germany, 24–27 September 2017; pp. 136–140. [Google Scholar]
- Chang, C.M.; Toda, K.; Sakamoto, D.; Igarashi, T. Eyes on a Car: An Interface Design for Communication between an Autonomous Car and a Pedestrian. In Proceedings of the 9th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; pp. 65–73. [Google Scholar]
- Costa, G. Designing Framework for Human-Autonomous Vehicle Interaction. Master’s thesis, Keio University Graduate School of Media Design, Minato, Japan, 2017. [Google Scholar]
- Deb, S.; Strawderman, L.J.; Carruth, D.W. Investigating pedestrian suggestions for external features on fully autonomous vehicles: A virtual reality experiment. Transp. Res. Part F Traffic Psychol. Behav. 2018, 59, 135–149. [Google Scholar] [CrossRef]
- Habibovic, A.; Lundgren, V.M.; Andersson, J.; Klingegård, M.; Lagström, T.; Sirkka, A.; Fagerlönn, J.; Edgren, C.; Fredriksson, R.; Krupenia, S.; et al. Communicating intent of automated vehicles to pedestrians. Front. Psychol. 2018, 9, 1336. [Google Scholar] [CrossRef] [PubMed]
- Hudson, C.R.; Deb, S.; Carruth, D.W.; McGinley, J.; Frey, D. Pedestrian perception of autonomous vehicles with external interacting features. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Orlando, FL, USA, 22–26 July 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 33–39. [Google Scholar]
- Mahadevan, K.; Somanath, S.; Sharlin, E. Communicating awareness and intent in autonomous vehicle-pedestrian interaction. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21–26 April 2018; pp. 1–12. [Google Scholar]
- Othersen, I.; Conti-Kufner, A.S.; Dietrich, A.; Maruhn, P.; Bengler, K. Designing for automated vehicle and pedestrian communication: Perspectives on eHMIs from older and younger persons. Proc. Hum. Factors Ergon. Soc. Eur. 2018, 4959, 135–148. [Google Scholar]
- Petzoldt, T.; Schleinitz, K.; Banse, R. Potential safety effects of a frontal brake light for motor vehicles. IET Intell. Transp. Syst. 2018, 12, 449–453. [Google Scholar] [CrossRef]
- Song, Y.E.; Lehsing, C.; Fuest, T.; Bengler, K. External HMIs and their effect on the interaction between pedestrians and automated vehicles. In International Conference on Intelligent Human Systems Integration, Dubai, United Arab Emirates, 7–9 January 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 13–18. [Google Scholar]
- de Clercq, K.; Dietrich, A.; Núñez Velasco, J.P.; De Winter, J.; Happee, R. External human-machine interfaces on automated vehicles: Effects on pedestrian crossing decisions. Hum. Factors 2019, 61, 1353–1370. [Google Scholar] [CrossRef]
- Holländer, K.; Colley, A.; Mai, C.; Häkkilä, J.; Alt, F.; Pfleging, B. Investigating the influence of external car displays on pedestrians’ crossing behavior in virtual reality. In Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, Taipei, Taiwan, 1–4 October 2019; pp. 1–11. [Google Scholar]
- Stadler, S.; Cornet, H.; Novaes Theoto, T.; Frenkler, F. A tool, not a toy: Using virtual reality to evaluate the communication between autonomous vehicles and pedestrians. In Augmented Reality and Virtual Reality; Springer: Berlin/Heidelberg, Germany, 2019; pp. 203–216. [Google Scholar]
- Ackermans, S.C.A.; Dey, D.D.; Ruijten, P.A.; Cuijpers, R.H.; Pfleging, B. The effects of explicit intention communication, conspicuous sensors, and pedestrian attitude in interactions with automated vehicles. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; Association for Computing Machinery, Inc.: New York, NY, USA, 2020; p. 70. [Google Scholar]
- Faas, S.M.; Mathis, L.A.; Baumann, M. External HMI for self-driving vehicles: Which information shall be displayed? Transp. Res. Part F Traffic Psychol. Behav. 2020, 68, 171–186. [Google Scholar] [CrossRef]
- Singer, T.; Kobbert, J.; Zandi, B.; Khanh, T.Q. Displaying the driving state of automated vehicles to other road users: An international, virtual reality-based study as a first step for the harmonized regulations of novel signaling devices. IEEE Trans. Intell. Transp. Syst. 2020, 23, 2904–2918. [Google Scholar] [CrossRef]
- Lee, Y.M.; Madigan, R.; Uzondu, C.; Garcia, J.; Romano, R.; Markkula, G.; Merat, N. Learning to interpret novel eHMI: The effect of vehicle kinematics and eHMI familiarity on pedestrian’ crossing behavior. J. Saf. Res. 2022, 80, 270–280. [Google Scholar] [CrossRef] [PubMed]
- Wilbrink, M.; Lau, M.; Illgner, J.; Schieben, A.; Oehl, M. Impact of External Human–Machine Interface Communication Strategies of Automated Vehicles on Pedestrians’ Crossing Decisions and Behaviors in an Urban Environment. Sustainability 2021, 13, 8396. [Google Scholar] [CrossRef]
- Clamann, M.; Aubert, M.; Cummings, M.L. Evaluation of vehicle-to-pedestrian communication displays for autonomous vehicles. In Proceedings of the Transportation Research Board 96th Annual Meeting, Washington, DC, USA, 8–12 January 2017. No. 17-02119. [Google Scholar]
- Li, Y.; Dikmen, M.; Hussein, T.G.; Wang, Y.; Burns, C. To cross or not to cross: Urgency-based external warning displays on autonomous vehicles to improve pedestrian crossing safety. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; pp. 188–197. [Google Scholar]
- Hensch, A.C.; Neumann, I.; Beggiato, M.; Halama, J.; Krems, J.F. How should automated vehicles communicate?—Effects of a light-based communication approach in a Wizard-of-Oz study. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 79–91. [Google Scholar]
- Dey, D.; Habibovic, A.; Löcken, A.; Wintersberger, P.; Pfleging, B.; Riener, A.; Martens, M.; Terken, J. Taming the eHMI jungle: A classification taxonomy to guide, compare, and assess the design principles of automated vehicles’ external human-machine interfaces. Transp. Res. Interdiscip. Perspect. 2020, 7, 100174. [Google Scholar] [CrossRef]
- Bevan, N.; Carter, J.; Harker, S. ISO 9241-11 revised: What have we learnt about usability since 1998? In Proceedings of the International Conference on Human-Computer Interaction, Bamberg, Germany, 14–18 September 2015; Springer: Berlin/Heidelberg, Germany, 2015; pp. 143–151. [Google Scholar]
- Fridman, L.; Mehler, B.; Xia, L.; Yang, Y.; Facusse, L.Y.; Reimer, B. To walk or not to walk: Crowdsourced assessment of external vehicle-to-pedestrian displays. arXiv 2017, arXiv:1707.02698. [Google Scholar]
- Ackermann, C.; Beggiato, M.; Schubert, S.; Krems, J.F. An experimental study to investigate design and assessment criteria: What is important for communication between pedestrians and automated vehicles? Appl. Ergon. 2019, 75, 272–282. [Google Scholar] [CrossRef] [PubMed]
- Bazilinskyy, P.; Dodou, D.; De Winter, J. Survey on eHMI concepts: The effect of text, color, and perspective. Transp. Res. Part F Traffic Psychol. Behav. 2019, 67, 175–194. [Google Scholar] [CrossRef]
- Eisma, Y.B.; Reiff, A.; Kooijman, L.; Dodou, D.; De Winter, J.C.F. External human-machine interfaces: Effects of message perspective. Transp. Res. Part F Traffic Psychol. Behav. 2021, 78, 30–41. [Google Scholar] [CrossRef]
- Zhang, J.; Vinkhuyzen, E.; Cefkin, M. Evaluation of an autonomous vehicle external communication system concept: A survey study. In International conference on applied human factors and ergonomics, Los Angeles, CA, USA, 17–21 July 2017; Springer: Berlin/Heidelberg, Germany, 2017; pp. 650–661. [Google Scholar]
- Alvarez, W.M.; de Miguel, M.Á.; García, F.; Olaverri-Monreal, C. Response of Vulnerable Road Users to Visual Information from Autonomous Vehicles in Shared Spaces. In Proceedings of the 2019 IEEE, Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 3714–3719. [Google Scholar]
- Chang, C.M. A Gender Study of Communication Interfaces between an Autonomous Car and a Pedestrian. In Proceedings of the 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Virtual Event, 21–22 September 2020; pp. 42–45. [Google Scholar]
- Mirnig, N.; Perterer, N.; Stollnberger, G.; Tscheligi, M. Three strategies for autonomous car-to-pedestrian communication: A survival guide. In Proceedings of the Companion of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, Vienna, Austria, 6–9 March 2017; pp. 209–210. [Google Scholar]
- Wickens, C.D.; Gordon, S.E.; Liu, Y.; Lee, J. An Introduction to Human Factors Engineering; Pearson Prentice Hall: Upper Saddle River, NJ, USA, 2004; Volume 2. [Google Scholar]
- Schilbach, L.; Wohlschlaeger, A.M.; Kraemer, N.C.; Newen, A.; Shah, N.J.; Fink, G.R.; Vogeley, K. Being with virtual others: Neural correlates of social interaction. Neuropsychologia 2006, 44, 718–730. [Google Scholar] [CrossRef]
- Kuzmanovic, B.; Georgescu, A.L.; Eickhoff, S.B.; Shah, N.J.; Bente, G.; Fink, G.R.; Vogeley, K. Duration matters: Dissociating neural correlates of detection and evaluation of social gaze. Neuroimage 2009, 46, 1154–1163. [Google Scholar] [CrossRef] [PubMed]
- Schrammel, F.; Pannasch, S.; Graupner, S.T.; Mojzisch, A.; Velichkovsky, B.M. Virtual friend or threat? The effects of facial expression and gaze interaction on psychophysiological responses and emotional experience. Psychophysiology 2009, 46, 922–931. [Google Scholar] [CrossRef] [PubMed]
- Georgescu, A.L.; Kuzmanovic, B.; Schilbach, L.; Tepest, R.; Kulbida, R.; Bente, G.; Vogeley, K. Neural correlates of “social gaze” processing in high-functioning autism under systematic variation of gaze duration. NeuroImage Clin. 2013, 3, 340–351. [Google Scholar] [CrossRef] [PubMed]
- Parsons, T.D. Virtual reality for enhanced ecological validity and experimental control in the clinical, affective, and social neurosciences. Front. Hum. Neurosci. 2015, 9, 660. [Google Scholar] [CrossRef]
- Parsons, T.D.; Gaggioli, A.; Riva, G. Virtual reality for research in social neuroscience. Brain Sci. 2017, 7, 42. [Google Scholar] [CrossRef]
- Dobs, K.; Bülthoff, I.; Schultz, J. Use and usefulness of dynamic face stimuli for face perception studies–a review of behavioral findings and methodology. Front. Psychol. 2018, 9, 1355. [Google Scholar] [CrossRef] [PubMed]
- Georgescu, A.L.; Kuzmanovic, B.; Roth, D.; Bente, G.; Vogeley, K. The use of virtual characters to assess and train non-verbal communication in high-functioning autism. Front. Hum. Neurosci. 2014, 8, 807. [Google Scholar] [CrossRef] [PubMed]
- Biocca, F.; Harms, C.; Burgoon, J.K. Toward a more robust theory and measure of social presence: Review and suggested criteria. Presence Teleoperators Virtual Environ. 2003, 12, 456–480. [Google Scholar] [CrossRef]
- Picard, R.W. Affective Computing; MIT Press: Cambridge, MA, USA, 2000. [Google Scholar]
- Scherer, K.R.; Bänziger, T.; Roesch, E. (Eds.) A Blueprint for Affective Computing: A Sourcebook and Manual; Oxford University Press: Oxford, UK, 2010. [Google Scholar]
- Cassell, J.; Thorisson, K.R. The power of a nod and a glance: Envelope vs. emotional feedback in animated conversational agents. Appl. Artif. Intell. 1999, 13, 519–538. [Google Scholar] [CrossRef]
- Pütten, A.V.D.; Reipen, C.; Wiedmann, A.; Kopp, S.; Krämer, N.C. Comparing emotional vs. envelope feedback for ECAs. In Proceedings of the International Workshop on Intelligent Virtual Agents, Tokyo, Japan, 1–3 September 2008; Springer: Berlin/Heidelberg, Germany, 2008; pp. 550–551. [Google Scholar]
- Ochs, M.; Niewiadomski, R.; Pelachaud, C. How a virtual agent should smile? In Proceedings of the International Conference on Intelligent Virtual Agents, Philadelphia, PA, USA, 20–22 September 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 427–440. [Google Scholar]
- Wang, N.; Gratch, J. Don’t just stare at me! In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Atlanta, GA, USA, 10–15 April 2010; pp. 1241–1250. [Google Scholar]
- McDonnell, R.; Breidt, M.; Bülthoff, H.H. Render me real?: Investigating the effect of render style on the perception of animated virtual humans. ACM Trans. Graph. (TOG) 2012, 31, 91. [Google Scholar] [CrossRef]
- Wong, J.W.E.; McGee, K. Frown more, talk more: Effects of facial expressions in establishing conversational rapport with virtual agents. In Proceedings of the International Conference on Intelligent Virtual Agents, Santa Cruz CA, USA, 12–14 September 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 419–425. [Google Scholar]
- Aljaroodi, H.M.; Adam, M.T.; Chiong, R.; Teubner, T. Avatars and embodied agents in experimental information systems research: A systematic review and conceptual framework. Australas. J. Inf. Syst. 2019, 23. [Google Scholar] [CrossRef]
- Furuya, H.; Kim, K.; Bruder, G.; J Wisniewski, P.; Welch, F.G. Autonomous Vehicle Visual Embodiment for Pedestrian Interactions in Crossing Scenarios: Virtual Drivers in AVs for Pedestrian Crossing. In Proceedings of the Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–7. [Google Scholar]
- Underwood, G. Visual attention and the transition from novice to advanced driver. Ergonomics 2007, 50, 1235–1249. [Google Scholar] [CrossRef] [PubMed]
- Tafaj, E.; Kübler, T.C.; Kasneci, G.; Rosenstiel, W.; Bogdan, M. Online classification of eye tracking data for automated analysis of traffic hazard perception. In Proceedings of the International Conference on Artificial Neural Networks, Sofia, Bulgaria, 10–13 September 2013; Springer: Berlin/Heidelberg, Germany, 2013; pp. 442–450. [Google Scholar]
- Kaya, N.; Girgis, J.; Hansma, B.; Donmez, B. Hey, watch where you’re going! An on-road study of driver scanning failures towards pedestrians and cyclists. Accid. Anal. Prev. 2021, 162, 106380. [Google Scholar] [CrossRef]
- Von Grünau, M.; Anston, C. The detection of gaze direction: A stare-in-the-crowd effect. Perception 1995, 24, 1297–1313. [Google Scholar] [CrossRef] [PubMed]
- Emery, N.J. The eyes have it: The neuroethology, function, and evolution of social gaze. Neurosci. Biobehav. Rev. 2000, 24, 581–604. [Google Scholar] [CrossRef]
- Senju, A.; Hasegawa, T. Direct gaze captures visuospatial attention. Vis. Cogn. 2005, 12, 127–144. [Google Scholar] [CrossRef]
- Senju, A.; Johnson, M.H. The eye contact effect: Mechanisms and development. Trends Cogn. Sci. 2009, 13, 127–134. [Google Scholar] [CrossRef] [PubMed]
- Conty, L.; George, N.; Hietanen, J.K. Watching eyes effects: When others meet the self. Conscious. Cogn. 2016, 45, 184–197. [Google Scholar] [CrossRef] [PubMed]
- Hamilton, A.F.D.C. Gazing at me: The importance of social meaning in understanding direct-gaze cues. Philos. Trans. R. Soc. B Biol. Sci. 2016, 371, 20150080. [Google Scholar] [CrossRef] [PubMed]
- Frith, C.D.; Frith, U. Interacting minds--a biological basis. Science 1999, 286, 1692–1695. [Google Scholar] [CrossRef]
- Gallagher, H.L.; Frith, C.D. Functional imaging of ‘theory of mind’. Trends Cogn. Sci. 2003, 7, 77–83. [Google Scholar] [CrossRef]
- Krumhuber, E.G.; Kappas, A.; Manstead, A.S. Effects of dynamic aspects of facial expressions: A review. Emot. Rev. 2013, 5, 41–46. [Google Scholar] [CrossRef]
- Horstmann, G. What do facial expressions convey: Feeling states, behavioral intentions, or actions requests? Emotion 2003, 3, 150. [Google Scholar] [CrossRef]
- Hess, U.; Adams, R.B., Jr.; Kleck, R.E. When Two Do the Same, It Might Not Mean the Same: The Perception of Emotional Expressions Shown by Men and Women. In Group Dynamics and Emotional Expression; Hess, U., Philippot, P., Eds.; Cambridge University Press: Cambridge, UK, 2007; pp. 33–50. [Google Scholar]
- Scherer, K.R.; Grandjean, D. Facial expressions allow inference of both emotions and their components. Cogn. Emot. 2008, 22, 789–801. [Google Scholar] [CrossRef]
- Ekman, P. Facial expressions of emotion: New findings, new questions. Psychol. Sci. 1992, 3, 34–38. [Google Scholar] [CrossRef]
- Berkowitz, L.; Harmon-Jones, E. Toward an understanding of the determinants of anger. Emotion 2004, 4, 107. [Google Scholar] [CrossRef] [PubMed]
- Schützwohl, A. Approach and avoidance during routine behavior and during surprise in a non-evaluative task: Surprise matters and so does the valence of the surprising event. Front. Psychol. 2018, 9, 826. [Google Scholar] [CrossRef]
- Reisenzein, R.; Horstmann, G.; Schützwohl, A. The cognitive-evolutionary model of surprise: A review of the evidence. Top. Cogn. Sci. 2019, 11, 50–74. [Google Scholar] [CrossRef]
- Nusseck, M.; Cunningham, D.W.; Wallraven, C.; Bülthoff, H.H. The contribution of different facial regions to the recognition of conversational expressions. J. Vis. 2008, 8, 1. [Google Scholar] [CrossRef] [PubMed]
- Cunningham, D.W.; Wallraven, C. Dynamic information for the recognition of conversational expressions. J. Vis. 2009, 9, 7. [Google Scholar] [CrossRef] [PubMed]
- Kaulard, K.; Cunningham, D.W.; Bülthoff, H.H.; Wallraven, C. The MPI facial expression database—A validated database of emotional and conversational facial expressions. PLoS ONE 2012, 7, e32321. [Google Scholar] [CrossRef]
- Kendon, A. Some uses of the head shake. Gesture 2002, 2, 147–182. [Google Scholar] [CrossRef]
- Guidetti, M. Yes or no? How young French children combine gestures and speech to agree and refuse. J. Child Lang. 2005, 32, 911–924. [Google Scholar] [CrossRef] [PubMed]
- Andonova, E.; Taylor, H.A. Nodding in dis/agreement: A tale of two cultures. Cogn. Processing 2012, 13, 79–82. [Google Scholar] [CrossRef] [PubMed]
- Fusaro, M.; Vallotton, C.D.; Harris, P.L. Beside the point: Mothers’ head nodding and shaking gestures during parent–child play. Infant Behav. Dev. 2014, 37, 235–247. [Google Scholar] [CrossRef] [PubMed]
- Osugi, T.; Kawahara, J.I. Effects of Head Nodding and Shaking Motions on Perceptions of Likeability and Approachability. Perception 2018, 47, 16–29. [Google Scholar] [CrossRef]
- Moretti, S.; Greco, A. Nodding and shaking of the head as simulated approach and avoidance responses. Acta Psychol. 2020, 203, 102988. [Google Scholar] [CrossRef]
- Semcon. The Smiling Car. 2016. Available online: https://semcon.com/uk/smilingcar/ (accessed on 21 April 2022).
- Becker, D.V.; Kenrick, D.T.; Neuberg, S.L.; Blackwell, K.C.; Smith, D.M. The confounded nature of angry men and happy women. J. Personal. Soc. Psychol. 2007, 92, 179. [Google Scholar] [CrossRef]
- Niedenthal, P.M.; Mermillod, M.; Maringer, M.; Hess, U. The Simulation of Smiles (SIMS) model: Embodied simulation and the meaning of facial expression. Behav. Brain Sci. 2010, 33, 417. [Google Scholar] [CrossRef] [PubMed]
- Barrett, L.F.; Adolphs, R.; Marsella, S.; Martinez, A.M.; Pollak, S.D. Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychol. Sci. Public Interest 2019, 20, 1–68. [Google Scholar] [CrossRef] [PubMed]
- Weber, M.; Giacomin, J.; Malizia, A.; Skrypchuk, L.; Gkatzidou, V.; Mouzakitis, A. Investigation of the dependency of the drivers’ emotional experience on different road types and driving conditions. Transp. Res. Part F Traffic Psychol. Behav. 2019, 65, 107–120. [Google Scholar] [CrossRef]
- Popuşoi, S.A.; Havârneanu, G.M.; Havârneanu, C.E. “Get the f#∗k out of my way!” Exploring the cathartic effect of swear words in coping with driving anger. Transp. Res. Part F Traffic Psychol. Behav. 2018, 56, 215–226. [Google Scholar]
- Stephens, A.N.; Lennon, A.; Bihler, C.; Trawley, S. The measure for angry drivers (MAD). Transp. Res. Part F Traffic Psychol. Behav. 2019, 64, 472–484. [Google Scholar] [CrossRef]
- Deffenbacher, J.L.; Lynch, R.S.; Oetting, E.R.; Swaim, R.C. The Driving Anger Expression Inventory: A measure of how people express their anger on the road. Behav. Res. Ther. 2002, 40, 717–737. [Google Scholar] [CrossRef]
- Kim, H.; Somerville, L.H.; Johnstone, T.; Alexander, A.L.; Whalen, P.J. Inverse amygdala and medial prefrontal cortex responses to surprised faces. Neuroreport 2003, 14, 2317–2322. [Google Scholar] [CrossRef] [PubMed]
- Marsh, A.A.; Ambady, N.; Kleck, R.E. The effects of fear and anger facial expressions on approach-and avoidance-related behaviors. Emotion 2005, 5, 119. [Google Scholar] [CrossRef]
- Neta, M.; Davis, F.C.; Whalen, P.J. Valence resolution of ambiguous facial expressions using an emotional oddball task. Emotion 2011, 11, 1425. [Google Scholar] [CrossRef] [PubMed]
- Neta, M.; Whalen, P.J. The primacy of negative interpretations when resolving the valence of ambiguous facial expressions. Psychol. Sci. 2010, 21, 901–907. [Google Scholar] [CrossRef] [PubMed]
- Yamada, H.; Matsuda, T.; Watari, C.; Suenaga, T. Dimensions of visual information for categorizing facial expressions of emotion. Jpn. Psychol. Res. 1994, 35, 172–181. [Google Scholar] [CrossRef]
- Tottenham, N.; Tanaka, J.W.; Leon, A.C.; McCarry, T.; Nurse, M.; Hare, T.A.; Nelson, C. The NimStim set of facial expressions: Judgments from untrained research participants. Psychiatry Res. 2009, 168, 242–249. [Google Scholar] [CrossRef]
- Wu, S.; Sun, S.; Camilleri, J.A.; Eickhoff, S.B.; Yu, R. Better the devil you know than the devil you don’t: Neural processing of risk and ambiguity. NeuroImage 2021, 236, 118109. [Google Scholar] [CrossRef]
- Alter, A.L.; Oppenheimer, D.M.; Epley, N.; Eyre, R.N. Overcoming intuition: Metacognitive difficulty activates analytic reasoning. J. Exp. Psychol. Gen. 2007, 136, 569. [Google Scholar] [CrossRef] [PubMed]
- Naples, A.; Nguyen-Phuc, A.; Coffman, M.; Kresse, A.; Faja, S.; Bernier, R.; McPartland, J.C. A computer-generated animated face stimulus set for psychophysiological research. Behav. Res. Methods 2015, 47, 562–570. [Google Scholar] [CrossRef]
- Nelson, N.L.; Russell, J.A. Building emotion categories: Children use a process of elimination when they encounter novel expressions. J. Exp. Child Psychol. 2016, 151, 120–130. [Google Scholar] [CrossRef]
- Wiese, E.; Metta, G.; Wykowska, A. Robots as intentional agents: Using neuroscientific methods to make robots appear more social. Front. Psychol. 2017, 8, 1663. [Google Scholar] [CrossRef] [PubMed]
- Gamer, M.; Hecht, H. Are you looking at me? Measuring the cone of gaze. J. Exp. Psychol. Hum. Percept. Perform. 2007, 33, 705. [Google Scholar] [CrossRef]
- Onkhar, V.; Bazilinskyy, P.; Stapel, J.C.J.; Dodou, D.; Gavrila, D.; De Winter, J.C.F. Towards the detection of driver–pedestrian eye contact. Pervasive Mob. Comput. 2021, 76, 101455. [Google Scholar] [CrossRef]
- Kohler, C.G.; Turner, T.; Stolar, N.M.; Bilker, W.B.; Brensinger, C.M.; Gur, R.E.; Gur, R.C. Differences in facial expressions of four universal emotions. Psychiatry Res. 2004, 128, 235–244. [Google Scholar] [CrossRef]
- Ambadar, Z.; Cohn, J.F.; Reed, L.I. All smiles are not created equal: Morphology and timing of smiles perceived as amused, polite, and embarrassed/nervous. J. Nonverbal Behav. 2009, 33, 17–34. [Google Scholar] [CrossRef] [PubMed]
- Helwig, N.E.; Sohre, N.E.; Ruprecht, M.R.; Guy, S.J.; Lyford-Pike, S. Dynamic properties of successful smiles. PLoS ONE 2017, 12, e0179708. [Google Scholar] [CrossRef] [PubMed]
- N’Diaye, K.; Sander, D.; Vuilleumier, P. Self-relevance processing in the human amygdala: Gaze direction, facial expression, and emotion intensity. Emotion 2009, 9, 798. [Google Scholar] [CrossRef]
- Bantoft, C.; Summers, M.J.; Tranent, P.J.; Palmer, M.A.; Cooley, P.D.; Pedersen, S.J. Effect of standing or walking at a workstation on cognitive function: A randomized counterbalanced trial. Hum. Factors 2016, 58, 140–149. [Google Scholar] [CrossRef] [PubMed]
- Kang, S.H.; Lee, J.; Jin, S. Effect of standing desk use on cognitive performance and physical workload while engaged with high cognitive demand tasks. Appl. Ergon. 2021, 92, 103306. [Google Scholar] [CrossRef] [PubMed]
- Kaß, C.; Schoch, S.; Naujoks, F.; Hergeth, S.; Keinath, A.; Neukum, A. Standardized Test Procedure for External Human–Machine Interfaces of Automated Vehicles. Information 2020, 11, 173. [Google Scholar] [CrossRef]
- Field, A. Discovering statistics using IBM SPSS statistics. Sage: Los Angeles, CA, USA,, 2013. [Google Scholar]
- Smith, M.L.; Cottrell, G.W.; Gosselin, F.; Schyns, P.G. Transmitting and decoding facial expressions. Psychol. Sci. 2005, 16, 184–189. [Google Scholar] [CrossRef] [PubMed]
- Summers, R.J.; Meese, T.S. The influence of fixation points on contrast detection and discrimination of patches of grating: Masking and facilitation. Vis. Res. 2009, 49, 1894–1900. [Google Scholar] [CrossRef] [PubMed]
- Reisberg, D. Cognition: Exploring the Science of the Mind: Sixth International Student Edition; WW Norton & Company: New York, NY, USA, 2015. [Google Scholar]
- Richler, J.J.; Mack, M.L.; Gauthier, I.; Palmeri, T.J. Holistic processing of faces happens at a glance. Vis. Res. 2009, 49, 2856–2861. [Google Scholar] [CrossRef] [PubMed]
- Hershler, O.; Hochstein, S. At first sight: A high-level pop out effect for faces. Vis. Res. 2005, 45, 1707–1724. [Google Scholar] [CrossRef]
- Jing, P.; Xu, G.; Chen, Y.; Shi, Y.; Zhan, F. The determinants behind the acceptance of autonomous vehicles: A systematic review. Sustainability 2020, 12, 1719. [Google Scholar] [CrossRef]
- Tapiro, H.; Oron-Gilad, T.; Parmet, Y. Pedestrian distraction: The effects of road environment complexity and age on pedestrian’s visual attention and crossing behavior. J. Saf. Res. 2020, 72, 101–109. [Google Scholar] [CrossRef]
- Bainbridge, L. Ironies of automation. In Analysis, Design and Evaluation of Man–Machine Systems; Pergamon Press: Oxford, UK, 1983; pp. 129–135. [Google Scholar]
- Reason, J. Understanding adverse events: Human factors. BMJ Qual. Saf. 1995, 4, 80–89. [Google Scholar] [CrossRef]
- Colley, M.; Walch, M.; Rukzio, E. Unveiling the Lack of Scalability in Research on External Communication of Autonomous Vehicles. In Proceedings of the Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, 25–30 April 2020; pp. 1–9. [Google Scholar]
- Faria, J.J.; Krause, S.; Krause, J. Collective behavior in road crossing pedestrians: The role of social information. Behav. Ecol. 2010, 21, 1236–1242. [Google Scholar] [CrossRef]
- Lanzer, M.; Baumann, M. Does crossing the road in a group influence pedestrians’ gaze behavior? Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2020, 64, 1938–1942. [Google Scholar] [CrossRef]
- Wilbrink, M.; Nuttelmann, M.; Oehl, M. Scaling up Automated Vehicles’ eHMI Communication Designs to Interactions with Multiple Pedestrians–Putting eHMIs to the Test. In Proceedings of the 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Leeds, UK, 9–10 September 2021; pp. 119–122. [Google Scholar]
- Dey, D.; van Vastenhoven, A.; Cuijpers, R.H.; Martens, M.; Pfleging, B. Towards Scalable eHMIs: Designing for AV-VRU Communication Beyond One Pedestrian. In Proceedings of the 13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Leeds, UK, 9–14 September 2021; pp. 274–286. [Google Scholar]
- Joisten, P.; Liu, Z.; Theobald, N.; Webler, A.; Abendroth, B. Communication of Automated Vehicles and Pedestrian Groups: An Intercultural Study on Pedestrians’ Street Crossing Decisions. In Proceedings of the Mensch und Computer 2021-Tagungsband, Ingolstadt, Germany, 5–8 September 2021. [Google Scholar]
- Mayer, R.C.; Davis, J.H.; Schoorman, F.D. An integrative model of organizational trust. Acad. Manag. Rev. 1995, 20, 709–734. [Google Scholar] [CrossRef]
- Hoff, K.A.; Bashir, M. Trust in automation: Integrating empirical evidence on factors that influence trust. Hum. Factors 2015, 57, 407–434. [Google Scholar] [CrossRef]
- Lee, J.D.; See, K.A. Trust in automation: Designing for appropriate reliance. Hum. Factors 2004, 46, 50–80. [Google Scholar] [CrossRef]
- Dixon, L. Autonowashing: The greenwashing of vehicle automation. Transp. Res. Interdiscip. Perspect. 2020, 5, 100113. [Google Scholar] [CrossRef]
- Faas, S.M.; Kraus, J.; Schoenhals, A.; Baumann, M. Calibrating Pedestrians’ Trust in Automated Vehicles: Does an Intent Display in an External HMI Support Trust Calibration and Safe Crossing Behavior? In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, Yokohama, Japan, 8–13 May 2021; pp. 1–17. [Google Scholar]
- Choi, J.K.; Ji, Y.G. Investigating the importance of trust on adopting an autonomous vehicle. Int. J. Hum.-Comput. Interact. 2015, 31, 692–702. [Google Scholar] [CrossRef]
- Hengstler, M.; Enkel, E.; Duelli, S. Applied artificial intelligence and trust—The case of autonomous vehicles and medical assistance devices. Technol. Forecast. Soc. Change 2016, 105, 105–120. [Google Scholar] [CrossRef]
- Reig, S.; Norman, S.; Morales, C.G.; Das, S.; Steinfeld, A.; Forlizzi, J. A field study of pedestrians and autonomous vehicles. In Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications, Toronto, ON, Canada, 23–25 September 2018; pp. 198–209. [Google Scholar]
- Oliveira, L.; Proctor, K.; Burns, C.G.; Birrell, S. Driving style: How should an automated vehicle behave? Information 2019, 10, 219. [Google Scholar] [CrossRef]
- Olaverri-Monreal, C. Promoting trust in self-driving vehicles. Nat. Electron. 2020, 3, 292–294. [Google Scholar] [CrossRef]
- Wang, Y.; Hespanhol, L.; Tomitsch, M. How Can Autonomous Vehicles Convey Emotions to Pedestrians? A Review of Emotionally Expressive Non-Humanoid Robots. Multimodal Technol. Interact. 2021, 5, 84. [Google Scholar] [CrossRef]
- Nowak, K.L.; Rauh, C. Choose your “buddy icon” carefully: The influence of avatar androgyny, anthropomorphism, and credibility in online interactions. Comput. Hum. Behav. 2008, 24, 1473–1493. [Google Scholar] [CrossRef]
- de Visser, E.J.; Krueger, F.; McKnight, P.; Scheid, S.; Smith, M.; Chalk, S.; Parasuraman, R. The world is not enough: Trust in cognitive agents. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2012, 56, 263–267. [Google Scholar] [CrossRef]
- Pak, R.; Fink, N.; Price, M.; Bass, B.; Sturre, L. Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults. Ergonomics 2012, 55, 1059–1072. [Google Scholar] [CrossRef]
- Waytz, A.; Heafner, J.; Epley, N. The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. J. Exp. Soc. Psychol. 2014, 52, 113–117. [Google Scholar] [CrossRef]
- Kaleefathullah, A.A.; Merat, N.; Lee, Y.M.; Eisma, Y.B.; Madigan, R.; Garcia, J.; Winter, J.D. External Human–Machine Interfaces Can Be Misleading: An Examination of Trust Development and Misuse in a CAVE-Based Pedestrian Simulation Environment. Hum. Factors 2020, 64, 1070–1085. [Google Scholar] [CrossRef] [PubMed]
- Matthews, M.; Chowdhary, G.; Kieson, E. Intent communication between autonomous vehicles and pedestrians. arXiv 2017, arXiv:1708.07123. [Google Scholar]
- Schweitzer, M.E.; Hershey, J.C.; Bradlow, E.T. Promises and lies: Restoring violated trust. Organ. Behav. Hum. Decis. Processes 2006, 101, 1–19. [Google Scholar] [CrossRef]
- Holländer, K.; Wintersberger, P.; Butz, A. Overtrust in external cues of automated vehicles: An experimental investigation. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Utrecht, The Netherlands, 21–25 September 2019; pp. 211–221. [Google Scholar]
- Gong, L. How social is social responses to computers? The function of the degree of anthropomorphism in computer representations. Comput. Hum. Behav. 2008, 24, 1494–1509. [Google Scholar] [CrossRef]
- Andrade, C. Internal, external, and ecological validity in research design, conduct, and evaluation. Indian J. Psychol. Med. 2018, 40, 498–499. [Google Scholar] [CrossRef] [PubMed]
- Feldstein, I.; Dietrich, A.; Milinkovic, S.; Bengler, K. A pedestrian simulator for urban crossing scenarios. IFAC-Pap. 2016, 49, 239–244. [Google Scholar] [CrossRef]
- Deb, S.; Carruth, D.W.; Sween, R.; Strawderman, L.; Garrison, T.M. Efficacy of virtual reality in pedestrian safety research. Appl. Ergon. 2017, 65, 449–460. [Google Scholar] [CrossRef]
- Hassin, R.R.; Aviezer, H.; Bentin, S. Inherently ambiguous: Facial expressions of emotions, in context. Emot. Rev. 2013, 5, 60–65. [Google Scholar] [CrossRef]
- Eisele, D.; Petzoldt, T. Effects of traffic context on eHMI icon comprehension. Transp. Res. Part F Traffic Psychol. Behav. 2022, 85, 1–12. [Google Scholar] [CrossRef]
- Cavallo, V.; Dommes, A.; Dang, N.T.; Vienne, F. A street-crossing simulator for studying and training pedestrians. Transp. Res. Part F Traffic Psychol. Behav. 2019, 61, 217–228. [Google Scholar] [CrossRef]
- Faas, S.M.; Mattes, S.; Kao, A.C.; Baumann, M. Efficient Paradigm to Measure Street-Crossing Onset Time of Pedestrians in Video-Based Interactions with Vehicles. Information 2020, 11, 360. [Google Scholar] [CrossRef]
- Vermersch, P. Describing the practice of introspection. J. Conscious. Stud. 2009, 16, 20–57. [Google Scholar]
- Cahour, B.; Salembier, P.; Zouinar, M. Analyzing lived experience of activity. Le Trav. Hum. 2016, 79, 259–284. [Google Scholar] [CrossRef]
- Utriainen, R.; Pöllänen, M. Prioritizing Safety or Traffic Flow? Qualitative Study on Highly Automated Vehicles’ Potential to Prevent Pedestrian Crashes with Two Different Ambitions. Sustainability 2020, 12, 3206. [Google Scholar] [CrossRef]
- Deb, S.; Carruth, D.W.; Fuad, M.; Stanley, L.M.; Frey, D. Comparison of Child and Adult Pedestrian Perspectives of External Features on Autonomous Vehicles Using Virtual Reality Experiment. In Proceedings of the International Conference on Applied Human Factors and Ergonomics, Washington, DC, USA, 24–28 July 2019; Springer: Berlin/Heidelberg, Germany, 2019; pp. 145–156. [Google Scholar]
- Tapiro, H.; Meir, A.; Parmet, Y.; Oron-Gilad, T. Visual search strategies of child-pedestrians in road crossing tasks. Proc. Hum. Factors Ergon. Soc. Eur. 2014, 119–130. [Google Scholar]
- Charisi, V.; Habibovic, A.; Andersson, J.; Li, J.; Evers, V. Children’s views on identification and intention communication of self-driving vehicles. In Proceedings of the 2017 Conference on Interaction Design and Children, Stanford, CA, USA, 27–30 June 2017; pp. 399–404. [Google Scholar]
- Klin, A.; Jones, W.; Schultz, R.; Volkmar, F.; Cohen, D. Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Arch. Gen. Psychiatry 2002, 59, 809–816. [Google Scholar] [CrossRef] [PubMed]
- Crehan, E.T.; Althoff, R.R. Me looking at you, looking at me: The stare-in-the-crowd effect and autism spectrum disorder. J. Psychiatr. Res. 2021, 140, 101–109. [Google Scholar] [CrossRef] [PubMed]
- Strauss, D.; Shavelle, R.; Anderson, T.W.; Baumeister, A. External causes of death among persons with developmental disability: The effect of residential placement. Am. J. Epidemiol. 1998, 147, 855–862. [Google Scholar] [CrossRef] [PubMed]
Facial Expression | ||||||||
---|---|---|---|---|---|---|---|---|
Angry | Surprised | Head Shake | Neutral | Cheek Puff | Smile | Nod | ||
Gaze Direction | Direct | 0.962 (0.033) | 0.997 (0.002) | 0.997 (0.002) | 0.823 (0.066) | 0.868 (0.06) | 0.76 (0.075) | 0.963 (0.027) |
Averted | 0.965 (0.033) | 0.992 (0.004) | 0.998 (0.002) | 0.822 (0.064) | 0.867 (0.061) | 0.73 (0.076) | 0.937 (0.034) | |
Non-yielding | Cruising | Yielding | ||||||
Vehicle Intention |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Rouchitsas, A.; Alm, H. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information 2022, 13, 420. https://doi.org/10.3390/info13090420
Rouchitsas A, Alm H. Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information. 2022; 13(9):420. https://doi.org/10.3390/info13090420
Chicago/Turabian StyleRouchitsas, Alexandros, and Håkan Alm. 2022. "Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention" Information 13, no. 9: 420. https://doi.org/10.3390/info13090420
APA StyleRouchitsas, A., & Alm, H. (2022). Ghost on the Windshield: Employing a Virtual Human Character to Communicate Pedestrian Acknowledgement and Vehicle Intention. Information, 13(9), 420. https://doi.org/10.3390/info13090420