Interaction Design and the Automated City – Emerging Urban Interfaces, Prototyping Approaches and Design Methods

A special issue of Multimodal Technologies and Interaction (ISSN 2414-4088).

Deadline for manuscript submissions: closed (20 April 2023) | Viewed by 25596

Special Issue Editors


E-Mail Website
Guest Editor
Design Lab, School of Architecture, Design and Planning, The University of Sydney, Sydney, NSW 2006, Australia
Interests: interaction design; human–computer interaction; prototyping; urban robots; smart cities; urban interfaces; media architecture

E-Mail Website
Guest Editor
Design Lab, School of Architecture, Design and Planning, The University of Sydney, Sydney, NSW 2006, Australia
Interests: interaction design; human–computer interaction; prototyping; urban robots; smart cities; urban interfaces; media architecture

E-Mail Website
Guest Editor
Magic Lab, Department of Industrial Engineering and Management, Ben Gurion University of the Negev, Be'er Sheva 84105, Israel
Interests: human–computer interaction, human–robot interaction, mobile computing, ubiquitous computing, drones

E-Mail Website
Guest Editor
Design Lab, School of Architecture, Design and Planning, The University of Sydney, Sydney, NSW 2006, Australia
Interests: culture; cities; communities; HCI; digital media
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Industrial Design Engineering, TU Delft, 2628 Delft, The Netherlands
Interests: interaction design; critical design research; responsible AI; service robots; automated driving systems; future cities

E-Mail Website
Guest Editor
Centre for Accident Research and Road Safety – Queensland (CARRS-Q), Queensland University of Technology, Brisbane, QLD 4000, Australia
Interests: automotive user interfaces; autonomous driving; intelligent transport systems; road safety; games; augmented reality; user experience research
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Adjunct Assistant Professor Columbia GSAPP. Post-Doctoral Associate, The Future Automation Research Lab, Cornell Tech, New York City, NY 10044, USA
Interests: social sustainbility; visualizations; simulations; community engagment; smart cities; urban HRI

E-Mail Website
Guest Editor
Ludwig-Maximilians, University of Munich (LMU), 80539 Munich, Germany
Interests: interaction design for artificial intelligence; flexible and organic user interfaces; media architecture; experience prototyping

E-Mail Website
Guest Editor
Australian Centre for Field Robotics, The University of Sydney, Sydney, NSW 2006, Australia
Interests: urban robots; autonomous vehicle design; human–computer interaction; computer perception; artificial intelligence

Special Issue Information

Dear Colleagues,

Ongoing advances in computing, sensing, and network technologies pave the way for the increasing introduction of autonomous cyber–physical systems in cities. These technologies, which include autonomous vehicles of various kinds such as driverless cars, ground robots, and drones, promise to transform urban mobility and automate services. However, a key to the successful uptake of autonomous vehicles is public acceptance and trustworthiness. This requires interaction designers and HCI researchers to define new interfaces that allow people to interact with those nonhuman agents, thereby considering aspects of usability and user experience, but also understanding the ethical and sociotechnical implications. For example, in the context of autonomous driving, researchers have begun to investigate the use of multimodal interfaces to enable communication between driverless cars and pedestrians. There is an increasing demand for new prototyping approaches to inform and evaluate near-future interfaces with both users and bystanders, as well as for design research methods to interrogate the roles and perspectives of nonhuman agents in cities. 

The aim of this Special Issue is to advance the understanding of the interaction design space in the automated city, thereby shedding light on the nature of interaction with urban robots and autonomous city infrastructures. We are looking for original research that makes contributions to the topic in the form of new concepts, theories, empirical results, methodologies, or the documentation of design artefacts. Aiming to draw the larger picture of this design space, we are welcoming research that focuses on utilitarian as well as more speculative application areas alike. Potential topics include but are not limited to: 

  • Multimodal interfaces that support communication and interaction between autonomous vehicles and people (e.g., in shared spaces);
  • New robotic applications and interfaces for more desirable urban futures; 
  • Context-based interface prototyping approaches to inform, design and evaluate complex urban technologies;
  • Human-centred and explainable artificial intelligence for urban ground and aerial robots and autonomous city infrastructures;
  • Design methods, tools, and techniques for the creation of autonomous urban HCI / HRI systems;
  • Evaluation methods and tools for autonomous urban HCI / HRI systems;
  • Design artefacts that explore and capture the experiential qualities of interaction with urban robots;
  • Frameworks and theories relevant to the human-centred/more-than-human-centred design of the automated city (e.g., considering human and non-human actants);
  • Frameworks to inform urban transition towards automated systems and to support the coexistence of multiple levels of automations;
  • Community engagement in the development of autonomous urban HCI / HRI systems for more equitable and inclusive cities (e.g., using AVs, AI, robotic applications and interfaces to reduce inequalities in cities);
  • Ethical dimensions and implications of the automated city.

Dr. Marius Hoggenmueller
Dr. Martin Tomitsch
Dr. Jessica R. Cauchard
Dr. Luke Hespanhol
Dr. Maria Luce Lupetti
Dr. Ronald Schroeter
Dr. Sharon Yavo-Ayalon
Dr. Alexander Wiethoff
Dr. Stewart Worrall
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Multimodal Technologies and Interaction is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

27 pages, 7639 KiB  
Article
Virtual Urban Field Studies: Evaluating Urban Interaction Design Using Context-Based Interface Prototypes
by Robert Dongas, Kazjon Grace, Samuel Gillespie, Marius Hoggenmueller, Martin Tomitsch and Stewart Worrall
Multimodal Technol. Interact. 2023, 7(8), 82; https://doi.org/10.3390/mti7080082 - 18 Aug 2023
Viewed by 1524
Abstract
In this study, we propose the use of virtual urban field studies (VUFS) through context-based interface prototypes for evaluating the interaction design of auditory interfaces. Virtual field tests use mixed-reality technologies to combine the fidelity of real-world testing with the affordability and speed [...] Read more.
In this study, we propose the use of virtual urban field studies (VUFS) through context-based interface prototypes for evaluating the interaction design of auditory interfaces. Virtual field tests use mixed-reality technologies to combine the fidelity of real-world testing with the affordability and speed of testing in the lab. In this paper, we apply this concept to rapidly test sound designs for autonomous vehicle (AV)–pedestrian interaction with a high degree of realism and fidelity. We also propose the use of psychometrically validated measures of presence in validating the verisimilitude of VUFS. Using mixed qualitative and quantitative methods, we analysed users’ perceptions of presence in our VUFS prototype and the relationship to our prototype’s effectiveness. We also examined the use of higher-order ambisonic spatialised audio and its impact on presence. Our results provide insights into how VUFS can be designed to facilitate presence as well as design guidelines for how this can be leveraged. Full article
Show Figures

Figure 1

27 pages, 132085 KiB  
Article
Sharing the Sidewalk: Observing Delivery Robot Interactions with Pedestrians during a Pilot in Pittsburgh, PA
by David Weinberg, Healy Dwyer, Sarah E. Fox and Nikolas Martelaro
Multimodal Technol. Interact. 2023, 7(5), 53; https://doi.org/10.3390/mti7050053 - 17 May 2023
Cited by 5 | Viewed by 2699
Abstract
Sidewalk delivery robots are being deployed as a form of last-mile delivery. While many such robots have been deployed on college campuses, fewer have been piloted on public sidewalks. Furthermore, there have been few observational studies of robots and their interactions with pedestrians. [...] Read more.
Sidewalk delivery robots are being deployed as a form of last-mile delivery. While many such robots have been deployed on college campuses, fewer have been piloted on public sidewalks. Furthermore, there have been few observational studies of robots and their interactions with pedestrians. To better understand how sidewalk robots might integrate into public spaces, the City of Pittsburgh, Pennsylvania conducted a pilot of sidewalk delivery robots to understand possible uses and the challenges that could arise in interacting with people in the city. Our team conducted ethnographic observations and intercept interviews to understand how residents perceived of and interacted with sidewalk delivery robots over the course of the public pilot. We found that people with limited knowledge about the robots crafted stories about their purpose and function. We observed the robots causing distractions and obstructions with different sidewalk users (including children and dogs), witnessed people helping immobilized robots, and learned about potential accessibility issues that the robots may pose. Based on our findings, we contribute a set of recommendations for future pilots, as well as questions to guide future design for robots in public spaces. Full article
Show Figures

Figure 1

25 pages, 3830 KiB  
Article
Building Community Resiliency through Immersive Communal Extended Reality (CXR)
by Sharon Yavo-Ayalon, Swapna Joshi, Yuzhen (Adam) Zhang, Ruixiang (Albert) Han, Narges Mahyar and Wendy Ju
Multimodal Technol. Interact. 2023, 7(5), 43; https://doi.org/10.3390/mti7050043 - 26 Apr 2023
Cited by 2 | Viewed by 2884
Abstract
Situated and shared experiences can motivate community members to plan shared action, promoting community engagement. We deployed and evaluated a communal extended-reality (CXR) bus tour that depicts the possible impacts of flooding and climate change. This paper describes the results of seven community [...] Read more.
Situated and shared experiences can motivate community members to plan shared action, promoting community engagement. We deployed and evaluated a communal extended-reality (CXR) bus tour that depicts the possible impacts of flooding and climate change. This paper describes the results of seven community engagement sessions with a total of N = 74 members of the Roosevelt Island community. We conducted pre- and post-bus tour focus groups to understand how the tour affected these community members’ awareness and motivation to take action. We found that the unique qualities of immersive, situated, and geo-located virtual reality (VR) on a bus made climate change feel real, brought the consequences of climate change closer to home, and highlighted existing community resources to address the issue. Our results showed that the CXR experience helped to simulate a physical emergency state, which empowered the community to translate feelings of hopelessness into creative and actionable ideas. Our finding exemplifies that geo-located VR on a bus can be a powerful tool to motivate innovations and collective action. Our work is a first-of-its-kind empirical contribution showing that CXR experiences can inspire action. It offers a proof-of-concept of a large-scale community engagement process featuring simulated communal experiences, leading to creative ideas for a bottom-up community resiliency plan. Full article
Show Figures

Figure 1

32 pages, 3665 KiB  
Article
Evaluating Social Impact of Smart City Technologies and Services: Methods, Challenges, Future Directions
by Elise Hodson, Teija Vainio, Michel Nader Sayún, Martin Tomitsch, Ana Jones, Meri Jalonen, Ahmet Börütecene, Md Tanvir Hasan, Irina Paraschivoiu, Annika Wolff, Sharon Yavo-Ayalon, Sari Yli-Kauhaluoma and Gareth W. Young
Multimodal Technol. Interact. 2023, 7(3), 33; https://doi.org/10.3390/mti7030033 - 22 Mar 2023
Cited by 5 | Viewed by 3928
Abstract
This study examines motivations, definitions, methods and challenges of evaluating the social impacts of smart city technologies and services. It outlines concepts of social impact assessment and discusses how social impact has been included in smart city evaluation frameworks. Thematic analysis is used [...] Read more.
This study examines motivations, definitions, methods and challenges of evaluating the social impacts of smart city technologies and services. It outlines concepts of social impact assessment and discusses how social impact has been included in smart city evaluation frameworks. Thematic analysis is used to investigate how social impact is addressed in eight smart city projects that prioritise human-centred design across a variety of contexts and development phases, from design research and prototyping to completed and speculative projects. These projects are notable for their emphasis on human, organisational and natural stakeholders; inclusion, participation and empowerment; new methods of citizen engagement; and relationships between sustainability and social impact. At the same time, there are gaps in the evaluation of social impact in both the smart city indexes and the eight projects. Based on our analysis, we contend that more coherent, consistent and analytical approaches are needed to build narratives of change and to comprehend impacts before, during and after smart city projects. We propose criteria for social impact evaluation in smart cities and identify new directions for research. This is of interest for smart city developers, researchers, funders and policymakers establishing protocols and frameworks for evaluation, particularly as smart city concepts and complex technologies evolve in the context of equitable and sustainable development. Full article
Show Figures

Figure 1

23 pages, 25389 KiB  
Article
Simulating Wearable Urban Augmented Reality Experiences in VR: Lessons Learnt from Designing Two Future Urban Interfaces
by Tram Thi Minh Tran, Callum Parker, Marius Hoggenmüller, Luke Hespanhol and Martin Tomitsch
Multimodal Technol. Interact. 2023, 7(2), 21; https://doi.org/10.3390/mti7020021 - 16 Feb 2023
Cited by 3 | Viewed by 3394
Abstract
Augmented reality (AR) has the potential to fundamentally change how people engage with increasingly interactive urban environments. However, many challenges exist in designing and evaluating these new urban AR experiences, such as technical constraints and safety concerns associated with outdoor AR. We contribute [...] Read more.
Augmented reality (AR) has the potential to fundamentally change how people engage with increasingly interactive urban environments. However, many challenges exist in designing and evaluating these new urban AR experiences, such as technical constraints and safety concerns associated with outdoor AR. We contribute to this domain by assessing the use of virtual reality (VR) for simulating wearable urban AR experiences, allowing participants to interact with future AR interfaces in a realistic, safe and controlled setting. This paper describes two wearable urban AR applications (pedestrian navigation and autonomous mobility) simulated in VR. Based on a thematic analysis of interview data collected across the two studies, we find that the VR simulation successfully elicited feedback on the functional benefits of AR concepts and the potential impact of urban contextual factors, such as safety concerns, attentional capacity, and social considerations. At the same time, we highlight the limitations of this approach in terms of assessing the AR interface’s visual quality and providing exhaustive contextual information. The paper concludes with recommendations for simulating wearable urban AR experiences in VR. Full article
Show Figures

Figure 1

29 pages, 4538 KiB  
Article
Ranking Crossing Scenario Complexity for eHMIs Testing: A Virtual Reality Study
by Elena Fratini, Ruth Welsh and Pete Thomas
Multimodal Technol. Interact. 2023, 7(2), 16; https://doi.org/10.3390/mti7020016 - 02 Feb 2023
Viewed by 1625
Abstract
External human–machine interfaces (eHMIs) have the potential to benefit AV–pedestrian interactions. The majority of studies investigating eHMIs have used relatively simple traffic environments, i.e., a single pedestrian crossing in front of a single eHMI on a one-lane straight road. While this approach has [...] Read more.
External human–machine interfaces (eHMIs) have the potential to benefit AV–pedestrian interactions. The majority of studies investigating eHMIs have used relatively simple traffic environments, i.e., a single pedestrian crossing in front of a single eHMI on a one-lane straight road. While this approach has proved to be efficient in providing an initial understanding of how pedestrians respond to eHMIs, it over-simplifies interactions which will be substantially more complex in real-life circumstances. A process is illustrated in a small-scale study (N = 10) to rank different crossing scenarios by level of complexity. Traffic scenarios were first developed for varying traffic density, visual complexity of the road scene, road geometry, weather and visibility conditions, and presence of distractions. These factors have been previously shown to increase difficulty and riskiness of the crossing task. The scenarios were then tested in a motion-based, virtual reality environment. Pedestrians’ perceived workload and objective crossing behaviour were measured as indirect indicators of the level of complexity of the crossing scenario. Sense of presence and simulator sickness were also recorded as a measure of the ecological validity of the virtual environment. The results indicated that some crossing scenarios were more taxing for pedestrians than others, such as those with road geometries where traffic approached from multiple directions. Further, the presence scores showed that the virtual environments experienced were found to be realistic. This paper concludes by proposing a “complex” environment to test eHMIs under more challenging crossing circumstances. Full article
Show Figures

Figure 1

20 pages, 4401 KiB  
Article
Smiles and Angry Faces vs. Nods and Head Shakes: Facial Expressions at the Service of Autonomous Vehicles
by Alexandros Rouchitsas and Håkan Alm
Multimodal Technol. Interact. 2023, 7(2), 10; https://doi.org/10.3390/mti7020010 - 20 Jan 2023
Cited by 3 | Viewed by 2793
Abstract
When deciding whether to cross the street or not, pedestrians take into consideration information provided by both vehicle kinematics and the driver of an approaching vehicle. It will not be long, however, before drivers of autonomous vehicles (AVs) will be unable to communicate [...] Read more.
When deciding whether to cross the street or not, pedestrians take into consideration information provided by both vehicle kinematics and the driver of an approaching vehicle. It will not be long, however, before drivers of autonomous vehicles (AVs) will be unable to communicate their intention to pedestrians, as they will be engaged in activities unrelated to driving. External human–machine interfaces (eHMIs) have been developed to fill the communication gap that will result by offering information to pedestrians about the situational awareness and intention of an AV. Several anthropomorphic eHMI concepts have employed facial expressions to communicate vehicle intention. The aim of the present study was to evaluate the efficiency of emotional (smile; angry expression) and conversational (nod; head shake) facial expressions in communicating vehicle intention (yielding; non-yielding). Participants completed a crossing intention task where they were tasked with deciding appropriately whether to cross the street or not. Emotional expressions communicated vehicle intention more efficiently than conversational expressions, as evidenced by the lower latency in the emotional expression condition compared to the conversational expression condition. The implications of our findings for the development of anthropomorphic eHMIs that employ facial expressions to communicate vehicle intention are discussed. Full article
Show Figures

Figure 1

17 pages, 3070 KiB  
Article
Understanding Operator Influence in Automated Urban Shuttle Buses and Recommendations for Future Development
by Martina Schuß, Alice Rollwagen and Andreas Riener
Multimodal Technol. Interact. 2022, 6(12), 109; https://doi.org/10.3390/mti6120109 - 13 Dec 2022
Cited by 2 | Viewed by 1981
Abstract
The automation of our vehicles is an all-present topic with great benefits for society, particularly in the area of public transport and pilot projects of automated shuttle buses are already underway. However, they do not show the full potential of using them as [...] Read more.
The automation of our vehicles is an all-present topic with great benefits for society, particularly in the area of public transport and pilot projects of automated shuttle buses are already underway. However, they do not show the full potential of using them as a supplement to public transport, since single-occupancy registration of the vehicles usually allows only slow speeds and also requires a substitute driver on board. In our study, we aim to (1) examine the status quo of its user acceptance and (2) identify the roles of the operators and their tasks in automated urban shuttle buses. We conducted a mixed-method study including in-depth interviews, questionnaires, and in-the-field observations visiting pilot projects of the two most widespread pilot projects on German streets. Our results uncover the multiple roles and tasks the human operators currently assume. Furthermore, we developed design approaches for a digital companion substituting the operator in a long run and evaluated these concepts. A remote operator or a hologram were preferred solutions and we propose further design requirements for such companions. This work helps to understand the individual roles that operators currently occupy and provides a good basis for concepts of technologies that will perform these tasks in the future. Full article
Show Figures

Figure 1

Review

Jump to: Research

17 pages, 10093 KiB  
Review
The Value of Context-Based Interface Prototyping for the Autonomous Vehicle Domain: A Method Overview
by Lukas A. Flohr and Dieter P. Wallach
Multimodal Technol. Interact. 2023, 7(1), 4; https://doi.org/10.3390/mti7010004 - 30 Dec 2022
Cited by 4 | Viewed by 1936
Abstract
Before autonomous vehicles (AVs; SAE levels 4 and 5) become broadly available, acceptance challenges such as trust and safety concerns must be overcome. In the development of appropriate HMIs that will tackle these challenges, physical and social context play essential roles. Contextual factors [...] Read more.
Before autonomous vehicles (AVs; SAE levels 4 and 5) become broadly available, acceptance challenges such as trust and safety concerns must be overcome. In the development of appropriate HMIs that will tackle these challenges, physical and social context play essential roles. Contextual factors thus need to be considered in early prototyping stages. Based on a qualitative semi-systematic literature review and knowledge from our research, this paper elaborates on the value of context-based interface prototyping in the AV domain. It provides a comprehensive overview and a discussion of applicable methods, including physical lab-based prototyping (mock-up, ride simulation with virtual and mixed reality, and immersive video), social context simulation (actors, enactment, items and props, and sound), wizard-of-oz, and experimental vehicles. Finally, the paper discusses factors affecting the impact of prototyping and derives recommendations for the application of prototyping methods in future AV studies. Full article
Show Figures

Figure 1

Back to TopTop