Next Article in Journal
“There Was No Green Tick”: Discovering the Functions of a Widget in a Joint Problem-Solving Activity and the Consequences for the Participants’ Discovering Process
Previous Article in Journal / Special Issue
Participatory Prototyping to Inform the Development of a Remote UX Design System in the Automotive Domain
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Survey to Understand Emotional Situations on the Road and What They Mean for Affective Automotive UIs

1
BMW Group Research, New Technologies, Innovations, 85748 Garching, Germany
2
LMU Munich, 80337 Munich, Germany
3
Bundeswehr University Munich, 85579 Neubiberg, Germany
*
Author to whom correspondence should be addressed.
Multimodal Technol. Interact. 2018, 2(4), 75; https://doi.org/10.3390/mti2040075
Submission received: 1 August 2018 / Revised: 8 October 2018 / Accepted: 18 October 2018 / Published: 25 October 2018
(This article belongs to the Special Issue Automotive User Interfaces)

Abstract

:
In this paper, we present the results of an online survey (N = 170) on emotional situations on the road. In particular, we asked potential early adopters to remember a situation where they felt either an intense positive or negative emotion while driving. Our research is motivated by imminent disruptions in the automotive sector due to automated driving and the accompanying switch to selling driving experiences over horsepower. This creates a need to focus on the driver’s emotion when designing in-car interfaces. As a result of our research, we present a set of propositions for affective car interfaces based on real-life experiences. With our work we aim to support the design of affective car interfaces and give designers a foundation to build upon. We find respondents often connect positive emotions with enjoying their independence, while negative experiences are associated mostly with traffic behavior. Participants who experienced negative situations wished for better information management and a higher degree of automation. Drivers with positive emotions generally wanted to experience the situation more genuinely, for example, by switching to a “back-to-basic” mode. We explore these statements and discuss recommendations for the design of affective interfaces in future cars.

1. Introduction

Driving a current car requires the driver to perform multiple simultaneous activities like steering, braking and accelerating, and monitoring the road. Drivers also perform many optional non-driving-related activities, such as handsfree calling, programming the destination, or adjusting the radio. The number of tasks already suggests that this can be challenging for the driver due to an increased mental or physical workload.
Consequently, much of the automotive research for manual driving tries to reduce the driver’s workload and distraction while making the ride enjoyable and even productive at the same time. A proposed idea based on this is an in-vehicle system which understands the driver state and is able to model driver behavior [1] and adapt the in-vehicle systems based on the current requirements [2]. The driver state includes all physical and functional characteristics of the driver, such as mental workload, fatigue, level of distraction, but also emotions [2]. We expect that understanding the drivers’ activities and state will be just as relevant or even more important for the upcoming age of highly automated driving as it is for manual driving. In particular when handing over control between car and driver (i.e., switching between different levels of automation [3]), it is essential to have a detailed understanding of the driver state to appropriately support this handover [4].
Affective computing in the car, i.e., recognizing and reacting upon the driver’s emotional state, is a topic of interest: Both negative [5,6,7] but also exuberantly positive emotional states [8] have been identified to strongly impact driving performance and to support unsafe driving. This topic has been identified as an upcoming research area already more than a decade ago [5] but compared to other areas of emotion-related sciences or HCI, the automotive sector rather neglected this topic in the past. Most research focuses around detecting emotions and/or reacting upon them. Surprisingly little is however known about the cause or stimulation of certain emotions while driving.
Since there is a need to understand where emotions come from and what they mean in order to build meaningful interactions, we carried out this survey of emotional experiences while driving among 170 drivers from Europe and North America. Jeon and Walker give an overview of relevant emotions while driving [9] which provided a valuable starting point. We focus on the reasons for specific emotions and ask the question how a system should be built to cope with this new kind of information and the resulting “driver-vehicle interaction loop” [10]. Based on these insights, we also propose user-centered characteristics for future cars that are based on our participants’ statements. Designers of affective automotive UIs can benefit from these insights as they give guidance towards desired features. We expect that our findings are not only applicable to one specific level of automation but can be applied for the whole transition process from manual driving (SAE Level 0, [3]) to full automation (SAE Level 5).
Respondents are generally looking forward to the age of automation because it allows for a better use of their time. With the presented design recommedations we show a definite desire for automation which has recently been doubted in the public discussion [11]. Inspired by related work [12], we deliberately focus on the needs of a subset of drivers—potential early adopters—which allows us to make assumptions on what actual future users want, in contrast to a general public opinion.

2. Contribution Statement

The contributions of this paper can be summarized as follows: Based on our online survey conducted in Europe and North America, we provide insights on (1) contextual triggers of emotions in the car and (2) on desires of drivers for future automotive HMI and (3) we generalize these findings to present design recommendations for affective user interfaces in future cars.

3. Related Work

Research on automotive user interfaces has for a long time focussed on how to optimize user experience and minimize distracting effects for the driver, (e.g., [13,14]). Approaches to this fundamental need for road safety [15] have been developed along with the technical possibilities in cars. Today, researchers of automotive human-machine interaction aim for a natural experience with multimodal input channels [16] and persuasive abilities [17]. Such systems can improve the safety of traffic participants by observing driving performance and then influence the driving style [18] or regulate speeding [19] if neccessary. Other systems monitor and react to the driver’s emotional state in order to keep them safe [20] as driving performance can also be influenced by negative emotions [7].

3.1. Detecting Emotions

Before a system can react to the driver’s emotions, it first needs to be able to recognize them. One technique is facial action units in video streams [21]. This has already been explored in the automotive context in 2005, when Hoch et al. combined acoustic and visual input to allow for an estimation of driver emotions and is still steadily refined [22]. Best results are often achieved with an approach where the information of different sources is fused, for example, physiological sensors and audio-visual data, which can outperform human observers [23]. The automotive environment provides a great proving ground for multi-sensor approaches as users interact within a confined space and can be observed from all angles [24].

3.2. Affective Computing

When in-vehicle systems can detect and act upon a driver’s emotions, they enable a relatively new kind of interaction, incorporating the principles of affective computing [25]. The concept is based on findings saying that feelings can have an enormous effect on behavior. People are, for instance, harder to distract when they are in an energetic mood and less likely to engage in mentally demanding tasks when feeling content [26]. Nass et al. have investigated this aspect in the car by introducing a voice-based assistant which can adapt to the user’s emotion with its voice [20]. They report fewer accidents, better attention, and higher willingness to communicate when the system’s voice mirrored the driver’s emotion.
Data from emotion recognition systems can also be used to describe the quality of interpersonal engagement [27] or it can be combined with location-based services to detect anomalies [28]. The need for understanding the context of emotions stems from the fact neither environment nor user behavior alone can give a system a clear view of what is happening [29], yet emotions have big effects on driving safety [30]. Our survey provides more insights on the causes of emotional situations on the road so we can understand and handle them accordingly.

3.3. Interaction With Autonomous Vehicles

Much research has been done on understanding drivers’ needs regarding autonomous driving and potential ideas for interaction. While self-driving cars may revolutionize mobility, they can also trigger a perceived loss of control and eventually lead to less fun while driving [31]. Interaction designers also need to address challenges concerning a degradation of mode- [32] and situational awareness [33] and in-vehicle systems need to communicate to the driver how such cars make decisions [34].
Key focus areas for researchers in automated driving have been identified in the fields of privacy, automation, trust, and the transformation of cars into a place for mobile work and play [35,36]. This also includes research on how automated vehicles will change our lifestyle [37]. Previous work has tackled these issues, for example, by featuring system uncertainty in the user interface of autonomous cars to manage trust levels [38] or by involving the passive driver through gamification [39]. The change in spatial allocation within the car introduces new activities for its occupants [40] and will likely shift the focus of research in HCI towards interaction between passengers [41,42] and require a universal design language to support usability and accessibility for a broad range of users [43]. Here, new opportunities will arise for different user groups, e.g., increasing mobility for elderly passengers and people with special needs [44,45]. Other approaches make use of the connectedness of cars to design social experiences, like direct communication to nearby friends [46], or an inter-vehicle feedback system which allows drivers to give and receive ratings of their conduct [47]. Finally, this connection between vehicles is also a basic prerequisite for any coexistence of manually and autonomously driven vehicles on the same roads [48].

3.4. Driver Models

To provide the car with the required user data, we need a model containing the current driver state [49]. Driver models can take into account physiological data [50], awareness indicators like glance behavior [51], or emotional cues [52]. These models vary heavily, depending on their purpose and can also contain other features.
In the application area of automotive user interfaces, we design proactive safety features supporting the driver when non-normative driver states are detected [53]. These states can also depend on the vehicle’s surroundings, so a connection to the environment model, which builds the foundation for autonomous driving [54], could provide us with valuable context information. To our knowledge, there is no previous work yet on a connection of driver model and environment model. However, feedback to this survey suggests that the surroundings can have an effect on the driver’s emotional state.

4. Online Survey

With our survey we identify triggers of emotional experiences on the road today and what they mean for the requirements on future automotive interaction. Participants were assigned to one of two emotion groups (positive/negative), so that a balance of equal narratives of positive and negative situations would be achieved. The survey was available in German and English language. (The survey text can be downloaded at http://drivingstudy.de/emotionsurvey.pdf).

4.1. Design and Method

When we investigate emotions we have to take into account that there are many different societal norms of expressing affect which can influence the information a person is willing to disclose in their narrative [55]. As we focus on private events, we also need to understand that people tend to feel less stressed when they talk about personal experiences in a setting which allows for privacy [56]. Providing anonymity to study participants furthermore improves the quality of answers and prevents social desirability biases [57].
With respect to these observations, we designed an online questionnaire which ensures anonymity and gives respondents leeway to reflect on their personal experiences without being biased (e.g., through examples). The study design was inspired by previous online survey approaches by Eiband et al. [58] and Simmons and Hoon [59] and the survey was tested in small groups and revised several times before going live. Experiences from our first iterations showed that it took participants too long to report both a positive and a negative experience. Thus, we decided to ask each participant only about either a positive or a negative experience. Participants were randomly assigned to one of these groups, while ensuring that the overall number of people in each group is equal. The survey comprised a short disclaimer followed by 4 sections. In the first part, we provided multiple choices, the remaining questions were free text entry which allowed for extensive storytelling.
  • Demographic data such as age, region, and estimated annual mileage.
  • Recollection of a freely chosen emotional experience in the car, which the participant had experienced in the past as a driver while manually driving a car. Depending on the randomly assigned group, this emotion was either positive or negative.
  • Detailed questions on the situation, for instance place, circumstances, involved parties.
  • Questions on behavior: how could involved people or an intelligent future vehicle have behaved to improve the situation?
Thus, we first asked about prior positive or negative emotional experiences and how a future vehicle could actively influence and improve the situation. For the design of the future vehicle, we deliberately decided not to ask the participants to imagine a specific level of driving automation. We asked the participants to imagine a modern vehicle which can actively respond to the needs of the driver. This vehicle will e.g., perceive the outside world and converse with the driver. The focus on distinct levels of automation was intentionally avoided to not bias participants into thinking of autonomous vehicles only, as many user interface approaches are viable in all levels of automation.

4.2. Participants

We published the survey on university mailing lists and car-related online forums, targeting academics and car enthusiasts as potential early adopters within the key demographics of premium car manufacturers. The title was held general by stating we want to hear about experiences on the road and participants were given an incentive of joining a lottery for three $20 gift certificates. 292 people visited the landing page of our survey and 170 participants completed the survey. 22 sets of answers were discarded because the respondents could not (a) remember a suitable experience or (b) because the raters found the input to be inappropriate (provocative/off-topic/implausible). This left us with 73 respondents for the negative, and 75 for the positive questionnaire.
Respondents were recruited from Europe (60%) and North America (40%), 33% of them were female and 67% male, aged 16 to 70. The mean age of respondents was 26.9 ( S D = 9.1 ) . Driving experience was balanced among participants with 25% reporting an annual mileage of less than 5000 km, 23% between 5000 km and 10,000 km, 30% between 10,000 km and 20,000 km, and 22% were frequent drivers with more than 20,000 km. All data was retrieved and stored anonymously. Contact information for the lottery was instantly separated from the answers and deleted once the winners were announced.

4.3. Limitations

Self-reporting questionnaires are widely used to get a better understanding of users [60] but they provide potential for inaccuracies due to recall bias, imprecise phrasing, or inaccurate self-perception [61,62]. We take measures to prevent such influences by asking generalized questions that went through multiple iterations before we published the survey and by applying filters and hand-selection to eradicate invalid data.
We recruited our respondents through online mailing lists and car forums. For the selection of these distribution channels, we tried to target car enthusiasts and people who prefer high-end vehicles. While this sample may not represent a general population, it fits quite well with the target group of high-end car manufacturers. We also have to accept a certain bias through self-selection because of the online recruiting approach.
Compared to in situ interviews, surveys have one limitation that they are asynchronous with regard to the event under investigation. However, as the emotional experiences we aim to investigate do not happen very frequently and as we focus on real-world experiences (i.e., not on artificial lab situations), it is impossible to collect such findings through a lab or real-world driving study. As a consequence, findings from our study are mostly qualitative and descriptive. Furthermore, study settings without experimental manipulation are often prone to confounding factors influencing the findings. In this case this is intentional as we aim to find out about contextual reasons for emotions on the road which can be seen as uncontrolled variables. With the goal of implementing emotion-aware systems in the car, this is a major first step to understand emotional experiences in the car. Obviously, future research is required to understand how people react in such situations as those which we identified in our survey. As all descriptive research, we have to accept that the findings we present are hardly repeatable and that they are the product of interpretation. We acknowledge the problem of researcher bias which we addressed with a multi-rater setup in order to prevent subjective emphasizing of certain ideas.

5. Inductive Coding and Inter-Rater Agreement

We clustered all answers into categories with two specific questions in mind: (1) Where do emotions come from while driving? (2) How can future in-vehicle systems improve the driving experience? Following the methodology of thematic analysis [63], this resulted in a classification system consisting of 6 categories and 47 subcategories from the 148 valid sets of answers. The classification was continually adjusted while processing the participants’ responses. With the final codebook, two raters grouped a random sample of 10% of the datasets into the given classifications and discussed their performance to improve common understanding.
Answers of both raters were transferred into a 2-column table with binary values for every response and subcategory. An inter-rater agreement was calculated with Cohen’s κ [64] because we have nominal data and two raters [65]. This procedure was adopted from the coding analysis of previous survey papers [58,59]. The analysis shows a substantial strength of agreement of κ = 0.751 [66] which was used to generalize to the full sample [67]. The remaining answers were then coded half each by the two raters.

6. Results

The answers given by the respondents generally varied in length and detail. Negative stories were not only longer but often also more emotional. The aforementioned method of thematic analysis prepared the qualitative feedback for factor analysis, allowing us to determine correlations among observed events and involved parties in context and their effects on emotions. 13 items showed reasonable factorability, determined by a Kaiser-Meyer-Olkin measure of sampling adequacy of 0.74 (values above 0.6 are recommended [68]) and significant results ( χ 2 ( 78 ) = 540.25 , p < 0.0001 ) in Bartlett’s test of sphericity. Results of the factor analysis are mentioned where applicable.

6.1. Triggers of Emotional Situations while Driving

Participants remember positive situations in connection with a “feeling of being free” (P134) and “good music” (P148), for example, “Driving on an empty highway at night, with loud music blasting from the speakers, open windows letting the cold night air in and just feeling completely free.” (P132). This recurring pattern was also expressed by P140 who said “I put on really loud music and sang along. Because it was so late, the roads were very empty. I felt very relieved and relaxed.”—and by P89: “I felt free, enjoyed being alone, and enjoyed the fact that I just do not have to do anything other than drive a car and listen to music while doing some racing.”
Empty roads are often connected with a tendency to “drive spiritedly” (P152, P144). Some drivers explicitly say they enjoyed “the sound of cars” (P144) or their “roaring exhaust” (P136) while doing so. Overall, 20 respondents identified the ride itself as reason for positive emotions (Figure 1).
In contrast to blissful joyrides, most stories which start with reports of traffic violations end in real anger. 45 respondents described the driving quality of others as a reason for negative emotions, followed by traffic and (near) accidents (Figure 1). Negative situations seem to happen more often on highways, but also on any other road with potentially high car volume. When we filter for involved parties, other traffic participants score highest for negative situations (Figure 2) and the most negatively influencing behavior are traffic violations (Figure 3). Factor analysis shows one component with corresponding factor loadings for traffic violations ( 0.827 ) and the wish for better driving of others ( 0.803 ) with a high correlation ( 0.743 ).
However, negative experiences cannot only be induced during the ride: drivers also reported that their previous emotional state had an impact on the situation. Prominent examples are overwhelming sadness (P37, P57) and anxiety as a result of previous negative driving experiences (P15, P45). Rides with family members are also reported to have an effect on the emotional state with a moderate correlation of family members ( 0.588 ) and their mood ( 0.573 ) influencing the driver ( 0.444 ) .

6.2. Users’ Suggestions for Affective Interfaces

After narrating their emotional experience, respondents had the chance to reflect on what involved parties could have done to improve the situation and how an intelligent vehicle or an in-car assistant should have behaved (Figure 4).
Drivers in negative situations generally wished for better driving and a more considerate, respectful attitude. Positive situations did obviously not create much input on how to improve the situation, yet drivers expressed an affinity towards personal interaction. We found one component with corresponding factor loadings for positive emotions ( 0.671 ) and the “be quiet” system approach ( 0.666 ) with a weak correlation of 0.175 .
Many drivers envision future cars to adapt the music according to their feelings, for example, by “automatic mood music starting to play” (P129), by choosing “the perfect next song” (P107), or by adjusting the volume and bass when the music fits the moment (P105). Others wish for a photo function, so they can “re-enjoy the scenery” after the ride (P117).
More safety-relevant features like fatigue detection (P157) or a navigator which calms down the driver in case of a detour (P23) were also mentioned. An approach to comfort the driver has been asked for several times, e.g., by P54: “First the vehicle has to calm me down because once someone loses their nerves they become more susceptible to make mistakes. This can be done with music, light, or the vehicle speaking back to me. Second the vehicle should help me get out of the situation itself and make sure I got back to the correct mood.”—another respondent wants the system to “recognize the negative emotions and ask the driver to stop, I do not think he should be deprived of control, maybe only on request” (P37). Of course the system could also monitor other passengers’ emotional states, e.g., kids on the backseat or the nervous occupant in the passenger seat—and adapt its capabilities accordingly. P165 envisioned an autonomous vehicle which can detect their mother and “replicate my mother’s driving style rather than my own to make her feel more comfortable in the car”.
Many respondents equated future cars with self-driving cars, from which they expect the ability to avoid dangerous situations, even if caused by the driver (e.g., P4, P12, P49, P57, P62, P66, P74). P4 said an interface for autonomous vehicles should in case of an incident “rationally analyze and explain the situation” to calm down the passengers. P62 predicted that in the future “a system can see one car moving much faster than the flow and force a gentle slowdown” to ensure road safety. This notion of imposed compliance to traffic rules has also been suggested by many respondents: P121 wrote “the situation as a whole would be better if vehicles prevented/discouraged breaking rules” and P71 stated that “ideally, a fully smart car would just have made the stop and checked for traffic on its own without relying on the driver to actually comply”.
Users want to use self-driving cars to make better use of their time. Interfaces can support them by e.g., tinting the windows so they “could not see anything at all [of the outsides] and could focus entirely on people or work” (P30), or by allowing them to rest while “the vehicle could stop and start by itself in traffic” (P65). Others want the car to take over so they can party with their friends (P91).
If we remember the many cases of positive situations involving joyrides, we can see the point of system functionalities designed to enable these experiences. P81 suggests “a future system would have been able to recognize that positive emotions had arisen through challenging driving passages and could have proposed similar/more challenging routes on its own”. Other drivers wish for a system which can warn them of close-by traffic so they can safely speed when the streets are empty and slow down when it is unsafe to do so (e.g., P151, P158). The functionality to “plan routes and stops according to scenic views” (P131) has also been asked for multiple times.
A group of respondents in favor of joyful driving (P135, P136, P139, P142, P143, P145, P147, P148, P155) share the opinion that a modern interface “would ruin this particular experience [of a joyride]” (P152) and that modern cars “are inherently less exciting as they take away some of the control from the driver” (P147). They wish for a “back-to-basics mode” (P135) which allows for “a manual override so that [they are] actually driving” (P142) and turns down all interactions to let “the passengers just enjoy it, at least for a while” (P148).
Another impulse comes from two respondents who felt positively reinforced after receiving a thank you gesture from another driver (P101, P124): they proposed a feature to “convey nice thoughts to other drivers” (P101), which could be achieved through external signals or direct communication. This could also be used to detect an inevitable collision and warn the passengers (P17). Inter-vehicle communication has been mentioned comparably often by drivers who already use additional information channels, e.g., info by co-driver / radio (correlation of 0.257 ). They suggest autonomous vehicles should communicate with each other to optimize the flow of traffic (P67) and with drivers of non-automated cars to clarify their intentions (P60). Drivers could use this channel in unclear situations to “ask for permission, which you cannot do with the blinking lights” (P53). Some also favored that drivers who violate traffic rules could be criticized through such a connection between cars (P43, P46, P29, P21).

7. Design Recommendations for Future Automotive User Interfaces

Based on the statements of 148 drivers, we identified a set of propositions for designers of future automotive interfaces with a focus on affective features and needs arising through the automation of the driving task. These 7 design recommendations are the main themes which emerged from a thematic analysis. Direct quotations are used to provide a means of member checking for the user sample [63].
Drivers Want the Interface to Respond Empathically and Focus On Their Well-Being Over 50% of the participants supposed affective functionalities, such as adaptive music (e.g., playlist, volume, equalizer), empathic voice interaction, encouragement in difficult situations, or ambient lighting. The system could also help by monitoring the kids on the backseat or by adapting the driving style if sensitivity or motion sickness are detected in passengers (e.g., P145).
Drivers Want the Navigation to Take Into Account the Emotional Aspects of the Route Respondents suggest that routes are chosen according to the expected effects on the driver and passengers [69]. When they are in an adventurous mood a challenging backcountry road would be better than when they want to arrive at their destination quickly (e.g., P81, P138). Drivers on a joyride would prefer empty roads, families on a Sunday trip would enjoy scenic mountain routes with picnic stops, and so forth (e.g., P112, P132).
Drivers Sometimes Want to Hand Over Responsibility Some respondents (40) state that they would like to switch to autonomous driving in certain situations, e.g., in bad traffic or when they want to have some private time. This also shows in the correlation of negative emotions and traffic violations (see Figure 3).
Drivers Want to Take Back Control for Fun Drivers occasionally want to turn off automation and even safety features to experience the pleasure of driving. Several users (e.g., P136, P144, P152) independently suggested a driving mode with very limited electronic support functionalities, which allows them to enjoy the ride more roughly in certain situations.
Drivers Want to Spend Their Time More Efficiently When not driving manually, future cars should enable their passengers to be undisturbed while enjoying music, working, or relaxing. Users want to focus on interacting with other passengers or even party inside the vehicle. The car could also save them time by running errands on its own (e.g., P30, P81, P142, P143).
Drivers Want Inter-Vehicle Communication Communication between vehicles is expected in modern traffic scenarios. Connected cars can so communicate with each other and with infrastructure to optimize the ride. Seemingly users also want to use this channel to thank other drivers (P101, P124), communicate intentions or ask for permission (P60, P53), or to quickly distribute safety-relevant information (P17, P67).
Drivers Want the Car to Make Driving More Safe Future cars should be able to intervene when the driver is about to cause an accident (e.g., P2, P7, P28, P34) but also take into account the driver’s experience (P64). Such a system could address past mistakes and teach the driver how to avoid them in the future (P21, P29, P43, P46). For this functionality we need a long-time driver model to manage the driver’s preferences and abilities.

8. Conclusions and Future Work

The first part of our results describes actual experiences of drivers and the circumstances that led to an emotional situation. We learn, for example, that highways show a higher occurrence of negative emotions than other streets. We can take this statement as inspiration to build route recommendations based on the passengers’ emotional states [69]. Other testimonies about the situation itself can synchronously be applied in automotive research (e.g., to define use cases for autonomous driving or in-vehicle assistants).
From discussions within the community, we know that context is seen as a promising first pointer for an estimation of driver emotions [70]. Previous work on driving contexts employed similar qualitative methods as we do, however only looked at very specific states like angry driving [71] or the effects of emotions on driving performance [7,72]. We fill this gap with a wider catchment area of generally positive and negative emotions. This new knowledge can be used for practical applications which analyze context data to inform the driver state. First approaches for the sensory realization of such systems have been presented recently [73,74], a production system would however require the deployment of those sensors in a fleet of vehicles to collect sufficient data and the according ground truths necessary for adequate machine learning algorithms.
The second part consists of an approach of ideation, contributing general research directions for HCI researchers focussing on future HMI concepts. The reported statements are only a manageable excerpt to give a glimpse on the received feedback. Participants describe systems taking care of the passengers’ well-being, which could also be applied on school buses to check on our kids. They also remind us to think about the subjects’s feelings about being monitored. Will autonomous cars which adapt their driving style to the passengers’ preferences be accepted by other traffic participants, or other passengers? We also found ideas within the drivers’ responses which have already been evaluated before, like an inter-vehicle feedback system [75]. These concepts could gain new momentum as we show users actively propose them. A next step could be the assessment of those prototypes with respect to user expectations, acceptance, and reactance [76,77].
Another direction of research we want to provoke are feasibility studies on how to actually incorporate required technologies into the car. Affective interfaces depend on robust emotion detection, artificial intelligence, and are influenced by data protection legislation. This can be implemented with cameras, physiological sensors, or connected consumer devices such as smart watches and phones. Autonomous driving needs vast amounts of input data, high computing capacities, and is already heavily regulated. How can we make these visions reality while concurrently combining societal goals like the eradication of fatal traffic accidents [78] with the wish to drive manually for fun? We are excited to see the future development of research on affective systems and hope to provide a groundwork for designers with the identified ideas for future automotive UIs.
Such a paradigm shift has happened before in the field of mobile phones where some of the established assumptions on user needs (e.g., need for long battery life) made way for newer needs and in doing so rendered established (and less flexible) manufacturers insignificant [79].
We present a set of seven design recommendations for future in-car user interfaces which we hope to be adapted by designers as groundwork in order to create the future of automotive HMI.

Author Contributions

Conceptualization: M.B., B.P. and F.A.; Investigation: MB; Methodology: M.B., B.P. and F.A.; Supervision: B.P. and F.A;. Visualization: M.B.; Writing–original draft: M.B.; Writing–review and editing, M.B., B.P. and F.A.

Funding

This work was funded, in part, by the Bavarian State Ministry of Education, Science and the Arts in the framework of the Center Digitization.Bavaria (ZD.B) and by the German Research Foundation (DFG), Grant No. AL 1899/2-1.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Islinger, T.; Köhler, T.; Wolff, C. Human Modeling in a Driver Analyzing Context: Challenge and Benefit. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Salzburg, Austria, 30 November–2 December 2011; ACM: New York, NY, USA, 2011; pp. 99–104. [Google Scholar] [CrossRef]
  2. Coughlin, J.F.; Reimer, B.; Mehler, B. Monitoring, Managing, and Motivating Driver Safety and Well-Being. IEEE Pervasive Comput. 2011, 10, 14–21. [Google Scholar] [CrossRef]
  3. SAE. Taxonomy and Definitions for Terms Related to on-Road Motor Vehicle Automated Driving Systems; SAE: Warrendale, PA, USA, 2018; p. 35. [Google Scholar] [CrossRef]
  4. Miller, D.; Sun, A.; Johns, M.; Ive, H.; Sirkin, D.; Aich, S.; Ju, W. Distraction Becomes Engagement in Automated Driving. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2015, 59, 1676–1680. [Google Scholar] [CrossRef]
  5. Eyben, F.; Wöllmer, M.; Poitschke, T.; Schuller, B.; Blaschke, C.; Färber, B.; Nguyen-Thien, N. Emotion on the Road: Necessity, Acceptance, and Feasibility of Affective Computing in the Car. Adv. Hum.-Comp. Interact. 2010, 2010, 5. [Google Scholar] [CrossRef]
  6. Fakhrhosseini, S.M.; Landry, S.; Tan, Y.Y.; Bhattarai, S.; Jeon, M. If You’Re Angry, Turn the Music on: Music Can Mitigate Anger Effects on Driving Performance. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Seattle, WA, USA, 17–19 September 2014; ACM: New York, NY, USA, 2014; pp. 18:1–18:7. [Google Scholar] [CrossRef]
  7. Jeon, M.; Yim, J.B.; Walker, B.N. An Angry Driver is Not the Same As a Fearful Driver: Effects of Specific Negative Emotions on Risk Perception, Driving Performance, and Workload. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Salzburg, Austria, 30 November–2 December 2011; ACM: New York, NY, USA, 2011; pp. 137–142. [Google Scholar] [CrossRef]
  8. Zimasa, T.; Jamson, S.; Henson, B. Are happy drivers safer drivers? Evidence from hazard response times and eye tracking data. Transp. Res. Part F Traffic Psychol. Behav. 2017, 46, 14–23. [Google Scholar] [CrossRef] [Green Version]
  9. Jeon, M.; Walker, B.N. What to detect? Analyzing Factor Structures of Affect in Driving Contexts for an Emotion Detection and Regulation System. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2011, 55, 1889–1893. [Google Scholar] [CrossRef]
  10. Riener, A.; Jeon, M.; Alvarez, I.; Frison, A.K. Driver in the Loop: Best Practices in Automotive Sensing and Feedback Mechanisms. In Automotive User Interfaces: Creating Interactive Experiences in the Car; Meixner, G., Müller, C., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 295–323. [Google Scholar] [CrossRef]
  11. Abraham, H.; Reimer, B.; Seppelt, B.; Fitzgerald, C.; Mehler, B.; Coughlin, J.F. Consumer Interest in Automation: Preliminary Observations Exploring a Year’s Change; Technical Report 2017-2; Massachusetts Institute of Technology AgeLab: Cambridge, MA, USA, 2017. [Google Scholar]
  12. Scott-Parker, B. Emotions, behaviour, and the adolescent driver: A literature review. Transp. Res. Part F Traffic Psychol. Behav. 2017, 50, 1–37. [Google Scholar] [CrossRef]
  13. Kun, A.L.; Paek, T.; Medenica, V.; Memarović, N.; Palinko, O. Glancing at Personal Navigation Devices Can Affect Driving: Experimental Results and Design Implications. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Essen, Germany, 21–22 September 2009; ACM: New York, NY, USA, 2009; pp. 129–136. [Google Scholar] [CrossRef]
  14. Tchankue, P.; Wesson, J.; Vogts, D. The Impact of an Adaptive User Interface on Reducing Driver Distraction. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Salzburg, Austria, 30 November–2 December 2011; ACM: New York, NY, USA, 2011; pp. 87–94. [Google Scholar] [CrossRef]
  15. National Highway Traffic Safety Administration. Visual-Manual NHTSA Driver Distraction Guidelines for in-Vehicle Electronic Devices; National Highway Traffic Safety Administration (NHTSA), Department of Transportation (DOT): Washington, DC, USA, 2012.
  16. Roider, F.; Rümelin, S.; Pfleging, B.; Gross, T. The Effects of Situational Demands on Gaze, Speech and Gesture Input in the Vehicle. In Proceedings of the 9th ACM International Conference on Automotive User Interfaces and Vehicular Applications, Oldenburg, Germany, 24–27 September 2017; ACM: New York, NY, USA, 2017. [Google Scholar] [CrossRef]
  17. Meschtscherjakov, A.; Wilfinger, D.; Scherndl, T.; Tscheligi, M. Acceptance of Future Persuasive In-car Interfaces towards a More Economic Driving Behaviour. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Essen, Germany, 21–22 September 2009; ACM: New York, NY, USA, 2009; pp. 81–88. [Google Scholar] [CrossRef]
  18. Knobel, M.; Hassenzahl, M.; Männlein, S.; Lamara, M.; Schumann, J.; Eckoldt, K.; Laschke, M.; Butz, A. Become a Member of the Last Gentlemen: Designing for Prosocial Driving. In Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces, Newcastle upon Tyne, UK, 3–5 September 2013; ACM: New York, NY, USA, 2013; pp. 60–66. [Google Scholar] [CrossRef]
  19. Kumar, M.; Kim, T. Dynamic Speedometer: Dashboard Redesign to Discourage Drivers from Speeding. In Proceedings of the CHI ’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; ACM: New York, NY, USA, 2005; pp. 1573–1576. [Google Scholar] [CrossRef]
  20. Nass, C.; Jonsson, I.M.; Harris, H.; Reaves, B.; Endo, J.; Brave, S.; Takayama, L. Improving Automotive Safety by Pairing Driver Emotion and Car Voice Emotion. In Proceedings of the CHI ’05 Extended Abstracts on Human Factors in Computing Systems, Portland, OR, USA, 2–7 April 2005; ACM: New York, NY, USA, 2005; pp. 1973–1976. [Google Scholar] [CrossRef]
  21. Velusamy, S.; Kannan, H.; Anand, B.; Sharma, A.; Navathe, B. A method to infer emotions from facial Action Units. In Proceedings of the 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague, Czech Republic, 22–27 May 2011; IEEE: Prague, Czech Republic, 2011; pp. 2028–2031. [Google Scholar] [CrossRef]
  22. Hoch, S.; Althoff, F.; McGlaun, G.; Rigoll, G. Bimodal fusion of emotional data in an automotive environment. In Proceedings of the 2005 IEEE International Conference on Acoustics, Speech, and Signal Processing, Philadelphia, PA, USA, 18–23 March 2005; IEEE: Philadelphia, PA, USA, 2005; Volume 2, pp. ii/1085–ii/1088. [Google Scholar] [CrossRef]
  23. Melnicuk, V.; Birrell, S.; Crundall, E.; Jennings, P. Towards hybrid driver state monitoring: Review, future perspectives and the role of consumer electronics. In Proceedings of the 2016 IEEE Intelligent Vehicles Symposium (IV), Gothenburg, Sweden, 19–22 June 2016; IEEE: Gothenburg, Sweden, 2016; pp. 1392–1397. [Google Scholar] [CrossRef] [Green Version]
  24. Braun, M. Are Autonomous Vehicles The Sentient Robots We Were Promised? In Proceedings of the CHI ’18 Workshop: Interacting with Autonomous Vehicles: Learning from other Domains, Montreal, QC, Canada, 21–26 April 2018; ACM: New York, NY, USA, 2018. [Google Scholar]
  25. Pantic, M.; Sebe, N.; Cohn, J.F.; Huang, T. Affective Multimodal Human-computer Interaction. In Proceedings of the 13th Annual ACM International Conference on Multimedia, Singapore, 6–12 November 2005; ACM: New York, NY, USA, 2005; pp. 669–676. [Google Scholar] [CrossRef]
  26. Kushlev, K.; Cardoso, B.; Pielot, M. Too Tense for Candy Crush: Affect Influences User Engagement with Proactively Suggested Content. In Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, Singapore, 9–12 September 2017; ACM: New York, NY, USA, 2017; pp. 13:1–13:6. [Google Scholar] [CrossRef]
  27. Slovák, P.; Tennent, P.; Reeves, S.; Fitzpatrick, G. Exploring Skin Conductance Synchronisation in Everyday Interactions. In Proceedings of the 8th Nordic Conference on Human-Computer Interaction: Fun, Fast, Foundational, Helsinki, Finland, 26–30 October 2014; ACM: New York, NY, USA, 2014; pp. 511–520. [Google Scholar] [CrossRef]
  28. Riener, A.; Ferscha, A.; Aly, M. Heart on the Road: HRV Analysis for Monitoring a Driver’s Affective State. In Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Essen, Germany, 21–22 September 2009; ACM: New York, NY, USA, 2009; pp. 99–106. [Google Scholar] [CrossRef]
  29. Schmidt, A.; Beigl, M.; Gellersen, H.W. There is more to context than location. Comput. Gr. 1999, 23, 893–901. [Google Scholar] [CrossRef] [Green Version]
  30. Wickens, C.M.; Wiesenthal, D.L.; Flora, D.B.; Flett, G.L. Understanding driver anger and aggression: Attributional theory in the driving environment. J. Exp. Psychol. Appl. 2011, 17, 354–370. [Google Scholar] [CrossRef] [PubMed]
  31. Rödel, C.; Stadler, S.; Meschtscherjakov, A.; Tscheligi, M. Towards Autonomous Cars: The Effect of Autonomy Levels on Acceptance and User Experience. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Seattle, WA, USA, 17–19 September 2014; ACM: New York, NY, USA, 2014; pp. 11:1–11:8. [Google Scholar] [CrossRef]
  32. Lenorovitz, J. How in the World Did We Ever Get into That Mode? Mode Error and Awareness in Supervisory Control. Hum. Factors 1995, 37, 5–19. [Google Scholar] [CrossRef]
  33. Endsley, M.R.; Kiris, E.O. The Out-of-the-Loop Performance Problem and Level of Control in Automation. Hum. Factors 1995, 37, 381–394. [Google Scholar] [CrossRef]
  34. Koo, J.; Kwac, J.; Ju, W.; Steinert, M.; Leifer, L.; Nass, C. Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance. Int. J. Interact. Des. Manuf. 2015, 9, 269–275. [Google Scholar] [CrossRef]
  35. Kerschbaum, P.; Lorenz, L.; Hergeth, S.; Bengler, K. Designing the human-machine interface for highly automated cars—Challenges, exemplary concepts and studies. In Proceedings of the 2015 IEEE International Workshop on Advanced Robotics and its Social Impacts (ARSO), Lyon, France, 1–3 July 2015; IEEE: Lyon, France, 2015; pp. 1–6. [Google Scholar] [CrossRef]
  36. Kun, A.L.; Boll, S.; Schmidt, A. Shifting Gears: User Interfaces in the Age of Autonomous Driving. IEEE Pervasive Comput. 2016, 15, 32–38. [Google Scholar] [CrossRef]
  37. Das, S.; Sekar, A.; Chen, R.; Kim, H.C.; Wallington, T.J.; Williams, E. Impacts of Autonomous Vehicles on Consumers Time-Use Patterns. Challenges 2017, 8, 32. [Google Scholar] [CrossRef]
  38. Helldin, T.; Falkman, G.; Riveiro, M.; Davidsson, S. Presenting System Uncertainty in Automotive UIs for Supporting Trust Calibration in Autonomous Driving. In Proceedings of the 5th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Eindhoven, The Netherlands, 28–30 October 2013; ACM: New York, NY, USA, 2013; pp. 210–217. [Google Scholar] [CrossRef]
  39. Schroeter, R.; Steinberger, F. PokÉMon DRIVE: Towards Increased Situational Awareness in Semi-automated Driving. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, Launceston, Australia, 29 November–2 December 2016; ACM: New York, NY, USA, 2016; pp. 25–29. [Google Scholar] [CrossRef]
  40. Ive, H.P.; Sirkin, D.; Miller, D.; Li, J.; Ju, W. “Don’T Make Me Turn This Seat Around!”: Driver and Passenger Activities and Positions in Autonomous Cars. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Nottingham, UK, 1–3 September 2015; ACM: New York, NY, USA, 2015; pp. 50–55. [Google Scholar] [CrossRef]
  41. Osswald, S.; Sundström, P.; Tscheligi, M. The Front Seat Passenger: How to Transfer Qualitative Findings into Design. Int. J. Veh. Technol. 2013. [Google Scholar] [CrossRef]
  42. Wilfinger, D.; Meschtscherjakov, A.; Murer, M.; Osswald, S.; Tscheligi, M. Are We There Yet? A Probing Study to Inform Design for the Rear Seat of Family Cars. In Proceedings of the Human-Computer Interaction— INTERACT 2011: 13th IFIP TC 13 International Conference, Lisbon, Portugal, 5–9 September 2011, Part II; Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2011; pp. 657–674. [Google Scholar] [CrossRef]
  43. Ferati, M.; Murano, P.; Anthony Giannoumis, G. Universal Design of User Interfaces in Self-driving Cars. In Advances in Design for Inclusion; Di Bucchianico, G., Kercher, P.F., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 220–228. [Google Scholar]
  44. Voinescu, A.; Morgan, P.L.; Alford, C.; Caleb-Solly, P. Investigating Older Adults’ Preferences for Functions Within a Human-Machine Interface Designed for Fully Autonomous Vehicles. In Human Aspects of IT for the Aged Population. Applications in Health, Assistance, and Entertainment; Zhou, J., Salvendy, G., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 445–462. [Google Scholar]
  45. Fagnant, D.J.; Kockelman, K. Preparing a nation for autonomous vehicles: Opportunities, barriers and policy recommendations. Transp. Res. Part A Policy Pract. 2015, 77, 167–181. [Google Scholar] [CrossRef]
  46. Knobel, M.; Hassenzahl, M.; Lamara, M.; Sattler, T.; Schumann, J.; Eckoldt, K.; Butz, A. Clique Trip: Feeling Related in Different Cars. In Proceedings of the Designing Interactive Systems Conference 2012, Newcastle Upon Tyne, UK, 11–15 June 2012; ACM: New York, NY, USA, 2012; pp. 29–37. [Google Scholar] [CrossRef]
  47. Wang, C.; Terken, J.; Hu, J.; Rauterberg, M. “Likes” and “Dislikes” on the Road: A Social Feedback System for Improving Driving Behavior. In Proceedings of the 8th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Ann Arbor, MI, USA, 24–26 October 2016; ACM: New York, NY, USA, 2016; pp. 43–50. [Google Scholar] [CrossRef]
  48. Baber, J.; Kolodko, J.; Noel, T.; Parent, M.; Vlacic, L. Cooperative autonomous driving: Intelligent vehicles sharing city roads. IEEE Robot. Autom. Mag. 2005, 12, 44–49. [Google Scholar] [CrossRef]
  49. Macadam, C.C. Understanding and Modeling the Human Driver. Veh. Syst. Dyn. 2003, 40, 101–134. [Google Scholar] [CrossRef]
  50. Yan, S.; Tran, C.C.; Wei, Y.; Habiyaremye, J.L. Driver’s Mental Workload Prediction Model Based on Physiological Indices. Int. J. Occup. Saf. Ergon. 2017, 1–9. [Google Scholar] [CrossRef] [PubMed]
  51. Taylor, T.; Pradhan, A.; Divekar, G.; Romoser, M.; Muttart, J.; Gomez, R.; Pollatsek, A.; Fisher, D. The view from the road: The contribution of on-road glance-monitoring technologies to understanding driver behavior. Accid. Anal. Prev. 2013, 58, 175–186. [Google Scholar] [CrossRef] [PubMed]
  52. Kamaruddin, N.; Wahab, A. Driver behavior analysis through speech emotion understanding. In Proceedings of the 2010 IEEE Intelligent Vehicles Symposium, San Diego, CA, USA, 21–24 June 2010; IEEE: San Diego, CA, USA, 2010; pp. 238–243. [Google Scholar] [CrossRef]
  53. Inagaki, T. Towards Monitoring and Modelling for Situation-Adaptive Driver Assist Systems. In Modelling Driver Behaviour in Automotive Environments: Critical Issues in Driver Interactions with Intelligent Transport Systems; Cacciabue, P.C., Ed.; Springer: London, UK, 2007; pp. 43–57. [Google Scholar] [CrossRef]
  54. Kuffner, W.; Würtenberger, M.; Wisselmann, D.; Gentner, H.; Rau, R. Hände Weg vom Steuer! ATZagenda 2012, 1, 62–65. [Google Scholar] [CrossRef]
  55. Keltner, D.; Haidt, J. Social Functions of Emotions at Four Levels of Analysis. Cogn. Emot. 1999, 13, 505–521. [Google Scholar] [CrossRef] [Green Version]
  56. Little, L.; Briggs, P. Private whispers/public eyes: Is receiving highly personal information in a public place stressful? Interact. Comput. 2009, 21, 316–322. [Google Scholar] [CrossRef]
  57. Marques, D.; Guerreiro, T.; Carriço, L. Measuring Snooping Behavior with Surveys: It’s How You Ask It. In Proceedings of the CHI ’14 Extended Abstracts on Human Factors in Computing Systems, Toronto, ON, Canada, 26 April–1 May 2014; ACM: New York, NY, USA, 2014; pp. 2479–2484. [Google Scholar] [CrossRef]
  58. Eiband, M.; Khamis, M.; von Zezschwitz, E.; Hussmann, H.; Alt, F. Understanding Shoulder Surfing in the Wild: Stories from Users and Observers. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, Denver, CO, USA, 6–11 May 2017; ACM: New York, NY, USA, 2017; pp. 4254–4265. [Google Scholar] [CrossRef]
  59. Simmons, A.; Hoon, L. Agree to Disagree: On Labelling Helpful App Reviews. In Proceedings of the 28th Australian Conference on Computer-Human Interaction, Launceston, Australia, 29 November 29–2 December 2016; ACM: New York, NY, USA, 2016; pp. 416–420. [Google Scholar] [CrossRef]
  60. Adams, A.; Sasse, M.A. Users Are Not the Enemy. Commun. ACM 1999, 42, 40–46. [Google Scholar] [CrossRef]
  61. Paulhus, D.L.; Vazire, S. The self-report method. In Handbook of Research Methods in Personality Psychology; Robins, R.W., Fraley, R.C., Krueger, R.F., Eds.; Guilford Press: New York, NY, USA, 2007; pp. 224–239. [Google Scholar]
  62. Vazire, S.; Mehl, M.R. Knowing me, knowing you: The accuracy and unique predictive validity of self-ratings and other-ratings of daily behavior. J. Personal. Soc. Psychol. 2008, 95, 1202. [Google Scholar] [CrossRef] [PubMed]
  63. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
  64. Cohen, J. A Coefficient of Agreement for Nominal Scales. Educ. Psychol. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
  65. Dewey, M.E. Coefficients of Agreement. Br. J. Psychiatry 1983, 143, 487–489. [Google Scholar] [CrossRef] [PubMed]
  66. Landis, J.R.; Koch, G.G. The Measurement of Observer Agreement for Categorical Data. Biometrics 1977, 33, 159–174. [Google Scholar] [CrossRef] [PubMed]
  67. Hallgren, K.A. Computing Inter-Rater Reliability for Observational Data: An Overview and Tutorial. Tutor. Quant. Methods Psychol. 2012, 8, 23–34. [Google Scholar] [CrossRef] [PubMed]
  68. Kaiser, H.F. A second generation little jiffy. Psychometrika 1970, 35, 401–415. [Google Scholar] [CrossRef]
  69. Pfleging, B.; Schneegass, S.; Meschtscherjakov, A.; Tscheligi, M. Experience Maps: Experience-Enhanced Routes for Car Navigation. In Proceedings of the 6th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Seattle, WA, USA, 17–19 September 2014; ACM: New York, NY, USA, 2014; pp. 1–6. [Google Scholar] [CrossRef]
  70. Bosch, E.; Oehl, M.; Jeon, M.; Alvarez, I.; Healey, J.; Ju, W.; Jallais, C. Emotional GaRage: A Workshop on In-Car Emotion Recognition and Regulation. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; ACM: New York, NY, USA, 2018; pp. 44–49. [Google Scholar] [CrossRef]
  71. Lennon, A.; Watson, B. “Teaching them a lesson?” A qualitative exploration of underlying motivations for driver aggression. Accid. Anal. Prev. 2011, 43, 2200–2208. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  72. Hu, T.Y.; Xie, X.; Li, J. Negative or positive? The effect of emotion and mood on risky driving. Transp. Res. Part F Traffic Psychol. Behav. 2013, 16, 29–40. [Google Scholar] [CrossRef]
  73. Weber, M. Automotive Emotions: A Human-centred Approach Towards The Measurement and Understanding of Drivers’ Emotions And Their Triggers. Ph.D. Thesis, Brunel University, London, UK, 2018. [Google Scholar]
  74. Vasey, E.; Ko, S.; Jeon, M. In-Vehicle Affect Detection System: Identification of Emotional Arousal by Monitoring the Driver and Driving Style. In Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Toronto, ON, Canada, 23–25 September 2018; ACM: New York, NY, USA, 2018; pp. 243–247. [Google Scholar] [CrossRef]
  75. Wang, C.; Terken, J.; Yu, B.; Hu, J. Reducing Driving Violations by Receiving Feedback from Other Drivers. In Proceedings of the 7th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Nottingham, UK, 1–3 September 2015; ACM: New York, NY, USA, 2015; pp. 62–67. [Google Scholar] [CrossRef]
  76. Bader, R.; Siegmund, O.; Woerndl, W. A Study on User Acceptance of Proactive In-Vehicle Recommender Systems. In Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications, Salzburg, Austria, 30 November–2 December 2011; ACM: New York, NY, USA, 2011; pp. 47–54. [Google Scholar] [CrossRef]
  77. Lessne, G.; Venkatesan, M. Reactance Theory in Consumer Research: the Past, Present and Future. ACR Adv. Consum. Res. 1989, 16, 76–78. [Google Scholar]
  78. United Nations General Assembly. Resolution Adopted by the General Assembly on 25 September 2015: Transforming Our World: The 2030 Agenda for Sustainable Development; United Nations General Assembly: New York, NY, USA, 2015. [Google Scholar]
  79. Bouwman, H.; Carlsson, C.; Carlsson, J.; Nikou, S.; Sell, A.; Walden, P. How Nokia failed to nail the Smartphone market. In Proceedings of the 25th European Regional Conference of the International Telecommunications Society (ITS), Brussels, Belgium, 22–25 June 2014; International Telecommunications Society (ITS): Brussels, Belgium, 2014. [Google Scholar]
Figure 1. Negative experiences (red) were mostly triggered by traffic-related incidents, while positive situations (blue) are connected with nice surroundings, music, enjoyment of the ride and personal interaction.
Figure 1. Negative experiences (red) were mostly triggered by traffic-related incidents, while positive situations (blue) are connected with nice surroundings, music, enjoyment of the ride and personal interaction.
Mti 02 00075 g001
Figure 2. The influence of other traffic participants tends to spark negative experiences, whereas friends as passengers are mostly recollected in positive situations.
Figure 2. The influence of other traffic participants tends to spark negative experiences, whereas friends as passengers are mostly recollected in positive situations.
Mti 02 00075 g002
Figure 3. Positive mood and conversations inside the car are connected to positive emotions, as well as affirmation of the driver’s behavior. Traffic violations on the other hand are directly related to disgruntlement.
Figure 3. Positive mood and conversations inside the car are connected to positive emotions, as well as affirmation of the driver’s behavior. Traffic violations on the other hand are directly related to disgruntlement.
Mti 02 00075 g003
Figure 4. Drivers in negative situations had many ideas for improvement, such as inter-vehicle communication, autonomous driving, improved information management and various media functionalities. Drivers in positive situations provided less input but with similar content, except for the wish that a system should be quiet in nice moments and offer a ‘back-to-basic’ option for electronic assistance settings.
Figure 4. Drivers in negative situations had many ideas for improvement, such as inter-vehicle communication, autonomous driving, improved information management and various media functionalities. Drivers in positive situations provided less input but with similar content, except for the wish that a system should be quiet in nice moments and offer a ‘back-to-basic’ option for electronic assistance settings.
Mti 02 00075 g004

Share and Cite

MDPI and ACS Style

Braun, M.; Pfleging, B.; Alt, F. A Survey to Understand Emotional Situations on the Road and What They Mean for Affective Automotive UIs. Multimodal Technol. Interact. 2018, 2, 75. https://doi.org/10.3390/mti2040075

AMA Style

Braun M, Pfleging B, Alt F. A Survey to Understand Emotional Situations on the Road and What They Mean for Affective Automotive UIs. Multimodal Technologies and Interaction. 2018; 2(4):75. https://doi.org/10.3390/mti2040075

Chicago/Turabian Style

Braun, Michael, Bastian Pfleging, and Florian Alt. 2018. "A Survey to Understand Emotional Situations on the Road and What They Mean for Affective Automotive UIs" Multimodal Technologies and Interaction 2, no. 4: 75. https://doi.org/10.3390/mti2040075

Article Metrics

Back to TopTop