On Driver Behavior Recognition for Increased Safety: A Roadmap
Abstract
:1. Introduction
- The incorrect estimation of a driver’s state as well as of the status of the ego-vehicle (also denoted as subject vehicle or Vehicle Under Test (VUT) and referring to the vehicle containing the sensors perceiving the environment around the vehicle itself) [13,14] and the external environment may cause the ADAS to incorrectly activate or to make wrong decisions. Besides immediate danger, wrong decisions reduce drivers’ confidence in the system.
- Many sensors are needed to achieve such an ADAS. Sensors are prone to errors and require several processing layers to produce usable outputs, where each layer introduces delays and may hide/damage data.
- Dependable systems that recognize emotions and humans’ states are still a research challenge. They are usually built around algorithms requiring heterogeneous data as input parameters as well as provided by different sensing technologies, which may introduce unexpected errors into the system.
- An effective communication between the ADAS and the driver is hard to achieve. Indeed, human distraction plays a critical role in car accidents [15] and can be caused by both external and internal causes.
- adoption of tactful monitoring of psychological and physiological parameters (e.g., eye closure) able to significantly improve the detection of dangerous situations (e.g., distraction and drowsiness)
- improvement in detecting dangerous situations (e.g., drowsiness) with a reasonable accuracy and based on the use of driving performance measures (e.g., through monitoring of “drift-and-jerk” steering as well as detection of fluctuations of the vehicle in different directions)
- introduction of “secondary” feedback mechanisms, subsidiary to those originally provided in the vehicle, able to further enhance detection accuracy—this could be the case of auditory recognition tasks returned to the vehicle’s driver through a predefined and prerecorded human voice, which is perceived by the human ear in a more acceptable way compared to a synthetic one.
1.1. Survey Methodology
- The state-of-the-art of modern ADASs, considering articles and surveys in a 5-year time-frame. We limit ourselves here to the highly cited articles that are still relevant today.
- Psychological frameworks for emotion/cognition recognition. We have both reviewed the classic frameworks (e.g., Tomkins [18], Russel [19], Ekman [20], etc.) as well as recent applications of these techniques in the automotive domain. We observe a lack of usage of such specialized frameworks in real-world ADAS technologies.
- Sensors and AI-based systems, proposed in recent literature, that we consider relevant to implementation of the above frameworks in the automotive domain.
1.2. Article Structure
2. ADAS Using Driver Emotion Recognition
2.1. Emotions Recognition in the Automotive Field
2.1.1. Facial Expression and Emotion Recognition
2.1.2. Valence and Engagement
2.1.3. Further Factors Influencing Driving Behavior
2.1.4. Emotional Effects on Driving Behavior
2.1.5. Emotion Induction and Emotion Regulation Approaches
- ambient light, i.e., the exploitation of blue lighting for leveraging calming effects on the level of arousal;
- visual notification, i.e., visual feedback about the current state of the driver;
- voice assistant, i.e., audio feedback about the driver’s status provided in natural language—and obtained with Natural Language Processing (NLP) tasks—with suggestions of regulation strategies;
- empathic assistant, i.e., a voice assistant improved with an empathic tone of voice mirroring the driver’s state suggesting regulation activities.
2.1.6. Emotion Recognition Technologies in the Vehicle
2.2. Human–Machine Interface (HMI)
2.2.1. Strategies to Build Human–Automation Cooperation: Teaming
2.3. AI Components in ADAS
- Mid-range ADASs (like lane change estimation, blind spot monitoring, emergency brake intervention, and DMSs) are currently deployed in many commercial vehicles. In these ADASs, the research focuses on understanding if AI/ML is capable of providing superior accuracy with regard to rule-based systems or the human driver.
- High-range ADASs for (experimental) autonomous vehicles, like Tesla [87] or Waymo [88], require complex classification tasks to reconstruct a navigable representation of the environment. These tasks are mainly performed using DL components based on CNNs and constitute the emerging field of AI-based Autonomous Driving (AIAD).
- Confidentiality, which is the property through which data is disclosed only as intended by the data owner. Personal data to be collected from people inside a vehicle’s cabin is the main challenge to be faced, as each occupant should be sure that his/her data are not disclosed to unauthorized people, especially for data in which personal features are recognizable (e.g., imaging data).
- Integrity, which is the property guaranteeing that critical assets are not altered in disagreement with the owner’s wishes. In vehicular contexts, this is particularly true if sensor or Electronic Control Unit (ECU) data need to be stored with anti-tamper mechanisms (e.g., data to be used in case of accidents and disputes).
- Availability, which is the property according to which critical assets will be accessible when needed for authorized use.
- Accountability, which is the property according to which actions affecting critical assets can be traced to the actor or automated component responsible for the action.
2.4. Sensing Components for Human Emotion Recognition
2.4.1. Inertial Sensors
2.4.2. Camera Sensors
2.4.3. Sensor-Equipped Steering Wheel and Wearables
2.4.4. Issues with Sensing Technologies
- Eye movements and PERCLOS.
- Tracking of gaze direction and head orientation: Electro-OculoGram (EOG).
- Level of attention of the driver, with his/her mental stress derived considering both gaze direction and focus point.
- Pupil diameter, as it has been observed that, in the case of mental stress of the driver, acceleration of the sympathetic nerve causes an increase in pupil diameter.
- Brain activity, especially through an EEG.
- ECG and HRV, that, as mentioned before, allow for monitoring of the tight connection betweem mental activity and autonomous nervous system, which in turn influences human heart activity. More in detail, the balance between the nervous systems, i.e., sympathetic and parasympathetic, affects heart rate, that generally linearly follows the predominance of the sympathetic and parasympathetic nervous systems, respectively.
- Facial muscle activity and facial tracking, for which the related parameters and directions can be monitored through a camera focusing on the driver, aim to detect his/her attention level and drowsiness degree and to alert the occupants of the vehicle (through some on-board mechanisms, e.g., HMIs) in the case that driving conditions are not appropriate (e.g., a weary driving person usually nods or swings his/her head, yawns often, and blinks rapidly and constantly).
- Steering Wheel Movement (SWM), that represents widely recognized information related to the drowsiness level of the driver and can be obtained in an unobtrusive way with a steering angle sensor in order to avoid interference with driving.
3. The NextPerception Approach
3.1. Statement and Vision
3.2. Development Process
3.3. Use Case on Driven Behavior Recognition
“Peter is driving in manual mode on an urban road. He is in a hurry, since he is going to pick up his daughter Martha at the kindergarten and he is late. While driving, he receives an important mail from his boss. Since he has to reply immediately, he takes his smartphone and starts composing a message, thus starting to drive erratically. The vehicle informs him that his behavior is not safe and invites him to focus on driving. Since he does not react properly, the vehicle establishes that Peter is no longer in the condition to drive. Automation offers the opportunity to take control and informs him that he can relax and reply to the email. Taking into account Peter’s rush, automation adapts the driving style to reach the kindergarten safely but on time.”
“Julie is driving on the highway in automated mode. She is arguing with her husband on the phone, and she is visibly upset. Since automation detects that she is going to approach the highway exit, it starts adjusting the interior lighting to calm Julie and, then, plays some calming background music. Since Julie is still upset, the vehicle concludes that she is not in the condition to take control safely and proposes that she take a rest and drink chamomile tea at the closest service station.”
- To develop robust, reliable, non-obtrusive methods to infer a combined cognitive/emotional state—to the best of our knowledge, there is no system able to infer the combination of these factors in the automotive domain and of examples of integrated measures [79].
- To develop interaction modalities, e.g., based on the combination of visual and vocal interaction, including shared control modalities to facilitate the transition of control and, in general, the interaction between the driver and the highly automated system.
- To develop an appropriate framework to induce and collect data about the driver state from both cognitive and emotional sides. Indeed, driver cognitive states have been investigated more than emotional states [160]. To date, driver’s emotions have been always investigated separately from driver’s cognitive states because of the difficulty to distinguish emotional and cognitive load effects [60,160]. Further research is then needed to understand the whole driver state from both perspectives, starting from the appropriated experimental design paradigm to induce both conditions.
3.4. Improving Emotion Recognition for Vehicle Safety
3.5. Improving Assistance Using Accurate Driver Complex State Monitoring
- Vehicle information will be collected through the in-vehicle Controller Area Network (CAN) network in order to recognize the driving pattern in impairment conditions, such as cognitive distraction.
- Driver data will be collected through a combination of unobtrusive sensors. For example, as introduced in Section 2.4.2, cameras inside the cockpit will be used to detect driver activity and visual distraction as well as emotional activation from facial expression; thermal cameras will be used to detect drowsiness and arousal; other unobtrusive sensors (e.g., smartwatch as well as more general wearable devices) will be used to measure the driver’s engagement and other bio-physiological parameters. The combination of different sensors enables the detection of several parameters influencing the driver’s fitness to drive. We will refer to this combination as Driver Complex State (DCS), in turn made by the combination of (i) emotional state; (ii) visual distraction; (iii) cognitive distraction; (iv) arousal; and (v) fatigue/drowsiness.
- Finally, external data, including road conditions, weather conditions, lane occupation, and the actions of other vehicles on the road will be considered.
3.6. Experimental Setup and Expected Results
4. Recommendations and Future Directions
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
Abbreviations
ABS | Anti-lock Braking System |
ACC | Adaptive Cruise Control |
ADAS | Advanced Driver-Assistance Systems |
AI | Artificial Intelligence |
AIAD | AI-based Autonomous Driving |
AU | Action Unit |
CAN | Controller Area Network |
CNN | Convolutional Neural Network |
CV | Computer Vision |
DCS | Driver Complex State |
DEAPS | Database for Emotion Analysis using Physiological Signals |
DL | Deep Learning |
DMS | Driver Monitoring System |
DSS | Decision Support System |
ECU | Electronic Control Unit |
ECG | Electrocardiograph |
EEG | Electroencephalogram |
EOG | Electrooculogram |
ESC | Electronic Stability Control |
F2DI | Fitness-to-Drive Index |
FACS | Facial Action Coding System |
FCW | Forward Collision Warning |
HCI | Human-Computer Interaction |
HF | Human Factor |
HMI | Human–Machine Interface |
HRV | Heart Rate Variability |
IMU | Inertial Measurement Unit |
LDWS | Lane Departure Warning System |
LIDAR | Laser Imaging Detection and Ranging |
LSTM | Long Short-Term Memory |
LWIR | LongWave InfraRed |
MEMS | Micro Electromechanical System |
ML | Machine Learning |
NDO | Naturalistic Driving Observation |
NLP | Natural Language Processing |
OBD | On-Board Diagnostic |
PERCLOS | PERCentage of eyelid CLOsure |
RNN | Recurrent Neural Network |
SAE | Society of Automotive Engineers |
SOTIF | Safety Of The Intended Functionality |
SWM | Steering Wheel Movement |
ToF | Time of Flight |
TSR | Traffic Sign Recognition |
UC | Use Case |
UCD | User-Centered Design |
UI | User Interface |
US | User Story |
V2I | Vehicle-to-Infrastructure |
V2V | Vehicle-to-Vehicle |
V2X | Vehicle-to-Everything |
VIWI | Volkswagen Infotainment Web Interface |
XAI | EXplainable AI |
References
- Ziebinski, A.; Cupek, R.; Grzechca, D.; Chruszczyk, L. Review of Advanced Driver Assistance Systems (ADAS). AIP Conf. Proc. 2017, 1906, 120002. [Google Scholar] [CrossRef]
- Vollrath, M.; Schleicher, S.; Gelau, C. The Influence of Cruise Control and Adaptive Cruise Control on Driving Behaviour—A driving simulator study. Accid. Anal. Prev. 2011, 43, 1134–1139. [Google Scholar] [CrossRef]
- Satoh, M.; Shiraishi, S. Performance of Antilock Brakes with Simplified Control Technique. In SAE International Congress and Exposition; SAE International: Warrendale, PA, USA, 1983. [Google Scholar] [CrossRef]
- Centers for Disease Control and Prevention (CDC). Increasing Alcohol Ignition Interlock Use. Available online: https://www.cdc.gov/motorvehiclesafety/impaired_driving/ignition_interlock_states.html (accessed on 28 September 2020).
- Martinelli, N.S.; Seoane, R. Automotive Night Vision System. In Thermosense XXI. International Society for Optics and Photonics; SPIE: Bellingham, WA, USA, 1999; Volume 3700, pp. 343–346. [Google Scholar] [CrossRef]
- How Pre-Collision Systems Work. Available online: https://auto.howstuffworks.com/car-driving-safety/safety-regulatory-devices/pre-collision-systems.htm (accessed on 28 September 2020).
- Jacobé de Naurois, C.; Bourdin, C.; Stratulat, A.; Diaz, E.; Vercher, J.L. Detection and Prediction of Driver Drowsiness using Artificial Neural Network Models. Accid. Anal. Prev. 2019, 126, 95–104. [Google Scholar] [CrossRef] [PubMed]
- How Electronic Stability Control Works. Available online: https://auto.howstuffworks.com/car-driving-safety/safety-regulatory-devices/electronic-stability-control.htm (accessed on 28 September 2020).
- Wang, C.; Sun, Q.; Li, Z.; Zhang, H.; Fu, R. A Forward Collision Warning System based on Self-Learning Algorithm of Driver Characteristics. J. Intell. Fuzzy Syst. 2020, 38, 1519–1530. [Google Scholar] [CrossRef]
- Kortli, Y.; Marzougui, M.; Atri, M. Efficient Implementation of a Real-Time Lane Departure Warning System. In Proceedings of the 2016 International Image Processing, Applications and Systems (IPAS), Hammamet, Tunisia, 5–7 November 2016; pp. 1–6. [Google Scholar] [CrossRef]
- Luo, H.; Yang, Y.; Tong, B.; Wu, F.; Fan, B. Traffic Sign Recognition Using a Multi-Task Convolutional Neural Network. IEEE Trans. Intell. Transp. Syst. 2018, 19, 1100–1111. [Google Scholar] [CrossRef]
- Singh, S. Critical Reasons for Crashes Investigated in the National Motor Vehicle Crash Causation Survey. Technical Report. (US Department of Transportation—National Highway Traffic Safety Administration). Report No. DOT HS 812 115. 2015. Available online: https://crashstats.nhtsa.dot.gov/Api/Public/ViewPublication/812506 (accessed on 11 November 2020).
- Ego Vehicle—Coordinate Systems in Automated Driving Toolbox. Available online: https://www.mathworks.com/help/driving/ug/coordinate-systems.html (accessed on 12 November 2020).
- Ego Vehicle—The British Standards Institution (BSI). Available online: https://www.bsigroup.com/en-GB/CAV/cav-vocabulary/ego-vehicle/ (accessed on 12 November 2020).
- Regan, M.A.; Lee, J.D.; Young, K. Driver Distraction: Theory, Effects, and Mitigation; CRC Press: Boca Raton, FL, USA, 2008. [Google Scholar]
- Salman, H.; Li, J.; Razenshteyn, I.; Zhang, P.; Zhang, H.; Bubeck, S.; Yang, G. Provably Robust Deep Learning via Adversarially Trained Smoothed Classifiers. In Advances in Neural Information Processing Systems; Curran Associates, Inc.: Red Hook, NY, USA, 2019; pp. 11292–11303. Available online: http://papers.nips.cc/paper/9307-provably-robust-deep-learning-via-adversarially-trained-smoothed-classifiers.pdf (accessed on 12 November 2020).
- NextPerception—Next Generation Smart Perception Sensors and Distributed Intelligence for Proactive Human Monitoring in Health, Wellbeing, and Automotive Systems—Grant Agreement 876487. Available online: https://cordis.europa.eu/project/id/876487 (accessed on 1 October 2020).
- Tomkins, S. Affect Imagery Consciousness: The Complete Edition: Two Volumes; Springer Publishing Company: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Russell, J.A. A Circumplex Model of Affect. J. Personal. Soc. Psychol. 1980, 39. [Google Scholar] [CrossRef]
- Ekman, P. Basic Emotions. In Handbook of Cognition and Emotion; John Wiley & Sons, Ltd.: Hoboken, NJ, USA, 2005; Chapter 3; pp. 45–60. [Google Scholar] [CrossRef]
- International Organization for Standardization (ISO). ISO 26262-1:2018—Road Vehicles—Functional Safety. Available online: https://www.iso.org/standard/68383.html (accessed on 24 September 2020).
- World Wide Web Consortium (W3C). Vehicle Information Access API. Available online: https://www.w3.org/2014/automotive/vehicle_spec.html (accessed on 24 September 2020).
- World Wide Web Consortium (W3C). Volkswagen Infotainment Web Interface (VIWI) Protocol. Available online: https://www.w3.org/Submission/2016/SUBM-viwi-protocol-20161213/ (accessed on 24 September 2020).
- Society of Automotive Engineers (SAE). SAE E/E Diagnostic Test Modes J1979_201702. Available online: https://www.sae.org/standards/content/j1979_201702/ (accessed on 24 September 2020).
- AbuAli, N.; Abou-zeid, H. Driver Behavior Modeling: Developments and Future Directions. Int. J. Veh. Technol. 2016, 2016. [Google Scholar] [CrossRef] [Green Version]
- Amparore, E.; Beccuti, M.; Botta, M.; Donatelli, S.; Tango, F. Adaptive Artificial Co-pilot as Enabler for Autonomous Vehicles and Intelligent Transportation Systems. In Proceedings of the 10th International Workshop on Agents in Traffic and Transportation (ATT 2018), Stockholm, Sweden, 13–19 July 2018. [Google Scholar]
- Rahman, M.; Chowdhury, M.; Xie, Y.; He, Y. Review of Microscopic Lane-Changing Models and Future Research Opportunities. IEEE Trans. Intell. Transp. Syst. 2013, 14, 1942–1956. [Google Scholar] [CrossRef]
- Doshi, A.; Trivedi, M.M. Tactical Driver Behavior Prediction and Intent Inference: A Review. In Proceedings of the 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), Washington, DC, USA, 5–7 October 2011; pp. 1892–1897. [Google Scholar] [CrossRef] [Green Version]
- Moridpour, S.; Sarvi, M.; Rose, G. Lane Changing Models: A Critical Review. Transp. Lett. 2010, 2, 157–173. [Google Scholar] [CrossRef]
- Wang, W.; Xi, J.; Chen, H. Modeling and Recognizing Driver Behavior Based on Driving Data: A Survey. Math. Probl. Eng. 2014, 2014. [Google Scholar] [CrossRef] [Green Version]
- Brown, K.; Driggs-Campbell, K.; Kochenderfer, M.J. Modeling and Prediction of Human Driver Behavior: A Survey. arXiv 2020, arXiv:2006.08832. [Google Scholar]
- Tomkins, S. Affect, Imagery, Consciousness; Springer Pub. Co: New York, NY, USA, 1962. [Google Scholar]
- Izard, C.E.; Libero, D.Z.; Putnam, P.; Haynes, O.M. Stability of Emotion Experiences and Their Relations to Traits of Personality. J. Personal. Soc. Psychol. 1993, 64, 847–860. [Google Scholar] [CrossRef]
- Plutchik, R. The Nature of Emotions: Human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 2001, 89, 344–350. [Google Scholar] [CrossRef]
- Ekman, P. Facial Action Coding System; A Human Face: Salt Lake City, UT, USA, 2002. [Google Scholar]
- ImageNet Database. Available online: http://image-net.org/ (accessed on 24 September 2020).
- Pons, G.; Masip, D. Supervised Committee of Convolutional Neural Networks in Automated Facial Expression Analysis. IEEE Trans. Affect. Comput. 2018, 9, 343–350. [Google Scholar] [CrossRef]
- Ko, B. A Brief Review of Facial Emotion Recognition Based on Visual Information. Sensors 2018, 18, 401. [Google Scholar] [CrossRef] [PubMed]
- Kim, D.H.; Baddar, W.J.; Jang, J.; Ro, Y.M. Multi-Objective Based Spatio-Temporal Feature Representation Learning Robust to Expression Intensity Variations for Facial Expression Recognition. IEEE Trans. Affect. Comput. 2019, 10, 223–236. [Google Scholar] [CrossRef]
- Shao, J.; Qian, Y. Three Convolutional Neural Network Models for Facial Expression Recognition in the Wild. Neurocomputing 2019, 355, 82–92. [Google Scholar] [CrossRef]
- Dhall, A.; Goecke, R.; Gedeon, T.; Sebe, N. Emotion Recognition in the Wild. J. Multimodal User Interfaces 2016, 10, 95–97. [Google Scholar] [CrossRef] [Green Version]
- Savran, A.; Gur, R.; Verma, R. Automatic Detection of Emotion Valence on Faces Using Consumer Depth Cameras. In Proceedings of the 2013 IEEE International Conference on Computer Vision Workshops, Sydney, Australia, 1–8 December 2013; pp. 75–82. [Google Scholar] [CrossRef] [Green Version]
- Cai, H.; Lin, Y. Modeling of Operators’ Emotion and Task Performance in a Virtual Driving Environment. Int. J. Hum. Comput. Stud. 2011, 69, 571–586. [Google Scholar] [CrossRef]
- Yiend, J. The Effects of Emotion on Attention: A Review of Attentional Processing of Emotional Information. Cogn. Emot. 2010, 24, 3–47. [Google Scholar] [CrossRef]
- Ben Henia, W.M.; Lachiri, Z. Emotion Classification in Arousal-Valence Dimension Using Discrete Affective Keywords Tagging. In Proceedings of the 2017 International Conference on Engineering MIS (ICEMIS), Monastir, Tunisia, 8–10 May 2017; pp. 1–6. [Google Scholar] [CrossRef]
- Ceccacci, S.; Mengoni, M.; Generosi, A.; Giraldi, L.; Carbonara, G.; Castellano, A.; Montanari, R. A Preliminary Investigation Towards the Application of Facial Expression Analysis to Enable an Emotion-Aware Car Interface. In Universal Access in Human-Computer Interaction. Applications and Practice; Antona, M., Stephanidis, C., Eds.; Springer International Publishing: Cham, Swizerland, 2020; pp. 504–517. [Google Scholar] [CrossRef]
- Begg, D.; Langley, J. Changes in Risky Driving Behavior from Age 21 to 26 Years. J. Saf. Res. 2001, 32, 491–499. [Google Scholar] [CrossRef]
- Evans, L.; Wasielewski, P. Risky Driving Related to Driver and Vehicle Characteristics. Accid. Anal. Prev. 1983, 15, 121–136. [Google Scholar] [CrossRef]
- Job, R.F.S. The Road User: The Psychology of Road Safety. In Safe and Mobile: Introductory Studies in Traffic Safety; Clark, J., Ed.; Emu Press: Armidale, Australia, 1999. [Google Scholar]
- Winfred, A.J.; Dennis, D. Predicting Motor Vehicle Crash Involvement from a Personality Measure and a Driving Knowledge Test. J. Prev. Interv. Community 2001, 22, 35–42. [Google Scholar] [CrossRef]
- Iversen, H. Risk-taking Attitudes and Risky Driving Behaviour. Transp. Res. Part F Traffic Psychol. Behav. 2004, 7, 135–150. [Google Scholar] [CrossRef]
- Reason, J. Human Error; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar] [CrossRef]
- Butters, J.; Mann, R.E.; Wickens, C.M.; Boase, P. Gender Differences and Demographic Influences in Perceived Concern for Driver Safety and Support for Impaired Driving Countermeasures. J. Saf. Res. 2012, 43, 405–411. [Google Scholar] [CrossRef]
- Oppenheim, I.; Oron-Gilad, T.; Parmet, Y.; Shinar, D. Can Traffic Violations Be Traced to Gender-Role, Sensation Seeking, Demographics and Driving Exposure? Transp. Res. Part F Traffic Psychol. Behav. 2016, 43, 387–395. [Google Scholar] [CrossRef]
- Factor, R.; Mahalel, D.; Yair, G. The Social Accident: A Theoretical Model and a Research Agenda for Studying the Influence of Social and Cultural Characteristics on Motor Vehicle Accidents. Accid. Anal. Prev. 2007, 39, 914–921. [Google Scholar] [CrossRef]
- Fernandes, R.; Soames Job, R.; Hatfield, J. A Challenge to the Assumed Generalizability of Prediction and Countermeasure for Risky Driving: Different Factors Predict Different Risky Driving Behaviors. J. Saf. Res. 2007, 38, 59–70. [Google Scholar] [CrossRef]
- Parnell, K.J.; Stanton, N.A.; Plant, K.L. Driver Distraction, 1st ed.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar] [CrossRef]
- Eyben, F.; Wöllmer, M.; Poitschke, T.; Schuller, B.; Blaschke, C.; Färber, B.; Nguyen-Thien, N. Emotion on the Road: Necessity, Acceptance, and Feasibility of Affective Computing in the Car. Adv. Hum. Comp. Int. 2010, 2010. [Google Scholar] [CrossRef] [Green Version]
- Jeon, M.; Walker, B.N.; Yim, J.B. Effects of Specific Emotions on Subjective Judgment, Driving Performance, and Perceived Workload. Transp. Res. Part F Traffic Psychol. Behav. 2014, 24, 197–209. [Google Scholar] [CrossRef]
- Braun, M.; Weber, F.; Alt, F. Affective Automotive User Interfaces—Reviewing the State of Emotion Regulation in the Car. arXiv 2020, arXiv:2003.13731. [Google Scholar]
- Yerkes, R.M.; Dodson, J.D. The Relation of Strength of Stimulus to Rapidity of Habit-Formation. J. Comp. Neurol. Psychol. 1908, 18, 459–482. [Google Scholar] [CrossRef] [Green Version]
- Pêcher, C.; Lemercier, C.; Cellier, J.M. Emotions Drive Attention: Effects on Driver’s Behaviour. Saf. Sci. 2009, 47, 1254–1259. [Google Scholar] [CrossRef]
- Jeon, M. Chapter 17—Emotions in Driving. In Emotions and Affect in Human Factors and Human-Computer Interaction; Jeon, M., Ed.; Academic Press: San Diego, CA, USA, 2017; pp. 437–474. [Google Scholar] [CrossRef]
- Braun, M.; Weiser, S.; Pfleging, B.; Alt, F. A Comparison of Emotion Elicitation Methods for Affective Driving Studies. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2018; pp. 77–81. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis; Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Braun, M.; Schubert, J.; Pfleging, B.; Alt, F. Improving Driver Emotions with Affective Strategies. Multimodal Technol. Interact. 2019, 3, 21. [Google Scholar] [CrossRef] [Green Version]
- Ceccacci, S.; Generosi, A.; Giraldi, L.; Mengoni, M. Tool to Make Shopping Experience Responsive to Customer Emotions. Int. J. Autom. Technol. 2018, 12, 319–326. [Google Scholar] [CrossRef]
- Generosi, A.; Altieri, A.; Ceccacci, S.; Foresi, G.; Talipu, A.; Turri, G.; Mengoni, M.; Giraldi, L. MoBeTrack: A Toolkit to Analyze User Experience of Mobile Apps in the Wild. In Proceedings of the 2019 IEEE International Conference on Consumer Electronics (ICCE), Las Vegas, NV, USA, 11–13 January 2019; pp. 1–2. [Google Scholar] [CrossRef]
- Generosi, A.; Ceccacci, S.; Mengoni, M. A Deep Learning-based System to Track and Analyze Customer Behavior in Retail Store. In Proceedings of the 2018 IEEE 8th International Conference on Consumer Electronics—Berlin (ICCE-Berlin), Berlin, Germany, 2–5 September 2018; pp. 1–6. [Google Scholar] [CrossRef]
- EmotioNet Database. Available online: http://cbcsl.ece.ohio-state.edu/enc-2020/ (accessed on 24 September 2020).
- Nasoz, F.; Lisetti, C.L.; Vasilakos, A.V. Affectively Intelligent and Adaptive Car Interfaces. Inf. Sci. 2010, 180, 3817–3836. [Google Scholar] [CrossRef]
- Katsis, C.D.; Katertsidis, N.; Ganiatsas, G.; Fotiadis, D.I. Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach. IEEE Trans. Syst. Man, Cybern. Part A Syst. Humans 2008, 38, 502–512. [Google Scholar] [CrossRef]
- Jones, C.M.; Jonsson, I.M. Performance Analysis of Acoustic Emotion Recognition for In-Car Conversational Interfaces. In Universal Access in Human-Computer Interaction. Ambient Interaction; Stephanidis, C., Ed.; Springer: Berlin/Heidelberg, Germany, 2007; pp. 411–420. [Google Scholar] [CrossRef] [Green Version]
- Masola, A.; Gabbi, C.; Castellano, A.; Capodieci, N.; Burgio, P. Graphic Interfaces in ADAS: From Requirements to Implementation. In Proceedings of the 6th EAI International Conference on Smart Objects and Technologies for Social Good; Association for Computing Machinery: New York, NY, USA, 2020; pp. 193–198. [Google Scholar] [CrossRef]
- Mehler, B.; Reimer, B.; Zec, M. Defining Workload in the Context of Driver State Detection and HMI Evaluation. In Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2012; pp. 187–191. [Google Scholar] [CrossRef]
- Manawadu, U.E.; Kamezaki, M.; Ishikawa, M.; Kawano, T.; Sugano, S. A Multimodal Human-Machine Interface Enabling Situation-Adaptive Control Inputs for Highly Automated Vehicles. In Proceedings of the 2017 IEEE Intelligent Vehicles Symposium (IV), IEEE, Los Angeles, CA, USA, 11–14 June 2017; pp. 1195–1200. [Google Scholar] [CrossRef]
- Mone, G. Sensing Emotions. Commun. ACM 2015, 58, 15–16. [Google Scholar] [CrossRef]
- Shu, J.; Chiu, M.; Hui, P. Emotion Sensing for Mobile Computing. IEEE Commun. Mag. 2019, 57, 84–90. [Google Scholar] [CrossRef]
- Du, N.; Pulver, E.; Robert, L.; Pradhan, A.; Yang, X.J.; Du, N.; Kim, J.; Zhou, F.; Tilbury, D.; Jessie, X. Evaluating Effects of Cognitive Load, Takeover Request Lead Time, and Traffic Density on Drivers’ Takeover Performance in Conditionally Automated Driving. In Proceedings of the 12th International Conference on Automotive User Interfaces and Interactive Vehicular Applications AutomotiveUI ’20, Washington, DC, USA, 21–22 September 2020; Association for Computing Machinery: New York, NY, USA, 2020; pp. 66–73. [Google Scholar] [CrossRef]
- Chan, M.; Singhal, A. Emotion Matters: Implications for Distracted Driving. Saf. Sci. 2015, 72, 302–309. [Google Scholar] [CrossRef]
- Christoffersen, K.; Woods, D.D. How to Make Automated Systems Team Players. In Advances in Human Performance and Cognitive Engineering Research; Emerald: Bingley, UK, 2002; Volume 2, pp. 1–12. [Google Scholar] [CrossRef]
- Allen, J.; Ferguson, G. Human-Machine Collaborative Planning. In Proceedings of the Third International NASA Workshop on Planning and Scheduling for Space, Houston, TX, USA, 27–29 October 2002; pp. 27–29. Available online: https://www.cs.rochester.edu/research/cisd/pubs/2002/allen-ferguson-nasa2002.pdf (accessed on 11 November 2020).
- Bradshaw, J.M.; Feltovich, P.J.; Jung, H.; Kulkarni, S.; Taysom, W.; Uszok, A. Dimensions of Adjustable Autonomy and Mixed-Initiative Interaction. In Agents and Computational Autonomy; Nickles, M., Rovatsos, M., Weiss, G., Eds.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 17–39. [Google Scholar] [CrossRef]
- Bradshaw, J.M. Making Agents Acceptable to People. In Multi-Agent Systems and Applications III; Mařík, V., Pěchouček, M., Müller, J., Eds.; Springer: Berlin/Heidelberg, Germany, 2003; pp. 1–3. [Google Scholar] [CrossRef] [Green Version]
- Klein, G. The Power of Intuition; Crown: New York, NY, USA, 2004. [Google Scholar]
- Moujahid, A.; ElAraki Tantaoui, M.; Hina, M.D.; Soukane, A.; Ortalda, A.; ElKhadimi, A.; Ramdane-Cherif, A. Machine Learning Techniques in ADAS: A Review. In Proceedings of the 2018 International Conference on Advances in Computing and Communication Engineering (ICACCE), Paris, France, 22–23 June 2018; pp. 235–242. [Google Scholar] [CrossRef]
- Tesla Autopilot. Available online: https://www.tesla.com/autopilot (accessed on 1 October 2020).
- Dolgov, D. Google I/O Recap: Turning Self-Driving Cars from Science Fiction into Reality With the Help of AI. Technical Report, Waymo Team. 2018. Available online: https://medium.com/waymo/google-i-o-recap-turning-self-driving-cars-from-science-fiction-into-reality-with-the-help-of-ai-89dded40c63 (accessed on 11 November 2020).
- Uber Told Self-Drive Cars Unsafe Days before Accident. Available online: https://www.bbc.com/news/technology-46552604 (accessed on 1 October 2020).
- International Organization for Standardization (ISO). ISO/PAS 21448:2019—Road Vehicles—Safety of the Intended Functionality. Available online: https://www.iso.org/standard/70939.html (accessed on 25 September 2020).
- Falcini, F.; Lami, G.; Costanza, A.M. Deep Learning in Automotive Software. IEEE Softw. 2017, 34, 56–63. [Google Scholar] [CrossRef]
- Gharib, M.; Lollini, P.; Botta, M.; Amparore, E.; Donatelli, S.; Bondavalli, A. On the Safety of Automotive Systems Incorporating Machine Learning Based Components: A Position Paper. In Proceedings of the 2018 48th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W), Luxembourg, 25–28 June 2018; pp. 271–274. [Google Scholar] [CrossRef]
- Schumann, J.; Gupta, P.; Liu, Y. Application of Neural Networks in High Assurance Systems: A Survey. In Applications of Neural Networks in High Assurance Systems; Springer: Berlin/Heidelberg, Germany, 2010; pp. 1–19. [Google Scholar] [CrossRef]
- Guidotti, R.; Monreale, A.; Ruggieri, S.; Turini, F.; Giannotti, F.; Pedreschi, D. A Survey of Methods for Explaining Black Box Models. ACM Comput. Surv. 2018, 51. [Google Scholar] [CrossRef] [Green Version]
- Saif, I.; Ammanath, B. “Trustworthy AI” Is a Framework to Help Manage Unique Risk. Technical Report, MIT Technology Review. 2020. Available online: www.technologyreview.com/2020/03/25/950291/trustworthy-ai-is-a-framework-to-help-manage-unique-risk/ (accessed on 12 November 2020).
- Torres-Huitzil, C.; Girau, B. Fault and Error Tolerance in Neural Networks: A Review. IEEE Access 2017, 5, 17322–17341. [Google Scholar] [CrossRef]
- Dong, X.; Yu, Z.; Cao, W.; Shi, Y.; Ma, Q. A Survey on Ensemble Learning. Front. Comput. Sci. 2020, 14, 241–258. [Google Scholar] [CrossRef]
- Lu, S.; Yao, Y.; Shi, W. Collaborative Learning on the Edges: A Case Study on Connected Vehicles. In Proceedings of the 2nd {USENIX} Workshop on Hot Topics in Edge Computing (HotEdge 19), Renton, WA, USA, 9 July 2019; USENIX Association: Renton, WA, USA, 2019. Available online: https://www.usenix.org/conference/hotedge19/presentation/lu (accessed on 11 November 2020).
- Chen, Q.; Ma, X.; Tang, S.; Guo, J.; Yang, Q.; Fu, S. F-Cooper: Feature Based Cooperative Perception for Autonomous Vehicle Edge Computing System Using 3D Point Clouds. In Proceedings of the 4th ACM/IEEE Symposium on Edge Computing, Arlington, VA, USA, 7–9 November 2019; Association for Computing Machinery: New York, NY, USA, 2019; pp. 88–100. [Google Scholar] [CrossRef]
- Shi, Y.; Yang, K.; Jiang, T.; Zhang, J.; Letaief, K.B. Communication-Efficient Edge AI: Algorithms and Systems. IEEE Commun. Surv. Tutorials 2020. [Google Scholar] [CrossRef]
- Kairouz, P.; McMahan, H.B.; Avent, B.; Bellet, A.; Bennis, M.; Bhagoji, A.N.; Bonawitz, K.; Charles, Z.; Cormode, G.; Cummings, R.; et al. Advances and Open Problems in Federated Learning. arXiv 2019, arXiv:1912.04977. [Google Scholar]
- Belli, L.; Cirani, S.; Davoli, L.; Melegari, L.; Mónton, M.; Picone, M. An Open-Source Cloud Architecture for Big Stream IoT Applications. In Interoperability and Open-Source Solutions for the Internet of Things: International Workshop, FP7 OpenIoT Project, Proceedings of the Conjunction with SoftCOM 2014, Split, Croatia, 17–19 September 2014; Podnar Žarko, I., Pripužić, K., Serrano, M., Eds.; Invited Papers; Springer International Publishing: Berlin/Heidelberg, Germany, 2015; pp. 73–88. [Google Scholar] [CrossRef]
- Welch, K.C.; Harnett, C.; Lee, Y.C. A Review on Measuring Affect with Practical Sensors to Monitor Driver Behavior. Safety 2019, 5, 72. [Google Scholar] [CrossRef] [Green Version]
- Van Schagen, I.; Sagberg, F. The Potential Benefits of Naturalistic Driving for Road Safety Research: Theoretical and Empirical Considerations and Challenges for the Future. Procedia Soc. Behav. Sci. 2012, 48, 692–701. [Google Scholar] [CrossRef] [Green Version]
- What Is Naturalistic Driving. Available online: http://www.udrive.eu/index.php/about-udrive/what-is-naturalistic-driving (accessed on 5 October 2020).
- The 100 Car Naturalistic Driving Study. Available online: https://www.nhtsa.gov/sites/nhtsa.dot.gov/files/100carphase1report.pdf (accessed on 5 October 2020).
- Differences and Similarities in Driver INTERACTION with in-Vehicle Technologies. Available online: http://cordis.europa.eu/project/rcn/90097_en.html (accessed on 5 October 2020).
- Prologue. Available online: https://prologue.kfv.at/ (accessed on 5 October 2020).
- Naturalistic Driving Observations within ERSO. Available online: https://tinyurl.com/dacota-eu (accessed on 5 October 2020).
- 2-Wheeler Behaviour and Safety. Available online: https://cordis.europa.eu/project/id/218703 (accessed on 5 October 2020).
- Waheed, O.T.; Elfadel, I.A.M. Domain-Specific Architecture for IMU Array Data Fusion. In Proceedings of the 2019 IFIP/IEEE 27th International Conference on Very Large Scale Integration (VLSI-SoC), Cuzco, Peru, 6–9 October 2019; pp. 129–134. [Google Scholar] [CrossRef]
- Mezentsev, O.; Collin, J. Design and Performance of Wheel-mounted MEMS IMU for Vehicular Navigation. In Proceedings of the 2019 IEEE International Symposium on Inertial Sensors and Systems (INERTIAL), Naples, FL, USA, 1–4 April 2019; pp. 1–4. [Google Scholar] [CrossRef]
- Konieczka, A.; Michalowicz, E.; Piniarski, K. Infrared Thermal Camera-based System for Tram Drivers Warning About Hazardous Situations. In Proceedings of the 2018 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA), Poznan, Poland, 19–21 September 2018; pp. 250–254. [Google Scholar] [CrossRef]
- Kiashari, S.E.H.; Nahvi, A.; Bakhoda, H.; Homayounfard, A.; Tashakori, M. Evaluation of driver drowsiness using respiration analysis by thermal imaging on a driving simulator. Multimed. Tools Appl. 2020, 79, 17793–17815. [Google Scholar] [CrossRef]
- Cardone, D.; Perpetuini, D.; Filippini, C.; Spadolini, E.; Mancini, L.; Chiarelli, A.M.; Merla, A. Driver Stress State Evaluation by Means of Thermal Imaging: A Supervised Machine Learning Approach Based on ECG Signal. Appl. Sci. 2020, 10, 5673. [Google Scholar] [CrossRef]
- Kashevnik, A.; Kruglov, M.; Lashkov, I.; Teslya, N.; Mikhailova, P.; Ripachev, E.; Malutin, V.; Saveliev, N.; Ryabchikov, I. Human Psychophysiological Activity Estimation Based on Smartphone Camera and Wearable Electronics. Future Internet 2020, 12, 111. [Google Scholar] [CrossRef]
- Lashkov, I.; Kashevnik, A. Smartphone-Based Intelligent Driver Assistant: Context Model and Dangerous State Recognition Scheme. In Intelligent Systems and Applications; Bi, Y., Bhatia, R., Kapoor, S., Eds.; Springer International Publishing: Cham, Switzerland, 2020; Volume 1038, pp. 152–165. [Google Scholar] [CrossRef]
- Lindow, F.; Kashevnik, A. Driver Behavior Monitoring Based on Smartphone Sensor Data and Machine Learning Methods. In Proceedings of the 2019 25th Conference of Open Innovations Association (FRUCT), Helsinki, Finland, 5–8 November 2019; pp. 196–203. [Google Scholar] [CrossRef]
- Kashevnik, A.; Lashkov, I.; Ponomarev, A.; Teslya, N.; Gurtov, A. Cloud-Based Driver Monitoring System Using a Smartphone. IEEE Sens. J. 2020, 20, 6701–6715. [Google Scholar] [CrossRef]
- Weng, M.C.; Chen, C.T.; Kao, H.C. Remote Surveillance System for Driver Drowsiness in Real-Time using Low-Cost Embedded Platform. In Proceedings of the 2008 IEEE International Conference on Vehicular Electronics and Safety, Columbus, OH, USA, 22–24 September 2008; pp. 288–292. [Google Scholar] [CrossRef]
- Adib, F.; Mao, H.; Kabelac, Z.; Katabi, D.; Miller, R.C. Smart Homes That Monitor Breathing and Heart Rate. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems; Association for Computing Machinery: New York, NY, USA, 2015; pp. 837–846. [Google Scholar] [CrossRef]
- Oyini Mbouna, R.; Kong, S.G.; Chun, M.G. Visual Analysis of Eye State and Head Pose for Driver Alertness Monitoring. IEEE Trans. Intell. Transp. Syst. 2013, 14, 1462–1469. [Google Scholar] [CrossRef]
- Li, L.; Chen, Y.; Xin, L. Driver Fatigue Detection Based on Mouth Information. In Proceedings of the 2010 8th World Congress on Intelligent Control and Automation, Jinan, China, 7–9 July 2010; pp. 6058–6062. [Google Scholar] [CrossRef]
- Rongben, W.; Lie, G.; Bingliang, T.; Lisheng, J. Monitoring Mouth Movement for Driver Fatigue or Distraction with One Camera. In Proceedings of the 7th International IEEE Conference on Intelligent Transportation Systems (IEEE Cat. No.04TH8749), Washington, WA, USA, 3–6 October 2004; pp. 314–319. [Google Scholar] [CrossRef]
- Chen, L.W.; Chen, H.M. Driver Behavior Monitoring and Warning With Dangerous Driving Detection Based on the Internet of Vehicles. IEEE Trans. Intell. Transp. Syst. 2020, 1–10. [Google Scholar] [CrossRef]
- Smirnov, A.; Kashevnik, A.; Lashkov, I. Human-Smartphone Interaction for Dangerous Situation Detection and Recommendation Generation While Driving. In Speech and Computer; Ronzhin, A., Potapova, R., Németh, G., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 346–353. [Google Scholar] [CrossRef]
- Vicente, F.; Huang, Z.; Xiong, X.; De la Torre, F.; Zhang, W.; Levi, D. Driver Gaze Tracking and Eyes Off the Road Detection System. IEEE Trans. Intell. Transp. Syst. 2015, 16, 2014–2027. [Google Scholar] [CrossRef]
- Ohn-Bar, E.; Tawari, A.; Martin, S.; Trivedi, M.M. On Surveillance for Safety Critical Events: In-Vehicle Video Networks for Predictive Driver Assistance Systems. Comput. Vis. Image Underst. 2015, 134, 130–140. [Google Scholar] [CrossRef]
- Kashevnik, A.; Fedotov, A.; Lashkov, I. Dangerous Situation Prediction and Driving Statistics Accumulation Using Smartphone. In Proceedings of the 2018 International Conference on Intelligent Systems (IS), Funchal, Portugal, 25–27 September 2018; pp. 521–527. [Google Scholar] [CrossRef]
- Yang, L.; Dong, K.; Dmitruk, A.J.; Brighton, J.; Zhao, Y. A Dual-Cameras-Based Driver Gaze Mapping System With an Application on Non-Driving Activities Monitoring. IEEE Trans. Intell. Transp. Syst. 2019, 1–10. [Google Scholar] [CrossRef] [Green Version]
- Warren, J.; Lipkowitz, J.; Sokolov, V. Clusters of Driving Behavior From Observational Smartphone Data. IEEE Intell. Transp. Syst. Mag. 2019, 11, 171–180. [Google Scholar] [CrossRef]
- Park, S.; Jayaraman, S. Enhancing the Quality of Life through Wearable Technology. IEEE Eng. Med. Biol. Mag. 2003, 22, 41–48. [Google Scholar] [CrossRef]
- AlGhatrif, M.; Lindsay, J. A Brief Review: History to Understand Fundamentals of Electrocardiography. J. Community Hosp. Intern. Med. Perspect. 2012, 2. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Seitsonen, E.R.J.; Korhonen, I.K.J.; Van Gils, M.J.; Huiku, M.; Lotjonen, J.M.P.; Korttila, K.T.; Yli-Hankala, A.M. EEG Spectral Entropy, Heart Rate, Photoplethysmography and Motor Responses to Skin Incision during Sevoflurane Anaesthesia. Acta Anaesthesiol. Scand. 2005, 49, 284–292. [Google Scholar] [CrossRef] [PubMed]
- Li, M.; Narayanan, S. Robust ECG Biometrics by Fusing Temporal and Cepstral Information. In Proceedings of the 2010 20th International Conference on Pattern Recognition, Istanbul, Turkey, 23–26 August 2010; pp. 1326–1329. [Google Scholar] [CrossRef]
- Abo-Zahhad, M.; Ahmed, S.M.; Abbas, S.N. Biometric Authentication Based on PCG and ECG Signals: Present Status and Future Directions. Signal Image Video Process. 2014, 8, 739–751. [Google Scholar] [CrossRef]
- Samarin, N.; Sannella, D. A Key to Your Heart: Biometric Authentication Based on ECG Signals. arXiv 2019, arXiv:1906.09181. [Google Scholar]
- Lourenço, A.; Alves, A.P.; Carreiras, C.; Duarte, R.P.; Fred, A. CardioWheel: ECG Biometrics on the Steering Wheel. In Machine Learning and Knowledge Discovery in Databases; Bifet, A., May, M., Zadrozny, B., Gavalda, R., Pedreschi, D., Bonchi, F., Cardoso, J., Spiliopoulou, M., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 267–270. [Google Scholar] [CrossRef]
- Hansen, J.H.L.; Busso, C.; Zheng, Y.; Sathyanarayana, A. Driver Modeling for Detection and Assessment of Driver Distraction: Examples from the UTDrive Test Bed. IEEE Signal Process. Mag. 2017, 34, 130–142. [Google Scholar] [CrossRef]
- Cassani, R.; Falk, T.H.; Horai, A.; Gheorghe, L.A. Evaluating the Measurement of Driver Heart and Breathing Rates from a Sensor-Equipped Steering Wheel using Spectrotemporal Signal Processing. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand, 27–30 October 2019; pp. 2843–2847. [Google Scholar] [CrossRef]
- Eilebrecht, B.; Wolter, S.; Lem, J.; Lindner, H.; Vogt, R.; Walter, M.; Leonhardt, S. The Relevance of HRV Parameters for Driver Workload Detection in Real World Driving. In Proceedings of the 2012 Computing in Cardiology, Krakow, Poland, 9–12 September 2012; pp. 409–412. [Google Scholar]
- Lanatà, A.; Valenza, G.; Greco, A.; Gentili, C.; Bartolozzi, R.; Bucchi, F.; Frendo, F.; Scilingo, E.P. How the Autonomic Nervous System and Driving Style Change With Incremental Stressing Conditions During Simulated Driving. IEEE Trans. Intell. Transp. Syst. 2015, 16, 1505–1517. [Google Scholar] [CrossRef]
- Vicente, J.; Laguna, P.; Bartra, A.; Bailón, R. Drowsiness Detection Using Heart Rate Variability. Med. Biol. Eng. Comput. 2016, 54, 927–937. [Google Scholar] [CrossRef]
- Muhlbacher-Karrer, S.; Mosa, A.H.; Faller, L.M.; Ali, M.; Hamid, R.; Zangl, H.; Kyamakya, K. A Driver State Detection System—Combining a Capacitive Hand Detection Sensor With Physiological Sensors. IEEE Trans. Instrum. Meas. 2017, 66, 624–636. [Google Scholar] [CrossRef]
- Muhlbacher-Karrer, S.; Faller, L.; Hamid, R.; Zangl, H. A Wireless Steering Wheel Gripping Sensor for Hands On/Off Detection. In Proceedings of the 2016 IEEE Sensors Applications Symposium (SAS), Catania, Italy, 20–22 April 2016; pp. 1–5. [Google Scholar] [CrossRef]
- Lozoya-Santos, J.d.J.; Sepúlveda-Arróniz, V.; Tudon-Martinez, J.C.; Ramirez-Mendoza, R.A. Survey on Biometry for Cognitive Automotive Systems. Cogn. Syst. Res. 2019, 55, 175–191. [Google Scholar] [CrossRef]
- Essers, S.; Lisseman, J.; Ruck, H. Steering Wheel for Active Driver State Detection. Auto Tech Rev. 2016, 5, 36–41. [Google Scholar] [CrossRef]
- Pinto, J.; Cardoso, J.; Lourenço, A.; Carreiras, C. Towards a Continuous Biometric System Based on ECG Signals Acquired on the Steering Wheel. Sensors 2017, 17, 2228. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Coughlin, J.F.; Reimer, B.; Mehler, B. Monitoring, Managing, and Motivating Driver Safety and Well-being. IEEE Pervasive Comput. 2011, 10, 14–21. [Google Scholar] [CrossRef]
- Bakker, J.; Pechenizkiy, M.; Sidorova, N. What’s Your Current Stress Level? Detection of Stress Patterns from GSR Sensor Data. In Proceedings of the 2011 IEEE 11th International Conference on Data Mining Workshops, Vancouver, BC, Canada, 11–14 December 2011; pp. 573–580. [Google Scholar] [CrossRef]
- Madrid, J.M.; Arce-Lopera, C.A.; Lasso, F. Biometric Interface for Driver’s Stress Detection and Awareness. In Adjunct Proceedings of the 10th International Conference on Automotive User Interfaces and Interactive Vehicular Applications; Association for Computing Machinery: New York, NY, USA, 2018; pp. 132–136. [Google Scholar] [CrossRef]
- Praveen, J. Biometrics wave poised to transform future driving. Biom. Technol. Today 2017, 2017, 5–8. [Google Scholar] [CrossRef]
- Embrace Seizure Characterization Clinical Trial. Available online: https://www.empatica.com/embrace-watch-epilepsy-monitor (accessed on 5 October 2020).
- Sahayadhas, A.; Sundaraj, K.; Murugappan, M. Detecting Driver Drowsiness Based on Sensors: A Review. Sensors 2012, 12, 16937–16953. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jansen, R.J.; van der Kint, S.T.; Hermens, F. Does Agreement Mean Accuracy? Evaluating Glance Annotation in Naturalistic Driving Data. Behav. Res. Methods 2020. [Google Scholar] [CrossRef] [PubMed]
- Kawanaka, H.; Miyaji, M.; Bhuiyan, M.S.; Oguri, K. Identification of Cognitive Distraction Using Physiological Features for Adaptive Driving Safety Supporting System. Int. J. Veh. Technol. 2013, 2013. [Google Scholar] [CrossRef]
- NextPerception—Next Generation Smart Perception Sensors and Distributed Intelligence for Proactive Human Monitoring in Health, Wellbeing, and Automotive Systems. Available online: https://www.nextperception.eu/ (accessed on 1 October 2020).
- Electronic Components and Systems for European Leadership (ECSEL) Joint Undertaking. Available online: https://www.ecsel.eu/ (accessed on 4 October 2020).
- Miller, D.; Sun, A.; Johns, M.; Ive, H.; Sirkin, D.; Aich, S.; Ju, W. Distraction Becomes Engagement in Automated Driving. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2015, 59, 1676–1680. [Google Scholar] [CrossRef] [Green Version]
- Jeon, M. Chapter 1—Emotions and Affect in Human Factors and Human–Computer Interaction: Taxonomy, Theories, Approaches, and Methods. In Emotions and Affect in Human Factors and Human-Computer Interaction; Jeon, M., Ed.; Academic Press: San Diego, CA, USA, 2017; pp. 3–26. [Google Scholar] [CrossRef]
- Li, J.; Braun, M.; Butz, A.; Alt, F. Designing Emotion-Aware in-Car Interactions for Unlike Markets. In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications: Adjunct Proceedings; Association for Computing Machinery: New York, NY, USA, 2019; pp. 352–357. [Google Scholar] [CrossRef]
- Liggins II, M.; Hall, D.; Llinas, J. Handbook of Multisensor Data Fusion: Theory and Practice; CRC Press: Boca Raton, FL, USA, 2017; p. 870. [Google Scholar]
- SCANeR Studio Platform. Available online: https://www.avsimulation.com/scanerstudio/ (accessed on 4 October 2020).
- Lucey, P.; Cohn, J.F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I. The Extended Cohn-Kanade Dataset (CK+): A complete dataset for action unit and emotion-specified expression. In Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition—Workshops, San Francisco, CA, USA, 13–18 June 2010; pp. 94–101. [Google Scholar] [CrossRef] [Green Version]
- Barsoum, E.; Zhang, C.; Ferrer, C.C.; Zhang, Z. Training Deep Networks for Facial Expression Recognition with Crowd-Sourced Label Distribution. In Proceedings of the 18th ACM International Conference on Multimodal Interaction; Association for Computing Machinery: New York, NY, USA, 2016; pp. 279–283. [Google Scholar] [CrossRef] [Green Version]
- Mollahosseini, A.; Hasani, B.; Mahoor, M.H. AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild. IEEE Trans. Affect. Comput. 2019, 10, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Maryam Fakhrhosseini, S.; Jeon, M. Chapter 10—Affect/Emotion Induction Methods. In Emotions and Affect in Human Factors and Human-Computer Interaction; Jeon, M., Ed.; Academic Press: San Diego, CA, USA, 2017; pp. 235–253. [Google Scholar] [CrossRef]
- Wickens, C.D. Processing Resources and Attention. Mult. Task Perform. 1991, 1991, 3–34. Available online: https://apps.dtic.mil/dtic/tr/fulltext/u2/a102719.pdf (accessed on 10 November 2020).
- Boer, E.R. Car following from the driver’s perspective. Transp. Res. Part F Traffic Psychol. Behav. 1999, 2, 201–206. [Google Scholar] [CrossRef]
- Fleming, J.M.; Allison, C.K.; Yan, X.; Lot, R.; Stanton, N.A. Adaptive Driver Modelling in ADAS to Improve User Acceptance: A Study Using Naturalistic Data. Saf. Sci. 2019, 119, 76–83. [Google Scholar] [CrossRef] [Green Version]
- Ziebinski, A.; Cupek, R.; Erdogan, H.; Waechter, S. A Survey of ADAS Technologies for the Future Perspective of Sensor Fusion. In Computational Collective Intelligence; Nguyen, N.T., Iliadis, L., Manolopoulos, Y., Trawiński, B., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 135–146. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Davoli, L.; Martalò, M.; Cilfone, A.; Belli, L.; Ferrari, G.; Presta, R.; Montanari, R.; Mengoni, M.; Giraldi, L.; Amparore, E.G.; et al. On Driver Behavior Recognition for Increased Safety: A Roadmap. Safety 2020, 6, 55. https://doi.org/10.3390/safety6040055
Davoli L, Martalò M, Cilfone A, Belli L, Ferrari G, Presta R, Montanari R, Mengoni M, Giraldi L, Amparore EG, et al. On Driver Behavior Recognition for Increased Safety: A Roadmap. Safety. 2020; 6(4):55. https://doi.org/10.3390/safety6040055
Chicago/Turabian StyleDavoli, Luca, Marco Martalò, Antonio Cilfone, Laura Belli, Gianluigi Ferrari, Roberta Presta, Roberto Montanari, Maura Mengoni, Luca Giraldi, Elvio G. Amparore, and et al. 2020. "On Driver Behavior Recognition for Increased Safety: A Roadmap" Safety 6, no. 4: 55. https://doi.org/10.3390/safety6040055
APA StyleDavoli, L., Martalò, M., Cilfone, A., Belli, L., Ferrari, G., Presta, R., Montanari, R., Mengoni, M., Giraldi, L., Amparore, E. G., Botta, M., Drago, I., Carbonara, G., Castellano, A., & Plomp, J. (2020). On Driver Behavior Recognition for Increased Safety: A Roadmap. Safety, 6(4), 55. https://doi.org/10.3390/safety6040055