Next Article in Journal
Identification of Active Gully Erosion Sites in the Loess Plateau of China Using MF-DFA
Previous Article in Journal
An Assessment of the GOCE High-Level Processing Facility (HPF) Released Global Geopotential Models with Regional Test Results in Turkey
Open AccessArticle

Deep Learning-Based Drivers Emotion Classification System in Time Series Data for Remote Applications

1
Department of Unmanned Vehicle Engineering, Sejong University, 209, Neungdong-ro, Gwangjin-gu, Seoul 05006, Korea
2
Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea
3
Department of Computer Science and Engineering, Kyungpook National University, Daegu 41566, Korea
4
College of Internet of Things Engineering, Hohai University, Changzhou 213022, China
5
Department of Software, Gachon University, 1342 Seongnamdaero, Sujeong-gu, Seongnam, Gyeonggi-do 13120, Korea
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(3), 587; https://doi.org/10.3390/rs12030587
Received: 2 January 2020 / Revised: 6 February 2020 / Accepted: 6 February 2020 / Published: 10 February 2020
Aggressive driving emotions is indeed one of the major causes for traffic accidents throughout the world. Real-time classification in time series data of abnormal and normal driving is a keystone to avoiding road accidents. Existing work on driving behaviors in time series data have some limitations and discomforts for the users that need to be addressed. We proposed a multimodal based method to remotely detect driver aggressiveness in order to deal these issues. The proposed method is based on change in gaze and facial emotions of drivers while driving using near-infrared (NIR) camera sensors and an illuminator installed in vehicle. Driver’s aggressive and normal time series data are collected while playing car racing and truck driving computer games, respectively, while using driving game simulator. Dlib program is used to obtain driver’s image data to extract face, left and right eye images for finding change in gaze based on convolutional neural network (CNN). Similarly, facial emotions that are based on CNN are also obtained through lips, left and right eye images extracted from Dlib program. Finally, the score level fusion is applied to scores that were obtained from change in gaze and facial emotions to classify aggressive and normal driving. The proposed method accuracy is measured through experiments while using a self-constructed large-scale testing database that shows the classification accuracy of the driver’s change in gaze and facial emotions for aggressive and normal driving is high, and the performance is superior to that of previous methods. View Full-Text
Keywords: emotions sensing; aggressive driving; normal driving; time series data; change in gaze; facial emotions; gaze tracking; deep learning emotions sensing; aggressive driving; normal driving; time series data; change in gaze; facial emotions; gaze tracking; deep learning
Show Figures

Graphical abstract

MDPI and ACS Style

Naqvi, R.A.; Arsalan, M.; Rehman, A.; Rehman, A.U.; Loh, W.-K.; Paul, A. Deep Learning-Based Drivers Emotion Classification System in Time Series Data for Remote Applications. Remote Sens. 2020, 12, 587.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop