Next Article in Journal
Hand Rehabilitation and Telemonitoring through Smart Toys
Next Article in Special Issue
Activity Recognition Using Wearable Physiological Measurements: Selection of Features from a Comprehensive Literature Study
Previous Article in Journal
Compressive Sensing for Tomographic Imaging of a Target with a Narrowband Bistatic Radar
Previous Article in Special Issue
Dilated Skip Convolution for Facial Landmark Detection
Open AccessArticle

Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments

Instituto Universitário de Lisboa (ISCTE-IUL) and Instituto de Telecomunicações (IT-IUL), Av. das Forças Armadas, 1649-026 Lisbon, Portugal
Universidade Federal do Rio Grande do Norte (UFRN), Av. Sen. Salgado Filho, 3000, Candelária, Natal, RN 59064-741, Brazil
Author to whom correspondence should be addressed.
Sensors 2019, 19(24), 5516;
Received: 18 October 2019 / Revised: 8 December 2019 / Accepted: 9 December 2019 / Published: 13 December 2019
The present work tries to fill part of the gap regarding the pilots’ emotions and their bio-reactions during some flight procedures such as, takeoff, climbing, cruising, descent, initial approach, final approach and landing. A sensing architecture and a set of experiments were developed, associating it to several simulated flights ( N f l i g h t s = 13 ) using the Microsoft Flight Simulator Steam Edition (FSX-SE). The approach was carried out with eight beginner users on the flight simulator ( N p i l o t s = 8 ). It is shown that it is possible to recognize emotions from different pilots in flight, combining their present and previous emotions. The cardiac system based on Heart Rate (HR), Galvanic Skin Response (GSR) and Electroencephalography (EEG), were used to extract emotions, as well as the intensities of emotions detected from the pilot face. We also considered five main emotions: happy, sad, angry, surprise and scared. The emotion recognition is based on Artificial Neural Networks and Deep Learning techniques. The Root Mean Squared Error (RMSE) and Mean Absolute Error (MAE) were the main methods used to measure the quality of the regression output models. The tests of the produced output models showed that the lowest recognition errors were reached when all data were considered or when the GSR datasets were omitted from the model training. It also showed that the emotion surprised was the easiest to recognize, having a mean RMSE of 0.13 and mean MAE of 0.01; while the emotion sad was the hardest to recognize, having a mean RMSE of 0.82 and mean MAE of 0.08. When we considered only the higher emotion intensities by time, the most matches accuracies were between 55% and 100%. View Full-Text
Keywords: emotion recognition; physiological sensing; multimodal sensing; deep learning; flight simulation emotion recognition; physiological sensing; multimodal sensing; deep learning; flight simulation
Show Figures

Figure 1

MDPI and ACS Style

César Cavalcanti Roza, V.; Adrian Postolache, O. Multimodal Approach for Emotion Recognition Based on Simulated Flight Experiments. Sensors 2019, 19, 5516.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop