Next Article in Journal
Flexural Strength and Acoustic Damage Characteristics of Steel Bar Reactive Powder Concrete
Previous Article in Journal
Three-Dimensional Thematic Map Imaging of the Yacht Port on the Example of the Polish National Sailing Centre Marina in Gdańsk
Previous Article in Special Issue
Driver Stress State Evaluation by Means of Thermal Imaging: A Supervised Machine Learning Approach Based on ECG Signal
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Ubiquitous Technologies for Emotion Recognition

by
Oresti Banos
1,*,†,‡,
Luis A. Castro
2,‡ and
Claudia Villalonga
1,‡
1
Department of Computer Architecture and Technology, CITIC, University of Granada, 18014 Granada, Spain
2
Department of Computing and Design, Sonora Institute of Technology (ITSON), Ciudad Obregon 85130, Mexico
*
Author to whom correspondence should be addressed.
Current address: C/Periodista Rafael Gómez Montero 2, 18071 Granada, Spain.
These authors contributed equally to this work.
Appl. Sci. 2021, 11(15), 7019; https://doi.org/10.3390/app11157019
Submission received: 21 July 2021 / Accepted: 27 July 2021 / Published: 29 July 2021
(This article belongs to the Special Issue Ubiquitous Technologies for Emotion Recognition)

Abstract

:
Emotions play a very important role in how we think and behave. As such, the emotions we feel every day can compel us to act and influence the decisions and plans we make about our lives. Being able to measure, analyze, and better comprehend how or why our emotions may change is thus of much relevance to understand human behavior and its consequences. Despite the great efforts made in the past in the study of human emotions, it is only now with the advent of wearable, mobile, and ubiquitous technologies that we can aim at sensing and recognizing emotions, continuously and in the wild. This Special Issue aims at bringing together the latest experiences, findings, and developments regarding ubiquitous sensing, modeling, and recognition of human emotions.

There is an increased interest in studying facial expressions and how they relate to emotions. Lately, some research has been carried out to enhance in-vehicle interactions for drivers. This increased interest has been mainly focused at increasing road safety such as driver drowsiness, distracted driving, or detecting drivers’ stress. In [1], they proposed a novel contactless method for detecting stress in drivers. This method is based on the extraction of features from thermal infrared imaging, which were obtained in a controlled environment with 10 subjects. In their work, they compared their results with a stress index obtained from electrocardiography (ECG). They used a binary classifier to discriminate among drivers with stress vs. those with no stress, resulting in classification performance with an AUC of 0.80, a sensitivity of 77%, and a specificity of 78%. The results are promising towards identifying driver states that can derive in situations that can be of interest for road safety. In addition to this work, the authors of [2] implemented a system to monitor the driver’s facial expressions to assess health risks such as severe pain episodes while driving. The proposed approach is based on line-segment feature analysis-convolutional recurrent neural network (LFA-CRNN), which had the highest accuracy at approximately 98.92%. The results of this work aim at alerting the driver to risk-related matters while driving.
Furthermore, using facial expressions, Ref. [3] present a novel system for automatically detecting pain in babies using image analysis. They implemented a classifier based on Support Vector Machines (SVM) using the following texture descriptors for pain detection: Local Binary Patterns, Local Ternary Patterns, and Radon Barcodes. Using the Infant COPE database, the authors obtained accuracy around 95%. The approaches of [2,3] both use facial expressions to detect pain, although they use different databases and features. Still, these works provide valuable results that can be applied in systems for increasing and monitoring safety and health.
Image-based emotion recognition is mostly based on the analysis of whole-face images. However, some approaches put the focus on core facial areas (eyes, nose, mouth, etc.), which normally encode most of the affective information. In [4], the authors multi-block deep learning techniques which are trained on core facial areas. They show its application in an app for interview self-management. The same principle is used in [5]. Here, the authors propose the use of convolutional neural networks for the automatic recognition of micro-expressions in real-time. Although the accuracy results are far from optimal, they appear quite promising, especially concerning the time taken to recognize the emotion (a few milliseconds).
Regular RGB cameras are most frequently used in facial expression-based emotion recognition. However, there is a growing bulk of research devoted to using thermal infrared cameras to this very end. In [6], the authors review the application of thermal infrared imaging to determine affective responses ascribed to physiological modulations. Their review outlines the main advantages and challenges of thermal imaging-based affective computing, with a particular focus on its use for human–robot interaction applications.
The automatic recognition of emotions has been largely based, according to the literature, on image or video data. Nonetheless, other types of data are used to identify people’s affective states. For example, Ref. [7] analyzes the audio data generated during phone calls in a call center. The authors use speech emotion recognition to determine the call urgency, giving greater priority to calls featuring emotions such as fear, anger and sadness, and less priority to calls featuring neutral speech and happiness. The results show a significant reduction in waiting time for calls estimated as more urgent, especially those calls presenting emotions of fear and anger.
Electroencephalogram (EEG) signals are another alternate data used for emotion recognition. EEG-based emotion recognition is particularly considered due to its effectiveness compared to body expressions and other physiological signals. In [8], the authors study the use of various statistical techniques, namely logistic regression, Gaussian kernels and Laplacian priors to automatically recognize a set of emotions. They also investigate the critical frequency bands in emotion recognition, concluding a superiority of gamma and beta bands while classifying emotions. The great volume of data generated by EEG approaches makes it particularly interesting to consider using deep learning techniques. In [9], the authors precisely use such techniques with the aim to help determining the attitude of consumers towards a product. The authors prototype their solution with a well-known emotion recognition dataset and they show accuracy results comparable to those obtained via traditional machine learning approaches.

Author Contributions

All authors contributed equally to the screening, review, and assessment of the submitted papers. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Thanks to all the authors who submitted their research for this Special Issue. The invaluable contribution of the international reviewers is gratefully acknowledged.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cardone, D.; Perpetuini, D.; Filippini, C.; Spadolini, E.; Mancini, L.; Chiarelli, A.M.; Merla, A. Driver Stress State Evaluation by Means of Thermal Imaging: A Supervised Machine Learning Approach Based on ECG Signal. Appl. Sci. 2020, 10, 5673. [Google Scholar] [CrossRef]
  2. Kim, C.M.; Hong, E.J.; Chung, K.; Park, R.C. Driver Facial Expression Analysis Using LFA-CRNN-Based Feature Extraction for Health-Risk Decisions. Appl. Sci. 2020, 10, 2956. [Google Scholar] [CrossRef]
  3. Martínez, A.; Pujol, F.A.; Mora, H. Application of Texture Descriptors to Facial Emotion Recognition in Infants. Appl. Sci. 2020, 10, 1115. [Google Scholar] [CrossRef] [Green Version]
  4. Shin, D.H.; Chung, K.; Park, R.C. Detection of Emotion Using Multi-Block Deep Learning in a Self-Management Interview App. Appl. Sci. 2019, 9, 4830. [Google Scholar] [CrossRef] [Green Version]
  5. Belaiche, R.; Liu, Y.; Migniot, C.; Ginhac, D.; Yang, F. Cost-Effective CNNs for Real-Time Micro-Expression Recognition. Appl. Sci. 2020, 10, 4959. [Google Scholar] [CrossRef]
  6. Filippini, C.; Perpetuini, D.; Cardone, D.; Chiarelli, A.M.; Merla, A. Thermal Infrared Imaging-Based Affective Computing and Its Application to Facilitate Human Robot Interaction: A Review. Appl. Sci. 2020, 10, 2924. [Google Scholar] [CrossRef]
  7. Bojanić, M.; Delić, V.; Karpov, A. Call Redistribution for a Call Center Based on Speech Emotion Recognition. Appl. Sci. 2020, 10, 4653. [Google Scholar] [CrossRef]
  8. Pan, C.; Shi, C.; Mu, H.; Li, J.; Gao, X. EEG-Based Emotion Recognition Using Logistic Regression with Gaussian Kernel and Laplacian Prior and Investigation of Critical Frequency Bands. Appl. Sci. 2020, 10, 1619. [Google Scholar] [CrossRef] [Green Version]
  9. Aldayel, M.; Ykhlef, M.; Al-Nafjan, A. Deep Learning for EEG-Based Preference Classification in Neuromarketing. Appl. Sci. 2020, 10, 1525. [Google Scholar] [CrossRef] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Banos, O.; Castro, L.A.; Villalonga, C. Ubiquitous Technologies for Emotion Recognition. Appl. Sci. 2021, 11, 7019. https://doi.org/10.3390/app11157019

AMA Style

Banos O, Castro LA, Villalonga C. Ubiquitous Technologies for Emotion Recognition. Applied Sciences. 2021; 11(15):7019. https://doi.org/10.3390/app11157019

Chicago/Turabian Style

Banos, Oresti, Luis A. Castro, and Claudia Villalonga. 2021. "Ubiquitous Technologies for Emotion Recognition" Applied Sciences 11, no. 15: 7019. https://doi.org/10.3390/app11157019

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop