Next Article in Journal
Nonlinear Guided-Wave Mixing for Condition Monitoring of Bolted Joints
Next Article in Special Issue
AttendAffectNet–Emotion Prediction of Movie Viewers Using Multimodal Fusion with Self-Attention
Previous Article in Journal
Development of an Easy-to-Operate Underwater Raman System for Deep-Sea Cold Seep and Hydrothermal Vent In Situ Detection
Previous Article in Special Issue
Deep-Learning-Based Multimodal Emotion Classification for Music Videos
Article

EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels

Department of Artificial Intelligence Convergence, Chonnam National University, 77 Yongbong-ro, Gwangju 500-757, Korea
*
Author to whom correspondence should be addressed.
Academic Editor: Kyandoghere Kyamakya
Sensors 2021, 21(15), 5092; https://doi.org/10.3390/s21155092
Received: 4 June 2021 / Revised: 21 July 2021 / Accepted: 25 July 2021 / Published: 27 July 2021
(This article belongs to the Special Issue Sensor Based Multi-Modal Emotion Recognition)
Besides facial or gesture-based emotion recognition, Electroencephalogram (EEG) data have been drawing attention thanks to their capability in countering the effect of deceptive external expressions of humans, like faces or speeches. Emotion recognition based on EEG signals heavily relies on the features and their delineation, which requires the selection of feature categories converted from the raw signals and types of expressions that could display the intrinsic properties of an individual signal or a group of them. Moreover, the correlation or interaction among channels and frequency bands also contain crucial information for emotional state prediction, and it is commonly disregarded in conventional approaches. Therefore, in our method, the correlation between 32 channels and frequency bands were put into use to enhance the emotion prediction performance. The extracted features chosen from the time domain were arranged into feature-homogeneous matrices, with their positions following the corresponding electrodes placed on the scalp. Based on this 3D representation of EEG signals, the model must have the ability to learn the local and global patterns that describe the short and long-range relations of EEG channels, along with the embedded features. To deal with this problem, we proposed the 2D CNN with different kernel-size of convolutional layers assembled into a convolution block, combining features that were distributed in small and large regions. Ten-fold cross validation was conducted on the DEAP dataset to prove the effectiveness of our approach. We achieved the average accuracies of 98.27% and 98.36% for arousal and valence binary classification, respectively. View Full-Text
Keywords: emotion recognition; electroencephalogram; EEG signals; channel correlation; frequency band correlation; multiscale kernel emotion recognition; electroencephalogram; EEG signals; channel correlation; frequency band correlation; multiscale kernel
Show Figures

Figure 1

MDPI and ACS Style

Phan, T.-D.-T.; Kim, S.-H.; Yang, H.-J.; Lee, G.-S. EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels. Sensors 2021, 21, 5092. https://doi.org/10.3390/s21155092

AMA Style

Phan T-D-T, Kim S-H, Yang H-J, Lee G-S. EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels. Sensors. 2021; 21(15):5092. https://doi.org/10.3390/s21155092

Chicago/Turabian Style

Phan, Tran-Dac-Thinh, Soo-Hyung Kim, Hyung-Jeong Yang, and Guee-Sang Lee. 2021. "EEG-Based Emotion Recognition by Convolutional Neural Network with Multi-Scale Kernels" Sensors 21, no. 15: 5092. https://doi.org/10.3390/s21155092

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop