A Customized ECA-CRNN Model for Emotion Recognition Based on EEG Signals
Abstract
:1. Introduction
- A new network named the ECA-CRNN model is proposed, which incorporates the Efficient Channel Attention Module for the first time to enhance emotion recognition using EEG signals;
- The ECA-CRNN model employs the attention mechanism for emotion recognition with EEG signals, originally used in image recognition, demonstrating the broad application and conveniences of ECA in enhancing the relationship between channels;
- The initial EEG signal is converted into a four-dimensional input dataset comprising temporal, spatial and frequency information. The customized CNN extracts spatial and DE frequency features, while GRU extracts temporal features. This comprehensive feature extraction approach results in more accurate emotion recognition outcomes;
- The ECA-CRNN model demonstrated superior performance on the DEAP dataset, exhibiting the lowest standard deviation and maintaining a relatively high recognition accuracy compared to some other methods we have tested.
2. ECA-CRNN’s Model Constructing
2.1. DEAP Dataset
2.2. Method of Feature Extraction
2.2.1. Temporal Feature
2.2.2. Frequency Feature
2.2.3. Spatial Feature
2.3. Efficient Channel Attention Module
2.4. ECA-CRNN’s Entire Structure
2.4.1. CNN’s Structure
2.4.2. GRU’s Structure
3. Experiment
3.1. Experimental Environment and Parameter Setting
3.2. Data Preprocessing
3.3. Experimental Processing and Data Comparison
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Lux, V.; van Ommen, C. The generational brain: Introduction. Theory Psychol. 2016, 26, 561–571. [Google Scholar] [CrossRef]
- Saarimaki, H.; Ejtehadian, L.F.; Glerean, E.; Jaaskelainen, I.P.; Vuilleumier, P.; Sams, M.; Nummenmaa, L. Distributed affective space represents multiple emotion categories across the human brain. Soc. Cogn. Affect. Neurosci. 2018, 13, 471–482. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Y.; Cui, C.; Zhong, S.H. EEG-Based Emotion Recognition via Knowledge-Integrated Interpretable Method. Mathematics 2023, 11, 1424. [Google Scholar] [CrossRef]
- Chen, J.X.; Zhang, P.W.; Mao, Z.J.; Huang, Y.F.; Jiang, D.M.; Zhang, A.N. Accurate EEG-Based Emotion Recognition on Combined Features Using Deep Convolutional Neural Networks. IEEE Access 2019, 7, 44317–44328. [Google Scholar] [CrossRef]
- Wang, Y.; Huang, Z.; Mccane, B.; Neo, P. EmotioNet: A 3-D Convolutional Neural Network for EEG-based Emotion Recognition. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar]
- Yang, Y.; Wu, Q.; Fu, Y.; Chen, X. Continuous Convolutional Neural Network with 3D Input for EEG-Based Emotion Recognition. Neural. Inf. Process. 2018, 11307, 433–443. [Google Scholar]
- Li, J.P.; Zhang, Z.X.; He, H.G. Hierarchical Convolutional Neural Networks for EEG-Based Emotion Recognition. Cogn. Comput. 2018, 10, 368–380. [Google Scholar] [CrossRef]
- Zhang, Y.Q.; Chen, J.L.; Tan, J.H.; Chen, Y.X.; Chen, Y.Y.; Li, D.H.; Yang, L.; Su, J.; Huang, X.; Che, W.L. An Investigation of Deep Learning Models for EEG-Based Emotion Recognition. Front. Neurosci. 2020, 14, 622759. [Google Scholar] [CrossRef]
- Yang, Y.; Wu, Q.; Qiu, M.; Wang, Y.; Chen, X. Emotion Recognition from Multi-Channel EEG through Parallel Convolutional Recurrent Neural Network. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar]
- Iyer, A.; Das, S.S.; Teotia, R.; Maheshwari, S.; Sharma, R.R. CNN and LSTM based ensemble learning for human emotion recognition using EEG recordings. Multimed. Tools Appl. 2023, 82, 4883–4896. [Google Scholar] [CrossRef]
- Wilaiprasitporn, T.; Ditthapron, A.; Matchaparn, K.; Tongbuasirilai, T.; Banluesombatkul, N.; Chuangsuwanich, E. Affective EEG-Based Person Identification Using the Deep Learning Approach. IEEE Trans. Cogn. Dev. Syst. 2020, 12, 486–496. [Google Scholar] [CrossRef] [Green Version]
- Shen, F.Y.; Dai, G.J.; Lin, G.; Zhang, J.H.; Kong, W.Z.; Zeng, H. EEG-based emotion recognition using 4D convolutional recurrent neural network. Cogn. Neurodyn. 2020, 14, 815–828. [Google Scholar] [CrossRef]
- Kim, G.I.; Jang, B. Petroleum Price Prediction with CNN-LSTM and CNN-GRU Using Skip-Connection. Mathematics 2023, 11, 547. [Google Scholar] [CrossRef]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Khateeb, M.; Anwar, S.M.; Alnowami, M. Multi-Domain Feature Fusion for Emotion Classification Using DEAP Dataset. IEEE Access 2021, 9, 12134–12142. [Google Scholar] [CrossRef]
- Joshi, V.M.; Ghongade, R.B.; Joshi, A.M.; Kulkarni, R.V. Deep BiLSTM neural network model for emotion detection using cross-dataset approach. Biomed. Signal Process. Control 2022, 73, 103407. [Google Scholar] [CrossRef]
- Menezes, M.L.R.; Samara, A.; Galway, L.; Sant’Anna, A.; Verikas, A.; Alonso-Fernandez, F.; Wang, H.; Bond, R. Towards emotion recognition for virtual environments: An evaluation of eeg features on benchmark dataset. Pers. Ubiquitous Comput. 2017, 21, 1003–1013. [Google Scholar] [CrossRef] [Green Version]
- Choi, E.; Kim, D. Arousal, Valence and Liking Classification Model Based on Deep Belief Network and DEAP Dataset for Mental Healthcare Management. Basic Clin. Pharmacol. Toxicol. 2019, 124, 214–215. [Google Scholar]
- Singh, U.; Shaw, R.; Patra, B.K. A data augmentation and channel selection technique for grading human emotions on DEAP dataset. Biomed. Signal Process. Control 2023, 79, 104060. [Google Scholar] [CrossRef]
- Lu, H.M.; Wan, M.; Sangaiah, A.K. Human Emotion Recognition Using an EEG Cloud Computing Platform. Mob. Netw. Appl. 2020, 25, 1023–1032. [Google Scholar] [CrossRef]
- Huang, D.M.; Chen, S.T.; Liu, C.; Zheng, L.; Tian, Z.H.; Jiang, D.Z. Differences first in asymmetric brain: A bi-hemisphere discrepancy convolutional neural network for EEG emotion recognition. Neurocomputing 2021, 448, 140–151. [Google Scholar] [CrossRef]
- De Witte, S.; Klooster, D.; Dedoncker, J.; Duprat, R.; Remue, J.; Baeken, C. Left prefrontal neuronavigated electrode localization in tDCS: 10-20 EEG system versus MRI-guided neuronavigation. Psychiatry Res. Neuroimaging 2018, 274, 1–6. [Google Scholar] [CrossRef]
- Xu, G.X.; Guo, W.H.; Wang, Y.J. Subject-independent EEG emotion recognition with hybrid spatio-temporal GRU-Conv architecture. Med. Biol. Eng. Comput. 2023, 61, 61–73. [Google Scholar] [CrossRef]
- Demir, F.; Sobahi, N.; Siuly, S.; Sengur, A. Exploring Deep Learning Features for Automatic Classification of Human Emotion Using EEG Rhythms. IEEE Sens. J. 2021, 21, 14923–14930. [Google Scholar] [CrossRef]
- Cui, D.; Xuan, H.Y.; Liu, J.; Gu, G.H.; Li, X.L. Emotion Recognition on EEG Signal Using ResNeXt Attention 2D-3D Convolution Neural Networks. Neural Process. Lett. 2022, 1–5. [Google Scholar] [CrossRef]
- Yin, Y.Q.; Zheng, X.W.; Hu, B.; Zhang, Y.; Cui, X.C. EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Appl. Soft Comput. 2021, 100, 106954. [Google Scholar] [CrossRef]
- Zheng, W.L.; Lu, B.L. Investigating Critical Frequency Bands and Channels for EEG-Based Emotion Recognition with Deep Neural Networks. IEEE Trans. Auton. Ment. Dev. 2015, 7, 162–175. [Google Scholar] [CrossRef]
- Mehmood, R.M.; Du, R.Y.; Lee, H.J. Optimal Feature Selection and Deep Learning Ensembles Method for Emotion Recognition From Human Brain EEG Sensors. IEEE Access 2017, 5, 14797–14806. [Google Scholar] [CrossRef]
- Hwang, S.; Hong, K.; Son, G.; Byun, H. Learning CNN features from DE features for EEG-based emotion recognition. Pattern Anal. Appl. 2020, 23, 1323–1335. [Google Scholar] [CrossRef]
- Zheng, W.L.; Zhu, J.Y.; Lu, B.L. Identifying Stable Patterns over Time for Emotion Recognition from EEG. IEEE Trans. Affect. Comput. 2019, 10, 417–429. [Google Scholar] [CrossRef] [Green Version]
- Hu, J.; Shen, L.; Albanie, S.; Sun, G.; Wu, E.H. Squeeze-and-Excitation Networks. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 42, 2011–2023. [Google Scholar] [CrossRef] [Green Version]
- Wang, Q.L.; Wu, B.G.; Zhu, P.F.; Li, P.H.; Zuo, W.M.; Hu, Q.H. ECA-Net: Efficient channel attention for deep convolutional neural networks. arXiv 2020, arXiv:1910.03151. [Google Scholar]
- Zhao, Y.X.; Man, K.L.; Smith, J.; Siddique, K.; Guan, S.U. Improved two-stream model for human action recognition. Eurasip J. Image Video Process. 2020, 2020, 24. [Google Scholar] [CrossRef]
- Gonon, L.; Schwab, C. Deep ReLU neural networks overcome the curse of dimensionality for partial integrodifferential equations. Anal. Appl. 2023, 21, 1–47. [Google Scholar] [CrossRef]
- Yang, S.Y.; Hao, K.R.; Ding, Y.S.; Liu, J. Vehicle Driving Direction Control Based on Compressed Network. Int. J. Pattern Recognit. Artif. Intell. 2018, 32, 1850025. [Google Scholar] [CrossRef]
- Yang, W.W.; Jia, C.X.; Liu, R.F. Construction and Simulation of the Enterprise Financial Risk Diagnosis Model by Using Dropout and BN to Improve LSTM. Secur. Commun. Netw. 2022, 2022, 4767980. [Google Scholar] [CrossRef]
- Chen, J.X.; Jiang, D.M.; Zhang, N. A Hierarchical Bidirectional GRU Model With Attention for EEG-Based Emotion Classification. IEEE Access 2019, 7, 118530–118540. [Google Scholar] [CrossRef]
- Siuly, S.; Guo, Y.H.; Alcin, O.F.; Li, Y.; Wen, P.; Wang, H. Exploring deep residual network based features for automatic schizophrenia detection from EEG. Phys. Eng. Sci. Med. 2023, 46, 561–574. [Google Scholar] [CrossRef] [PubMed]
- Issa, S.; Peng, Q.; You, X. Emotion Classification Using EEG Brain Signals and the Broad Learning System. IEEE Trans. Syst. Man Cybern.-Syst. 2021, 51, 7382–7391. [Google Scholar] [CrossRef]
- Asghar, M.A.; Khan, M.J.; Rizwan, M.; Shorfuzzaman, M.; Mehmood, R.M. AI inspired EEG-based spatial feature selection method using multivariate empirical mode decomposition for emotion classification. Multimed. Syst. 2022, 28, 1275–1288. [Google Scholar] [CrossRef] [PubMed]
Layer (Type) | Output Shape | Activation | Size of Filter and Pooling | Number of Filters |
---|---|---|---|---|
conv1 (Conv2D) | (None, 8, 9, 128) | relu | 3 × 3 | 128 |
batch_normalization (BatchNormalization) | (None, 8, 9, 128) | |||
eca1 (ECA) | (None, 8, 9, 128) | |||
conv2 (Conv2D) | (None, 8, 9, 256) | relu | 3 × 3 | 256 |
batch_normalization (BatchNormalization) | (None, 8, 9, 256) | |||
eca2 (ECA) | (None, 8, 9, 256) | |||
conv3 (Conv2D) | (None, 8, 9, 256) | relu | 3 × 3 | 256 |
batch_normalization (BatchNormalization) | (None, 8, 9, 256) | |||
eca3 (ECA) | (None, 8, 9, 256) | |||
Conv4 (Conv2D) | (None, 8, 9, 128) | relu | 1 × 1 | 128 |
batch_normalization (BatchNormalization) | (None, 8, 9, 128) | |||
eca4 (ECA) | (None, 8, 9, 128) | |||
pool (MaxPooling2D) | (None, 4, 4, 128) | 2 × 2 | ||
fla1 (Flatten) | (None, 2048) | |||
dense1 (Dense) | (None, 512) | selu | ||
alpha_dropout (AlphaDropout) | (None, 512) |
t (s) | Arousal ((Acc ± Std)%) | Valence ((Acc ± Std)%) |
---|---|---|
1 | 93.74 ± 2.02 | 93.51 ± 2.14 |
2 | 93.98 ± 1.93 | 93.65 ± 2.27 |
3 | 94.27 ± 1.51 | 93.82 ± 1.92 |
4 | 95.45 ± 1.28 | 95.12 ± 1.55 |
5 | 94.14 ± 1.84 | 93.86 ± 2.18 |
6 | 94.07 ± 1.29 | 93.75 ± 1.96 |
Epochs | Arousal ((Acc ± Std)%) | Valence ((Acc ± Std)%) |
---|---|---|
100 | 94.06 ± 2.09 | 93.58 ± 2.24 |
200 | 94.44 ± 1.86 | 94.12 ± 1.76 |
300 | 94.96 ± 1.54 | 94.71 ± 1.83 |
400 | 95.21 ± 1.90 | 94.87 ± 1.61 |
473 | 95.70 ± 1.16 | 95.33 ± 1.45 |
500 | 95.69 ± 1.22 | 95.21 ± 1.87 |
600 | 95.65 ± 2.13 | 95.24 ± 1.88 |
Methods | Arousal ((Acc ± Std)%) | Valence ((Acc ± Std)%) |
---|---|---|
EmotioNet [5] | 73.24 ± 3.13 | 72.16 ± 3.26 |
CCNN [6] | 90.15 ± 2.87 | 89.41 ± 2.75 |
PCRNN [9] | 90.98 ± 3.21 | 90.73 ± 2.94 |
Three-dimensional CNN [12] | 91.96 ± 2.84 | 90.95 ± 2.52 |
4D-CRNN [12] | 94.52 ± 3.59 | 94.18 ± 2.88 |
ECA-CRNN(ours) | 95.70 ± 1.16 | 95.33 ± 1.45 |
Models | Characteristics |
---|---|
EmotioNet [5] | Utilizing 3D convolution kernels to extract spatial-temporal information in the first half of the model and extracting temporal features in the second half but disregarding frequency information |
CCNN [6] | Building a customized continuous CNN model, but only focusing on frequency and spatial information |
HCNN [7] | Organizing differential entropy features across channels, but only attending to frequency and spatial information |
PCRNN [9] | By adopting a parallel branch structure, CNN and LSTM can extract distinct features from the raw EEG signals, but ignoring the interaction between space and time |
Ensemble model [10] | Refining a three-branch model to capture more information, albeit at the cost of increased parameters and complexity |
4D-CRNN [12] | Using DE features to construct four-dimensional features, focusing on frequency, spatial and temporal information, but ignoring the internal relations between frequency bands |
ECA-CRNN (ours) | Inserting ECA-Net and BN layers, also using DE features and focusing on multidimensional information, strengthening the influence between frequency bands, but the part of CNN presents opportunities for further enhancement |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Song, Y.; Yin, Y.; Xu, P. A Customized ECA-CRNN Model for Emotion Recognition Based on EEG Signals. Electronics 2023, 12, 2900. https://doi.org/10.3390/electronics12132900
Song Y, Yin Y, Xu P. A Customized ECA-CRNN Model for Emotion Recognition Based on EEG Signals. Electronics. 2023; 12(13):2900. https://doi.org/10.3390/electronics12132900
Chicago/Turabian StyleSong, Yan, Yiming Yin, and Panfeng Xu. 2023. "A Customized ECA-CRNN Model for Emotion Recognition Based on EEG Signals" Electronics 12, no. 13: 2900. https://doi.org/10.3390/electronics12132900
APA StyleSong, Y., Yin, Y., & Xu, P. (2023). A Customized ECA-CRNN Model for Emotion Recognition Based on EEG Signals. Electronics, 12(13), 2900. https://doi.org/10.3390/electronics12132900