A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet
Abstract
:1. Introduction
- Fusing the frequency domain features, spatial information, and frequency band characteristics of multi-channel EEG signals, a novel emotion recognition framework based on multi-band EEG topology maps is proposed. In this framework, the multi-band EEG topology maps are introduced, which not only reflect the relative positional relationship between EEG channels, but also reflect the differences in frequency bands in EEG frequency domain features.
- A novel emotion recognition network, ERENet, is proposed. In ERENet, a multi-band discrete parallel processing module is designed for extracting spatial frequency domain features from the EEG topology map of each specific frequency band. In addition, a multi-band information exchange and recombination module is designed for the interaction of features in different frequency bands; thereby, more representational combined features can be extracted. Finally, a weighted classification module is designed to explore the correlation of different combined features.
2. Materials and Methods
2.1. Dataset
2.2. Data Preprocessing
2.3. An Emotion Recognition Method from Multi-Band EEG Topology Maps Using ERENet
2.3.1. Multi-Band EEG Topology Maps
2.3.2. The Model of Proposed ERENet
- Multi-band Discrete Parallel Processing Module
- 2.
- Multi-band Information Exchange and Reorganization Module
- 3.
- Weighted Classification Module
3. Experiments and Results
3.1. Experimental Environment and Experimental Settings
3.2. Selection of Hyperparameters
3.3. Comparison between ERENet and Other Methods
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wright, R.; Riedel, R.; Sechrest, L.; Lane, R.D.; Smith, R. Sex differences in emotion recognition ability: The mediating role of trait emotional awareness. Motiv. Emot. 2017, 42, 149–160. [Google Scholar] [CrossRef]
- Pattnaik, S.; Sabut, S.K.; Dash, M. DWT-based Feature Extraction and Classification for Motor Imaginary EEG Signals. In Proceedings of the 2016 International Conference on Systems in Medicine and Biology, Kharagpur, India, 4–7 January 2016. [Google Scholar]
- Hajjar, Y.; Hajjar, A.E.S.; Daya, B.; Chauvet, P. Determinant characteristics in EEG signal based on bursts amplitude segmentation for predicting pathological outcomes of a premature newborn. In Proceedings of the 2017 Sensors Networks Smart and Emerging Technologies, Beiriut, Lebanon, 12–14 September 2017. [Google Scholar]
- Samara, A.; Menezes, M.; Galway, L. Feature Extraction for Emotion Recognition and Modelling Using Neurophysiological Data. In Proceedings of the 2016 15th International Conference on Ubiquitous Computing and Communications and 2016 International Symposium on Cyberspace and Security (IUCC-CSS), Granada, Spain, 14–16 December 2016. [Google Scholar]
- Yuen, C.T.; San, W.S.; Rizon, M.; Seong, T.C. Classification of human emotions from EEG signals using statistical features and neural network. Int. J. Integr. Eng. 2009, 1, 71–76. [Google Scholar]
- Zhang, G.; Yu, M.; Liu, Y.J.; Zhao, G.; Zhang, D.; Zheng, W. SparseDGCNN: Recognizing emotion from multichannel EEG signals. IEEE Trans. Affect. Comput. 2021, 1949–3045. [Google Scholar] [CrossRef]
- An, Y.; Hu, S.; Duan, X.; Zhao, L.; Xie, C.; Zhao, Y. EEG emotion recognition based on 3D feature fusion and convolutional autoencoder. Front. Comput. Neurosci. 2021, 15, 743426. [Google Scholar] [CrossRef]
- Frantzidis, C.A.; Bratsas, C.; Papadelis, C.L.; Konstantinidis, E.; Pappas, C.; Bamidis, P.D. Toward Emotion Aware Computing: An Integrated Approach Using Multichannel Neurophysiological Recordings and Affective Visual Stimuli. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 589–597. [Google Scholar] [CrossRef]
- Petrantoakis, P.C.; Hadjileontiadis, L.J. Emotion Recognition from EEG Using Higher Order Crossings. IEEE Trans. Inf. Technol. Biomed. 2010, 14, 186–197. [Google Scholar] [CrossRef]
- Wen, T.X. Research on Feature Extraction and Classification of EEG Signals. Ph.D. Thesis, Xiamen University, Xiamen, China, 2018. [Google Scholar]
- Chao, H.; Dong, L.; Liu, Y.; Lu, B. Emotion Recognition from Multiband EEG Signals Using CapsNet. Sensors 2019, 19, 2212. [Google Scholar] [CrossRef] [Green Version]
- Thammasan, N.; Moriyama, K.; Fukui, K.I.; Numao, M. Continuous Music-Emotion Recognition Based on Electroencephalogram. Ieice Trans. Inf. Syst. 2016, 99, 1234–1241. [Google Scholar] [CrossRef] [Green Version]
- Song, T.; Zheng, W.; Song, P.; Cui, Z. EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks. IEEE Trans. Affect. Comput. 2020, 11, 532–541. [Google Scholar] [CrossRef] [Green Version]
- Liu, Y.; Shi, S.; Song, Y.; Gao, Q.; Li, Z.; Song, H.; Pang, S.; Li, D. EEG based Mental Workload Assessment by Power Spectral Density Feature. In Proceedings of the 2022 IEEE International Conference on Mechatronics and Automation, Guilin, China, 7–10 August 2022. [Google Scholar]
- Zhang, T.; Chen, W.; Li, M. AR based quadratic feature extraction in the VMD domain for the automated seizure detection of EEG using random forest classifier. Biomed. Signal Process. Control 2017, 31, 550–559. [Google Scholar] [CrossRef]
- Liu, Y.; Yu, M.; Zhao, G.; Song, J.; Ge, Y.; Shi, Y. Real-Time Movie-Induced Discrete Emotion Recognition from EEG Signals. IEEE Trans. Affect. Comput. 2018, 9, 550–562. [Google Scholar] [CrossRef]
- Fang, W.C.; Wang, K.Y.; Fahier, N.; Ho, Y.L.; Huang, Y.D. Development and validation of an EEG-based real-time emotion recognition system using edge AI computing platform with convolutional neural network system-on-chip design. IEEE J. Emerg. Sel. Top. Circuits Syst. 2019, 9, 645–657. [Google Scholar] [CrossRef]
- Song, T.; Zheng, W.; Lu, C.; Zong, Y.; Zhang, X.; Cui, Z. MPED: A multi-modal physiological emotion database for discrete emotion recognition. IEEE Access 2019, 7, 12177–12191. [Google Scholar] [CrossRef]
- Ullah, H.; Uzair, M.; Mahmood, A.; Ullah, M.; Khan, S.D.; Cheikh, F.A. Internal emotion classification using EEG signal with sparse discriminative ensemble. IEEE Access 2019, 7, 40144–40153. [Google Scholar] [CrossRef]
- Li, M.; Chen, W.; Zhang, T. A novel seizure diagnostic model based on kernel density estimation and least squares support vector machine. Biomed. Signal Process. Control 2018, 41, 233–241. [Google Scholar] [CrossRef]
- Chen, X.; Xu, X.; Liu, A.; McKeown, M.J.; Wang, Z.J. The Use of Multivariate EMD and CCA for Denoising Muscle Artifacts from Few-Channel EEG Recordings. IEEE Trans. Instrum. Meas. 2018, 67, 359–370. [Google Scholar] [CrossRef]
- Wang, Z.; Hope, R.M.; Wang, Z.; Ji, Q.; Gray, W.D. An EEG workload classifier for multiple subjects. In Proceedings of the 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Boston, MA, USA, 30 August–3 September 2011. [Google Scholar]
- Zainuddin, A.Z.A.; Mansor, W.; Khuan, L.Y.; Mahmoodin, Z. Classification of EEG Signal from Capable Dyslexic and Normal Children Using KNN. Adv. Sci. Lett. 2018, 24, 1402–1405. [Google Scholar] [CrossRef]
- Li, M.; Xu, H.; Liu, X.; Lu, S. Emotion recognition from multichannel EEG signals using K-nearest neighbor classification. Technol. Health Care 2018, 26, 509–519. [Google Scholar] [CrossRef] [PubMed]
- Bhardwaj, A.; Gupta, A.; Jain, P.; Rani, A.; Yadav, J. Classification of human emotions from EEG signals using SVM and LDA Classifiers. In Proceedings of the 2015 2nd International Conference on Signal Processing and Integrated Networks (SPIN), Noida, India, 19–20 February 2015. [Google Scholar]
- Li, X.; Sun, X.; Qi, X.; Sun, X. Relevance vector machine Based EEG emotion recognition. In Proceedings of the 2016 Sixth International Conference on Instrumentation & Measurement, Computer, Communication and Control (IMCCC), Harbin, China, 21–23 July 2016. [Google Scholar]
- Tibdewal, M.N.; Tale, S.A. Multichannel detection of epilepsy using SVM classifier on EEG signal. In Proceedings of the International Conference on Computing Communication Control & Automation, Pune, India, 12–13 August 2016. [Google Scholar]
- Golmohammadi, M.; Hossein, H.; Silvia, L.; Obeid, L.; Picone, J. Automatic Analysis of EEGs Using Big Data and Hybrid Deep Learning Architectures. Front. Hum. Neuroence 2017, 13, 76. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Li, J.; Li, S.; Pan, J.; Wang, F. Cross-subject EEG emotion recognition with self-organized graph neural network. Front. Comput. Neurosci. 2021, 15, 611653. [Google Scholar] [CrossRef]
- Mokatren, L.S.; Ansari, R.; Cetin, A.E.; Leow, A.D.; Ajilore, O.A.; Klumpp, H.; Vural, F.T.Y. EEG Classification by Factoring in Sensor Spatial Configuration. IEEE Access 2021, 9, 19053–19065. [Google Scholar] [CrossRef]
- Schirrmeister, R.T.; Springenberg, J.T.; Fiedere, L.D.J.; Glasstetter, M.; Eggensperger, K.; Tangermann, M.; Hutter, F.; Burgard, W.; Ball, T. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 2017, 38, 5391–5420. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tao, W.; Li, C.; Song, R.; Cheng, J.; Liu, Y.; Wan, F.; Chen, X. EEG-based Emotion Recognition via Channel-wise Attention and Self Attention. IEEE Trans. Affect. Comput. 2020. [Google Scholar] [CrossRef]
- Arjun; Rajpoot, A.S.; Panicker, M.R. Subject independent emotion recognition using EEG signals employing attention driven neural networks. Biomed. Signal Process. Control 2022, 75, 103547. [Google Scholar] [CrossRef]
- Bao, G.; Yang, K.; Tong, L.; Shu, J.; Zhang, R.; Wang, L.; Yan, B.; Zeng, Y. Linking Multi-Layer Dynamical GCN With Style-Based Recalibration CNN for EEG-Based Emotion Recognition. Front. Comput. Neurosci. 2022, 16, 834952. [Google Scholar] [CrossRef]
- Lawhern, V.J.; Solon, A.J.; Waytowich, N.R.; Gordon, S.M.; Hung, C.P.; Lance, B.J. EEGNet: A Compact Convolutional Network for EEG-based Brain-Computer Interfaces. J. Neural Eng. 2018, 15, 1741–2552. [Google Scholar] [CrossRef] [Green Version]
- Zhang, D.; Yao, L.; Chen, K.; Monaghan, J. A Convolutional Recurrent Attention Model for Subject-Independent EEG Signal Analysis. IEEE Signal Process. Lett. 2019, 26, 715–719. [Google Scholar] [CrossRef]
- Jones, N.A.; Fox, N.A. Electroencephalogram asymmetry during emotionally evocative films and its relation to positive and negative affectivity. Brain Cogn. 1992, 20, 280–299. [Google Scholar] [CrossRef]
- Liu, Y.; Sourina, O.; Nguyen, M.K. Real-Time EEG-Based Emotion Recognition and Its Applications. Trans. Comput. Sci. XII 2011, 6670, 256–277. [Google Scholar]
- Jatupaiboon, N.; Pan-Ngum, S.; Israsena, P. Emotion classification using minimal EEG channels and frequency bands. In Proceedings of the International Joint Conference on Computer Science & Software Engineering, Khon Kaen, Thailand, 29–31 May 2013. [Google Scholar]
- Huang, D.; Guan, C.; Ang, K.K.; Zhang, H.; Pan, Y. Asymmetric Spatial Pattern for EEG-based emotion detection. In Proceedings of the International Joint Conference on Neural Networks, Brisbane, QLD, Australia, 10–15 June 2012. [Google Scholar]
- Koelstra, S.; Muhl, C.; Soleymani, M.; Lee, J.-S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A Database for Emotion Analysis Using Physiological Signals. IEEE Trans. Affect. Comput. 2012, 3, 18–31. [Google Scholar] [CrossRef] [Green Version]
- Alarcão, S.M. Reminiscence Therapy Improvement using Emotional Information. In Proceedings of the 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII), San Antonio, TX, USA, 23–26 October 2017. [Google Scholar]
- Frantzidis, C.A.; Lithari, C.D.; Vivas, A.B.; Papadelis, C.L.; Pappas, C.; Bamidis, P.D. Towards emotion aware computing: A study of arousal modulation with multichannel event-related potentials, delta oscillatory activity and skin conductivity responses. In Proceedings of the 8th IEEE International Conference on Bioinformatics & Bioengineering, Athens, Greece, 8–10 October 2008. [Google Scholar]
- Luo, L. Architectures of neuronal circuits. Science 2021, 373, 7285. [Google Scholar] [CrossRef] [PubMed]
- Axel, R. The molecular logic of smell. Sci. Am. 1995, 273, 154–159. [Google Scholar] [CrossRef] [PubMed]
- Vosshall, L.B.; Stocker, R.F. Molecular architecture of smell and taste in Drosophila. Annu. Rev. Neurosci. 2007, 30, 505–533. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Yin, Z.; Zhao, M.; Wang, Y.; Yang, J.; Zhang, J. Recognition of emotions using multimodal physiological signals and an ensemble deep learning model. Comput. Methods Programs Biomed. 2017, 140, 93–110. [Google Scholar] [CrossRef] [PubMed]
- Chao, H.; Zhi, H.; Dong, L.; Liu, Y. Recognition of Emotions Using Multichannel EEG Data and DBN-GC-Based Ensemble Deep Learning Framework. Comput. Intell. Neurosci. 2018, 2018, 9750904. [Google Scholar] [CrossRef]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv 2017, arXiv:1704.04861. [Google Scholar]
- Tubaishat, A.A.; Al-Obeidat, F.; Halim, Z.; Waqas, M.; Qayum, F. EmoPercept: EEG-based emotion classification through perceiver. Soft Comput. 2022, 26, 10563–10570. [Google Scholar]
- Zhang, Y.; Cai, H.; Nie, L.; Xu, P.; Zhao, S.; Guan, C. An end-to-end 3D convolutional neural network for decoding attentive mental state. Neural Netw. 2021, 144, 129–137. [Google Scholar] [CrossRef] [PubMed]
Label | Threshold | Data Quantity | |
---|---|---|---|
Instances | Total | ||
LV 1 | <5 | 84,708 | 149,760 |
HV 2 | ≥5 | 65,052 |
Label | Threshold | Data Quantity | |
---|---|---|---|
Instances | Total | ||
LA 1 | <5 | 88,218 | 149,760 |
HA 2 | ≥5 | 61,542 |
Scheme | Group Convolution of Module 1 | Group Convolution of Module 2 | Number of Neurons in the Fully Connected Layers of Module 3 | ||
---|---|---|---|---|---|
Number of Kernels | Size of Kernels | Number of Kernels | Size of Kernels | ||
1 | 8 | 3 × 3 | 8 | 3 × 3 | 5 |
2 | 8 | 3 × 3 | 16 | 3 × 3 | 5 |
3 | 16 | 3 × 3 | 16 | 3 × 3 | 5 |
4 | 16 | 3 × 3 | 32 | 3 × 3 | 5 |
Scheme | Number of Total Parameters | Number of Trainable Parameters | Training Time (Seconds) |
---|---|---|---|
1 | 8576 | 8384 | 700 |
2 | 16,320 | 16,000 | 800 |
3 | 32,224 | 31,584 | 1200 |
4 | 63,200 | 62,048 | 1400 |
Scheme | Group Convolution of Module 1 | Group Convolution of Module 2 | Number of Neurons in the Fully Connected Layers of Module 3 | ||
---|---|---|---|---|---|
Number of Kernels | Size of Kernels | Number of Kernels | Size of Kernels | ||
1 | 8 | 3 × 3 | 16 | 3 × 3 | 5 |
2 | 8 | 4 × 4 | 16 | 4 × 4 | 5 |
3 | 8 | 5 × 5 | 16 | 5 × 5 | 5 |
4 | 8 | 6 × 6 | 16 | 6 × 6 | 5 |
5 | 8 | 7 × 7 | 16 | 7 × 7 | 5 |
Scheme | Group Convolution of Module 1 | Group Convolution of Module 2 | Number of Neurons in the Fully Connected Layers of Module 3 | ||
---|---|---|---|---|---|
Number of Kernels | Size of Kernels | Number of Kernels | Size of Kernels | ||
1 | 8 | 5 × 5 | 16 | 5 × 5 | 0 |
2 | 8 | 5 × 5 | 16 | 5 × 5 | 5 |
3 | 8 | 5 × 5 | 16 | 5 × 5 | 10 |
4 | 8 | 5 × 5 | 16 | 5 × 5 | 20 |
Method | Accuracy (%) | |||||
---|---|---|---|---|---|---|
Intra-Subject | Inter-Subject | Mixed-Subject | ||||
Valence | Arousal | Valence | Arousal | Valence | Arousal | |
SVM | 70.65 | 71.05 | 41.01 | 42.43 | 63.76 | 63.09 |
1D-CNN | 74.33 | 71.12 | 45.89 | 43.12 | 65.21 | 63.03 |
2D-CNN | 81.12 | 80.45 | 58.32 | 59.67 | 72.39 | 72.42 |
EEGNet [35] | 85.67 | 85.03 | 64.04 | 61.85 | 78.56 | 78.71 |
ShallowNet [31] | 87.18 | 86.39 | 63.12 | 61.31 | 79.11 | 78.32 |
CapsNet [11] | 82.24 | 83.37 | 57.28 | 58.79 | 70.73 | 71.28 |
EmoPercept [50] | 93.56 | 94.79 | 67.13 | 66.09 | 85.32 | 85.66 |
3D-CNN [51] | 97.89 | 96.56 | 74.32 | 71.42 | 92.12 | 91.79 |
ERENet (Ours) | 95.38 | 95.09 | 68.32 | 67.97 | 88.47 | 87.52 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lv, Z.; Zhang, J.; Epota Oma, E. A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet. Appl. Sci. 2022, 12, 10273. https://doi.org/10.3390/app122010273
Lv Z, Zhang J, Epota Oma E. A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet. Applied Sciences. 2022; 12(20):10273. https://doi.org/10.3390/app122010273
Chicago/Turabian StyleLv, Ziyi, Jing Zhang, and Estanislao Epota Oma. 2022. "A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet" Applied Sciences 12, no. 20: 10273. https://doi.org/10.3390/app122010273
APA StyleLv, Z., Zhang, J., & Epota Oma, E. (2022). A Novel Method of Emotion Recognition from Multi-Band EEG Topology Maps Based on ERENet. Applied Sciences, 12(20), 10273. https://doi.org/10.3390/app122010273