Next Article in Journal
LoRaWAN for Smart Campus: Deployment and Long-Term Operation Analysis
Previous Article in Journal
Label Noise Cleaning with an Adaptive Ensemble Method Based on Noise Detection Metric
Article

Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features

Computer Science and Engineering, Konkuk University, Seoul 05029, Korea
*
Author to whom correspondence should be addressed.
This manuscript is an extended version of the conference paper Longbin, J.; Chang, J.; Kim, E. EEG-Based User Identification Using Channel-Wise Features. In Proceedings of the 5th Asian Conference, ACPR 2019, Auckland, New Zealand, 26–29 November 2019; pp. 750–762.
Sensors 2020, 20(23), 6719; https://doi.org/10.3390/s20236719
Received: 29 September 2020 / Revised: 17 November 2020 / Accepted: 23 November 2020 / Published: 24 November 2020
(This article belongs to the Section Wearables)
Electroencephalogram (EEG)-based emotion recognition is receiving significant attention in research on brain-computer interfaces (BCI) and health care. To recognize cross-subject emotion from EEG data accurately, a technique capable of finding an effective representation robust to the subject-specific variability associated with EEG data collection processes is necessary. In this paper, a new method to predict cross-subject emotion using time-series analysis and spatial correlation is proposed. To represent the spatial connectivity between brain regions, a channel-wise feature is proposed, which can effectively handle the correlation between all channels. The channel-wise feature is defined by a symmetric matrix, the elements of which are calculated by the Pearson correlation coefficient between two-pair channels capable of complementarily handling subject-specific variability. The channel-wise features are then fed to two-layer stacked long short-term memory (LSTM), which can extract temporal features and learn an emotional model. Extensive experiments on two publicly available datasets, the Dataset for Emotion Analysis using Physiological Signals (DEAP) and the SJTU (Shanghai Jiao Tong University) Emotion EEG Dataset (SEED), demonstrate the effectiveness of the combined use of channel-wise features and LSTM. Experimental results achieve state-of-the-art classification rates of 98.93% and 99.10% during the two-class classification of valence and arousal in DEAP, respectively, with an accuracy of 99.63% during three-class classification in SEED. View Full-Text
Keywords: EEG; cross-subject; emotion recognition; user independent model; channel-wise feature EEG; cross-subject; emotion recognition; user independent model; channel-wise feature
Show Figures

Figure 1

MDPI and ACS Style

Jin, L.; Kim, E.Y. Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features. Sensors 2020, 20, 6719. https://doi.org/10.3390/s20236719

AMA Style

Jin L, Kim EY. Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features. Sensors. 2020; 20(23):6719. https://doi.org/10.3390/s20236719

Chicago/Turabian Style

Jin, Longbin, and Eun Y. Kim. 2020. "Interpretable Cross-Subject EEG-Based Emotion Recognition Using Channel-Wise Features" Sensors 20, no. 23: 6719. https://doi.org/10.3390/s20236719

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop