Abstract
Emotions are complex phenomena arising from cooperative interactions among multiple brain regions. Electroencephalography (EEG) provides a non-invasive means to observe such neural activity; however, as it captures only electrode-level signals from the scalp, accurately classifying dimensional emotions requires considering both local electrode activity and the global spatial distribution across the scalp. Motivated by this, we propose a brain-inspired EEG electrode-cluster-based framework for dimensional emotion classification. The model organizes EEG electrodes into nine clusters based on spatial and functional proximity, applying an EEG Deformer to each cluster to learn frequency characteristics, temporal dynamics, and local signal patterns. The features extracted from each cluster are then integrated using a bidirectional cross-attention (BCA) mechanism and a temporal convolutional network (TCN), effectively modeling long-term inter-cluster interactions and global signal dependencies. Finally, a multilayer perceptron (MLP) is used to classify valence and arousal levels. Experiments on three public EEG datasets demonstrate that the proposed model significantly outperforms existing EEG-based dimensional emotion recognition methods. Cluster-based learning, reflecting electrode proximity and signal distribution, effectively captures structural patterns at the electrode-cluster level, while inter-cluster information integration further captures global signal interactions, thereby enhancing the interpretability and physiological validity of EEG-based dimensional emotion analysis. This approach provides a scalable framework for future affective computing and brain–computer interface (BCI) applications.