- Article
Decoding Self-Imagined Emotions from EEG Signals Using Machine Learning for Affective BCI Systems
- Charoenporn Bouyam,
- Nannaphat Siribunyaphat and
- Bukhoree Sahoh
- + 1 author
Research on self-imagined emotional imagery supports the development of practical affective brain–computer interface (BCI) systems. This study proposes a hybrid emotion induction approach that combines facial expression image cues with subsequent emotional imagery, involving six positive and six negative emotions across two- or four-class valence and arousal categories. Machine learning (ML) techniques were applied to interpret these self-generated emotions from electroencephalogram (EEG) signals. Experiments were conducted to observe brain activity and validate the proposed feature and classification algorithms. The results showed that absolute beta power features computed from power spectral density (PSD) across EEG channels consistently achieved the highest classification accuracy for all emotion categories with the K-nearest neighbors (KNN) algorithm, while alpha–beta ratio features also contributed. The nonlinear parametric ML models achieved high effectiveness; the K-nearest neighbor (KNN) classifier performed best in detecting neutral states, while the artificial neural network (ANN) achieved balanced accuracy across emotional stages. The proposed system supports the use of the hybrid emotion induction paradigm and PSD-derived EEG features to develop reliable, subject-independent affective BCI systems. In future work, we will expand the datasets, employ advanced feature extraction and deep learning models, integrate multi-modal signals, and validate the proposed approaches across broader populations.
4 November 2025




