Next Article in Journal
Considering the Impact of Adverse Weather: Integrated Scheduling Optimization of Berths and Quay Cranes
Previous Article in Journal
Anomaly Deviation-Based Window Size Selection of Sensor Data for Enhanced Fault Diagnosis Efficiency in Autonomous Manufacturing Systems
Previous Article in Special Issue
A Portable and Affordable Four-Channel EEG System for Emotion Recognition with Self-Supervised Feature Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Multi-Database EEG Integration for Subject-Independent Emotion Recognition in Brain–Computer Interface Systems

1
Department of Electrical and Instrumentation Engineering, Thapar Institute of Engineering & Technology, Patiala 147004, India
2
Department of Electronics and Communication Engineering, Thapar Institute of Engineering & Technology, Patiala 147004, India
*
Author to whom correspondence should be addressed.
Mathematics 2026, 14(3), 474; https://doi.org/10.3390/math14030474
Submission received: 18 November 2025 / Revised: 12 January 2026 / Accepted: 16 January 2026 / Published: 29 January 2026

Abstract

Affective computing has emerged as a pivotal field in human–computer interaction. Recognizing human emotions through electroencephalogram (EEG) signals can advance our understanding of cognition and support healthcare. This study introduces a novel subject-independent emotion recognition framework by integrating multiple EEG emotion databases (DEAP, MAHNOB HCI-Tagging, DREAMER, AMIGOS and REFED) into a unified dataset. EEG segments were transformed into feature vectors capturing statistical, spectral, and entropy-based measures. Standardized pre-processing, analysis of variance (ANOVA) F-test feature selection, and six machine learning models were applied to the extracted features. Classification models such as Decision Tree, Discriminant Analysis, Support Vector Machine (SVM), K-Nearest Neighbor (K-NN), Naive Bayes, and Artificial Neural Networks (ANN) were considered. Experimental results demonstrate that SVM achieved the best performance for arousal classification (70.43%), while ANN achieved the highest accuracy for valence classification (68.07%), with both models exhibiting strong generalization across subjects. The results highlight the feasibility of developing biomimetic brain–computer interface (BCI) systems for objective assessment of emotional intelligence and its cognitive underpinnings, enabling scalable applications in affective computing and adaptive human–machine interaction.
Keywords: cognition; brain computer interface; emotion; subject-independent classification; emotional intelligence; arousal; valence cognition; brain computer interface; emotion; subject-independent classification; emotional intelligence; arousal; valence

Share and Cite

MDPI and ACS Style

Panchal, J.; Singh, M.I.; Sandha, K.S.; Singh, M. Multi-Database EEG Integration for Subject-Independent Emotion Recognition in Brain–Computer Interface Systems. Mathematics 2026, 14, 474. https://doi.org/10.3390/math14030474

AMA Style

Panchal J, Singh MI, Sandha KS, Singh M. Multi-Database EEG Integration for Subject-Independent Emotion Recognition in Brain–Computer Interface Systems. Mathematics. 2026; 14(3):474. https://doi.org/10.3390/math14030474

Chicago/Turabian Style

Panchal, Jaydeep, Moon Inder Singh, Karmjit Singh Sandha, and Mandeep Singh. 2026. "Multi-Database EEG Integration for Subject-Independent Emotion Recognition in Brain–Computer Interface Systems" Mathematics 14, no. 3: 474. https://doi.org/10.3390/math14030474

APA Style

Panchal, J., Singh, M. I., Sandha, K. S., & Singh, M. (2026). Multi-Database EEG Integration for Subject-Independent Emotion Recognition in Brain–Computer Interface Systems. Mathematics, 14(3), 474. https://doi.org/10.3390/math14030474

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop