Abstract
Background/Objectives: This systematic review presents how neural and emotional networks are integrated into EEG-based emotion recognition, bridging the gap between cognitive neuroscience and practical applications. Methods: Following PRISMA, 64 studies were reviewed that outlined the latest feature extraction and classification developments using deep learning models such as CNNs and RNNs. Results: Indeed, the findings showed that the multimodal approaches were practical, especially the combinations involving EEG with physiological signals, thus improving the accuracy of classification, even surpassing 90% in some studies. Key signal processing techniques used during this process include spectral features, connectivity analysis, and frontal asymmetry detection, which helped enhance the performance of recognition. Despite these advances, challenges remain more significant in real-time EEG processing, where a trade-off between accuracy and computational efficiency limits practical implementation. High computational cost is prohibitive to the use of deep learning models in real-world applications, therefore indicating a need for the development and application of optimization techniques. Aside from this, the significant obstacles are inconsistency in labeling emotions, variation in experimental protocols, and the use of non-standardized datasets regarding the generalizability of EEG-based emotion recognition systems. Discussion: These challenges include developing adaptive, real-time processing algorithms, integrating EEG with other inputs like facial expressions and physiological sensors, and a need for standardized protocols for emotion elicitation and classification. Further, related ethical issues with respect to privacy, data security, and machine learning model biases need to be much more proclaimed to responsibly apply research on emotions to areas such as healthcare, human–computer interaction, and marketing. Conclusions: This review provides critical insight into and suggestions for further development in the field of EEG-based emotion recognition toward more robust, scalable, and ethical applications by consolidating current methodologies and identifying their key limitations.
1. Introduction
Emotions are arguably the most important aspect of human life, closely linked to cognition and the quality and richness of human experience [1,2,3]. The rapid development of effective computing technology has resulted in systems being developed for several domains to measure, understand, and respond to human needs [4,5,6]. One of the principal subfields in affective computing is based on non-invasive EEG [7,8]. This subdomain will be reviewed and discussed according to human emotions, EEG channels and bands, applications, databases, and machine learning algorithms [9,10].
Currently, advances in computer technology have allowed the creation of systems that are prepared to interpret, recognize, and express emotions, allowing the development of medical systems that diagnose and follow up on different emotional states, virtual reality applications capable of identifying the user’s emotional state to adapt the environment or change the users’ VR experience, or marketing applications capable of capturing the user’s reactions to different company products [11,12,13,14,15]. All these examples fall within the affective computing rubric, a multidisciplinary area that extends to computer science, psychology, and social expression science, whose purpose is to understand, measure, and respond to human emotions [16,17,18,19,20].
In artificial intelligence, recognizing bioelectric signals, especially electroencephalogram (EEG) signals, can represent the users’ emotional patterns. This is known as EEG-based emotion recognition [21,22,23]. The original intention of emotion recognition was to help people who could not perceive other people’s emotions well, including patients with autism, agnosia, and other neuropsychiatric diseases. However, recent research has mainly focused on moving towards real-world applications helping machines recognize people’s emotional words [24,25,26]. However, the advancement of signal processing techniques and affective neuroscience has not been fully utilized in this decade-long research. This review aims to integrate and summarize the recent progress of emotion recognition and provide some recommendations and current challenges for researchers interested in this topic [27,28].
This research aims to merge different complex knowledge bases, from affective neuroscience to signal processing. Furthermore, we hope the separation between researchers from various research areas will gradually decrease after summarizing these vast and diverse studies. These embedded communities share the same EEG signal but with different expectations and could benefit from each other’s designs. Our article will describe the purpose and methods of each work, and we will summarize the significant conclusions of each study. This article will compare the discrepancies between the considerable findings of different tasks and identify future challenges for the research community. Specifically, we will carefully discuss the emotional feedback loops for integrating future real-world applications.
2. Literature Review
2.1. Neural Networks in Cognitive Neuroscience
First should be an introduction to the problem area. A more detailed description of neural networks might help understand the two subfields. Ideally, the description should include the current state of the art, frontiers of research, and a bit of a pragmatic tone to be educational. Then, more specialized information for each subfield should follow [29,30,31]. For the last part, bridge phrases should draw the two subfields together and point to the limitations of the current practice in both. The conclusion should end the section after a couple of sentences summarizing what has been said. The study and modeling of cognitive processes through a neurobiological approach are as old and diverse as the field of cognitive neuroscience itself. Cognitive neuroscience has grown around integrating a pattern of specialized but segregated neural networks that process different categories of sensory input and an abstract model of higher cognitive activity [32,33,34]. Moreover, while asymmetry in brain function was often beyond direct observation in external behavior or internal experience, the same dichotomy eventually appeared in psychological models of emotional and cognitive processes, notably with the identification in the last decades of a mentalizing network and a central executive large-scale network model. Equally significant, albeit a less researched aspect, a growing body of evidence has suggested that the core cognitive control role of the central executive network also extends to the processing of emotionally salient information by recruiting the regions in emotional regulation and in recruiting them for the top-down deployment of selective attention on emotional stimuli, particularly when resisting interfering emotional activity [35,36].
2.1.1. Fundamentals of Neural Networks
Artificial Neural Networks (ANNs) are computational models inspired by the structure and function of biological neural networks in the human brain. ANNs consist of interconnected layers of artificial neurons that process and transmit information through weighted connections. Typically, ANNs have an input layer, one or more hidden layers, and an output layer. The input layer receives raw data, the hidden layers transform this data through weighted computations and activation functions, and the output layer provides the final processed information, which can represent various outcomes, such as classification, regression, or decision-making tasks [37,38]. Each neuron in an ANN applies a mathematical function to its inputs, typically a weighted sum followed by an activation function. Standard activation functions include sigmoid, ReLU (Rectified Linear Unit), and tanh, which introduce non-linearity into the model, allowing it to learn complex patterns [39]. Training an ANN involves adjusting the weights of connections using optimization techniques such as gradient descent to minimize the difference between predicted and actual outputs. ANNs have broad applications, including pattern recognition, natural language processing, and speech recognition. In speech processing, ANNs can identify phonetic patterns even in noisy environments, enabling robust voice recognition systems [40,41]. One of the most used architectures is the multilayer perception (MLP), which consists of multiple layers of neurons and uses backpropagation for training [42]. Unlike biological neural networks, where neurons are specialized for different functions, such as sensory or motor neurons, the neurons in an ANN are generalized computational units. The output layer can take various forms depending on the application, ranging from a classification label to a numerical prediction [43]. Furthermore, real-time processing applications like robotics and autonomous systems can integrate ANNs with sensor input to enable adaptive decision-making based on incoming data [44,45].
2.1.2. Applications in Cognitive Neuroscience
Direct observation of neural activities evoked by emotional stimuli and scenarios is often needed to understand the neural dynamics of emotion. Therefore, the practical and precise technique for measuring brain emotional responses becomes indispensable. With high temporal resolution, simultaneously recorded electrophysiological signals, such as the electroencephalogram and magnetoencephalogram, contain significant information about the neural processes at a millisecond level during different emotional states, which modern machine learning models capture as features to recognize emotional states [46]. Generally, in cognitive neuroscience, EEG routines have a higher temporal resolution when subjects watch specific video clips or pictures that elicit explicit emotions [47]. Besides, EEG is widely applied to record brain response signals during face-to-face social-emotional communication [48]. With a more natural experimental setup that does not pressure the subjects, real emotions can be effectively elicited among people. Compared to facial expressions and speech data, which have some disadvantages in the field of actual application, such as needing a specific well-focused context and environmental noise—the advantages of well-known, pre-processed, and widely acknowledged private databases of well-collected non-subjective neural signals make EEG seemingly more suitable under non-controlling and natural environments [49].
2.2. Emotional Networks and EEG-Based Emotion Recognition
Motivated by the increased urgent need for real-time automatic emotion recognition in unobtrusive cognitive models of complex human behaviors, this systematic review first comprehensively reviews the recent progress of emotion recognition by analyzing how it is used in cognitive neuroscience. We aim to extract valuable key techniques and methods for developing the next generation of cognitive models of emotion recognition. Then, we identify a gap between cognitive neural models of dynamic emotional cognitive processes and rule- or plasticity-based static or classifier-based computational models of emotion recognition [50]. Motivated by this gap, this systematic review finally shows how real-world EEG-based effective computational emotion recognition has been developed and will be developed and outlines possible useful emotional and neural features unique to EEG data that can support and improve the development of the next generation of cognitive neural models of human emotions and emotional disorders [51]. Neural emotional networks encode emotions from core brain elements such as the amygdala, thalamus, prefrontal cortex, and related structures. This macro or functional network provides essential clues to neural models of emotions; that is, in addition to neural support, cognitive and real-world behavioral models should preferably capitalize on information measured from these emotional networks [52]. The capacity needed is consistent with our previous work supporting cognitive neuro-economical models. Ideally, neuro-economical models operate on a wireless measurement of emotional or implicit feedback measured from these neural elements. These models should be non-invasive and cost-effective for real-world applications [53,54,55].
2.2.1. Understanding Emotional Networks
Understanding the process underlying the generation and processing of emotional states is a key challenge in emotional and social neuroscience. The Papez circuit, consisting primarily of the prefrontal cortex, cingulate cortex, subcortical regions, and hippocampus, was initially described as the “emotional brain” and was accepted as the most relevant pathway for understanding the generation of basic emotional states for several decades. However, as new data has been made available, these ideas have been revisited. Currently, this network is considered part of the default mode network, including the anterior cingulate cortex and the insula [56]. An extensive body of evidence demonstrates that multiple brain structures generate, recognize, and regulate emotional reactions. These structures include the prefrontal cortex and limbic system-related regions, including the amygdala, anterior cingulate cortex, insula, orbital frontal cortex, and ventral striatum [57]. These and additional areas not considered part of the default mode network are more strongly activated during the generation or perception of emotional states [58,59].
A significant part of the research that describes the neurobiological aspects of emotion recognition has emerged from studying patients with neurological or psychiatric disorders. Patients with lesions in specific regions of the brain, particularly the prefrontal cortex and the limbic region, tend to present deficits in generating emotions [60]. Moreover, studies in subjects diagnosed with depression, anxiety, bipolar disorder, antisocial personality, and schizophrenia have reported significant impairments in their ability to produce, interpret, and/or regulate their own and others’ emotions during interpersonal communications, which impairments are correlated with aspects of social cognition [61]. These observations suggest that success in emotionally motivated behaviors is associated with integrating numerous physical sensations, thoughts, and ongoing contextual and affective experiences and promptly producing the most appropriate responses. These observations underscore the importance of engaging the right NRF in each moment of interpersonal interaction [62,63].
2.2.2. EEG-Based Emotion Recognition Techniques
EEG-based emotion recognition relies on analyzing electrical activity in the brain to identify emotional states. This process typically involves two main steps: feature extraction and emotion classification. Feature extraction involves identifying and isolating relevant EEG signal characteristics, while classification applies machine learning or statistical techniques to categorize emotions based on these extracted features. Additionally, feature extraction is a crucial step in EEG-based emotion recognition, as it aims to extract discriminative features that represent underlying emotional states. The following are several commonly used feature extraction techniques:
- Event-Related Potentials (ERPs): ERPs are transient, time-locked EEG responses to specific sensory, cognitive, or affective stimuli. They are obtained by averaging EEG responses across multiple presentations of the same event, allowing researchers to isolate neural activity associated with emotional processing. Key ERP components, such as P300, N200, and LPP (Late Positive Potential), have been linked to emotional processing and cognitive evaluation of stimuli. ERPs provide high temporal resolution, which is ideal for tracking dynamic emotional responses. However, their dependency on repeated trials and controlled experimental conditions can limit their real-world applicability [64,65,66].
- Spectral Features and Frequency Band Analysis: EEG signals are decomposed into different frequency bands—delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (30–100 Hz)—each of which plays a role in cognitive and emotional processes. For instance, increased theta and delta power is often associated with emotional arousal. At the same time, alpha asymmetry between hemispheres is linked to emotional valence (e.g., more significant left-hemisphere alpha activity is associated with positive emotions, while right-hemisphere dominance correlates with negative emotions). Spectral power analysis helps quantify these variations and is widely used in emotion recognition studies [67,68,69].
- Power Asymmetry Analysis: Emotional states are often associated with hemispheric asymmetry in EEG activity, particularly in the frontal cortex. The frontal alpha asymmetry (FAA) model suggests that more excellent left-frontal alpha activity is linked to approach-related positive emotions. In contrast, right-frontal alpha activity is associated with withdrawal-related negative emotions. Power asymmetry analysis can effectively distinguish between emotional states, making it a valuable feature for EEG-based emotion recognition [70].
- Time-Frequency Analysis: This method integrates temporal and spectral information, allowing researchers to track how EEG power distributions evolve. Wavelet Transform (WT) and Short-Time Fourier Transform (STFT) are commonly used for time-frequency analysis in emotion recognition. These methods provide insights into transient changes in EEG rhythms that correlate with emotional responses, enhancing the robustness of emotion classification models [71].
- Effective Connectivity Analysis: Beyond analyzing isolated EEG components, effective connectivity methods assess how different brain regions communicate during emotional processing. Techniques such as Granger causality analysis, phase-locking value (PLV), and dynamic causal modeling (DCM) help quantify the directional influence of neural activity between brain regions. Effective connectivity metrics are beneficial for understanding the neural circuits underlying emotional experiences and have been applied in advanced emotion recognition frameworks [72,73].
2.3. Integration of Neural Networks and Emotional Networks
Integrating Artificial Neural Networks (ANNs) and Emotional Networks in EEG-based emotion recognition offers a promising avenue for enhancing our understanding of the neural mechanisms underlying affective states. This integration bridges computational intelligence with insights from cognitive neuroscience, enabling more accurate and adaptive emotion recognition systems. By leveraging deep learning techniques such as Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), emotion recognition models can extract discriminative EEG features while incorporating the dynamic properties of emotional networks observed in the brain’s limbic and prefrontal systems [74,75,76,77,78,79].
This paper presents a systematic review that explores EEG-based emotion recognition from the novel perspective of integrating neural and emotional networks. Unlike previous studies that primarily focused on either machine learning techniques or neuroscience-driven approaches in isolation, our framework synthesizes both domains, emphasizing their complementary roles in emotion detection. Specifically, our analysis addresses:
- How neural networks enhance EEG feature extraction by utilizing deep learning models to capture complex temporal and spatial EEG dynamics related to emotions.
- How emotional networks contribute to understanding affective processing, particularly in the prefrontal cortex, amygdala, and anterior cingulate cortex, which play pivotal roles in emotional regulation and perception.
- The application of hybrid neural–emotional models in real-time, subject-adaptive emotion recognition will pave the way for personalized AI-driven affective computing.
2.4. Research Questions
The research questions below are structured to connect technological advancements, experimental design, emotion recognition methodologies, and real-world implementations. These questions aim to bridge interdisciplinary approaches while emphasizing the role of neural and emotional networks in enhancing our understanding and application of emotion recognition systems.
RQ1: [Data-Driven Techniques] How can machine learning, data augmentation, and signal processing methods enhance the extraction and classification of EEG features for emotion recognition across diverse datasets and real-world conditions? This question addresses the need for advanced computational methods that overcome variability in datasets, improving the scalability and accuracy of EEG-based models in dynamic and diverse environments.
RQ2: [Experimental Paradigms and Approaches] What experimental setups, protocols, and stimuli are most effective in eliciting robust, ecologically valid EEG responses for studying emotional and cognitive processes across diverse populations? This question highlights the importance of ecological validity by focusing on the experimental design. It explores how experimental protocols can simulate real-world emotional experiences.
RQ3: [Emotion Recognition Techniques] What are the strengths and limitations of current EEG-based emotion recognition methodologies, and how can multimodal integration improve their real-time performance and adaptability in practical applications? This question examines the methodologies and multimodal techniques that enhance the real-time applicability of EEG-based systems while addressing their inherent limitations, such as noise and variability.
RQ4: [Behavioral and Cognitive Insights] How do EEG patterns reflect individual differences in emotional and cognitive processing, and what insights can they provide into personality traits, behavioral responses, and neuropsychological conditions? This question delves into the relationship between neural activity and individual differences, emphasizing the potential for personalized emotion recognition systems.
RQ5: [Applications and Ethical Considerations] How can EEG-based emotion recognition systems be applied effectively in real-world contexts, such as healthcare, education, and marketing, while addressing technical challenges and ethical implications? The final question extends the discussion to practical implementations, exploring the potential for EEG-based systems to transform various industries and the ethical considerations that must be addressed in such applications.
These interrelated questions serve as a comprehensive guide to understanding the multifaceted challenges and opportunities in EEG-based emotion recognition. They align closely with the manuscript’s objectives, offering a structured approach to integrating theoretical insights with practical applications.
3. Materials and Methods
This systematic review explores the interface between cognitive neuroscience, affective neuroscience, and practical applications of EEG-based emotion recognition with the PRISMA methodology (Supplementary Materials) [80]. This research will integrate studies concerning advancement in the integration of neural and emotional networks, with special attention to EEG-based emotion detection methods and their applications in real-world contexts, including healthcare, human–computer interaction, and marketing. The review covers methodologies related to feature extraction, emotion classification, and multimodal approaches, underlining novel machine learning techniques and signal processing methods. Current trends are discussed, including challenges in adaptability, real-time processing, and scalability for applications across different environments. Also, this review considers results from 64 studies in detail and underlines significant advancements and lacunae; therefore, it deeply recognizes the role of EEG-based emotion recognition in promoting interdisciplinary research and practical implementations.
3.1. Search Strategy
The query strings for each database/interface combination were designed to capture studies relevant to EEG-based emotion recognition, guided by the inclusion criteria and research objectives.
For PubMed, the query string was ((EEG OR “electroencephalogram”) AND (“emotion recognition” OR “affective neuroscience”) AND (“machine learning” OR “deep learning” OR CNN OR RNN)) AND (2000:2024 [Date-Publication]).
In Scopus, the search employed TITLE-ABS-KEY ((EEG OR “electroencephalogram”) AND (“emotion recognition” OR “affective neuroscience”) AND (“machine learning” OR “deep learning” OR CNN OR RNN)) AND PUBYEAR > 1999.
For Web of Science, the query was TS = (EEG OR “electroencephalogram”) AND TS = (“emotion recognition” OR “affective neuroscience”) AND TS = (“machine learning” OR “deep learning” OR CNN OR RNN) AND PY = (2000–2024).
Also, the Google Scholar search string used was “EEG emotion recognition” AND (“machine learning” OR “deep learning” OR CNN OR RNN) AND (“affective neuroscience”) after: 1999.
Finally, in PsycINFO via EBSCOhost, the query string was (DE “Electroencephalography” OR DE “EEG”) AND (DE “Emotion Recognition” OR DE “Affective Neuroscience”) AND (DE “Machine Learning” OR DE “Deep Learning”) AND (PY 2000–2024). These query strings incorporated Boolean operators and database-specific syntax to ensure comprehensive retrieval of relevant studies while filtering for publication years, methodologies, and focus areas.
The search has been carried out following a systematic PRISMA methodology to ensure comprehensiveness. A protocol detailing the objectives, eligibility criteria, information sources, and analysis methods was registered on Open Science Framework (https://osf.io/36qzj (accessed on 14 February 2025) | Registration DOI: 10.17605/OSF.IO/36QZJ) [81]. Database searches identified 298 records in the following databases: PubMed, Scopus, Web of Science, Google Scholar, and PsycINFO. After removing 73 duplicates, language restrictions excluded 18, and 28 records for publication dates before 2000 were removed; 15 more were for irrelevant titles, and thus, 164 remained to be screened. Of these, 24 records were excluded during screening as unrelated to the topic, and 21 were excluded as non-empirical works, that is, commentaries or opinion pieces. Full-text reports were sought for retrieval on 119 of these; 7 could not be accessed. Of the 112 reports assessed for eligibility, 11 were excluded due to insufficient methodological detail, 18 due to being conference abstracts, and 19 due to focusing on unrelated populations, such as animal studies. Finally, 64 studies were selected as eligible for inclusion in the final systematic review, providing a strong basis for the review (Figure 1).
Figure 1.
Flowchart of PRISMA methodology.
3.2. Inclusion and Exclusion Criteria
Inclusion and exclusion criteria were established to maintain focus on EEG-based emotion recognition within cognitive neuroscience and its real-world applications.
Studies were included if they met all of the following criteria:
- Presented empirical findings on EEG-based emotion recognition and its applications in healthcare, human–computer interaction, education, and marketing.
- Were published in peer-reviewed journal articles between 2000 and 2024.
- Utilized EEG signal processing, machine learning techniques, or emotion classification frameworks.
- Involved human participants and provided relevant neuroimaging data applicable to emotion recognition.
- Were published in English.
Studies were excluded if they met any of the following criteria:
- Were non-empirical works such as commentaries, reviews, opinion pieces, or theoretical articles without experimental data.
- Included conference abstracts, gray literature, or non-peer-reviewed publications.
- Were published in languages other than English.
- Were published before 2000.
- Focused on animal studies unless explicitly related to neural mechanisms applicable to human emotions.
- Lacked sufficient methodological detail or did not meet the rigor required for systematic analysis.
- Were not directly relevant to EEG-based methods, emotion recognition, or neuroimaging techniques within the scope of cognitive neuroscience.
These criteria ensured the inclusion of studies that adhered to methodological rigor while directly applicable to the review’s objectives.
3.3. Risk of Bias Assessment
All the studies selected for this review underwent a systematic assessment for bias with an appropriate tool based on study design (Table 1). In this case, the Cochrane Risk of Bias Tool was used for randomized controlled studies, while observational studies were assessed for their quality using the Newcastle–Ottawa Scale. Each study was evaluated for the five domains: selection bias, performance bias, detection bias, attrition bias, and reporting bias (Figure 2). The selection of participants had a low risk of bias in some studies, where randomization techniques and appropriate inclusion/exclusion criteria were reported. However, the risk was unclear in a few studies because there was insufficient detail about randomization or recruitment methods. Regarding blinding, many studies involving EEG experiments, especially those in which emotional stimuli are administered, showed inconsistent practice. About 20% of the studies did not report whether the participants or researchers were blinded to the conditions or did so inadequately, presenting a moderate risk of performance bias. Most studies showed robust EEG signal acquisition and processing techniques with appropriate validation for machine learning-based emotion classification. However, incomplete validation procedures or lack of cross-validation were reported in some of the studies, which gave them a higher risk of bias regarding detection methods. Most of the studies had very minimal participant dropouts. However, several included studies did not report any methods for missing EEG data; thus, a moderate risk of attrition bias was introduced. In a few studies, selective outcome reporting was considered where either outcome was incompletely reported or replaced with exploratory findings, indicating a high risk of reporting bias. Of the 64 included studies, 60% were rated as low risk of bias, 25% as moderate risk, and 15% as high risk across the five domains. Most studies that showed a high risk of bias lacked methodological detail in randomization, blinding, or data handling. These biases were considered during the synthesis of results by focusing on the findings of the low-risk studies and discussing critically any potential bias of higher-risk studies. This provides a more robust and valid conclusion from this systematic review.
Table 1.
Research articles of systematic analysis (N = 64).
Figure 2.
Risk of bias assessment across five domains.
4. Results
The results of this systematic review explore integrating EEG-based emotion recognition methodologies, neural networks, and their applications across diverse fields. In total, 64 studies showed the main trends in EEG feature extraction, classification techniques, and emotion recognition performance. The review discusses the leading machine learning methods, the most used EEG channels, and frequency bands. Moreover, it gives a clear visualization that helps the reader understand the strengths and limitations of each technique for its performance, scalability, and practical applicability in enhancing real-world applications such as healthcare, human–computer interaction, and marketing. The chart below (Figure 3) compares the machine learning techniques in EEG-based emotion recognition studies. Techniques such as CNN, RNN, SVM, and deep learning are assessed across key dimensions: performance, scalability, cost, temporal resolution, and spatial resolution. The results highlight the strengths and trade-offs of each approach. For example, deep learning excels in performance and temporal resolution, while SVM stands out for its cost-effectiveness and scalability. CNN achieves a balance across most metrics, whereas RNN demonstrates competitive scalability and temporal performance.
Figure 3.
Comparison of EEG techniques based on research insights.
4.1. [RQ1]: How Can Machine Learning, Data Augmentation, and Signal Processing Methods Enhance the Extraction and Classification of EEG Features for Emotion Recognition Across Diverse Datasets and Real-World Conditions?
Understanding EEG-based emotion recognition requires integrating advanced signal processing, feature extraction, and machine learning techniques. This section provides detailed analysis, structuring findings into distinct subsections: signal processing methods, feature extraction techniques, classification models, real-world applications, and methodological considerations. The extended discussion aims to critically analyze state-of-the-art methods and their limitations, fostering deeper insight into emerging trends and potential research directions.
4.1.1. Signal Processing Techniques for EEG Preprocessing
Effective preprocessing to reduce noise and enhance the signal quality is indispensable for emotion recognition. ICA remains one of the gold-standard techniques for eliminating electrical, muscular, and visual artifacts [109]. However, there are significant limitations to the application of ICA, especially regarding its sensitivity to parameter selection and computational cost. In return, AAR has been introduced as an implemented technique for eliminating eye movement and blinking artifacts [128]. The REST technique increases the emotional modulation effects at occipitotemporal electrodes and, when combined with band-pass filtering of 0.1–120 Hz, enhances the clarity of the signal [128]. Normalization techniques, such as z-score standardization and min–max scaling, reduce interparticipant variability, thus allowing for more robust generalization across datasets [132]. Other recent extensions include Kalman filtering for spike removal [140], cumulative general linear modeling to account for drift, and autoregressive modeling to correct serial correlations. These methods yield considerable improvements in the clarity of the EEG signal, thereby lowering error rates in subsequent classification tasks. However, these extensions increase the computational load and, as they are not yet standardized within different research groups, they also need to be validated.
4.1.2. Feature Extraction: Temporal, Spectral, and Connectivity Analysis
Feature extraction determines the discriminability of the emotional states from the EEG data. Temporal features include EEP, occurring at 120–180 ms post-stimulus at Fz, Cz, and Pz electrodes [114], and FRN is observed from 200 to 400 ms post-feedback at the frontocentral electrodes that provide the key information in the process of emotion recognition. It is essential to mention the 10–20 system, a standardized electrode placement method in EEG recordings, ensuring consistency across research and clinical applications. Electrodes are positioned relative to specific brain regions, facilitating studying cognitive and affective processes. In this system, electrodes are labeled based on their location:
- Frontal (F), Central (C), Parietal (P), Temporal (T), and Occipital (O) regions.
- Electrodes along the midline are denoted with a “z” (e.g., Fz, Cz, Pz) to indicate their position along the sagittal plane.
Key midline electrodes include the following:
- Fz (Frontal Midline)—Located in the prefrontal cortex, associated with cognitive control, attention regulation, and decision-making.
- Cz (Central Midline)—Positioned at the vertex of the scalp, crucial for sensorimotor processing and movement-related potentials.
- Pz (Parietal Midline)—Situated over the parietal cortex, involved in spatial cognition, working memory, and sensory integration.
The accompanying EEG electrode placement map (Figure 4) visually represents the 10–20 system, highlighting key electrode locations and their functional relevance. The layout ensures optimal coverage of neural activity, supporting accurate signal acquisition for emotion recognition and cognitive neuroscience applications. This standardized electrode configuration plays a fundamental role in EEG-based emotion recognition, providing a structured framework for analyzing emotional, cognitive, and physiological responses in real-world and experimental settings.
Figure 4.
EEG electrode placement (10–20 system).
Spectral feature analysis indicates emotion-specific frequency band dynamics. Delta-band activity is associated with motivational and behavioral inhibition processes [143], whereas gamma-band activity corresponds to the processing of positive emotional imagery [144]. The theta oscillations (4–8 Hz) have been related to cognitive control and emotional regulation, particularly in frontal regions [145]. Connectivity analyses allow higher-order views of neural interactions. Techniques such as lagged phase synchronization [109], discriminative spatial network patterns [129], and Imaginary Part of Coherency [135] reduce volume conduction artifacts and provide deeper insights into network-wide emotional processing. While these approaches enhance feature extraction, they require high computational power; therefore, real-time applications are impractical without further optimization.
4.1.3. Classification Techniques: Traditional and Deep Learning Approaches
Traditional machine learning and deep learning models have been applied to evolving EEG-based emotion recognition. Traditional classifiers, such as L1-regularized logistic regression, are computationally efficient and yield good generalization capability [132]. Polynomial kernel SVMs have also been robust in emotion classification tasks [102]. Deep learning models have significantly improved classification performance: Shallow ConvNet reaches 99.65% in mix-subject scenarios, while Deep ConvNet reaches 95.43% [128]. Transfer learning approaches integrating intra- and inter-subject classification strategies have further improved limited datasets. However, deep learning models require much-labeled data, which remains challenging given the few publicly available EEG datasets. A seminal study [134] has shown how deep learning models could predict conflict engagement with 95% accuracy and 33% over chance. In this work, the key neuro-physiological markers in the occipital cortex and superior frontal gyrus were identified to demonstrate the potential of deep networks in decoding emotion-related neural signatures. However, interpretability could not be ensured because of the black-box nature of the deep learning models, which introduces serious risks in critical domains such as healthcare.
4.1.4. Real-World Applications and Wearable EEG Devices
Wearable EEG devices revolutionized the applications of emotion recognition outside the laboratory. The Emotiv EPOC+ headset successfully measures six neural emotional parameters [113]. The Muse headband shows proper reliability within ecological contexts [87]. Despite such progress, environmental noise and inter-subject variability are formidable challenges [89]. The clinical applications of neurofeedback training based on EEG data are promising. Machine learning models have classified symptom severity in PTSD patients and predicted treatment outcomes for depression [118]. These studies suggest that EEG-based emotion recognition may have profound implications for diagnostics and intervention strategies in mental health. However, real-world implementation faces several challenges due to movement artifacts contaminating signals, the requirement for standardized protocols, and the poor battery life of portable EEG systems. These barriers can only be overcome with further development of adaptive noise filtering techniques and more efficient feature extraction methods.
4.1.5. Methodological Considerations and Future Directions
The evaluation of an EEG-based emotion recognition system must not be based solely on classification accuracy. In fact, AUC, sensitivity, specificity, and kappa statistics can also further indicate the quality of a model. It is still essential to consider cross-validation to avoid model overfitting [132], but variations in validation strategy in different research works call for unified benchmarking. A study [86] has shown that a low-pass filter at 30 Hz and a high-pass filter at 0.16 Hz could retain the emotional signals and eliminate the noise. These parameters may not be appropriate for all datasets; thus, adaptive filtering methods may be more suitable. Another important factor contributing to generalizability is the diversity of datasets. A demographic-balanced study of 11 participants (6 males, 5 females, 7 Asians, and 4 Caucasians) underlined the importance of diverse training for emotion recognition models [128]. Moreover, ecological validity is increased due to naturalistic paradigms like emotional film clips and IAPS images [97,135], further ensuring that emotion recognition systems perform well in the wild. For EEG-based emotion recognition, multimodal integration opens new perspectives. Robustness could be further improved by combining this modality with fMRI [127], behavioral measures [87], and physiological markers such as heart rate variability. Future work should consider adaptive approaches that will consider individual differences in neural responses, including creative self-efficacy-mediating variance and alexithymia-related emotional processing differences [145].
4.1.6. EEG Emotion Recognition Techniques and Performance
Figure 5 below shows a more integrated visual view into the effectiveness and performance of different EEG-based emotion recognition methodologies from three critical standpoints: (a) pre-processing techniques of signals, (b) model classification techniques, and (c) techniques to extract features. While having effectiveness in blue, overlaid by red, representing performance, one gets an account of various methodologies with trade-offs regarding their strength. Here are the key insights from the radar chart:
Figure 5.
Radar chart of EEG emotion recognition techniques and performance.
- Preprocessing Techniques
- ○
- Independent Component Analysis (ICA) and Normalization exhibit the highest effectiveness and performance, making them crucial for artifact removal and data standardization.
- ○
- Kalman Filtering and REST demonstrate moderate effectiveness but perform less due to computational complexity and real-time limitations.
- Classification Models
- ○
- Deep learning models (Shallow ConvNet, Deep ConvNet) outperform traditional methods, achieving high accuracy and robustness.
- ○
- SVM and Logistic Regression, while computationally efficient, show lower performance than deep learning approaches.
- Feature Extraction Methods
- ○
- Temporal, Spectral, and Connectivity-based feature extraction techniques all contribute significantly to emotion recognition.
- ○
- Spectral analysis demonstrates higher performance, likely due to its ability to capture relevant frequency-based neural patterns.
Accuracy, computational efficiency, and real-world applicability again show a trade-off in this analysis. While deep learning models and advanced preprocessing techniques have demonstrated high performance, their high computational cost presents a challenge in real-time EEG emotion recognition systems. Therefore, further research should optimize such techniques to assure high accuracy and feasibility in real-life applications.
In conclusion, machine learning, data augmentation, and signal processing techniques have significantly advanced EEG-based emotion recognition. Deep learning-based approaches achieved state-of-the-art classification performance, while new multimodal strategies promise auspicious improvements concerning real-world applicability. However, challenges include inter-subject variability and sensitivity to environmental noise. Refined preprocessing techniques, more diversity in the datasets, and integration of other complementary modalities will be necessary in future studies to further improve robustness and applicability. By continuously incorporating advances in computational models and tackling several methodological challenges, EEG-based emotion recognition could bring more real-life impact across the board of application domains, ranging from healthcare to human–computer interaction.
4.2. [RQ2] What Experimental Setups, Protocols, and Stimuli Are Most Effective in Eliciting Robust, Ecologically Valid EEG Responses for Studying Emotional and Cognitive Processes Across Diverse Populations?
Researchers have developed and refined various experimental setups, protocols, and stimuli to investigate emotional and cognitive processes using EEG across diverse populations. The most effective methodologies integrate standardized and naturalistic stimuli, multimodal approaches, rigorous experimental designs, and considerations for population-specific factors. The complexity of these experimental designs and their increasing relevance in applied settings necessitate a closer examination of each domain’s strengths, limitations, and advancements.
4.2.1. Stimuli for Inducing Robust Emotional and Cognitive EEG Responses
Standardized image databases, such as IAPS, have been commonly utilized to present emotionally salient stimuli within an EEG experiment to ensure similar responses across subject groups. These images are presented in blocks of positive, negative, or neutral content, which presents reliable neural correlations of emotion but includes necessary cool-down periods to minimize carry-over effects [135,139]. More recently, film clips have been used more because they possess more substantial ecological validity and are dynamic, capturing more complicated emotional responses than static images alone can provide [97]. Scientists have investigated that the point-light display effectively assesses emotion recognition using minimalist visual cues that isolate fundamental aspects of emotional processing [124,137]. Advances have also been made in creating more naturalistic social stimuli, with animated 3D avatars particularly useful in capturing emotion recognition and social cognition mechanisms [86]. Using dynamically emotional faces through face morphing algorithms has further improved stimulus realism and validity, enabling a more nuanced exploration of emotion recognition and regulation [140]. Recent studies have moved to multimodal stimulation, demonstrated by visual and auditory stimulation combinations, to increase ecological validity. For instance, it has been shown that presenting IAPS images with emotionally congruent auditory stimulation enhances EEG responses, indicating a deeper interaction between visual and auditory emotional processing [139]. Researchers investigating resting-state EEG paradigms have used slideshow presentations of natural versus urban environments to study their impact on emotional states, demonstrating that environmental context plays a crucial role in modulating neural responses [109]. The increasing adoption of multimodal approaches highlights the importance of integrating multiple sensory channels into experimental setups to capture a more comprehensive picture of emotional and cognitive processing.
4.2.2. Experimental Protocols and Paradigms
Such experimental paradigm refinement has significantly enhanced EEG research’s robustness in emotional and cognitive processes. The Affective Posner Task has been used to study emotion regulation patterns through attentional shifts. This indicates that attentional biases toward emotionally salient cues modulate early- and late-stage EEG components [134]. Cognitive reappraisal tasks have provided key insights into how individuals actively regulate emotional reactivity, with findings indicating distinct neural activation patterns associated with successful versus unsuccessful emotion regulation [120]. The Go/NoGo Emotion Recognition Task has been employed to examine the rapid categorization of emotional stimuli. ERP trials show distinct activity patterns in response to positive and negative facial expressions at approximately 170 ms at occipitotemporal electrodes [132]. These findings suggest emotion processing may involve early perceptual mechanisms and higher-order cognitive appraisals. Resting-state EEG has become a meaningful way to establish a baseline of neural activity associated with various emotional states. Many such studies, using pre- and post-stimulus resting-state EEG recordings, have investigated functional connectivity changes following an affective stimulus, which are differentially affected in various populations [109]. Indeed, the value of EEG for tracking stability and change in emotional processing over extended periods has also been shown in longitudinal studies. For example, neurofeedback training programs based on EEG-based markers of emotional regulation have reported significant changes in neural activity after weeks of training [82,142]. Longitudinal EEG studies have provided important information about the efficacy of therapeutic interventions and training regimens in modifying emotional and cognitive responses.
4.2.3. Technological Considerations in EEG Data Collection
Improvements in EEG technology have enabled better data capture for varying experimental conditions. High-resolution systems, like a 64-channel arrangement from NeuroScan, have been employed for various neurophysiological recordings; these arrangements give excellent temporal and spatial resolutions of brain activity [132]. This trend is further supported by the increasingly frequent use of portable EEG headsets, such as the Muse and Emotiv EPOC+, in real-world research settings due to their increased ease of use and better signal quality, thus enabling data collection in more naturalistic conditions [87,113]. Beyond integrating with other physiological measures such as fMRI, ECG, and GSR, it has further strengthened the capability to record complete data from both affective and cognitive processes [95,118,127]. Indeed, co-registered EEG-fMRI has been shown to provide complementary insights into the neural dynamics–brain structure interplay, especially regarding emotion regulation and decision-making processes [127]. Real-time neurofeedback applications have opened new avenues for adaptive interventions in emotional regulation. These studies using EEG-based video-game training report that participants can actively self-modulate their neural responses through continuous feedback, resulting in long-term improvements in emotional control and cognitive flexibility [88,89,140]. Such findings indicate the potential of EEG neurofeedback paradigms for therapeutic use, especially in populations requiring emotion regulation training. Preprocessing techniques such as realignment, spatial smoothing, and temporal filtering have enhanced the clarity of EEG signals, hence allowing high-quality data collection and analysis [140]. Sampling rates as high as 5000 Hz have enabled the capture of accurate temporal patterns of neural activity associated with emotional processing.
4.2.4. Population-Specific and Contextual Considerations
EEG investigations have diversified to include an increasingly broad range of populations, allowing the complex emotion and cognition interaction effects to be investigated across demographic and clinical groups. There are EEG studies investigating responses in terminal cancer patients receiving palliative care, which indicate altered affective processing due to chronic illness [122]. Chronic stroke patients also have distinct EEG features of emotional dysregulation that emerge, indicating a need for individually tailored interventions in neurorehabilitation [130]. The examination of EEG responses in female genocide survivors has provided critical insights into trauma-related emotional processing and the potential for neurofeedback-based interventions [89]. Professional groups, such as managers versus non-managers, have also been compared; occupational experiences shape cognitive and affective processing patterns [87]. Further, cultural influences have also been widely investigated, and EEG studies have also included a variety of ethnic backgrounds to identify universal and culture-specific patterns of emotional processing [128].
4.2.5. Future Directions and Integrative Approaches
Future EEG research should be done to extend the use of real-world settings in enhancing ecological validity, with field studies assessing effective and cognitive processes in naturalistic environments. Refining personalized EEG paradigms to individual differences in emotional and cognitive traits will enable sensitive inquiries into neural variability. Neurofeedback interventions using real-time EEG adaptive algorithms hold promises of clinical use, especially in the mental health sector, where emotion regulation is among the critical therapeutic targets. Integrating machine learning techniques in analyzing EEG data will further enhance this, hence providing a more accurate classification of emotions and cognitions that may enable higher-order applications in brain–computer interfaces. While high-resolution recordings and multimodal integration remain key developing trends in the EEG research arena, population-specific adaptations will largely dictate the future of affective and cognitive neuroscience. In the interest of a broad review of these EEG technologies, we’ve designed a heat map displaying a few of these methodologies using key metrics. The ecological validity score, for obvious reasons, has the first weight, given the reason for assessment is to test ecological validity, i.e., the technology applied to real-life circumstances. Cost efficiency, the technical set-up complication level, the signal quality, and popularity within the studies’ rate supplementation will give this scoring more integrity. Figure 6 provides insights into key aspects considered on the heatmap below:
Figure 6.
Heatmap of EEG technologies: ecological validity and additional metrics.
- Ecological Validity: Portable EEG and multimodal integration techniques demonstrate the highest ecological validity (score: 9), making them ideal for real-world applications. High-density EEG, while offering superior resolution, has lower ecological validity due to its lab-based constraints (score: 6).
- Cost Efficiency: Portable EEG has the highest cost efficiency (score: 9), making it more accessible for large-scale studies and real-world applications. EEG-fMRI has the lowest cost efficiency (score: 2) due to the high operational and maintenance costs.
- Setup Complexity: EEG-fMRI has the highest setup complexity (score: 10), requiring specialized facilities, whereas portable EEG has the lowest complexity (score: 3), allowing for easy deployment in field settings.
- Signal Quality: High-density EEG provides the highest signal quality (score: 10), offering superior temporal and spatial resolution. Portable EEG, while ecologically valid, has lower signal quality (score: 6).
- Research Adoption: Multimodal integration and EEG-fMRI show the highest adoption rates in research (scores: 9), highlighting their importance in neuroscience and cognitive studies.
This intensive analysis of heatmaps now reveals the trade-off between ecological validity, cost, complexity, and signal quality within EEG research. Portable EEG and multimodal integrations are the most flexible approaches that balance usability and scientific rigor. High-density EEG and EEG-fMRI remain pivotal to high-resolution neural studies, though with reduced ecological applicability. These insights will provide a valuable framework for selecting appropriate EEG technologies based on research objectives and practical constraints. The resulting visualization can be a decision tool for researchers to optimize experimental settings concerning feasibility, accuracy, and ecological relevance in EEG studies into emotional and cognitive processes across diverse populations. In conclusion, the most potent EEG experimental paradigms combine standardized and naturalistic stimuli with rigorous task designs, multimodal integration, and population-specific considerations. The development of real-time analysis, wearables, and ecological validity inspires the future of EEG emotion and cognition research. It remains key to finding a balance between methodological rigor and real-life relevance to further our understanding of neural responses to emotional and cognitive processes across diverse populations.
4.3. [RQ3] What Are the Strengths and Limitations of Current EEG-Based Emotion Recognition Methodologies, and How Can Multimodal Integration Improve Their Real-Time Performance and Adaptability in Practical Applications?
Among various promising methodologies, EEG-based emotion recognition has emerged to capture rapid neural responses associated with emotional processes. However, several challenges restrict their applicability in real-world scenarios and call for exploring practical approaches to multimodal integration for improved performance and adaptability. In this regard, this section critically analyzes the strengths and weaknesses of the methodologies of EEG-based emotion recognition concerning issues related to the signal processing aspect, generalizability, and real-time application. It also discusses how the multimodal approach, especially the combination of EEG with other physiological and neuroimaging modalities, may help to overcome these challenges and move toward more robust and practical emotion recognition systems.
4.3.1. Strengths of EEG-Based Emotion Recognition
Another essential benefit of EEG-based emotion recognition is the high temporal resolution; even emotional responses that occur several milliseconds after stimulation can be elicited. Numerous studies have shown that EEG may be sensitive to emotional effects as quickly as 20 ms after stimulus onset, thereby supplying crucial insights about the beginning of emotional processes [139]. Non-invasiveness and the capability to measure rapid neural dynamics make EEG particularly suitable for applications requiring real-time monitoring of emotional states, such as brain–computer interfaces and affective computing systems [132]. In addition, EEG has played a key role in studying event-related potentials, including the well-known N170 and P300 components, which have been widely used to differentiate between positive and negative emotional stimuli. With these well-defined neural markers, EEG-based emotion classification models become more reliable in detecting affective states across different experimental conditions with high accuracy [133].
Another strong point of EEG is that it is non-invasive and relatively inexpensive compared to other neuroimaging techniques, such as functional magnetic resonance imaging (fMRI). In modern times, relatively cheap and easy-to-operate EEG devices have helped modern EEG devices become common in both laboratory and real-world settings, including wearable EEG headsets such as Emotiv EPOC+, which have already demonstrated effective emotion recognition capabilities in natural environments [113]. Furthermore, EEG offers fundamental information in the frequency domain, where discrete frequency bands are associated with different emotional states. Alpha-band activity has been proven sensitive to emotional stimuli; asymmetry in frontal alpha power commonly signals emotional valence [92]. Larger scales have been used, though; recording delta and gamma band activities in EEG has reported high-arousal emotional states, further expanding the possibility of identifying such affective states [135]. Despite these advantages, EEG-based emotion recognition has limitations, mainly when conducted outside controlled experimental settings. This section addresses significant challenges encountered by EEG-based emotion recognition systems in terms of the quality of signals, spatial resolution, individual variability, and ecological validity.
4.3.2. Limitations and Challenges in EEG-Based Emotion Recognition
Signaling quality is the most resistant problem in developing an EEG-based emotion recognition system. In general, raw EEG data easily gets exposed to various muscle and eye blinking-related artifacts and can easily deteriorate due to noise. A real-time application suffers from these artifacts because using efficient preprocessing methods will lose crucial emotional information due to noise filtration. This increases the complexity of the EEG signal processing pipeline and may hinder the feasibility of real-time emotion recognition outside of controlled environments [132]. Apart from that, EEG inherently maintains the issue of spatial resolution, as it mainly records cortical surface electrical activity and provides limited access to deeper brain structures involved in critical emotional processing, such as the amygdala and hippocampus. This is the limitation of the spatial grounds that decrease the capability of EEG in presenting a full image of the neural mechanisms behind the emotion and needs integration with fMRI for high spatial resolution imaging [118].
Another limitation is the well-known inter-subject variability problem that seriously interferes with generalization in models for emotion recognition. Individual differences in neurophysiology, baseline EEG pattern, and emotional reactivity may yield substantial variability in EEG signals, so a single model that demonstrates generally good performance across different subjects may hardly be found. Research has indicated that models trained on one group of participants fail to generalize well to new subjects. Therefore, adaptive and personalized approaches are necessary for EEG-based emotion recognition [128]. Furthermore, the success of EEG-based emotion recognition largely depends on the conditions under which data is collected. Various studies have utilized carefully controlled stimuli in laboratory settings. However, applying these findings to real-world environments presents significant challenges due to factors such as environmental noise, distractions, and varying levels of user engagement during EEG acquisition. These inconsistencies reduce the reliability of emo-tion classification algorithms when deployed outside the lab [113]. These limitations have driven the investigation of multimodal approaches for enhancing robustness and accuracy in EEG-based emotion recognition, often in combination with other physiological and behavioral measures. The following section will discuss how multimodal integration can balance the deficiencies of EEG to improve real-time performances in applications involving emotion recognition.
4.3.3. Enhancing EEG Emotion Recognition Through Multimodal Integration
Integrating multimodal approaches represents a set of emerging solutions to such challenging issues, including multiple sources related to physiological and neuroimaging aspects of EEG-based affective detection. One of the most feasible modalities involves integrating EEG with functional magnetic resonance imaging to compensate for the latter’s poor resolution by offering extensive anatomic visualization of the affective activities of the brain. Studies have demonstrated that EEG-fMRI integration enables the simultaneous capture of high-temporal and high-spatial-resolution data, facilitating a more comprehensive understanding of the neural correlates of emotion [118,142]. This approach has been particularly valuable in clinical applications, where precise localization of brain regions involved in emotional processing can aid in developing targeted interventions for mood disorders.
Another popular multimodal approach includes the combination of EEG with other physiological signals like electrocardiography and galvanic skin response. These physiological markers provide additional information on the activities of the autonomic nervous system, which plays a vital role in emotional arousal and regulation. Indeed, studies have indicated that including heart rate variability measures along with EEG improves the detection of stress and emotional arousal, resulting in better classification accuracy than using EEG only [95]. Similarly, incorporating facial expression analysis and video-based behavioral tracking has improved emotion recognition performance by adding observable affective cues to neural data [145].
Besides the advantages of multimodal signal integration, recent advances in machine learning further play a vital role in promoting the adaptability of EEG-based emotion recognition systems. By applying deep learning techniques, such as CNN and RNN, a very competitive performance increase can be achieved, which allows some research works to realize high-reliable and near real-time emotion recognition. Researchers have shown that CNNs are particularly good at extracting hierarchical features from the raw EEG signals, providing better generalization across subjects and experimental conditions [117,128]. Besides, performing transfer learning and domain adaptation helps tackle inter-subject variability with limited re-training over a new user by the model [129].
4.3.4. Future Directions and Practical Applications
Integrating EEG-based emotion recognition with real-time neurofeedback systems holds significant clinical and practical application potential. Neurofeedback is training wherein real-time visual or auditory information is provided about the activities of the brain, which can help in achieving better emotional regulation and reduce the symptoms of various psychiatric disorders like PTSD and depression. Clinically meaningful symptom severity reductions have been reported after EEG-based neurofeedback interventions, indicating the great potential of such systems for mental health applications [89,119]. Furthermore, wearable EEG devices have opened new avenues for continuous emotion monitoring in everyday settings, enabling applications in human–computer interaction, affective gaming, and personalized healthcare [113].
Although significant strides have been made in developing EEG-based emotion recognition technologies, many obstacles must be addressed regarding broad applicability. Further advances will require attention to the standardization of experimental protocols, enhancing robustness against environmental variability, and elaborating ethical frameworks governing the responsible use of effective computing systems. Conquering these identified challenges above would, therefore, demand interdisciplinarity between neuroscience, artificial intelligence, and human–computer interaction research. Further developments in multimodal integration and adaptive learning algorithms may provide a broader pathway to more practical and scalable emotion recognition systems. This brings us closer to the real-world implementation of real-time, user-adaptive, and ethically responsible applications in EEG-based affective computing.
Figure 7 below compares three different EEG-based emotion recognition approaches, including advantages and disadvantages concerning their unimodality and multimodality on classification accuracy and real-time and adaptability properties. EEG-based approaches show only the lowest ratings in all criteria, revealing many limitations in a single modality regarding signal reliability, spatial resolution, and poor generalizability. Complementing EEG with fMRI, ECG, GSR, and video significantly enhances recognition accuracy and improves real-time adaptation, addressing major challenges in EEG-based emotion detection.
Figure 7.
Strengths, limitations, and multimodal integration in EEG-based emotion recognition.
EEG + fMRI offers the best classification performance at 85% due to the good spatial resolution provided by fMRI. At the same time, this modality lags in real-time performance due to its computationally complex nature with a processing delay. EEG + ECG and EEG + GSR demonstrate better real-time efficiency and are suitable for wearable and clinical applications, particularly in mental health monitoring and affective computing. EEG + Video: This is a balanced approach toward the multimodal capture of neural and behavioral emotional responses. Thus, it has the most relevance to human–computer interaction applications and AI-driven emotion recognition systems.
This comparison underlines how effectively multimodal integration compensates for the intrinsic weaknesses of EEG and points toward paths involving practical, real-time, and adaptive solutions for emotion recognition. Advances in machine learning, real-time data fusion, and personalized neuroadaptive systems are foreseen to further refine this methodology at higher robustness for applicability in natural environments.
4.4. [RQ4] How Do EEG Patterns Reflect Individual Differences in Emotional and Cognitive Processing, and What Insights Can They Provide into Personality Traits, Behavioral Responses, and Neuropsychological Conditions?
EEG patterns provide valuable insights into individual differences in emotional and cognitive processing. These neural markers reflect relationships with personality traits, behavioral responses, and neuropsychological conditions. To enhance analytical depth, this section examines how EEG markers contribute to understanding cognitive and emotional variability across individuals.
4.4.1. EEG and Emotional Processing in Neurodevelopmental and Clinical Populations
EEG studies have also shown significant variability in emotional processing across neurodevelopmental and clinical groups. For example, individuals with Asperger syndrome have weaker theta synchronization, indicating atypical emotional regulation [141]. Other EEG-based assessments have identified differences in emotional processing pre- and post-therapy, including improved affective regulation following music therapy in cancer patients [122]. EEG patterns, especially, prove to help characterize emotional impairments. Research using single-trial N170 ERPs has shown that EEG features accurately classify positive and negative emotional stimuli [132]. The implication is that EEG is diagnostic and useful in monitoring therapeutic outcomes and behavioral changes across different populations. Moreover, differences in the EEG components reflect differential emotional reactivity and provide a neural basis for understanding why some individuals show heightened sensitivity to emotional stimuli while others evidence blunted responses.
Table 2 below summarizes comparisons related to EEG features observed in Asperger syndrome, ASD, schizophrenia, MDD, GAD, and PTSD from the perspective of several neurodevelopmental and clinical conditions, including all conditions under discussion. Key abnormalities include altered theta synchronization, increased variability, and power fluctuations across the frequency spectrum. These neural markers relate to cognitive and emotional dysregulation and give important diagnostic and therapeutic insights. The table has underlined EEG as one of the essential approaches in studying neural signatures associated with the conditions and helping assist in early detection and personalized strategy treatment.
Table 2.
EEG Features in Neurodevelopmental and Clinical Conditions.
4.4.2. EEG and Personality Traits in Emotional Responses
EEG coherence and neural activation patterns reflect personality-driven emotional processing. High-EI individuals exhibit higher EEG coherence in social perception-related brain areas, while low-EI individuals show object perception tendencies [93]. These differences become real-time in emotional interactions: the EEG signatures of emotional intelligence align with behavioral adaptability. The studies reveal a negative correlation between left amygdala activation and susceptibility to anger, underlining neural markers of emotional reactivity [144,145]. This implies that EEG can help predict emotional responses from personality, providing a quantified measure of a person’s ability to regulate affective states. Moreover, highly empathetic individuals show higher cortical gamma activity in response to emotional stimuli [115], supporting that EEG markers can discern individuals according to their emotional sensitivity and social responsiveness. These findings have practical implications for tailoring interventions that enhance emotional intelligence and stress resilience.
Figure 8 below shows EEG variations in individuals with different levels of Emotional Intelligence by comparing Frontal Alpha Power and Gamma Activity across low, medium, and high EI groups. The results showed that individuals with higher EI exhibited increased frontal alpha power, reflecting better emotional regulation and cognitive flexibility. Also, gamma activity, reflecting enhanced neural processing and emotional sensitivity, was higher in individuals with greater emotional intelligence.
Figure 8.
EEG variations across different emotional intelligence levels.
These findings correspond with previous evidence that higher emotional intelligence parallels more efficient neural activity in affective brain areas. Again, the increasingly higher scores obtained in both measures from the Low to High EI groups point to the potential that EEG biomarkers may have for evaluating emotional intelligence and related cognitive–emotional functions. Such neural markers may provide essential insights into tailored emotional control and social cognition interventions.
4.4.3. EEG and Cognitive Processing Efficiency
EEG patterns also reflect cognitive control, decision-making, and attentional regulation. It is possible to make single-trial EEG data-based individualized predictions of cognitive efficiency and response speed [117]. P300 amplitudes correlate positively with attentional focus and short-term memory performance [125], which indicates that these signals can act as biomarkers of cognitive workload and information-processing efficiency. EEG theta-band activity was associated with higher executive control and better adaptive emotional functioning [100]. Further, different degrees of frontal-midline theta power also serve as essential markers concerning the extent to which cognitive control mechanisms are adequate to afford complex problem-solving and high-stakes decisions. EEG-based assessments could have limited applications in cognitive performance training. Those who revealed the poorest degree of cognitive control would most likely benefit from neurofeedback intervention centered on attention optimization and executive functions.
4.4.4. EEG and Emotional Regulation Strategies
EEG’s role in emotional self-regulation is given in biofeedback and therapeutic practice. Neurofeedback interventions using the mentioned EEG markers have been reported to be effective in enhancing emotional self-regulation, mood stability, and anxiety management [119]. Increased frontal activation is related to better emotion regulation [97], thus setting a neurophysiological basis for behavioral therapies focused on enhancing cognitive control over emotional reactions. EEG-based measures also predict treatment efficacy in various psychological interventions, further strengthening their clinical value [122]. The potential impact of such a skill in regulating affective states through EEG-guided self-regulation could be far-reaching, ranging from improved resilience to stress to reduced emotional dysregulation in psychiatric disorders. Figure 9 below depicts, in flowchart form, some EEG-based emotional regulation interventions by describing mechanisms and expected outcomes. The central concept of EEG-based emotional regulation is linked to four key intervention strategies:
Figure 9.
Flowchart of EEG-based emotional regulation interventions.
- Neurofeedback Training—This method trains individuals to regulate their EEG activity, enhancing self-control over emotional responses. By reinforcing desired brainwave patterns, neurofeedback helps improve emotional stability and resilience.
- Cognitive Behavioral Therapy (CBT)—EEG can be used alongside CBT to enhance cognitive control of emotions. Real-time EEG monitoring allows therapists to track neural correlations of emotional regulation and adapt treatment strategies accordingly.
- Mindfulness and Meditation—EEG-guided mindfulness practices help individuals develop greater self-awareness and relaxation. Increased alpha and theta activity, commonly observed during meditation, improves emotional regulation and reduces stress.
- Real-Time EEG Monitoring—This approach provides adaptive feedback during therapy sessions, helping clinicians personalize interventions based on a patient’s real-time emotional and cognitive state.
All these interventions lead to improved emotional regulation, with specific outcomes such as better control over emotional responses, enhanced coping strategies, reduced stress and anxiety, and personalized therapeutic approaches. This flowchart highlights the integrative role of EEG in developing targeted interventions for emotional and mental well-being.
4.4.5. EEG and Reward-Based Decision-Making
Among neural markers of decision-making and reward sensitivity are the EEG-based measures. FRN amplitudes are higher in individuals with high reward sensitivity but low risk-taking propensity [90], which implies that EEG components help outline certain cognitive biases within decision-making frameworks. P300 amplitudes reflect attention allocation and cognitive decision-making processes, as shown in studies [86,87], further depicting how EEG works in understanding the mechanisms of risk evaluation and reward processing. EEG-derived emotional positivity indexes sensitivity to affective primes 130, underlining the complex neural interaction of cognition and affective influences on decision-making. The knowledge obtained will be instrumental in setting up more precise predictive models concerning consumer behavior, financial decision-making, and psychological treatment concerning maladaptive risk-taking attitudes.
Figure 10 below depicts the relation between EEG FRN amplitudes and decision-making accuracy based on the insights drawn from the EEG studies included in this systematic review. FRN is a component of ERP that reflects the cognitive processing of feedback and error detection; hence, it is essential for decision-making. This scatter plot indicates a positive correlation between FRN amplitudes and decision-making accuracy: the trend line in the graph indicates that higher decision accuracy is shown when the FRN amplitude is more damaging. This will also support the fact that stronger FRNs indicate a greater sensitivity to feedback that relates to better performances on decision-making tasks. Interindividual differences in FRN may reflect individual variations in cognitive flexibility, reward sensitivity, and adaptive learning strategies.
Figure 10.
EEG-based emotional regulation interventions.
These findings indicate that EEG-based estimates of FRN amplitudes may help predict the efficiency of decision-making, the propensity for risk-taking behavior, and learning adaptability. This relationship has possible practical consequences for cognitive training, neurofeedback therapy, and understanding neurological disorders affecting executive functions and impulse control.
4.4.6. EEG and Neural Markers for Psychopathology
EEG studies offer diagnostic insights into psychiatric disorders by identifying patterns associated with mental health conditions. Increased EEG signal variability is a biomarker for schizophrenia and autism [123,133], reflecting the altered neural oscillatory activity characteristic of these conditions. Elevated LPP amplitudes indicate abnormal emotional regulation in depressive individuals [120], suggesting that EEG can function as a prognostic tool for mood disorders. Interference in the dmPFC via TMS disrupts emotion discrimination capabilities, as evidenced by EEG markers 120, underlining the involvement of neural circuits in cognitive-affective integration. Applications of EEG to mental health diagnostics also underline this modality’s potential for early detection, intervention monitoring, and personalized treatment approaches.
Figure 11—Comparing an EEG heatmap showing power changes across five fundamental brain regions, the Frontal, Parietal, Temporal, Occipital, and Central, obtained in healthy participants and clinical groups. Data was derived from systematic review papers analyzing variations in the activity patterns related to EEG processes in emotional regulation, cognitive processes, and neurologically linked disorders. The heatmap shows a generally higher EEG power set in all parts of the brain in healthy subjects, especially in the frontal and occipital lobes responsible for cognitive control, emotion regulation, and visual processing. At the same time, their clinical populations showed substantially reduced EEG power. Again, the most pronounced differences arose over the frontal and central regions, in good agreement with studies of mood disorders, neurodevelopmental conditions, and psychiatric illnesses. Lower frontal EEG activity in clinical populations has been interpreted to reflect deficits in executive functioning and/or emotional regulation, observed in disorders such as Major Depressive Disorder and Generalized Anxiety Disorder.
Figure 11.
Comparative EEG heatmap: healthy vs. clinical populations.
These findings support EEG’s diagnostic and therapeutic value for differentiating between healthy and clinical populations. The EEG power differences could, therefore, provide biomarkers for neurophysiological dysfunctions and support the development of EEG-based interventions, neurofeedback training, and specific therapeutic approaches to improve affected individuals’ emotional and cognitive functions.
In conclusion, EEG points to neural mechanisms underlying individual emotional and cognitive processing differences. EEG microstates, mismatch negativity, and phase synchronization differentiate cognitive styles, offering refined insights into inter-individual variability. EEG neurofeedback reinforces cognitive and emotional interventions by supporting its potential to optimize mental and cognitive health. Multimodal research integrating EEG with neuroimaging and behavioral assessment now offers the possibility of developing a broader understanding of brain–behavior relationships. Future studies should focus on multimodal approaches, such as combining EEG with fMRI, behavioral analysis, and genetic data, in refining our understanding of individual cognitive and emotional variability. Extension of the use of EEG from laboratory settings to real-world contexts may enhance the practical value of such findings for deriving more specific and effective interventions across diverse populations.
Figure 12 below gives a flowchart to represent the integration of EEG with neuroimaging and behavioral assessment techniques toward deriving comprehensive biomarkers of cognitive and emotional processing. This hierarchy illustrates how electrophysiological, neuroimaging, behavioral, and clinical evaluations relate to one another in giving meaning to brain function and mental health. The flowchart starts with the general topic of Multimodal Neuroscience Assessment, which splits into two major domains:
Figure 12.
Comparative EEG Flowchart: healthy vs. clinical populations.
- Electrophysiological Techniques, including EEG (Electroencephalography) and MEG (Magnetoencephalography). EEG measures brainwave oscillations and real-time neural activity, providing high temporal resolution. At the same time, MEG detects magnetic fields produced by neural activity, offering superior spatial and temporal insights into cognitive function.
- Neuroimaging Techniques, encompassing fMRI (Functional Magnetic Resonance Imaging) and PET (Positron Emission Tomography). fMRI provides detailed spatial mapping of brain activity by measuring hemodynamic responses, whereas PET analyzes metabolic and neurotransmitter activity, offering insights into neurochemical processes.
Assessing behavior, cognition, and clinical and psychological factors contributes to understanding cognitive func-tions, emotional states, and neuropathic conditions. On the other hand, behavioral assessment will monitor the performance during cognitive and emotional tasks, and clinical evaluation will diagnose neuropsychiatric conditions in nature on neural and psychological markers. These combined modalities generate cognitive and emotional processing biomarkers that are important to understand individual differences, diagnose mental health disorders, and further inform targeted interventions, including neurofeedback and cognitive training. A schema such as this underscores the emphasis of multimodal approaches to neuroscience research today, allowing a more precise, personalized analysis of brain function and behavior.
4.5. [RQ5] How Can EEG-Based Emotion Recognition Systems Be Applied Effectively in Real-World Contexts, Such as Healthcare, Education, and Marketing, While Addressing Technical Challenges and Ethical Implications?
EEG-based emotion recognition systems hold great promise for practical applications in various domains. In healthcare, EEG-based emotion recognition systems hold immense promise for diagnosing and treating affective disorders. Researchers [122] showed that such systems could be used to monitor, in real-time, the emotional effects of therapeutic interventions, such as music therapy, on palliative care cancer patients. This real-time monitoring capability could enable healthcare professionals to optimize treatments based on the individual emotional responses of the patients. Furthermore, the study [145] suggested that self-regulation of amygdala activity by neurofeedback could have implications for early detection and monitoring of neuropsychiatric disorders. Researchers [94] further showed that EEG measures of cognitive control predicted daily coping with stress and emotional reactivity and thus helped identify individuals at risk for stress-related disorders.
EEG-based emotion recognition systems in education may optimize learning experiences by assessing students’ emotional engagement, as suggested by researchers [132], and adaptive feedback based on learners’ mental states, as indicated by the study [110]. The system could create personalized learning experiences that adapt to the individual student’s emotional and cognitive state, which could improve educational outcomes. Moreover, the system could help assess academic programs, as evidenced by researchers [99] who used EEG measures to determine changes in the emotional regulatory effects of an entrepreneurship program.
Marketing applications could also include using an EEG-based emotion recognition system to evaluate consumer responses to advertisements or products [95,139] and optimizing user experiences in digital interfaces through real-time adaptation [86]. However, using this technology in marketing raises ethical issues regarding privacy and manipulation that must be carefully considered.
Several technical challenges must be overcome to make using EEG-based emotion recognition systems effective in real-world contexts. One is the need for user-friendly and non-intrusive recording systems suitable for real-world environments [87,95]. Another is the necessity of more robust signal processing methods for noise and artifact handling in real-world conditions [83,140]. Models should be developed to consider individual differences and variability in emotional processing. Inter-subject variability can seriously affect emotion recognition accuracy in studies such as those [120,128,132]. Other requirements include real-time processing capability for smooth applications in natural environments [86,110].
Furthermore, using EEG-based emotion recognition systems inherently carries several ethical challenges. Privacy and data protection are of utmost concern with sensitive neural data, requiring serious protection of data and informed consent processes. The possibility of abuse and manipulation is also high, especially in marketing and persuasive scenarios, and clear ethics guidelines and regulations should be laid down to avoid potential abuses. Ensuring inclusivity and fairness in emotion recognition algorithms is another critical ethical consideration, as bias in these systems could lead to unequal treatment or discrimination [86,87].
To overcome these technical and ethical challenges, future research should be directed at enhancing EEG spatial resolution and signal quality [118], developing adaptive algorithms accounting for individual differences [87], enhancing real-time processing capabilities [86,87], and exploring multimodal approaches that integrate EEG with other physiological and behavioral measures [95,118,145]. Furthermore, the responsible development and deployment of EEG-based emotion recognition systems requires establishing clear ethical guidelines, regulations, and strong data protection. EEG-based emotion recognition systems have immense potential for practical healthcare, education, and marketing applications. Such systems can assist in diagnosing and treating affective disorders, optimizing therapeutic interventions, personalizing learning experiences, evaluating educational programs, and providing insights into consumer responses. However, this development should surmount serious challenges related to creating user-friendly, non-invasive recording systems, efficient signal processing algorithms, machine learning models allowing for individual variations, and online processing. Besides that, the responsible development of those systems would involve ethical considerations of the following aspects: privacy issues, the possibility of manipulative misuse, and inclusiveness/equity. Therefore, by addressing these technical challenges in a well-focused research area with the establishment of guidelines and safeguards, EEG-based emotion recognition systems could be effectively put into practical applications in contexts of real life, with probable improvements in outcomes and invaluable insights into many domains.
5. Discussion
5.1. Key Notes on Techniques Enhancing EEG-Based Emotion Recognition
Signal preprocessing helps to clean and purify the EEG data for further analysis, improving the quality of extracted emotional features. Band-pass filtering filters the EEG data in a particular range, for example, 0.5–45 Hz, to remove noise and focus on relevant neural activity. Artifact detection and elimination techniques, such as the Reference Electrode Standardization Technique, were proposed to deal with contaminants like eye blinks and muscle artifacts. The single-trial analysis captures trial-to-trial variations in emotional responses, preserving subtle features like N170 event-related potential that could otherwise be lost in average data.
Feature extraction aims to identify meaningful EEG patterns and signatures corresponding to emotional states. Event-related spectral perturbation methods, such as ERSP, investigate EEG oscillations within specific time windows, like alpha, beta, and gamma bands, to exhibit spectral changes related to emotional processing. Spatial network patterns, such as Discriminative Spatial Network Patterns (DSNP), further extract features about the connectivity between brain regions during emotional responses, thus providing insight into neural network dynamics.
Data augmentation techniques are employed to increase model robustness and generalizability. Synthetic data generation methods artificially expand training datasets, improving model performance and mitigating overfitting, especially when limited EEG data is available. Dynamic face morphing for emotional stimuli generation enhances dataset diversity by better reflecting real-world emotional variations.
Machine learning algorithms have proven effective for EEG-based emotion recognition. L1LR provides computationally efficient solutions with strong generalization performance on single-trial EEG features. SVMs, especially the ones using radial basis function kernels, are robust in classification performance but usually represent a computational burden. The potential of uncovering complex patterns within the spatial and temporal structures in EEG signals is relatively well-utilized with convolutional neural networks, which considerably enhance classification performance. While LDA performs well with linear EEG features, it does not serve nonlinear brain dynamics.
Cross-validation methodologies, including k-fold validations, are used to avoid overfitting and ensure model performance generalization and reliability. In addition, several evaluation metrics—ROC-AUC, sensitivity, and specificity—offer a more holistic assessment than simple accuracy for model performance, enhancing the robustness of emotion recognition systems with various datasets and real conditions. These techniques, when combined, result in higher accuracy, robustness, and generalizability for EEG-based emotion recognition systems.
Table 3 summarizes the most important approaches for an EEG-based emotion recognition system: signal preprocessing methods, feature extraction methods, augmentations, machine learning methodology, and generalization of performance. Such pre-processing methods mainly focus on bandpass filtering and eliminating artifacts, thus securing quality EEG signals for more sophisticated processing, as in such tasks, good EEG will contribute to the robustness of the analyses. Feature extraction methods using time-frequency and spatial network analysis could, in turn, extract the significant neural signatures related to emotions.
Table 3.
Techniques enhancing EEG-based emotion recognition.
Adding to these approaches, other data augmentation techniques, like synthetic data generation and dynamic face morphing, have also been applied during training to enhance model generalizability and robustness. This is further enhanced with the integration of machine learning algorithms, such as L1-regularized logistic regression, support vector machines, and convolutional neural networks, to improve the performance of EEG-based emotion recognition systems. In addition, techniques of generalization of performance, such as cross-validation and multiple evaluation metrics, ensure reliability across diverse datasets and conditions. Together, these methods provide a comprehensive framework for mitigating technical challenges and improving the accuracy, robustness, and scalability of EEG-based emotion recognition systems for research studies and real-world applications.
Diverse emotional stimuli significantly contribute to the evocation of robust EEG responses that contribute to experimental precision and ecological validity. Short and long videos, facial expression tasks, and standardized image databases, such as the IAPS, provide well-controlled emotional elicitation. In contrast, naturalistic approaches offer higher ecological validity, such as dynamic emotional faces and autobiographical memory retrieval. Emotional film clips enhance real-world relevance by simulating complex emotional experiences. Experimental protocols, such as Go/No-Go and oddball tasks, enable the investigation of response inhibition and attention-related EEG components, including emotion-specific ERPs such as N170. The closed-loop neurofeedback designs add a dynamic element: brain responses are shaped in real time through positive or negative feedback loops. Longitudinal designs and multimodal approaches—including EEG with other physiological signals like ECG, GSR, or neuroimaging techniques such as fMRI—provide a more thorough investigation of emotional and cognitive processes while allowing the observation of changes in these processes over time.
Considering population issues is essential in increasing the generalizability of the findings based on EEG. The studies that included various groups, from clinical populations to healthy participants, underlined the differences in emotional processing across different conditions. Further, recording EEG in real-world settings, such as urban green spaces or robotics laboratories, enhances ecological validity and practical applicability while balancing experimental control and naturalistic relevance. EEG can reach robust and generalizable insights into emotional and cognitive processes in diverse populations by incorporating diverse stimuli, innovative protocols, multimodal integration, and ecologically valid approaches.
Table 4 gives effective experimental set-ups, stimuli, and protocols for eliciting robust EEG responses in studies of emotions. Such emotional stimuli, including video material, facial expressions, and naturalistic paradigms, allow for ecological validity; protocols like Go/No-Go tasks, neurofeedback, and longitudinal designs can comprehensively investigate emotional and cognitive processes. These include multimodal approaches, including the combination of EEG with other physiological measures such as fMRI and GSR, which offer richer understandings of emotional responses. Considering diverse populations and real-world settings provides confidence that results will generalize across contexts.
Table 4.
Effective experimental setups and stimuli for EEG-based emotion studies.
Table 5 presents an overall summary of the strengths and limitations of EEG-based emotion recognition methodologies. The high temporal resolution of EEG represents one of the most significant advantages of this technique, which can capture fast emotional responses, such as N170 event-related potentials. Its cost-effectiveness and portability make it practical and accessible for real-world applications, particularly compared to other neuroimaging techniques. Furthermore, EEG allows for investigations of specific frequency bands such as alpha, beta, and gamma that are helpful in the neural oscillations accompanying emotional processing. Applying machine learning methods in single-trial analysis and real-time neurofeedback extends EEG as a practical dynamic recognition approach toward emotion. The assessment of functional connectivity and, in relation to, the development of wearable EEG devices contribute to ecological validity for this approach, enabling studies even in naturalistic environments.
Table 5.
Strengths and limitations of EEG-based emotion recognition.
Despite these strengths, EEG-based methods have various limitations. The technique is very susceptible to artifacts—which include muscle activity or eye blinks—and needs extensive preprocessing for the data to be suitable for analysis. Poor spatial resolution limits the capability of EEG to precisely localize deep-brain activity, which represents one of the most substantial disadvantages when compared with fMRI. Inter-subject variability remains challenging because neural responses vary between individuals and complicate the generalization of emotion recognition models. Moreover, the environmental noise of real-world settings decreases system accuracy. Addressing these limitations through multimodal integration, including techniques such as fMRI, physiological measures of heart rate and GSR, and behavioral data, enhances spatial resolution, robustness, and real-time adaptability. Combining EEG’s temporal precision with other data sources will yield more comprehensive and practical systems for emotion recognition that can be applied to various applications.
Furthermore, Table 6 outlines how EEG patterns reflect individual emotional and cognitive processing differences. On one hand, EEG allows one to investigate variability in emotional responses, such as empathy, emotional regulation, and reactivity. On the other hand, cognitive processing captures attentional flexibility, decision-making strategies, and neural plasticity after interventions. Moreover, EEG patterns show links with personality traits, such as creative self-efficacy and emotional contagion, and atypical neural activity in neuropsychological conditions like autism and depression. Additionally, EEG can predict behavioral outcomes, such as reward-based decision-making and activity-related emotional states. Integrating EEG with other physiological or behavioral measures enhances understanding of individual differences, thus providing a comprehensive framework for analyzing emotional and cognitive processing.
Table 6.
EEG patterns and individual differences in emotional and cognitive processing.
Finally, Table 7 illustrates the key applications of EEG-based emotion recognition systems in healthcare, education, and marketing sectors. EEG systems in the healthcare sector have shown promising results regarding monitoring mental health, optimizing therapy, and providing neurofeedback for regulating emotions. Educational applications, on the other hand, focus on adaptive learning systems that respond to students’ emotional engagement and thus improve learning outcomes. At the same time, the EEG measures help assess program effectiveness. In marketing, EEG provides insights into consumer behavior and neuromarketing by tracking emotional responses to advertisements and media stimuli. The table further details critical technical challenges: signal noise, inter-subject variability, and the need for real-time processing capabilities. In addition to ethical issues like privacy, informed consent, data security, and fairness, the implementation of EEG-based systems in real-world contexts needs to be responsibly performed.
Table 7.
Applications of EEG-based emotion recognition in real-world contexts.
The relationships between neuroimaging techniques, such as EEG, MEG, fMRI, and PET; machine learning methods, such as CNN, RNN, and GAN; applications, such as mental health, computer interaction, marketing, and adaptive systems; and ethical challenges, such as privacy, bias, and inclusivity, are emphasized in the circular graph (Figure 13). The links show the interdisciplinary nature of EEG-based emotion recognition research. Interestingly, MEG shows links to applications such as mental health and human–computer interaction, as well as ethical concerns like bias. EEG remains the central one, with extensive links to various applications and challenges, underlining its dominant role in the field.
Figure 13.
Circular graph shows relationships between neuroimaging techniques, machine learning methods, applications, and ethical challenges.
The following heatmap (Figure 14) shows the distribution of neuroimaging modalities in key application domains, giving an overview of their usage trends. For instance, MEG is densely located under Marketing, while fMRI and PET are strongly related to Mental Health and Adaptive Systems, respectively, showing that they bear relevance in trying to solve healthcare and technological issues. EEG is notably well-rounded within a multitude of domains, but for Human–Computer Interaction, the cost-effectiveness of this tool gives it a very high temporal resolution.
Figure 14.
Neuroimaging modalities across application domains.
These findings underline the diversified usability of neuroimaging techniques and emphasize domain-specific preferences, which include further opportunities for integration. The following radar chart (Figure 15) compares strengths in green and limitations in red related to EEG-based emotion recognition. Among the most critical strengths, high temporal resolution, cost-effectiveness, and real-time neurofeedback can be named. The main drawbacks are spatial resolution, susceptibility to artifacts, and not being able to use this technology outside with greater accuracy and applicability.
Figure 15.
Comparative radar chart illustrates the strengths and limitations of EEG for emotion recognition.
5.2. Future Directions and Emerging Trends
EEG emotion recognition is a promising technique with potential applications in various contexts. Nevertheless, the research field is young, with most papers published over the last decade. Here, we have reviewed previous ones and attempted to identify intervention points, present the state of the art in EEG-aided emotion recognition, and propose emerging trends. Around 90% of the research reviewed is based solely on the valence vs. arousal model, making it an area that requires urgent action. Journal papers far exceed clinical trials, with the latter focused almost exclusively on depression and a few cases of Parkinson’s or autism [146,147,148]. There is also no intervention aspect in attention- or stress-monitoring papers, although cognitive states are identified as necessary [149,150,151]. The preceding prose suggests future directions, which often contrast with the state of the art or have come to light only recently. These include not-so-usual models, larger datasets of patient-recorded EEG, multimodal datasets derived from sensor fusion, the time domain, unsupervised learning, positive affect engagement, and behavior-adaptive norming. More complex immediate applications reduce the barrier to real-world deployments, such as smartphones, smartwatches, cars, and office/emotional assistants [152,153,154]. Emotional and cognitive feedback contribute to affective system understanding and enhance machine and human performance. More recently, few researchers have studied deep learning techniques that go beyond suppressed traditional wisdom [155,156,157].
5.3. Advancements in EEG Technology
Various technical advancements in EEG technology have significantly impacted the field. One of the most salient developments in EEG technology is the production of headset and wearable electrode systems. Several manufacturers develop comfortable, high-quality, and low-cost EEG sensor devices. EEG headset devices are worn during training, cognitive task performance, and signal recording. They are usually designed with minor direct skin contact, which is particularly important for conducting user-friendly social research in cognitive science or real-world applications [158,159,160]. EEG technology is rapidly advancing, and the path toward consumer-grade, mobile, wearable, and invisible brain–computer interfaces will become a reality. Indeed, EEG technology is advancing to such an extent that one can see initial developments toward using everyday objects or environments as brain–computer interfaces, as demonstrated by developers of stimulation displays, headsets, actuators, or observers [161,162]. In this scenario, consumers realize they do not need sophisticated or expensive equipment to monitor emotional activity and potentially improve health and wellness or social development. EEG technology opens a new dimension in investigating interactive development since individuals can share experiences while exchanging brain signals [163,164].
5.4. Multimodal Emotion Recognition
Research combining different modalities is also emerging to improve emotion recognition performance further. Multimodal fusion technology has been widely used in emotion recognition research as a window to the heart. Many classic combinations include audio-visual emotion recognition, audio-haptic emotion recognition, and audio-physiological emotion recognition. In addition, various databases need to be built by combining multimodal fusion. Recently, with the increasing research on multimodal emotion recognition, the introduction of the modal fusion strategy has become more mature. There is a rising interest in combining different modalities for more accurate recognition of emotional states [165,166]. Modal fusion has become an important direction in multimodal human–computer interaction, and the progress of multimodal emotion recognition is also an essential process in the transition to anonymous products. Most commercial applications require multimodal fusion to obtain more complementary features and a further improvement in recognition performance and generalization ability. Promising research has shown that physiological signals can strongly determine different mood states. In multimodal fusion, physiological signals contribute significantly to classifying negative and high-internal-arousal emotional states. However, there is still minimal research on AHE, most of which is used to recognize valence and arousal. The main challenge of physiological signal research is that the proposed multimodal fusion model should be able to handle multimodal data in an end-to-end manner without additional feature extraction and processing steps. A large dataset to be obtained through human–computer interaction is also needed to explore further the potential of physiological signals in multimodal fusion [166,167,168]. Due to low user acceptance and increased system complexity, few studies have been conducted. Most research focuses on recognizing negative and high-internal-arousal emotional states. Ensemble learning and joint model learning are promising methods, but behavior labels may be difficult to obtain, whereas audio modal problems can be alleviated. An inspiring work with crowd-sourced datasets is the most suitable resource and will be the most popular database. In the future, multimodal emotional content and emotions will likely become important indicators of social interaction and human–computer interaction.
6. Conclusions
This systematic review underlines the transformative potential of EEG-based emotion recognition from healthcare, human–machine interaction, education, and marketing fields by covering significant advances in signal processing techniques, deep learning models, and multimodal integration that raise the level of accuracy and their implementation in practice. It identifies CNNs and RNNs as effective for feature extraction and classification, with some models achieving classification accuracies of more than 90% when integrating EEG with other physiological signals. Despite these promising developments, several challenges are still critical: signal noise, inter-subject variability, and the lack of standardized protocols ultimately limit generalizability and reproducibility. High computational costs and real-time processing reduce the feasibility of everyday applications. The review thus gives emphasis on the requirement for optimized machine learning architectures, more efficient preprocessing techniques, and robust validation strategies to improve real-world applicability. Ethical issues, especially those related to data privacy, bias in training datasets, and the potential misuse of emotion recognition technologies, also need consideration. The trade-off between experimental control and real-world adaptability remains an open issue in ensuring responsible development. Future research should improve real-time processing capabilities, integrate EEG with multiple physiological measures, and refine deep learning approaches toward developing scalable and adaptive emotion recognition systems. Applications to mental health diagnostics, personalized education, and consumer behavior analysis show great transformative potential for EEG-based emotion recognition if those technical and ethical challenges can be overcome. Only then will EEG-based emotion recognition become a robust, easy-to-access, and ethically sound method for perceiving and acting on human feelings within various environments.
Supplementary Materials
The following supporting information can be downloaded at: https://www.mdpi.com/article/10.3390/brainsci15030220/s1, PRISMA 2020 Checklist. Ref. [169] is cited in Supplementary Materials.
Author Contributions
Conceptualization, E.G. and C.H.; methodology, E.G.; software, C.H.; validation, E.G., C.H. and H.A.; formal analysis, E.G., C.H. and H.A.; investigation, E.G., A.A., C.H. and H.A.; resources, E.G., A.A., C.H. and H.A.; data curation, E.G., C.H. and H.A.; writing—original draft preparation, E.G., A.A., C.H. and H.A.; writing—review and editing, E.G., A.A., C.H. and H.A.; visualization, E.G., A.A., C.H. and H.A.; supervision, E.G., C.H. and H.A.; project administration, E.G., A.A., C.H. and H.A.; funding acquisition, E.G., A.A., C.H. and H.A. All authors have read and agreed to the published version of the manuscript.
Funding
This research was funded by the Research Council of the University of Patras, Greece.
Conflicts of Interest
The authors declare no conflicts of interest.
Abbreviations
| General Neuroscience and Cognitive Science | |
| EEG | Electroencephalography |
| MEG | Magnetoencephalography |
| ERP | Event-Related Potential |
| PFC | Prefrontal Cortex |
| DMN | Default Mode Network |
| IFG | Inferior Frontal Gyrus |
| TPJ | Temporo-Parietal Junction |
| ACC | Anterior Cingulate Cortex |
| VLPFC | Ventrolateral Prefrontal Cortex |
| DLPFC | Dorsolateral Prefrontal Cortex |
| HPC | Hippocampus |
| AMY | Amygdala |
| INS | Insula |
| EEG 10–20 Electrode Placement System | |
| Fp1, Fp2 | Frontal Pole (left/right) |
| F3, F4 | Frontal (left/right) |
| F7, F8 | Frontal Lateral (left/right) |
| Fz | Frontal Midline |
| C3, C4 | Central (left/right, sensorimotor regions) |
| Cz | Central Midline (vertex, motor processing) |
| P3, P4 | Parietal (left/right, spatial awareness) |
| Pz | Parietal Midline (sensory integration) |
| O1, O2 | Occipital (left/right, visual processing) |
| T3, T4, T5, T6 | Temporal (auditory and memory processing) |
| EEG-Based Emotion Recognition | |
| LPP | Late Positive Potential |
| FAA | Frontal Alpha Asymmetry |
| HRSD | Hamilton Rating Scale for Depression |
| BDI-II | Beck Depression Inventory-II |
| PANAS | Positive and Negative Affect Schedule |
| PCL-5 | PTSD Checklist for DSM-5 |
| PTGI | Post-Traumatic Growth Inventory |
| FRN | Feedback-Related Negativity |
| N170 | A Component of the ERP Related to Face Perception |
| P300 | A Positive Deflection in EEG Associated with Attention and Decision-Making |
| Machine Learning and Signal Processing | |
| CNN | Convolutional Neural Network |
| RNN | Recurrent Neural Network |
| ANN | Artificial Neural Network |
| MLP | Multilayer Perceptron |
| WT | Wavelet Transform |
| STFT | Short-Time Fourier Transform |
| PLV | Phase-Locking Value |
| DCM | Dynamic Causal Modeling |
| MVPA | Multivariate Pattern Analysis |
| LDA | Linear Discriminant Analysis |
| SVM | Support Vector Machine |
| Brain–Computer Interfaces (BCI) and Neurotechnology | |
| BCI | Brain–Computer Interface |
| HMI | Human–Machine Interaction |
| GSR | Galvanic Skin Response |
| RT-fMRI | Real-Time Functional MRI |
| NIRS | Near-Infrared Spectroscopy |
| EMG | Electromyography |
| Neurostimulation and Brain Modulation Techniques | |
| tDCS | Transcranial Direct Current Stimulation |
| cTBS | Continuous Theta Burst Stimulation |
| TMS | Transcranial Magnetic Stimulation |
| HD-tDCS | High-Definition Transcranial Direct Current Stimulation |
| rt-fMRI NF | Real-Time fMRI Neurofeedback |
| Emotion and Psychological Assessment Tools | |
| PANAS | Positive and Negative Affect Schedule |
| HRSD | Hamilton Rating Scale for Depression |
| BDI-II | Beck Depression Inventory-II |
| PCL-5 | PTSD Checklist for DSM-5 |
| PTGI | Post-Traumatic Growth Inventory |
| MASQ-AD | Mood and Anxiety Symptom Questionnaire-Anhedonic Depression |
| PSWQ | Penn State Worry Questionnaire |
| GSE | General Self-Efficacy Scale |
| Experimental Paradigms and Cognitive Tasks | |
| CRT | Cognitive Reappraisal Task |
| RME | Reading the Mind in the Eyes Test |
| RCPT | Realistic Complex Problem Task |
| GNG | Go/No-Go Task |
| SST | Stop Signal Task |
| Statistical and Data Processing Terms | |
| ANOVA | Analysis of Variance |
| ANCOVA | Analysis of Covariance |
| ERP | Event-Related Potential |
| sLORETA | Standardized Low-Resolution Brain Electromagnetic Tomography |
References
- Edelman, B.J.; Zhang, S.; Schalk, G.; Brunner, P.; Müller-Putz, G.; Guan, C.; He, B. Non-invasive Brain-Computer Interfaces: State of the Art and Trends. IEEE Rev. Biomed. Eng. 2024, 18, 26–49. [Google Scholar] [CrossRef]
- Samal, P.; Hashmi, M.F. Role of machine learning and deep learning techniques in EEG-based BCI emotion recognition system: A review. Artif. Intell. Rev. 2024, 57, 50. [Google Scholar] [CrossRef]
- Robu-Movilă, A. Neurourbanism: From effective to affective computing in urban planning. Rev. Şcolii Dr. Urban. 2025, 10, 85–94. [Google Scholar]
- Mai, N.D.; Long, N.M.H.; Chung, W.Y. 1D-CNN-based BCI system for detecting Emotional states using a Wireless and Wearable 8-channel Custom-designed EEG Headset. In Proceedings of the 2021 IEEE International Conference on Flexible and Printable Sensors and Systems (FLEPS), Manchester, UK, 20–23 June 2021; pp. 1–4. [Google Scholar] [CrossRef]
- Cai, H.; Pan, J.; Xiao, Q.; Jin, J.; Li, Y.; Xie, Q. Decoding Musical Neural Activity in Patients with Disorders of Consciousness Through Self-Supervised Contrastive Domain Generalization. IEEE Trans. Affect. Comput. 2024, 1–18. [Google Scholar] [CrossRef]
- Joshi, R.; Jadeja, M. The Synergy of Clinical Psychology and Affective Computing: Advancements in Emotion Recognition and Therapy. In Affective Computing for Social Good: Enhancing Well-Being, Empathy, and Equity; Springer Nature: Cham, Switzerland, 2024; pp. 21–45. [Google Scholar] [CrossRef]
- Essa, A.; Kotte, H. Brain signals analysis based deep learning methods: Recent advances in the study of non-invasive brain signals. arXiv 2021, arXiv:2201.04229. [Google Scholar]
- Gkintoni, E.; Dimakos, I.; Nikolaou, G. Cognitive Insights from Emotional Intelligence: A Systematic Review of EI Models in Educational Achievement. Emerg. Sci. J. 2025, 8, 262–297. [Google Scholar] [CrossRef]
- Karras, C.; Theodorakopoulos, L.; Karras, A.; Krimpas, G.A. Efficient Algorithms for Range Mode Queries in the Big Data Era. Information 2024, 15, 450. [Google Scholar] [CrossRef]
- Antal, A.; Luber, B.; Brem, A.K.; Bikson, M.; Brunoni, A.R.; Kadosh, R.C.; Dubljević, V.; Fecteau, S.; Ferreri, F.; Flöel, A.; et al. Non-invasive brain stimulation and neuroenhancement. Clin. Neurophysiol. Pract. 2022, 7, 146–165. [Google Scholar] [CrossRef]
- Halkiopoulos, C.; Gkintoni, E. Leveraging AI in e-learning: Personalized learning and adaptive assessment through cognitive neuropsychology—A systematic analysis. Electronics 2024, 13, 3762. [Google Scholar] [CrossRef]
- Afzal, S.; Khan, H.A.; Piran, M.J.; Lee, J.W. A comprehensive survey on affective computing; challenges, trends, applications, and future directions. IEEE Access 2024, 12, 96150–96168. [Google Scholar] [CrossRef]
- Tian, L.; Oviatt, S.; Muszynski, M.; Chamberlain, B.; Healey, J.; Sano, A. Applied Affective Computing; Association for Computing Machinery: New York, NY, USA, 2022. [Google Scholar] [CrossRef]
- García-Hernández, R.A.; Luna-García, H.; Celaya-Padilla, J.M.; García-Hernández, A.; Reveles-Gómez, L.C.; Flores-Chaires, L.A.; Delgado-Contreras, J.R.; Rondon, D.; Villalba-Condori, K.O. A Systematic Literature Review of Modalities, Trends, and Limitations in Emotion Recognition, Affective Computing, and Sentiment Analysis. Appl. Sci. 2024, 14, 7165. [Google Scholar] [CrossRef]
- Devillers, L.; Cowie, R. Ethical considerations on affective computing: An overview. Proc. IEEE 2023, 111, 1445–1458. [Google Scholar] [CrossRef]
- Gkintoni, E.; Dimakos, I.; Halkiopoulos, C.; Antonopoulou, H. Contribution of Neuroscience to Educational Praxis: A Systematic Review. Emerg. Sci. J. 2023, 7, 146–158. [Google Scholar] [CrossRef]
- Hasan, M.A.; Noor, N.F.M.; Rahman, S.S.B.A.; Rahman, M.M. The transition from intelligent to affective tutoring systems: A review and open issues. IEEE Access 2020, 8, 204612–204638. [Google Scholar] [CrossRef]
- Filippini, C.; Perpetuini, D.; Cardone, D.; Chiarelli, A.M.; Merla, A. Thermal infrared imaging-based affective computing and its application to facilitate human-robot interaction: A review. Appl. Sci. 2020, 10, 2924. [Google Scholar] [CrossRef]
- Kaklauskas, A.; Abraham, A.; Ubarte, I.; Kliukas, R.; Luksaite, V.; Binkyte-Veliene, A.; Vetloviene, I.; Kaklauskiene, L. A review of AI cloud and edge sensors, methods, and applications for the recognition of emotional, affective and physiological states. Sensors 2022, 22, 7824. [Google Scholar] [CrossRef]
- Huang, Y.; Fei, T.; Kwan, M.P.; Kang, Y.; Li, J.; Li, Y.; Li, X.; Bian, M. GIS-based emotional computing: A review of quantitative approaches to measure the emotion layer of human-environment relationships. ISPRS Int. J. Geo-Inf. 2020, 9, 551. [Google Scholar] [CrossRef]
- Li, X.; Zhang, Y.; Tiwari, P.; Song, D.; Hu, B.; Yang, M.; Zhao, Z.; Kumar, N.; Marttinen, P. EEG-based emotion recognition: A tutorial and review. ACM Comput. Surv. 2022, 55, 1–57. [Google Scholar] [CrossRef]
- Suhaimi, N.S.; Mountstephens, J.; Teo, J. EEG-based emotion recognition: A state-of-the-art review of current trends and opportunities. Comput. Intell. Neurosci. 2020, 2020, 8875426. [Google Scholar] [CrossRef] [PubMed]
- Houssein, E.H.; Hammad, A.; Ali, A.A. Human emotion recognition from EEG-based brain-computer interface using machine learning: A comprehensive review. Neural Comput. Appl. 2022, 34, 12527–12557. [Google Scholar] [CrossRef]
- Wang, F.; Wu, S.; Zhang, W.; Xu, Z.; Zhang, Y.; Wu, C.; Coleman, S. Emotion recognition with convolutional neural network and EEG-based EFDMs. Neuropsychologia 2020, 146, 107506. [Google Scholar] [CrossRef] [PubMed]
- Torres, E.P.; Torres, E.A.; Hernández-Álvarez, M.; Yoo, S.G. EEG-based BCI emotion recognition: A survey. Sensors 2020, 20, 5083. [Google Scholar] [CrossRef]
- Islam, M.R.; Moni, M.A.; Islam, M.M.; Rashed-Al-Mahfuz, M.; Islam, M.S.; Hasan, M.K.; Hossain, M.S.; Ahmad, M.; Uddin, S.; Azad, A.; et al. Emotion recognition from EEG signal focusing on deep learning and shallow learning techniques. IEEE Access 2021, 9, 94601–94624. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, B.; Di, L. Research Progress of EEG-Based Emotion Recognition: A Survey. ACM Comput. Surv. 2024, 56, 1–49. [Google Scholar] [CrossRef]
- Cui, X.; Wu, Y.; Wu, J.; You, Z.; Xiahou, J.; Ouyang, M. A review: Music-emotion recognition and analysis based on EEG signals. Front. Neuroinform. 2022, 16, 997282. [Google Scholar] [CrossRef]
- Sortwell, A.; Evgenia, G.; Zagarella, S.; Granacher, U.; Forte, P.; Ferraz, R.; Ramirez-Campillo, R.; Carter-Thuillier, B.; Konukman, F.; Nouri, A.; et al. Making neuroscience a priority in Initial Teacher Education curricula: A call for bridging the gap between research and future practices in the classroom. Neurosci. Res. Notes 2023, 6, 266.1–266.7. [Google Scholar] [CrossRef]
- Pinto, A.M.; Geenen, R.; Wager, T.D.; Lumley, M.A.; Häuser, W.; Kosek, E.; Ablin, J.N.; Amris, K.; Branco, J.; Buskila, D.; et al. Emotion regulation and the salience network: A hypothetical integrative model of fibromyalgia. Nat. Rev. Rheumatol. 2023, 19, 44–60. [Google Scholar] [CrossRef]
- Hu, N.; Long, Q.; Li, Q.; Hu, X.; Li, Y.; Zhang, S.; Chen, A.; Huo, R.; Liu, J.; Wang, X. The modulation of salience and central executive networks by acute stress in healthy males: An EEG microstates study. Int. J. Psychophysiol. 2021, 169, 63–70. [Google Scholar] [CrossRef] [PubMed]
- Nicholson, A.A.; Harricharan, S.; Densmore, M.; Neufeld, R.W.; Ros, T.; McKinnon, M.C.; Frewen, P.A.; Théberge, J.; Jetly, R.; Pedlar, D.; et al. Classifying heterogeneous presentations of PTSD via the default mode, central executive, and salience networks with machine learning. NeuroImage Clin. 2020, 27, 102262. [Google Scholar] [CrossRef]
- Zanella, F.; Monachesi, B.; Grecucci, A. Resting-state BOLD temporal variability in sensorimotor and salience networks underlies trait emotional intelligence and explains differences in emotion regulation strategies. Sci. Rep. 2022, 12, 15163. [Google Scholar] [CrossRef] [PubMed]
- Almdahl, I.S.; Martinussen, L.J.; Agartz, I.; Hugdahl, K.; Korsnes, M.S. Inhibition of emotions in healthy aging: Age-related differences in brain network connectivity. Brain Behav. 2021, 11, e02052. [Google Scholar] [CrossRef]
- Palomero-Gallagher, N.; Amunts, K. A short review on emotion processing: A lateralized network of neuronal networks. Brain Struct. Funct. 2022, 227, 673–684. [Google Scholar] [CrossRef] [PubMed]
- Dadario, N.B.; Sughrue, M.E. Should neurosurgeons try to preserve non-traditional brain networks? A systematic review of the neuroscientific evidence. J. Pers. Med. 2022, 12, 587. [Google Scholar] [CrossRef]
- Theodorakopoulos, L.; Karras, A.; Theodoropoulou, A.; Kampiotis, G. Benchmarking Big Data Systems: Performance and Decision-Making Implications in Emerging Technologies. Technologies 2024, 12, 217. [Google Scholar] [CrossRef]
- Fan, J.; Fang, L.; Wu, J.; Guo, Y.; Dai, Q. From brain science to artificial intelligence. Engineering 2020, 6, 4–13. [Google Scholar] [CrossRef]
- Parhi, K.K.; Unnikrishnan, N.K. Brain-inspired computing: Models and architectures. IEEE Open J. Circuits Syst. 2020, 1, 185–204. [Google Scholar] [CrossRef]
- Zeng, Y.; Zhao, D.; Zhao, F.; Shen, G.; Dong, Y.; Lu, E.; Zhang, Q.; Sun, Y.; Liang, Q.; Zhao, Y.; et al. Braincog: A spiking neural network based, brain-inspired cognitive intelligence engine for brain-inspired AI and brain simulation. Patterns 2023, 4, 100789. [Google Scholar] [CrossRef] [PubMed]
- Kanwisher, N.; Khosla, M.; Dobs, K. Using artificial neural networks to ask ‘why’ questions of minds and brains. Trends Neurosci. 2023, 46, 83–95. [Google Scholar] [CrossRef] [PubMed]
- Macpherson, T.; Churchland, A.; Sejnowski, T.; DiCarlo, J.; Kamitani, Y.; Takahashi, H.; Hikida, T. Natural and Artificial Intelligence: A brief introduction to the interplay between AI and neuroscience research. Neural Netw. 2021, 144, 603–613. [Google Scholar] [CrossRef] [PubMed]
- Grassini, S. EEG for the Study of Environmental Neuroscience. In Environmental Neuroscience; Springer: Cham, Switzerland, 2024. [Google Scholar] [CrossRef]
- Jiang, C.; Li, Y.; Tang, Y.; Guan, C. Enhancing EEG-based classification of depression patients using spatial information. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 566–575. [Google Scholar] [CrossRef]
- Earl, E.H.; Goyal, M.; Mishra, S.; Kannan, B.; Mishra, A.; Chowdhury, N.; Mishra, P. EEG-based Functional Connectivity in Resting and Emotional States may Identify Major Depressive Disorder using Machine Learning. Clin. Neurophysiol. 2024, 136, 105017. [Google Scholar] [CrossRef] [PubMed]
- Ahne, E.; Rosselli, M. The Impact of a Single, Brief Mindfulness Intervention on Cognitive and Emotional Reactivity: An EEG Study. Mindfulness 2024, 15, 876–888. [Google Scholar] [CrossRef]
- Park, W.; Alsuradi, H.; Eid, M. EEG Correlates to Perceived Urgency Elicited by Vibration Stimulation of the Upper Body. Sci. Rep. 2024, 14, 1789. [Google Scholar] [CrossRef] [PubMed]
- Halkiopoulos, C.; Antonopoulou, H.; Gkintoni, E.; Aroutzidis, A. Neuromarketing as an Indicator of Cognitive Consumer Behavior in Decision-Making Process of Tourism destination—An Overview. In Transcending Borders in Tourism Through Innovation and Cultural Heritage; Springer: Cham, Switzerland, 2022; pp. 679–697. [Google Scholar] [CrossRef]
- Sivaranjani, S.; Haldo, J.; Rithick, R.; Santhosh, M.; Santhoshkumar, R.P. Analyzing EEG Patterns as Predictors of Physiological Responses. In Proceedings of the 2024 International Conference on Advances in Data Engineering and Intelligent Computing Systems (ADICS), Chennai, India, 18–19 April 2024; pp. 1–6. [Google Scholar] [CrossRef]
- Saganowski, S.; Perz, B.; Polak, A.G.; Kazienko, P. Emotion Recognition for Everyday Life using Physiological Signals from Wearables: A Systematic Literature Review. IEEE Trans. Affect. Comput. 2022, 14, 1876–1897. [Google Scholar] [CrossRef]
- Grandjean, D. Brain Networks of Emotional Prosody Processing. Emot. Rev. 2021, 13, 234–245. [Google Scholar] [CrossRef]
- Ghomroudi, P.A.; Scaltritti, M.; Grecucci, A. Decoding Reappraisal and Suppression from Neural Circuits: A Combined Supervised and Unsupervised Machine Learning Approach. Cogn. Affect. Behav. Neurosci. 2023, 23, 1095–1112. [Google Scholar] [CrossRef]
- Sayal, A.; Guedes, A.G.; Almeida, I.; Pereira, D.J.; Lima, C.; Panda, R.; Paiva, R.P.; Sousa, T.; Castelo-Branco, M.; Bernardino, I.; et al. Decoding Musical Valence and Arousal: Exploring the Neural Correlates of Music-Evoked Emotions and the Role of Expressivity Features. bioRxiv 2024. [Google Scholar] [CrossRef]
- Li, Q.; Liu, Y.; Shang, Y.; Zhang, Q.; Yan, F. Deep Sparse Autoencoder and Recursive Neural Network for EEG Emotion Recognition. Entropy 2022, 24, 1187. [Google Scholar] [CrossRef] [PubMed]
- Huang, Z.; Du, C.; Wang, Y.; Fu, K.; He, H. Graph-Enhanced Emotion Neural Decoding. IEEE Trans. Med. Imaging 2023, 42, 2262–2273. [Google Scholar] [CrossRef] [PubMed]
- Kamali, A.; Milosavljevic, S.; Gandhi, A.; Lano, K.R.; Shobeiri, P.; Sherbaf, F.G.; Sair, H.I.; Riascos, R.F.; Hasan, K.M. The Cortico-Limbo-Thalamo-Cortical Circuits: An Update to the Original Papez Circuit of the Human Limbic System. Brain Topogr. 2023, 36, 371–389. [Google Scholar] [CrossRef] [PubMed]
- Aggleton, J.P.; Nelson, A.J.; O’Mara, S.M. Time to Retire the Serial Papez Circuit: Implications for Space, Memory, and Attention. Neurosci. Biobehav. Rev. 2022, 140, 104813. [Google Scholar] [CrossRef] [PubMed]
- Kaushal, P.S.; Saran, B.; Bazaz, A.; Tiwari, H. A Brief Review of Limbic System Anatomy, Function, and Its Clinical Implication. Santosh Univ. J. Health Sci. 2024, 10, 26–32. [Google Scholar] [CrossRef]
- Ferreira, T.A., Jr.; Middlebrooks, E.H.; Tzu, W.H.; Neto, M.R.; Holanda, V.M. Postmortem Dissections of the Papez Circuit and Nonmotor Targets for Functional Neurosurgery. World Neurosurg. 2020, 144, e866–e875. [Google Scholar] [CrossRef]
- Lotze, M. Emotional Processing Impairments in Patients with Insula Lesions Following Stroke. NeuroImage 2024, 291, 120591. [Google Scholar] [CrossRef]
- Álvarez-Fernández, S.; Andrade-González, N.; Simal, P.; Matias-Guiu, J.A.; Gómez-Escalonilla, C.; Rodriguez-Jimenez, R.; Stiles, B.J.; Lahera, G. Emotional Processing in Patients with Single Brain Damage in the Right Hemisphere. BMC Psychol. 2023, 11, 8. [Google Scholar] [CrossRef] [PubMed]
- Gkintoni, E.; Halkiopoulos, C.; Antonopoulou, H. Neuroleadership an Asset in Educational Settings: An Overview. Emerg. Sci. Journal. Emerg. Sci. J. 2022, 6, 893–904. [Google Scholar] [CrossRef]
- Barrash, J.; Bruss, J.; Anderson, S.W.; Kuceyeski, A.; Manzel, K.; Tranel, D.; Boes, A.D. Lesions in Different Prefrontal Sectors Are Associated with Different Types of Acquired Personality Disturbances. Cortex 2022, 147, 169–184. [Google Scholar] [CrossRef]
- Nawaz, R.; Cheah, K.H.; Nisar, H.; Yap, V.V. Comparison of Different Feature Extraction Methods for EEG-Based Emotion Recognition. Biocybern. Biomed. Eng. 2020, 40, 910–926. [Google Scholar] [CrossRef]
- Wang, J.; Wang, M. Review of the Emotional Feature Extraction and Classification Using EEG Signals. Cogn. Robot. 2021, 1, 29–40. [Google Scholar] [CrossRef]
- Phadikar, S.; Sinha, N.; Ghosh, R. A Survey on Feature Extraction Methods for EEG-Based Emotion Recognition. In Intelligent Techniques and Applications in Science and Technology; Springer International Publishing: Cham, Switzerland, 2020; pp. 31–45. [Google Scholar] [CrossRef]
- Patel, P.; Annavarapu, R.N. EEG-Based Human Emotion Recognition Using Entropy as a Feature Extraction Measure. Brain Inform. 2021, 8, 20. [Google Scholar] [CrossRef] [PubMed]
- Dadebayev, D.; Goh, W.W.; Tan, E.X. EEG-Based Emotion Recognition: Review of Commercial EEG Devices and Machine Learning Techniques. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 4385–4401. [Google Scholar] [CrossRef]
- Bouwer, F.L.; Háden, G.P.; Honing, H. Probing Beat Perception with Event-Related Potentials (ERPs) in Human Adults, Newborns, and Nonhuman Primates. In Neurobiology of Interval Timing; Springer: Cham, Switzerland, 2024. [Google Scholar] [CrossRef]
- Kwon, Y.S.; Lee, J.; Lee, S. The Impact of Background Music on Film Audience’s Attentional Processes: Electroencephalography Alpha-Rhythm and Event-Related Potential Analyses. Front. Psychol. 2022, 13, 933497. [Google Scholar] [CrossRef] [PubMed]
- Hickey, P.; Barnett-Young, A.; Patel, A.D.; Race, E. Environmental Rhythms Orchestrate Neural Activity at Multiple Stages of Processing During Memory Encoding: Evidence from Event-Related Potentials. PLoS ONE 2020, 15, e0234668. [Google Scholar] [CrossRef] [PubMed]
- Henrich, K.; Scharinger, M. Predictive Processing in Poetic Language: Event-Related Potentials Data on Rhythmic Omissions in Metered Speech. Front. Psychol. 2022, 12, 782765. [Google Scholar] [CrossRef]
- Dousset, C.; Wyckmans, F.; Monseigne, T.; Fourdin, L.; Boulanger, R.; Sistiaga, S.; Ingels, A.; Kajosch, H.; Noël, X.; Kornreich, C.; et al. Sensori-Motor Neurofeedback Improves Inhibitory Control and Induces Neural Changes: A Placebo-Controlled, Double-Blind, Event-Related Potentials Study. Int. J. Clin. Health Psychol. 2024, 24, 100501. [Google Scholar] [CrossRef] [PubMed]
- Tan, C.; Šarlija, M.; Kasabov, N. NeuroSense: Short-Term Emotion Recognition and Understanding Based on Spiking Neural Network Modelling of Spatio-Temporal EEG Patterns. Neurocomputing 2021, 454, 278–290. [Google Scholar] [CrossRef]
- Šimić, G.; Tkalčić, M.; Vukić, V.; Mulc, D.; Španić, E.; Šagud, M.; Olucha-Bordonau, F.E.; Vukšić, M.; Hof, P.R. Understanding Emotions: Origins and Roles of the Amygdala. Biomolecules 2021, 11, 823. [Google Scholar] [CrossRef] [PubMed]
- Savchenko, A.V.; Savchenko, L.V.; Makarov, I. Classifying Emotions and Engagement in Online Learning Based on a Single Facial Expression Recognition Neural Network. IEEE Trans. Affect. Comput. 2022, 13, 2132–2143. [Google Scholar] [CrossRef]
- Pomazan, V.; Tvoroshenko, I.; Gorokhovatskyi, V. Development of an Application for Recognizing Emotions Using Convolutional Neural Networks. Int. J. Acad. Inf. Syst. Res. 2023, 7, 25–36. [Google Scholar]
- Alnuaim, A.A.; Zakariah, M.; Alhadlaq, A.; Shashidhar, C.; Hatamleh, W.A.; Tarazi, H.; Shukla, P.K.; Ratna, R. Human-Computer Interaction with Detection of Speaker Emotions Using Convolution Neural Networks. Comput. Intell. Neurosci. 2022, 2022, 7463091. [Google Scholar] [CrossRef]
- Shou, Y.; Meng, T.; Ai, W.; Yang, S.; Li, K. Conversational Emotion Recognition Studies Based on Graph Convolutional Neural Networks and a Dependent Syntactic Analysis. Neurocomputing 2022, 501, 629–639. [Google Scholar] [CrossRef]
- Haddaway, N.R.; Page, M.J.; Pritchard, C.C.; McGuinness, L.A. PRISMA2020: An R package and Shiny app for producing PRISMA 2020-compliant flow diagrams, with interactivity for optimised digital transparency and Open Synthesis. Campbell Syst. Rev. 2022, 18, e1230. [Google Scholar] [CrossRef]
- van den Akker, O.R.; Peters, G.-J.Y.; Bakker, C.J.; Carlsson, R.; Coles, N.A.; Corker, K.S.; Feldman, G.; Moreau, D.; Nordström, T.; Pickering, J.S.; et al. Increasing the transparency of systematic reviews: Presenting a generalized registration form. Syst. Rev. 2023, 12, 170. [Google Scholar] [CrossRef] [PubMed]
- Adams, S.; Penton-Voak, I.; Harmer, C.; Holmes, E.; Munafo, M. Effects of emotion recognition training on mood among individuals with high levels of depressive symptoms: Study protocol for a randomised controlled trial. Trials 2013, 14, 161. [Google Scholar] [CrossRef] [PubMed]
- Aellen, F.; Göktepe-Kavis, P.; Apostolopoulos, S.; Tzovara, A. Convolutional neural networks for decoding electroencephalography responses and visualizing trial by trial changes in discriminant features. J. Neurosci. Methods 2021, 364, 109367. [Google Scholar] [CrossRef] [PubMed]
- Albein-Urios, N.; Fernandez, L.; Hill, A.; Kirkovski, M.; Enticott, P. Prefrontal anodal High Definition-tDCS has limited effects on emotion regulation. Brain Stimul. 2022, 16, 17–19. [Google Scholar] [CrossRef] [PubMed]
- Allen, J.J.B.; Harmon-Jones, E.; Cavender, J.H. Manipulation of frontal EEG asymmetry through biofeedback alters self-reported emotional responses and facial EMG. Psychophysiology 2001, 38, 685–693. [Google Scholar] [CrossRef]
- Amaral, C.P.; Simões, M.; Castelo-Branco, M. Neural Signals Evoked by Stimuli of Increasing Social Scene Complexity Are Detectable at the Single-Trial Level and Right Lateralized. PLoS ONE 2015, 10, e0121970. [Google Scholar] [CrossRef] [PubMed]
- Balconi, M.; Vandelli, G.V.; Angioletti, L. Be Creative to Innovate! EEG Correlates of Group Decision-Making in Managers. Sustainability 2024, 16, 2175. [Google Scholar] [CrossRef]
- Bois, N.D.; Bigirimana, A.; Korik, A.; Kéthina, L.G.; Rutembesa, E.; Mutabaruka, J.; Mutesa, L.; Prasad, G.; Jansen, S.; Coyle, D. Electroencephalography and psychological assessment datasets to determine the efficacy of a low-cost, wearable neurotechnology intervention for reducing Post-Traumatic Stress Disorder symptom severity. Data Brief 2022, 42, 108066. [Google Scholar] [CrossRef]
- Bois, N.D.; Bigirimana, A.; Korik, A.; Kéthina, L.G.; Rutembesa, E.; Mutabaruka, J.; Mutesa, L.; Prasad, G.; Jansen, S.; Coyle, D. Neurofeedback with low-cost, wearable electroencephalography (EEG) reduces symptoms in chronic Post-Traumatic Stress Disorder. J. Affect. Disord. 2021, 295, 1319–1334. [Google Scholar] [CrossRef] [PubMed]
- Borboni, A.; Elamvazuthi, I.; Cusano, N. EEG-Based Empathic Safe Cobot. Machines 2022, 10, 603. [Google Scholar] [CrossRef]
- Brennan, S.; McLoughlin, D.; O’Connell, R.; Bogue, J.; O’Connor, S.; Mchugh, C.; Glennon, M. Anodal transcranial direct current stimulation of the left dorsolateral prefrontal cortex enhances emotion recognition in depressed patients and controls. J. Clin. Exp. Neuropsychol. 2017, 39, 384–395. [Google Scholar] [CrossRef] [PubMed]
- Cao, D.; Li, Y.; Niznikiewicz, M.; Tang, Y.; Wang, J. The theta burst transcranial magnetic stimulation over the right PFC affects electroencephalogram oscillation during emotional processing. Prog. Neuro-Psychopharmacol. Biol. Psychiatry 2017, 82, 21–30. [Google Scholar] [CrossRef] [PubMed]
- Ciorciari, J.; Pfeifer, J.; Gountas, J. An EEG Study on Emotional Intelligence and Advertising Message Effectiveness. Behav. Sci. 2019, 9, 88. [Google Scholar] [CrossRef]
- Compton, R.; Arnstein, D.; Freedman, G.; Dainer-Best, J.; Liss, A.; Robinson, M.D. Neural and behavioral measures of error-related cognitive control predict daily coping with stress. Emotion 2011, 11, 379–390. [Google Scholar] [CrossRef] [PubMed]
- Correa, J.A.M.; Abadi, M.K.; Sebe, N.; Patras, I. AMIGOS: A Dataset for Affect, Personality and Mood Research on Individuals and Groups. IEEE Trans. Affect. Comput. 2018, 12, 479–493. [Google Scholar] [CrossRef]
- Costa, N.M.D.; Bicho, E.; Dias, N. Does priming subjects, with not only resting state but also with mindfulness or/and guided imagery, affect self-regulation of SMR neurofeedback? Framework to improve brain self-regulation and support the rehabilitation of disorders such as depression, anxiety, stress and attention control. Front. Cell. Neurosci. 2019, 13. [Google Scholar] [CrossRef]
- Dennis, T.A.; Solomon, B. Frontal EEG and emotion regulation: Electrocortical activity in response to emotional film clips is associated with reduced mood induction and attention interference effects. Biol. Psychol. 2010, 85, 456–464. [Google Scholar] [CrossRef] [PubMed]
- Dolcos, S.; Hu, Y.; Williams, C.L.; Bogdan, P.C.; Hohl, K.; Berenbaum, H.; Dolcos, F. Cultivating Affective Resilience: Proof-of-Principle Evidence of Translational Benefits From a Novel Cognitive-Emotional Training Intervention. Front. Psychol. 2021, 12, 585536. [Google Scholar] [CrossRef]
- Egana-delSol, P.; Sun, X.; Sajda, P. Neurophysiological markers of emotion regulation predict efficacy of entrepreneurship education. Sci. Rep. 2023, 13, 7206. [Google Scholar] [CrossRef] [PubMed]
- Egner, T.; Gruzelier, J. Ecological validity of neurofeedback: Modulation of slow wave EEG enhances musical performance. NeuroReport 2003, 14, 1221–1224. [Google Scholar] [CrossRef] [PubMed]
- Eldeeb, S.M.; Susam, B.T.; Akçakaya, M.; Conner, C.M.; White, S.; Mazefsky, C. Trial by trial EEG based BCI for distress versus non distress classification in individuals with ASD. Sci. Rep. 2021, 11, 6000. [Google Scholar] [CrossRef] [PubMed]
- Feeser, M.; Prehn, K.; Kazzer, P.; Mungee, A.; Bajbouj, M. Transcranial Direct Current Stimulation Enhances Cognitive Control During Emotion Regulation. Brain Stimul. 2014, 7, 105–112. [Google Scholar] [CrossRef] [PubMed]
- Friedrich, E.V.; Sivanathan, A.; Lim, T.; Suttie, N.; Louchart, S.; Pillen, S.; Pineda, J. An Effective Neurofeedback Intervention to Improve Social Interactions in Children with Autism Spectrum Disorder. J. Autism Dev. Disord. 2015, 45, 4084–4100. [Google Scholar] [CrossRef]
- Gamond, L.; Cattaneo, Z. The dorsomedial prefrontal cortex plays a causal role in mediating in-group advantage in emotion recognition: A TMS study. Neuropsychologia 2016, 93, 312–317. [Google Scholar] [CrossRef] [PubMed]
- Gao, H.; Zhang, H.; Wang, L.; Zhang, C.; Feng, Z.; Li, Z.; Tong, L.; Yan, B.; Hu, G. Altered amygdala functional connectivity after real-time functional MRI emotion self-regulation training. NeuroReport 2023, 34, 537–545. [Google Scholar] [CrossRef]
- Gootjes, L.; Franken, I.; Strien, J.W. Cognitive Emotion Regulation in Yogic Meditative Practitioners Sustained Modulation of Electrical Brain Potentials. J. Psychophysiol. 2011, 25, 87–94. [Google Scholar] [CrossRef]
- Herwig, U.; Lutz, J.; Scherpiet, S.; Scheerer, H.; Kohlberg, J.; Opialla, S.; Preuss, A.; Steiger, V.R.; Sulzer, J.; Weidt, S.; et al. Training emotion regulation through real-time fMRI neurofeedback of amygdala activity. NeuroImage 2019, 184, 687–696. [Google Scholar] [CrossRef] [PubMed]
- Iacoviello, B.; Wu, G.; Alvarez, E.; Huryk, K.M.; Collins, K.; Murrough, J.; Iosifescu, D.; Charney, D. Cognitive-emotional training as an intervention for major depressive disorder. Depress. Anxiety 2014, 31, 699–706. [Google Scholar] [CrossRef]
- Imperatori, C.; Massullo, C.; Rossi, E.D.; Carbone, G.A.; Theodorou, A.; Scopelliti, M.; Romano, L.; Gatto, C.D.; Allegrini, G.; Carrus, G.; et al. Exposure to nature is associated with decreased functional connectivity within the distress network: A resting state EEG study. Front. Psychol. 2023, 14, 1171215. [Google Scholar] [CrossRef]
- Kim, H.; Chae, Y.; Kim, S.; Im, C. Development of a Computer-Aided Education System Inspired by Face-to-Face Learning by Incorporating EEG-Based Neurofeedback Into Online Video Lectures. IEEE Trans. Learn. Technol. 2023, 16, 78–91. [Google Scholar] [CrossRef]
- Klimm, N.; Ehlis, A.; Häußinger, F.; Plewnia, C. P163. Reduction of excitability in the left inferior frontal gyrus by cathodal tDCS facilitates emotion recognition. Clin. Neurophysiol. 2015, 126, e142. [Google Scholar] [CrossRef]
- Lai, C.; Lucarelli, G.; Pellicano, G.; Massaro, G.; Altavilla, D.; Aceto, P. Neural Correlate of the Impact of Dream Recall on Emotional Processing. Dreaming 2019, 29, 40–56. [Google Scholar] [CrossRef]
- Lin, W.; Chen, Q.; Jiang, M.; Tao, J.; Liu, Z.; Zhang, X.; Wu, L.; Xu, S.; Kang, Y.; Zeng, Q. Sitting or Walking? Analyzing the Neural Emotional Indicators of Urban Green Space Behavior with Mobile EEG. J. Urban Health 2020, 97, 191–203. [Google Scholar] [CrossRef] [PubMed]
- Liu, H.; Hsieh, M.; Hsu, Y.; Lai, W. Effects of affective arousal on choice behavior, reward prediction errors, and feedback-related negativities in human reward-based decision making. Front. Psychol. 2015, 6, 592. [Google Scholar] [CrossRef] [PubMed]
- Maffei, A.; Spironelli, C.; Angrilli, A. Affective and cortical EEG gamma responses to emotional movies in women with high vs low traits of empathy. Neuropsychologia 2019, 133, 107175. [Google Scholar] [CrossRef]
- Magdin, M.; Balogh, Z.; Reichel, J.; Francisti, J.; Koprda, Š.; György, M. Automatic detection and classification of emotional states in virtual reality and standard environments (LCD): Comparing valence and arousal of induced emotions. Virtual Real. 2021, 25, 1029–1041. [Google Scholar] [CrossRef]
- Müller, K.; Tangermann, M.; Dornhege, G.; Krauledat, M.; Curio, G.; Blankertz, B. Machine learning for real-time single-trial EEG-analysis: From brain–computer interfacing to mental state monitoring. J. Neurosci. Methods 2008, 167, 82–90. [Google Scholar] [CrossRef] [PubMed]
- Müller-Dahlhaus, F.; Bergmann, T. Network perturbation-based biomarkers of depression and treatment response. Cell Rep. Med. 2023, 4, 101086. [Google Scholar] [CrossRef]
- Park, W.; Cho, M.; Park, S. Effects of Electroencephalogram Biofeedback on Emotion Regulation and Brain Homeostasis of Late Adolescents in the COVID-19 Pandemic. J. Korean Acad. Nurs. 2022, 52, 36–51. [Google Scholar] [CrossRef] [PubMed]
- Parvaz, M.; Moeller, S.; Goldstein, R.Z.; Proudfit, G.H. Electrocortical evidence of increased post-reappraisal neural reactivity and its link to depressive symptoms. Soc. Cogn. Affect. Neurosci. 2015, 10, 78–84. [Google Scholar] [CrossRef] [PubMed]
- Peng, Y.; Huang, Y.; Chen, B.; He, M.; Jiang, L.; Li, Y.; Huang, X.; Pei, C.; Zhang, S.; Li, C.; et al. Electroencephalographic Network Topologies Predict Antidepressant Responses in Patients with Major Depressive Disorder. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 2577–2588. [Google Scholar] [CrossRef]
- Ramírez, R.; Planas, J.; Escudé, N.; Mercadé, J.; Farriols, C. EEG-Based Analysis of the Emotional Effect of Music Therapy on Palliative Care Cancer Patients. Front. Psychol. 2018, 9, 254. [Google Scholar] [CrossRef]
- Ruiz, S.; Sitaram, R.; Lee, S.; Soekadar, S.; Birbaumer, N. Learning to self-regulate insula cortex modulates emotion recognition and neural connectivity in schizophrenia. Schizophr. Res. 2010, 117, 178. [Google Scholar] [CrossRef]
- Şahintürk, S.; Yıldırım, E. Effects of tDCS on emotion recognition and brain oscillations. J. Clin. Exp. Neuropsychol. 2024, 46, 504–521. [Google Scholar] [CrossRef] [PubMed]
- Sarkheil, P.; Zilverstand, A.; Kilian-Hütten, N.; Schneider, F.; Goebel, R.; Mathiak, K. fMRI feedback enhances emotion regulation as evidenced by a reduced amygdala response. Behav. Brain Res. 2015, 281, 326–332. [Google Scholar] [CrossRef] [PubMed]
- Schweizer, S.; Hampshire, A.; Dalgleish, T. Extending Brain-Training to the Affective Domain: Increasing Cognitive and Affective Executive Control through Emotional Working Memory Training. PLoS ONE 2011, 6, e24372. [Google Scholar] [CrossRef] [PubMed]
- Scult, M.; Fresco, D.; Gunning, F.; Liston, C.; Seeley, S.H.; García, E.; Mennin, D. Changes in Functional Connectivity Following Treatment with Emotion Regulation Therapy. Front. Behav. Neurosci. 2019, 13, 10. [Google Scholar] [CrossRef]
- Shang, B.; Duan, F.; Fu, R.; Gao, J.; Sik, H.; Meng, X.; Chang, C. EEG-based investigation of effects of mindfulness meditation training on state and trait by deep learning and traditional machine learning. Front. Hum. Neurosci. 2023, 17, 1033420. [Google Scholar] [CrossRef] [PubMed]
- Si, Y.; Li, F.; Duan, K.; Tao, Q.; Li, C.; Cao, Z.; Zhang, Y.; Biswal, B.; Li, P.; Yao, D.; et al. Predicting individual decision-making responses based on single-trial EEG. NeuroImage 2019, 206, 116333. [Google Scholar] [CrossRef] [PubMed]
- Sun, R.; Wong, W.; Wang, J.; Wang, X.; Tong, R. Functional brain networks assessed with surface electroencephalography for predicting motor recovery in a neural guided intervention for chronic stroke. Brain Commun. 2021, 3, fcab214. [Google Scholar] [CrossRef]
- Sun, Y.; Xu, Y.; Lv, J.; Liu, Y. Self- and Situation-Focused Reappraisal are not homogeneous: Evidence from behavioral and brain networks. Neuropsychologia 2022, 173, 108282. [Google Scholar] [CrossRef]
- Tian, Y.; Zhang, H.; Pang, Y.; Lin, J. Classification for Single-Trial N170 During Responding to Facial Picture with Emotion. Front. Comput. Neurosci. 2018, 12, 68. [Google Scholar] [CrossRef]
- Trenado, C.; González-Ramírez, A.; Lizárraga-Cortés, V.; Leal, N.P.; Manjarrez, E.; Ruge, D. The Potential of Trial-by-Trial Variabilities of Ongoing-EEG, Evoked Potentials, Event Related Potentials and fMRI as Diagnostic Markers for Neuropsychiatric Disorders. Front. Neurosci. 2019, 12, 850. [Google Scholar] [CrossRef]
- Vahid, A.; Mückschel, M.; Stober, S.; Stock, A.; Beste, C. Applying deep learning to single-trial EEG data provides evidence for complementary theories on action control. Commun. Biol. 2020, 3, 112. [Google Scholar] [CrossRef] [PubMed]
- Valencia, S.; Trujillo, N.; Trujillo, S.; Acosta, A.; Rodríguez, M.; Ugarriza, J.; López, J.D.; García, A.M.; Parra, M. Neurocognitive reorganization of emotional processing following a socio-cognitive intervention in Colombian ex-combatants. Soc. Neurosci. 2020, 15, 398–407. [Google Scholar] [CrossRef]
- Vieira, L.; Marques, D.; Melo, L.; Marques, R.C.; Monte-Silva, K.; Cantilino, A. Transcranial direct current stimulation effects on cognitive reappraisal: An unexpected result? Brain Stimul. 2020, 13, 650–652. [Google Scholar] [CrossRef]
- Vonck, S.; Swinnen, S.; Wenderoth, N.; Alaerts, K. Effects of Transcranial Direct Current Stimulation on the Recognition of Bodily Emotions from Point-Light Displays. Front. Hum. Neurosci. 2015, 9, 438. [Google Scholar] [CrossRef][Green Version]
- Votinov, M.; Wagels, L.; Hoffstaedter, F.; Kellermann, T.; Goerlich, K.; Eickhoff, S.; Habel, U. Effects of exogenous testosterone application on network connectivity within emotion regulation systems. Sci. Rep. 2020, 10, 2352. [Google Scholar] [CrossRef] [PubMed]
- Wang, J.; Nicol, T.; Skoe, E.; Sams, M.; Kraus, N. Emotion Modulates Early Auditory Response to Speech. J. Cogn. Neurosci. 2009, 21, 2121–2128. [Google Scholar] [CrossRef] [PubMed][Green Version]
- Watve, A.; Haugg, A.; Frei, N.; Koush, Y.; Willinger, D.; Bruehl, A.B.; Stämpfli, P.; Scharnowski, F.; Sladky, R. Facing emotions: Real-time fMRI-based neurofeedback using dynamic emotional faces to modulate amygdala activity. Front. Neurosci. 2024, 17, 1286665. [Google Scholar] [CrossRef] [PubMed]
- Yang, H.; Savostyanov, A.; Tsai, A.C.; Liou, M. Face recognition in Asperger syndrome: A study on EEG spectral power changes. Neurosci. Lett. 2011, 492, 84–88. [Google Scholar] [CrossRef] [PubMed]
- Young, K.D.; Misaki, M.; Harmer, C.; Victor, T.A.; Zotev, V.; Phillips, R.; Siegle, G.; Drevets, W.; Bodurka, J. Real-Time Functional Magnetic Resonance Imaging Amygdala Neurofeedback Changes Positive Information Processing in Major Depressive Disorder. Biol. Psychiatry 2017, 82, 578–586. [Google Scholar] [CrossRef] [PubMed]
- Zhang, W.; Luo, J. The transferable placebo effect from pain to emotion: Changes in behavior and EEG activity. Psychophysiology 2009, 46, 626–634. [Google Scholar] [CrossRef]
- Zhao, Z.; Yao, S.; Li, K.; Sindermann, C.; Zhou, F.; Zhao, W.; Li, J.; Lührs, M.; Goebel, R.; Kendrick, K.; et al. Real-Time Functional Connectivity-Informed Neurofeedback of Amygdala-Frontal Pathways Reduces Anxiety. Psychother. Psychosom. 2019, 88, 5–15. [Google Scholar] [CrossRef]
- Zotev, V.; Krueger, F.; Phillips, R.; Alvarez, R.; Simmons, W.K.; Bellgowan, P.; Drevets, W.; Bodurka, J.; Domschke, K. Self-Regulation of Amygdala Activation Using Real-Time fMRI Neurofeedback. PLoS ONE 2011, 6, e24522. [Google Scholar] [CrossRef]
- Pereira, D.R.; Teixeira-Santos, A.C.; Sampaio, A.; Pinheiro, A.P. Examining the Effects of Emotional Valence and Arousal on Source Memory: A Meta-Analysis of Behavioral Evidence. Emotion 2023, 23, 1740–1760. [Google Scholar] [CrossRef] [PubMed]
- Zsidó, A.N. The Effect of Emotional Arousal on Visual Attentional Performance: A Systematic Review. Psychol. Res. 2024, 88, 1–24. [Google Scholar] [CrossRef] [PubMed]
- Yik, M.; Mues, C.; Sze, I.N.; Kuppens, P.; Tuerlinckx, F.; De Roover, K.; Kwok, F.H.C.; Schwartz, S.H.; Abu-Hilal, M.; Adebayo, D.F.; et al. On the Relationship Between Valence and Arousal in Samples Across the Globe. Emotion 2023, 23, 332–346. [Google Scholar] [CrossRef]
- Teismann, H.; Kissler, J.; Berger, K. Investigating the Roles of Age, Sex, Depression, and Anxiety for Valence and Arousal Ratings of Words: A Population-Based Study. BMC Psychol. 2020, 8, 485. [Google Scholar] [CrossRef] [PubMed]
- Irrazabal, N.; Burin, D. Effects of Emotional Valence and Arousal on Comprehension and Assembly of Instructions. Trends Psychol. 2021, 29, 104–122. [Google Scholar] [CrossRef]
- Hofbauer, L.M.; Rodriguez, F.S. Emotional Valence Perception in Music and Subjective Arousal: Experimental Validation of Stimuli. Int. J. Psychol. 2023, 58, 465–475. [Google Scholar] [CrossRef]
- Castiblanco Jimenez, I.A.; Olivetti, E.C.; Vezzetti, E.; Moos, S.; Celeghin, A.; Marcolin, F. Effective Affective EEG-Based Indicators in Emotion-Evoking VR Environments: An Evidence from Machine Learning. Neural Comput. Appl. 2024, 36, 22245–22263. [Google Scholar] [CrossRef]
- Dhivyaa, C.R.; Nithya, K.; Anbukkarasi, S. Enhancing Parkinson’s Disease Detection and Diagnosis: A Survey of Integrative Approaches Across Diverse Modalities. IEEE Access 2024, 12, 158999–159024. [Google Scholar] [CrossRef]
- Jamil, M.; Aziz, M.Z.; Yu, X. Exploring the Potential of Pretrained CNNs and Time-Frequency Methods for Accurate Epileptic EEG Classification: A Comparative Study. Biomed. Phys. Eng. Express 2024, 10, 045023. [Google Scholar] [CrossRef]
- Nasseri, M.; Attia, T.P.; Joseph, B.; Gregg, N.M.; Nurse, E.S.; Viana, P.F.; Schulze-Bonhage, A.; Duempelmann, M.; Worrell, G.; Freestone, D.R.; et al. Non-Invasive Wearable Seizure Detection Using Long-Short-Term Memory Networks with Transfer Learning. J. Neural Eng. 2021, 18, 056017. [Google Scholar] [CrossRef]
- Dong, C.; Ye, T.; Long, X.; Aarts, R.M.; van Dijk, J.P.; Shang, C.; Liao, X.; Chen, W.; Lai, W.; Chen, L.; et al. A Two-Layer Ensemble Method for Detecting Epileptic Seizures Using a Self-Annotation Bracelet with Motor Sensors. IEEE Trans. Instrum. Meas. 2022, 71, 1–13. [Google Scholar] [CrossRef]
- Zheng, Q.; Venkitaraman, A.; Petravic, S.; Frossard, P. Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases; Springer: Berlin/Heidelberg, Germany, 2023; pp. 547–563. [Google Scholar] [CrossRef]
- Fontanillo Lopez, C.A.; Li, G.; Zhang, D. Beyond Technologies of Electroencephalography-Based Brain-Computer Interfaces: A Systematic Review from Commercial and Ethical Aspects. Front. Neurosci. 2020, 14, 611130. [Google Scholar] [CrossRef] [PubMed]
- Paek, A.Y.; Brantley, J.A.; Evans, B.J.; Contreras-Vidal, J.L. Concerns in the Blurred Divisions Between Medical and Consumer Neurotechnology. IEEE Syst. J. 2020, 15, 3069–3080. [Google Scholar] [CrossRef]
- Jamil, N.; Belkacem, A.N.; Ouhbi, S.; Lakas, A. Noninvasive Electroencephalography Equipment for Assistive, Adaptive, and Rehabilitative Brain-Computer Interfaces: A Systematic Literature Review. Sensors 2021, 21, 4754. [Google Scholar] [CrossRef]
- Beyrouthy, T.; Mostafa, N.; Roshdy, A.; Karar, A.S.; Alkork, S. Review of EEG-Based Biometrics in 5G-IoT: Current Trends and Future Prospects. Appl. Sci. 2024, 14, 534. [Google Scholar] [CrossRef]
- Värbu, K.; Muhammad, N.; Muhammad, Y. Past, Present, and Future of EEG-Based BCI Applications. Sensors 2022, 22, 3331. [Google Scholar] [CrossRef]
- Flanagan, K.; Saikia, M.J. Consumer-Grade Electroencephalogram and Functional Near-Infrared Spectroscopy Neurofeedback Technologies for Mental Health and Wellbeing. Sensors 2023, 23, 8482. [Google Scholar] [CrossRef]
- Xue, Z.; Zhang, Y.; Li, H.; Chen, H.; Shen, S.; Du, H. Instrumentation, Measurement, and Signal Processing in Electroencephalography-Based Brain-Computer Interfaces: Situations and Prospects. IEEE Trans. Instrum. Meas. 2024, 73, 1–18. [Google Scholar] [CrossRef]
- Jiang, Y.; Li, W.; Hossain, M.S.; Chen, M.; Alelaiwi, A.; Al-Hammadi, M. A Snapshot Research and Implementation of Multimodal Information Fusion for Data-Driven Emotion Recognition. Inf. Fusion 2020, 53, 209–221. [Google Scholar] [CrossRef]
- Karani, R.; Desai, S. Review on Multimodal Fusion Techniques for Human Emotion Recognition. Int. J. Adv. Comput. Sci. Appl. 2022, 13, 359–366. [Google Scholar] [CrossRef]
- Liu, S.; Gao, P.; Li, Y.; Fu, W.; Ding, W. Multi-Modal Fusion Network with Complementarity and Importance for Emotion Recognition. Inf. Sci. 2023, 647, 61–75. [Google Scholar] [CrossRef]
- Geetha, A.V.; Mala, T.; Priyanka, D.; Uma, E. Multimodal Emotion Recognition with Deep Learning: Advancements, Challenges, and Future Directions. Inf. Fusion 2024, 112, 102218. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).