Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (11)

Search Parameters:
Keywords = self-induced emotion recognition

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 9383 KiB  
Article
Using the β/α Ratio to Enhance Odor-Induced EEG Emotion Recognition
by Jiayi Fang, Genfa Yu, Shengliang Liao, Songxing Zhang, Guangyong Zhu and Fengping Yi
Appl. Sci. 2025, 15(9), 4980; https://doi.org/10.3390/app15094980 - 30 Apr 2025
Cited by 1 | Viewed by 448
Abstract
Emotion recognition using an odor-induced electroencephalogram (EEG) has broad applications in human-computer interaction. However, existing studies often rely on subjective self-reporting to label emotion, lacking objective verification. While the β/α ratio has been identified as a potential objective indicator of arousal in EEG [...] Read more.
Emotion recognition using an odor-induced electroencephalogram (EEG) has broad applications in human-computer interaction. However, existing studies often rely on subjective self-reporting to label emotion, lacking objective verification. While the β/α ratio has been identified as a potential objective indicator of arousal in EEG spectral analysis, its value in emotion recognition remains underexplored. This study ensured the authenticity of emotions through self-reporting and EEG spectral analysis of 50 adults after inhaling sandalwood essential oil (SEO) or bergamot essential oil (BEO). Classification models were built using discriminant analysis (DA), support vector machine (SVM), and random forest (RF) algorithms to identify low or high arousal emotions. Notably, this study introduced the β/α ratio as a novel frequency domain feature to enhance model performance for the first time. Both self-reporting and EEG spectral analysis indicated that SEO promotes relaxation, whereas BEO enhances attentiveness. In model testing, incorporating the β/α ratio enhanced the performance of all models, with the accuracy of DA, SVM, and RF increasing from 70%, 75%, and 85% to 75%, 80%, and 95%, respectively. This study validated the authenticity of emotions by employing a combination of subjective and objective methods and highlighted the importance of β/α in emotion recognition along the arousal dimension. Full article
(This article belongs to the Section Biomedical Engineering)
Show Figures

Figure 1

22 pages, 5697 KiB  
Article
Real-Time Sensor-Based and Self-Reported Emotional Perceptions of Urban Green-Blue Spaces: Exploring Gender Differences with FER and SAM
by Xuan Zhang, Haoying Han and Guoqiang Shen
Sensors 2025, 25(3), 748; https://doi.org/10.3390/s25030748 - 26 Jan 2025
Cited by 1 | Viewed by 1064
Abstract
Urban green-blue spaces (UGBS) are increasingly recognized for their benefits to physical and mental well-being. However, research on real-time gender-specific emotional responses to UGBS remains limited. To address this gap, a dual-method approach combining facial expression recognition (FER) and self-reported measures to investigate [...] Read more.
Urban green-blue spaces (UGBS) are increasingly recognized for their benefits to physical and mental well-being. However, research on real-time gender-specific emotional responses to UGBS remains limited. To address this gap, a dual-method approach combining facial expression recognition (FER) and self-reported measures to investigate gender differences in real-time emotional evaluations of UGBS was developed. Using static images from Google Street View as stimuli, a self-reporting experiment involving 108 participants provided insights into subjective emotional experiences. Subsequently, a FER experiment, utilizing 360-degree video stimuli, captured over two million data points, validating the feasibility and advantages of real-time emotion monitoring. The findings revealed distinct gender-specific emotional patterns: women experienced stronger pleasant emotions and preferred scenes evoking higher arousal, while men demonstrated sharper responses and rated scenes with peak valence emotions more favorably. Grass elicited relaxation and delight in women and arousal in men, whereas blue spaces induced calmness across genders, with men reporting greater relaxation as water content increased. The study underscores the potential of FER technology in assessing real-time emotional responses, providing actionable insights for inclusive urban planning. By integrating advanced tools and participatory design approaches, urban planners can develop strategies that enhance emotional well-being and create livable cities that support diverse user needs. Full article
(This article belongs to the Collection Sensors for Globalized Healthy Living and Wellbeing)
Show Figures

Figure 1

20 pages, 1685 KiB  
Article
Meditation-Induced Self-Boundary Flexibility and Prosociality: A MEG and Behavioral Measures Study
by Yoav Schweitzer, Fynn-Mathis Trautwein, Yair Dor-Ziderman, Ohad Nave, Jonathan David, Stephen Fulder and Aviva Berkovich-Ohana
Brain Sci. 2024, 14(12), 1181; https://doi.org/10.3390/brainsci14121181 - 26 Nov 2024
Cited by 1 | Viewed by 1380
Abstract
Background: In the last decade, empirical studies on the beneficial effects of meditation on prosocial capacities have accumulated, but the underlying mechanisms remain unclear. Buddhist sources state that liberating oneself from a fixed view of the self by gaining access to its transitory [...] Read more.
Background: In the last decade, empirical studies on the beneficial effects of meditation on prosocial capacities have accumulated, but the underlying mechanisms remain unclear. Buddhist sources state that liberating oneself from a fixed view of the self by gaining access to its transitory and malleable nature leads to increased compassion and other prosocial traits. These, however, have not yet been empirically tested. Methods: The current study aims at filling this gap by first examining whether 44 long term meditators differ from 53 controls in prosocial capacities on different levels of the socio-cognitive hierarchy, and second by examining whether these are associated with meditation-induced ‘selfless’ states, operationalized here as the sense of boundary (SB) flexibility. We capitalize on our previous work on the neurophenomenology of mindfulness-induced SB dissolution, which yielded a neural index of SB-flexibility, solely for the meditators, and examine its correlations with a battery of validated behavioral prosociality tasks. Results: Our findings reveal enhanced low-level prosocial processes in meditators, including enhanced emotion recognition and reduced outgroup bias. We show the stability of SB flexibility over a year, demonstrating consistent high beta deactivation. The neural index of SB flexibility negatively correlates with recognizing negative emotions, suggesting a link to reduced social threat perception. Conclusions: These results connect the neural correlates of SB flexibility to prosociality, supported by stable high beta deactivations. We expect the results to raise awareness regarding the prosocial potential of flexing one’s self-boundaries through meditation. Full article
(This article belongs to the Section Cognitive, Social and Affective Neuroscience)
Show Figures

Figure 1

16 pages, 8003 KiB  
Article
AffectiVR: A Database for Periocular Identification and Valence and Arousal Evaluation in Virtual Reality
by Chaelin Seok, Yeongje Park, Junho Baek, Hyeji Lim, Jong-hyuk Roh, Youngsam Kim, Soohyung Kim and Eui Chul Lee
Electronics 2024, 13(20), 4112; https://doi.org/10.3390/electronics13204112 - 18 Oct 2024
Cited by 3 | Viewed by 1128
Abstract
This study introduces AffectiVR, a dataset designed for periocular biometric authentication and emotion evaluation in virtual reality (VR) environments. To maximize immersion in VR environments, interactions must be seamless and natural, with unobtrusive authentication and emotion recognition technologies playing a crucial role. This [...] Read more.
This study introduces AffectiVR, a dataset designed for periocular biometric authentication and emotion evaluation in virtual reality (VR) environments. To maximize immersion in VR environments, interactions must be seamless and natural, with unobtrusive authentication and emotion recognition technologies playing a crucial role. This study proposes a method for user authentication by utilizing periocular images captured by a camera attached to a VR headset. Existing datasets have lacked periocular images acquired in VR environments, limiting their practical application. To address this, periocular images were collected from 100 participants using the HTC Vive Pro and Pupil Labs infrared cameras in a VR environment. Participants also watched seven emotion-inducing videos, and emotional evaluations for each video were conducted. The final dataset comprises 1988 monocular videos and corresponding self-assessment manikin (SAM) evaluations for each experimental video. This study also presents a baseline study to evaluate the performance of biometric authentication using the collected dataset. A deep learning model was used to analyze the performance of biometric authentication based on periocular data collected in a VR environment, confirming the potential for implicit and continuous authentication. The high-resolution periocular images collected in this study provide valuable data not only for user authentication but also for emotion evaluation research. The dataset developed in this study can be used to enhance user immersion in VR environments and as a foundational resource for advancing emotion recognition and authentication technologies in fields such as education, therapy, and entertainment. This dataset offers new research opportunities for non-invasive continuous authentication and emotion recognition in VR environments, and it is expected to significantly contribute to the future development of related technologies. Full article
(This article belongs to the Special Issue Biometric Recognition: Latest Advances and Prospects)
Show Figures

Figure 1

33 pages, 18667 KiB  
Article
Multimodal Dataset Construction and Validation for Driving-Related Anger: A Wearable Physiological Conduction and Vehicle Driving Data Approach
by Lichen Sun, Hongze Yang and Bo Li
Electronics 2024, 13(19), 3904; https://doi.org/10.3390/electronics13193904 - 2 Oct 2024
Cited by 4 | Viewed by 1991 | Correction
Abstract
Anger impairs a driver’s control and risk assessment abilities, heightening traffic accident risks. Constructing a multimodal dataset during driving tasks is crucial for accurate anger recognition. This study developed a multimodal physiological -vehicle driving dataset (DPV-MFD) based on drivers’ self-reported anger during simulated [...] Read more.
Anger impairs a driver’s control and risk assessment abilities, heightening traffic accident risks. Constructing a multimodal dataset during driving tasks is crucial for accurate anger recognition. This study developed a multimodal physiological -vehicle driving dataset (DPV-MFD) based on drivers’ self-reported anger during simulated driving tasks. In Experiment 1, responses from 624 participants to anger-inducing videos and driving scenarios were collected via questionnaires to select appropriate materials. In Experiments 2 and 3, multimodal dynamic data and self-reported SAM emotion ratings were collected during simulated and real-vehicle tasks, capturing physiological and vehicle responses in neutral and anger states. Spearman’s correlation coefficient analysis validated the DPV-MFD’s effectiveness and explored the relationships between multimodal data and emotional dimensions. The CNN-LSTM deep learning network was used to assess the emotion recognition performance of the DPV-MFD across different time windows, and its applicability in real-world driving scenarios was validated. Compared to using EEG data alone, integrating multimodal data significantly improved anger recognition accuracy, with accuracy and F1 scores rising by 4.49% and 9.14%, respectively. Additionally, real-vehicle data closely matched simulated data, confirming the dataset’s effectiveness for real-world applications. This research is pivotal for advancing emotion-aware human–machine- interaction and intelligent transportation systems. Full article
(This article belongs to the Special Issue Recent Progress of Artificial Intelligence in Virtual Reality)
Show Figures

Figure 1

15 pages, 1322 KiB  
Article
Neural Activity Associated with Symptoms Change in Depressed Adolescents following Self-Processing Neurofeedback
by Natasha Ahrweiler, Carmen Santana-Gonzalez, Na Zhang, Grace Quandt, Nikki Ashtiani, Guanmin Liu, Maggie Engstrom, Erika Schultz, Ryan Liengswangwong, Jia Yuan Teoh, Katia Kozachok and Karina Quevedo
Brain Sci. 2022, 12(9), 1128; https://doi.org/10.3390/brainsci12091128 - 25 Aug 2022
Cited by 9 | Viewed by 2885
Abstract
Adolescent depression is prevalent, debilitating, and associated with chronic lifetime mental health disorders. Understanding the neurobiology of depression is critical to developing novel treatments. We tested a neurofeedback protocol targeting emotional regulation and self-processing circuitry and examined brain activity associated with reduced symptom [...] Read more.
Adolescent depression is prevalent, debilitating, and associated with chronic lifetime mental health disorders. Understanding the neurobiology of depression is critical to developing novel treatments. We tested a neurofeedback protocol targeting emotional regulation and self-processing circuitry and examined brain activity associated with reduced symptom severity, as measured through self-report questionnaires, four hours after neurofeedback. Depressed (n = 34) and healthy (n = 19) adolescents participated in (i) a brief neurofeedback task that involves simultaneously viewing their own happy face, recalling a positive autobiographical memory, and increasing amygdala-hippocampal activity; (ii) a self- vs. other- face recognition task with happy, neutral, and sad facial expressions before and after the neurofeedback. In depressed youth, reduced depression after neurofeedback was associated with increased self-referential and visual areas’ activity during neurofeedback, specifically, increased activity in the cuneus, precuneus and parietal lobe. Reduced depression was also associated with increased activation of emotional regulation and cross-modal areas during a self-recognition task. These areas included the cerebellum, middle temporal gyrus, superior temporal gyrus, and supramarginal gyrus. However, decreased rumination was linked to decreased precuneus, angular and temporal gyri activity during neurofeedback. These results tentatively suggest that neurofeedback may induce short-term neurobiological changes in the self-referential and emotional regulation networks associated with reduced symptom severity among depressed adolescents. Full article
(This article belongs to the Section Neuropsychology)
Show Figures

Figure 1

17 pages, 902 KiB  
Article
An Exploratory Study on the Acoustic Musical Properties to Decrease Self-Perceived Anxiety
by Emilia Parada-Cabaleiro, Anton Batliner and Markus Schedl
Int. J. Environ. Res. Public Health 2022, 19(2), 994; https://doi.org/10.3390/ijerph19020994 - 16 Jan 2022
Cited by 6 | Viewed by 3913
Abstract
Musical listening is broadly used as an inexpensive and safe method to reduce self-perceived anxiety. This strategy is based on the emotivist assumption claiming that emotions are not only recognised in music but induced by it. Yet, the acoustic properties of musical work [...] Read more.
Musical listening is broadly used as an inexpensive and safe method to reduce self-perceived anxiety. This strategy is based on the emotivist assumption claiming that emotions are not only recognised in music but induced by it. Yet, the acoustic properties of musical work capable of reducing anxiety are still under-researched. To fill this gap, we explore whether the acoustic parameters relevant in music emotion recognition are also suitable to identify music with relaxing properties. As an anxiety indicator, the positive statements from the six-item Spielberger State-Trait Anxiety Inventory, a self-reported score from 3 to 12, are taken. A user-study with 50 participants assessing the relaxing potential of four musical pieces was conducted; subsequently, the acoustic parameters were evaluated. Our study shows that when using classical Western music to reduce self-perceived anxiety, tonal music should be considered. In addition, it also indicates that harmonicity is a suitable indicator of relaxing music, while the role of scoring and dynamics in reducing non-pathological listener distress should be further investigated. Full article
Show Figures

Figure 1

12 pages, 1304 KiB  
Article
Conscientiousness in Pilots Correlates with Electrodermal Stability: Study on Simulated Flights under Social Stress
by Antonio R. Hidalgo-Muñoz, Damien Mouratille, Radouane El-Yagoubi, Yves Rouillard, Nadine Matton and Mickaël Causse
Safety 2021, 7(2), 49; https://doi.org/10.3390/safety7020049 - 18 Jun 2021
Cited by 11 | Viewed by 6469
Abstract
For pilots, the capacity to cope with anxiety is crucial during a flight since they may be confronted with stressful situations. According to the Big Five Inventory, this capacity can be modulated by two important personality traits: conscientiousness and neuroticism. The former would [...] Read more.
For pilots, the capacity to cope with anxiety is crucial during a flight since they may be confronted with stressful situations. According to the Big Five Inventory, this capacity can be modulated by two important personality traits: conscientiousness and neuroticism. The former would be related to concentration skills and the latter to the attention bias towards anxiety-provoking stimuli. Given the current development of monitoring systems for detecting the users’ state, which can be incorporated into cockpits, it is desirable to estimate their robustness to inter-individual personality differences. Indeed, several emotion recognition methods are based on physiological responses that can be modulated by specific personality profiles. The personality traits of twenty pilots were assessed. Afterwards, they performed two consecutive simulated flights without and with induced social stress while electrodermal activity was measured. Their subjective anxiety was assessed before the second flight, prior to the stress-induced condition. The results showed that higher scores in neuroticism correlated positively with cognitive and somatic anxiety. Moreover, under social stress, higher scores in conscientiousness correlated positively with electrodermal stability, i.e., a lower number of skin conductance responses. These results on both self-reported and physiological responses are in favor of the integration of personality differences into pilots’ state monitoring. Full article
Show Figures

Graphical abstract

23 pages, 3484 KiB  
Article
Comparing Neural Correlates of Human Emotions across Multiple Stimulus Presentation Paradigms
by Naveen Masood and Humera Farooq
Brain Sci. 2021, 11(6), 696; https://doi.org/10.3390/brainsci11060696 - 25 May 2021
Cited by 10 | Viewed by 3691
Abstract
Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, [...] Read more.
Most electroencephalography (EEG)-based emotion recognition systems rely on a single stimulus to evoke emotions. These systems make use of videos, sounds, and images as stimuli. Few studies have been found for self-induced emotions. The question “if different stimulus presentation paradigms for same emotion, produce any subject and stimulus independent neural correlates” remains unanswered. Furthermore, we found that there are publicly available datasets that are used in a large number of studies targeting EEG-based human emotional state recognition. Since one of the major concerns and contributions of this work is towards classifying emotions while subjects experience different stimulus-presentation paradigms, we need to perform new experiments. This paper presents a novel experimental study that recorded EEG data for three different human emotional states evoked with four different stimuli presentation paradigms. Fear, neutral, and joy have been considered as three emotional states. In this work, features were extracted with common spatial pattern (CSP) from recorded EEG data and classified through linear discriminant analysis (LDA). The considered emotion-evoking paradigms included emotional imagery, pictures, sounds, and audio–video movie clips. Experiments were conducted with twenty-five participants. Classification performance in different paradigms was evaluated, considering different spectral bands. With a few exceptions, all paradigms showed the best emotion recognition for higher frequency spectral ranges. Interestingly, joy emotions were classified more strongly as compared to fear. The average neural patterns for fear vs. joy emotional states are presented with topographical maps based on spatial filters obtained with CSP for averaged band power changes for all four paradigms. With respect to the spectral bands, beta and alpha oscillation responses produced the highest number of significant results for the paradigms under consideration. With respect to brain region, the frontal lobe produced the most significant results irrespective of paradigms and spectral bands. The temporal site also played an effective role in generating statistically significant findings. To the best of our knowledge, no study has been conducted for EEG emotion recognition while considering four different stimuli paradigms. This work provides a good contribution towards designing EEG-based system for human emotion recognition that could work effectively in different real-time scenarios. Full article
(This article belongs to the Section Behavioral Neuroscience)
Show Figures

Figure 1

22 pages, 2567 KiB  
Article
Investigating EEG Patterns for Dual-Stimuli Induced Human Fear Emotional State
by Naveen Masood and Humera Farooq
Sensors 2019, 19(3), 522; https://doi.org/10.3390/s19030522 - 26 Jan 2019
Cited by 34 | Viewed by 6983
Abstract
Most electroencephalography (EEG) based emotion recognition systems make use of videos and images as stimuli. Few used sounds, and even fewer studies were found involving self-induced emotions. Furthermore, most of the studies rely on single stimuli to evoke emotions. The question of “whether [...] Read more.
Most electroencephalography (EEG) based emotion recognition systems make use of videos and images as stimuli. Few used sounds, and even fewer studies were found involving self-induced emotions. Furthermore, most of the studies rely on single stimuli to evoke emotions. The question of “whether different stimuli for same emotion elicitation generate any subject-independent correlations” remains unanswered. This paper introduces a dual modality based emotion elicitation paradigm to investigate if emotions can be classified induced with different stimuli. A method has been proposed based on common spatial pattern (CSP) and linear discriminant analysis (LDA) to analyze human brain signals for fear emotions evoked with two different stimuli. Self-induced emotional imagery is one of the considered stimuli, while audio/video clips are used as the other stimuli. The method extracts features from the CSP algorithm and LDA performs classification. To investigate associated EEG correlations, a spectral analysis was performed. To further improve the performance, CSP was compared with other regularized techniques. Critical EEG channels are identified based on spatial filter weights. To the best of our knowledge, our work provides the first contribution for the assessment of EEG correlations in the case of self versus video induced emotions captured with a commercial grade EEG device. Full article
(This article belongs to the Section Biosensors)
Show Figures

Figure 1

22 pages, 25630 KiB  
Article
Investigating Patterns for Self-Induced Emotion Recognition from EEG Signals
by Ning Zhuang, Ying Zeng, Kai Yang, Chi Zhang, Li Tong and Bin Yan
Sensors 2018, 18(3), 841; https://doi.org/10.3390/s18030841 - 12 Mar 2018
Cited by 65 | Viewed by 7699
Abstract
Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and [...] Read more.
Most current approaches to emotion recognition are based on neural signals elicited by affective materials such as images, sounds and videos. However, the application of neural patterns in the recognition of self-induced emotions remains uninvestigated. In this study we inferred the patterns and neural signatures of self-induced emotions from electroencephalogram (EEG) signals. The EEG signals of 30 participants were recorded while they watched 18 Chinese movie clips which were intended to elicit six discrete emotions, including joy, neutrality, sadness, disgust, anger and fear. After watching each movie clip the participants were asked to self-induce emotions by recalling a specific scene from each movie. We analyzed the important features, electrode distribution and average neural patterns of different self-induced emotions. Results demonstrated that features related to high-frequency rhythm of EEG signals from electrodes distributed in the bilateral temporal, prefrontal and occipital lobes have outstanding performance in the discrimination of emotions. Moreover, the six discrete categories of self-induced emotion exhibit specific neural patterns and brain topography distributions. We achieved an average accuracy of 87.36% in the discrimination of positive from negative self-induced emotions and 54.52% in the classification of emotions into six discrete categories. Our research will help promote the development of comprehensive endogenous emotion recognition methods. Full article
(This article belongs to the Special Issue Advanced Physiological Sensing)
Show Figures

Figure 1

Back to TopTop