Predicting Empathy and Other Mental States During VR Sessions Using Sensor Data and Machine Learning
Abstract
1. Introduction
- Self-report questionnaires: These involve participants answering questions about their thoughts, feelings, and behaviors related to empathy. While they provide valuable insights into subjective experiences, they rely on self-perception and may be influenced by biases such as social desirability or inaccurate self-assessment [6,11].
- Performance tasks: These tasks evaluate empathic abilities by having participants engage in simulations or scenarios that elicit perspective-taking or emotion inference. They offer ecologically valid measures but primarily capture cognitive empathy and may not reflect emotional empathy. Task design and individual cognitive differences can also influence results [12].
- Behavioral observation: This method involves recording empathic behaviors, such as facial expressions, vocal tone, or prosocial actions, in natural or controlled settings and using standardized coding schemes for quantifying these behaviors. While it provides objective data, it can be time-consuming and subject to reactivity biases due to the presence of observers. Additionally, it focuses on behavioral expressions rather than emotional empathy [13].
- Psychophysiological measures: These assess physiological responses, such as heart rate, skin conductance, or brain activity, during empathic experiences. Although they do not directly predict empathy, techniques like Functional Magnetic Resonance Imaging (fMRI) and electroencephalography (EEG) have been used to study the neural mechanisms of empathy, highlighting the roles of the mirror neurons and brain regions responsible for emotional processing. However, these methods may be influenced by non-empathetic factors, inter-individual variability, and technical constraints [14,15]. Importantly, measures such as heart rate (PPG) and muscle activity (EMG) have been associated with affective arousal, while movement dynamics captured by an IMU can serve as indirect markers of cognitive engagement, thereby aligning these modalities with both affective and cognitive components of empathy.
- Immersive and embodied experiences: VR creates a sense of presence, allowing users to embody different perspectives and experiences. This immersion can enhance empathy by enabling individuals to see and feel situations from another person’s viewpoint [22] or a different viewpoint [23]. For instance, one study introduced a method where participants, seated in a stationary chair and wearing a VR headset, used a joystick to move an avatar forward in a virtual environment, allowing them to experience a different point of view. This approach employed electrical muscle stimulation to synchronize leg muscle sensations with the gait cycle, offering a first-person perspective that created the illusion of walking without physical movement. The study demonstrated that this technique, through VR immersion, effectively induced an embodied experience with the sensation of leg movement for seated users [23].
- Perspective-taking and emotional engagement: VR can simulate realistic scenarios that elicit emotional responses and encourage perspective-taking. VR experiences can simulate lifelike scenarios that elicit emotional responses and immerse users in the perspectives of others, helping them better understand different feelings and experiences [24]. For example, one study demonstrated how VR can enhance emotional well-being and foster empathy in the elderly by exposing them to emotionally meaningful content. This included storytelling, immersive environments, and virtual tours of nostalgic locations. Simulated social interactions with family or peers were also incorporated to encourage emotional engagement [25]. Another notable example is the use of death simulations in VR, where users explore the concept of death in a controlled environment. As a profound and universal theme, death offers unique opportunities for fostering empathy, emotional development, and self-reflection. In these simulations, participants transition from a virtual cardiac arrest to brain death, experiencing inner-body sensations while journeying through expansive cosmic landscapes. This immersive experience encourages deep contemplation on mortality and the sublime nature of existence, highlighting VR’s capability to engage with complex emotional and existential topics [26].
- Empathy training and perspective shifts: VR-based interventions are used in healthcare, education, and diversity training to challenge biases and enhance empathic abilities. By immersing individuals in realistic scenarios, VR challenges biases and fosters perspective-taking, allowing participants to gain deeper insights into others’ emotions and experiences, making it a powerful tool for empathy training [17]. For example, VR has been shown to effectively cultivate empathy in sensitive areas such as racism, inequity, and climate change, particularly within healthcare, where participants engage in transformative experiences that improve empathy and understanding [27]. In another study, VR-based dementia training for home care workers improved their knowledge, attitudes, competence, and empathy, demonstrating the value of immersive training methods in enhancing care skills [28]. Additionally, a study proposed a video game framework using generative AI to create personalized interactive narratives, aiming to foster empathy and social inclusion. By combining VR’s immersive nature with personalized experiences, this research explored how AI-driven games can enhance wisdom, with a particular focus on empathy as a core element of wisdom [29].
- Ethical considerations: Although VR has the potential to enhance empathy, its use must be guided by ethical principles. As a tool capable of evoking intense emotional responses, VR experiences should be designed responsibly to respect the dignity and privacy of all individuals involved. Additionally, it is crucial to carefully manage the emotional impact on users to avoid potential negative effects on their well-being. Achieving a balance between creating immersive empathic experiences and ensuring the emotional safety of participants is essential for using VR effectively as a tool to foster empathy [27]. For instance, scenarios like virtual rape highlight the need to address how VR experiences might awaken the idea of real-world violations [30]. Another study emphasized that VR can unintentionally cause harm, including empathy fatigue or distress for users and providers of empathic acts, especially in contexts involving chronic or stigmatized illnesses. Thoughtful design approaches that balance emotional engagement with actionable support are vital to ensure VR enhances well-being without adverse effects. By integrating strategies to mitigate empathy fatigue and promote self-care among users and caregivers, VR technologies can better align with the goal of positively impacting health and well-being [31].
- Personalized interventions and early prevention: The model can assess individuals’ empathy levels across different contexts, such as healthcare, education, or counseling. This information could guide strategies to foster empathic skills in those who may benefit from additional support and help prevent potential challenges in settings where empathy is critical, such as patient care or interpersonal relationships.
- Selection and training: It can help select individuals for empathy-intensive roles and guide training programs by identifying areas for improvement.
- Research and understanding: The model can contribute to research on empathy, offering insights into factors influencing it and identifying patterns across populations.
- Entertainment and interactive media: By tailoring content to users’ empathy levels, creators can enhance emotional engagement in video games, narratives, and media recommendations.
- Personal growth and self-awareness: Individuals can gain self-awareness about their empathy strengths and weaknesses, fostering personal growth and encouraging the development of empathy.
2. Background and State of the Art
2.1. General Context of the Studies Using VR in Empathy Enhancement
2.2. Using Virtual Reality to Elicit Empathy
2.3. Measuring Empathy
2.4. Predicting Empathy and Other Psychological States with Machine Learning
3. Materials and Data Collection Process
3.1. Materials and Setup for Empathy Elicitation in VR
3.2. Participants and Recording Setup
4. Methodology
4.1. Preprocessing
4.2. Data Segmentation and Feature Engineering
5. Experimental Evaluation
5.1. Experimental Setup
5.1.1. Conducted Experiments
- Different segments, using 500 ms and 5 s window sizes: We set up five different experiments, labeling the dataset in five different ways: (a) binary classification of empathic arousal prediction: this aimed to predict empathic arousal by comparing empathic parts of the video with the forest part, excluding the non-empathic roller coaster segment; (b) binary classification of non-empathic arousal prediction: this compared the forest and roller coaster parts of the video to predict non-empathic arousal; (c) binary classification of empathic vs. non-empathic arousal: this included only empathic parts and the roller coaster, aiming to distinguish between empathic and non-empathic arousal while analyzing physiological responses to empathic content and non-empathic arousal-inducing stimuli (e.g., the roller coaster); (d) binary classification of general arousal prediction: this split the dataset into two classes: the forest (representing no arousal) and everything else (including empathic parts and the roller coaster, representing arousal); (e) three-class classification of arousal type prediction: this treated the forest and roller coaster as separate classes while grouping all empathic parts into one class, aiming to distinguish no arousal, empathic arousal, and non-empathic arousal without differentiating specific empathic emotions [53].
- State empathy as nominal classification, using 500 ms and 5 s window sizes: We aimed to predict state empathy, which reflects the temporary affective response elicited in specific situations [78]. This approach used participants’ averaged responses to the three state empathy questions for each video segment. Each empathic segment was treated separately and labeled with the corresponding state empathy responses, while the forest and roller coaster segments were labeled as zero. The goal was to predict participants’ state empathy levels during the session.
- State empathy as nominal classification, using the ‘one entire video segment’ window size: The target variable was obtained as an average of participants’ responses to the state empathy questions provided after each segment of the video session. However, in this case, when the ground truth was the state empathy class, we used a window size corresponding to one entire video segment (with each video consisting of six segments/parts). This window size was used consistently in all subsequent experiments.
- State empathy as ordinal classification, using the ‘one entire video segment’ window size: Since state empathy has ordered categories, ranging from not empathic to highly empathic, and undefined distances between levels, we applied ordinal classification.
- Trait empathy as regression, using the ‘one entire video segment’ window size: Since trait empathy was a continuous variable derived from the continuous attributes of cognitive empathy (perspective-taking and online simulation) and affective empathy (emotion contagion and proximal responsivity) from the QCAE, we focused on predicting the affective empathy component. Affective empathy reflects the ability to feel and share another’s emotions, which aligns with the context of this study, as the participants were not required to perform actions that would test cognitive empathy, such as understanding the actors’ feelings.
- Trait empathy as classification, using the ‘one entire video segment’ window size: Equal interval binning was applied to the target variable, trait empathy (affective points), dividing its range (5.00 to 8.75) into equal intervals. The resulting bins were defined as follows: class 0: [5.00–5.75); class 1: [5.75–6.50); class 2: [6.50–7.25); class 3: [7.25–8.00); and class 4: [8.00–8.75].
5.1.2. Resampling Approaches
5.1.3. Machine Learning Algorithms
5.1.4. Cross-Validation Strategy
5.1.5. Evaluation Metrics
5.2. Predicting Different Segments
5.3. Predicting State Empathy
5.3.1. State Empathy Level as Nominal Classification, Using 500 ms and 5 s Window Sizes
5.3.2. State Empathy as Nominal Classification, Using the ‘One Entire Video Segment’ Window Size
5.3.3. State Empathy as Ordinal Classification
5.4. Predicting Trait Empathy
5.4.1. Trait Empathy as Regression
5.4.2. Trait Empathy as Classification
5.5. Statistical Tests
5.5.1. Narrative vs. Non-Narrative Versions
5.5.2. Gender of the Actor
5.5.3. Type of Emotion
- There was a significant difference between “Anger” and “Happiness” (mean difference = 0.8627, p-value < 0.05), indicating that participants were more empathetic towards the “Happiness” emotion.
- Similarly, there was a significant difference between “Anger” and “Sad” (mean difference = 0.3725, p-value < 0.05), showing greater empathy towards “Sad” emotions.
- There was also a significant difference between “Anxiety” and “Happiness” (mean difference = 0.6863, p-value < 0.05), where participants were more empathetic towards “Happiness”.
- There was a significant difference between “Happiness” and “Sad” (mean difference = −0.4902, p-value < 0.05), indicating higher empathy towards “Happiness”.
- However, there was no significant difference between “Anger” and “Anxiety” (mean difference = 0.1765, p-value = 0.4595), or between “Anxiety” and “Sad” (mean difference = 0.1961, p-value = 0.3639), indicating that empathy levels for these emotions were more similar.
6. Discussion and Concluding Remarks
6.1. Strengths and Limitations
6.2. Future Work
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Harari, H.; Shamay-Tsoory, S.G.; Ravid, M.; Levkovitz, Y. Double dissociation between cognitive and affective empathy in borderline personality disorder. Psychiatry Res. 2010, 175, 277–279. [Google Scholar] [CrossRef]
- Kizhevska, E.; Ferreira-Brito, F.; Guerreiro, T.J.; Lustrek, M. Using virtual reality to elicit empathy: A narrative review. In Proceedings of the Workshop on Virtual Reality for Health and Wellbeing (VR4Health@MUM), Lisbon, Portugal, 27–30 November 2022; pp. 19–22. [Google Scholar]
- Reniers, R.L.; Corcoran, R.; Drake, R.; Shryane, N.M.; Völlm, B.A. The QCAE: A questionnaire of cognitive and affective empathy. J. Personal. Assess. 2011, 93, 84–95. [Google Scholar] [CrossRef]
- Shamay-Tsoory, S.G.; Aharon-Peretz, J. Dissociable prefrontal networks for cognitive and affective theory of mind: A lesion study. Neuropsychologia 2007, 45, 3054–3067. [Google Scholar] [CrossRef] [PubMed]
- Han, S. Understanding cultural differences in human behavior: A cultural neuroscience approach. Curr. Opin. Behav. Sci. 2015, 3, 68–72. [Google Scholar] [CrossRef]
- Baron-Cohen, S.; Wheelwright, S. The empathy quotient: An investigation of adults with Asperger syndrome or high functioning autism, and normal sex differences. J. Autism Dev. Disord. 2004, 34, 163–175. [Google Scholar] [CrossRef] [PubMed]
- Batson, C.D. Altruism in Humans; Oxford University Press: Oxford, UK, 2011. [Google Scholar]
- Ma-Kellams, C.; Lerner, J. Trust your gut or think carefully? Examining whether an intuitive, versus a systematic, mode of thought produces greater empathic accuracy. J. Personal. Soc. Psychol. 2016, 111, 674. [Google Scholar] [CrossRef] [PubMed]
- Herrera, F.; Bailenson, J.; Weisz, E.; Ogle, E.; Zaki, J. Building long-term empathy: A large-scale comparison of traditional and virtual reality perspective-taking. PLoS ONE 2018, 13, e0204494. [Google Scholar] [CrossRef]
- Lima, F.F.D.; Osório, F.D.L. Empathy: Assessment instruments and psychometric quality—A systematic literature review with a meta-analysis of the past ten years. Front. Psychol. 2021, 12, 781346. [Google Scholar] [CrossRef]
- Davis, M.H. A multidimensional approach to individual differences in empathy. JSAS Cat. Sel. Doc. Psychol. 1980, 10, 85. [Google Scholar]
- Lamm, C.; Batson, C.D.; Decety, J. The neural substrate of human empathy: Effects of perspective-taking and cognitive appraisal. J. Cogn. Neurosci. 2007, 19, 42–58. [Google Scholar] [CrossRef]
- Preston, S.D.; De Waal, F.B. Empathy: Its ultimate and proximate bases. Behav. Brain Sci. 2002, 25, 1–20. [Google Scholar] [CrossRef]
- Singer, T.; Lamm, C. The social neuroscience of empathy. Ann. N. Y. Acad. Sci. 2009, 1156, 89–96. [Google Scholar] [CrossRef] [PubMed]
- Keysers, C.; Gazzola, V. Integrating simulation and theory of mind: From self to social cognition. Trends Cogn. Sci. 2007, 11, 194–196. [Google Scholar] [CrossRef] [PubMed]
- MacKenzie, J.E.; Klarkowski, M.; Horton, E.M.; Theobald, M.; Danby, S.; Kervin, L.; Barrie, L.; Amery, P.K.; Andradi, M.; Smith, S.S.; et al. Using psychophysiological data to facilitate reflective conversations with children about their player experiences. Proc. ACM Hum.-Comput. Interact. 2024, 8, 347. [Google Scholar] [CrossRef]
- Banakou, D.; Hanumanthu, P.D.; Slater, M. Virtual embodiment of white people in a black virtual body leads to a sustained reduction in their implicit racial bias. Front. Hum. Neurosci. 2016, 10, 226766. [Google Scholar] [CrossRef]
- Mado, M.; Herrera, F.; Nowak, K.; Bailenson, J. Effect of virtual reality perspective-taking on related and unrelated contexts. Cyberpsychol. Behav. Soc. Netw. 2021, 24, 839–845. [Google Scholar] [CrossRef]
- Sanchez-Vives, M.V.; Slater, M. From presence to consciousness through virtual reality. Nat. Rev. Neurosci. 2005, 6, 332–339. [Google Scholar] [CrossRef]
- Ahn, S.J.; Bostick, J.; Ogle, E.; Nowak, K.L.; McGillicuddy, K.T.; Bailenson, J.N. Experiencing nature: Embodying animals in immersive virtual environments increases inclusion of nature in self and involvement with nature. J. Comput.-Mediat. Commun. 2016, 21, 399–419. [Google Scholar] [CrossRef]
- Barbot, B.; Kaufman, J.C. What makes immersive virtual reality the ultimate empathy machine? Discerning the underlying mechanisms of change. Comput. Hum. Behav. 2020, 111, 106431. [Google Scholar] [CrossRef]
- Riva, G.; Waterworth, J.A.; Waterworth, E.L. The layers of presence: A bio-cultural approach to understanding presence in natural and mediated environments. Cyberpsychol. Behav. 2004, 7, 402–416. [Google Scholar] [CrossRef]
- Um, J.; Jeon, E.; Kang, Y.; Kang, S.; Elsharkawy, A.; DelPreto, J.; Matusik, W.; Rus, D.; Kim, S. LegSense: Inducing walking sensation in seated VR by providing movement illusion via electrical muscle stimulation. In Proceedings of the Companion of the 2024 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’24), Melbourne, Australia, 5–9 October 2024; pp. 797–802. [Google Scholar] [CrossRef]
- Slater, M.; Antley, A.; Davison, A.; Swapp, D.; Guger, C.; Barker, C.; Sanchez-Vives, M.V. A virtual reprise of the Stanley Milgram obedience experiments. PLoS ONE 2006, 1, e39. [Google Scholar] [CrossRef]
- Benoit, M.; Guerchouche, R.; Petit, P.D.; Chapoulie, E.; Manera, V.; Chaurasia, G.; Drettakis, G.; Robert, P. Is it possible to use highly realistic virtual reality in the elderly? A feasibility study with image-based rendering. Neuropsychiatr. Dis. Treat. 2015, 11, 557–563. [Google Scholar] [CrossRef] [PubMed]
- Greuter, S.; Mulvany, G.T. Designing a virtual death VR experience. In Proceedings of the Companion of the 2024 on ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp ’24), Melbourne, Australia, 5–9 October 2024; ACM: New York, NY, USA, 2024; pp. 328–332. [Google Scholar] [CrossRef]
- Bertr, P.; Guegan, J.; Robieux, L.; McCall, C.A.; Zenasni, F. Learning empathy through virtual reality: Multiple strategies for training empathy-related abilities using body ownership illusions in embodied virtual reality. Front. Robot. AI 2018, 5, 326671. [Google Scholar] [CrossRef] [PubMed]
- Sung, H.C.; Su, H.F.; Lee, W.L.; Yamakawa, M.; Wang, H.M. Effects of a dementia virtual reality-based training with peer support for home care workers: A cluster randomized controlled trial. Int. J. Geriatr. Psychiatry 2022, 37. [Google Scholar] [CrossRef] [PubMed]
- Tucek, T. Enhancing empathy through personalized AI-driven experiences and conversations with digital humans in video games. In Proceedings of the Companion Proceedings of the 2024 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY Companion ’24), Tampere, Finland, 14–17 October 2024; ACM: New York, NY, USA, 2024; pp. 446–449. [Google Scholar] [CrossRef]
- Strikwerda, L. Present and future instances of virtual rape in light of three categories of legal philosophical theories on rape. Philos. Technol. 2015, 28, 491–510. [Google Scholar] [CrossRef]
- Nourriz, S.; Bezabih, A.; Smith, C.E. On the design risks of empathy fatigue. In Proceedings of the 3rd Empathy-Centric Design Workshop: Scrutinizing Empathy Beyond the Individual (EmpathiCH ’24), Honolulu, HI, USA, 11 May 2024; ACM: New York, NY, USA, 2024; pp. 34–39. [Google Scholar] [CrossRef]
- Mehrabian, A.; Epstein, N. A measure of emotional empathy. J. Personal. 1972, 40, 525–543. [Google Scholar] [CrossRef]
- O’Brien, E.; Konrath, S.H.; Gruhn, D.; Hagen, A.L. Empathic concern and perspective taking: Linear and quadratic effects of age across the adult life span. J. Gerontol. Ser. B Psychol. Sci. Soc. Sci. 2013, 68, 168–175. [Google Scholar] [CrossRef]
- Zwack, J.; Schweitzer, J. If every fifth physician is affected by burnout, what about the other four? Resilience strategies of experienced physicians. Acad. Med. 2013, 88, 382–389. [Google Scholar] [CrossRef]
- Neumann, M.; Edelhäuser, F.; Tauschel, D.; Fischer, M.R.; Wirtz, M.; Woopen, C.; Haramati, A.; Scheffer, C. Empathy decline and its reasons: A systematic review of studies with medical students and residents. Acad. Med. 2011, 86, 996–1009. [Google Scholar] [CrossRef]
- Thomas, M.R.; Dyrbye, L.N.; Huntington, J.L.; Lawson, K.L.; Novotny, P.J.; Sloan, J.A.; Shanafelt, T.D. How do distress and well-being relate to medical student empathy? A multicenter study. J. Gen. Intern. Med. 2007, 22, 177–183. [Google Scholar] [CrossRef]
- Stargatt, J.; Bhar, S.; Petrovich, T.; Bhowmik, J.; Sykes, D.; Burns, K. The effects of virtual reality-based education on empathy and understanding of the physical environment for dementia care workers in Australia: A controlled study. J. Alzheimer’s Dis. 2021, 84, 1247–1257. [Google Scholar] [CrossRef]
- Wijma, E.M.; Veerbeek, M.A.; Prins, M.; Pot, A.M.; Willemse, B.M. A virtual reality intervention to improve the understanding and empathy for people with dementia in informal caregivers: Results of a pilot study. Aging Ment. Health 2018, 22, 1121–1129. [Google Scholar] [CrossRef]
- Tay, J.; Qu, Y.; Lim, L.; Puthran, R.; Tan, C.; Rajendran, R.; Wei, K.; Xie, H.; Sim, K. Impact of a virtual reality intervention on stigma, empathy, and attitudes toward patients with psychotic disorders among mental health care professionals: Randomized controlled trial. JMIR Ment. Health 2025, 12, e66925. [Google Scholar] [CrossRef]
- Lin, H.L.; Wang, Y.C.; Huang, M.L.; Yu, N.W.; Tang, I.; Hsu, Y.C.; Huang, Y.S. Can virtual reality technology be used for empathy education in medical students: A randomized case-control study. BMC Med. Educ. 2024, 24, 1254. [Google Scholar] [CrossRef] [PubMed]
- Rehl, D.; Mangapora, M.; Love, M.; Love, C.; Shaw, K.; McCarthy, J.; Beverly, E.A. Feasibility of a cinematic-virtual reality training program about opioid use disorder for osteopathic medical students: A single-arm pre–post study. J. Osteopath. Med. 2024, 124, 509–516. [Google Scholar] [CrossRef] [PubMed]
- Anderson, T.; Duffy, G.; Corry, D. Virtual reality education on myalgic encephalomyelitis for medical students and healthcare professionals: A pilot study. BMC Med. Educ. 2024, 24, 1018. [Google Scholar] [CrossRef] [PubMed]
- Liu, J.; Mak, P.; Chan, K.; Cheung, D.; Cheung, K.; Fong, K.; Kor, P.; Lai, T.; Maximo, T. The effects of immersive virtual reality–assisted experiential learning on enhancing empathy in undergraduate health care students toward older adults with cognitive impairment: Multiple-methods study. JMIR Med. Educ. 2024, 10, e48566. [Google Scholar] [CrossRef]
- Van Loon, A.; Bailenson, J.; Zaki, J.; Bostick, J.; Willer, R. Virtual reality perspective-taking increases cognitive empathy for specific others. PLoS ONE 2018, 13, e0202442. [Google Scholar] [CrossRef]
- Ventura, S.; Cardenas, G.; Miragall, M.; Riva, G.; Baños, R. How does it feel to be a woman victim of sexual harassment? The effect of 360°-video-based virtual reality on empathy and related variables. Cyberpsychol. Behav. Soc. Netw. 2021, 24, 258–266. [Google Scholar] [CrossRef]
- Bouchard, S.; Bernier, F.; Boivin, É; Dumoulin, S.; Laforest, M.; Guitard, T.; Robillard, G.; Monthuy-Blanc, J.; Renaud, P. Empathy toward virtual humans depicting a known or unknown person expressing pain. Cyberpsychol. Behav. Soc. Netw. 2013, 16, 61–71. [Google Scholar] [CrossRef]
- Fusaro, M.; Tieri, G.; Aglioti, S.M. Seeing pain and pleasure on self and others: Behavioral and psychophysiological reactivity in immersive virtual reality. J. Neurophysiol. 2016, 116, 2656–2662. [Google Scholar] [CrossRef]
- Fusaro, M.; Tieri, G.; Aglioti, S.M. Influence of cognitive stance and physical perspective on subjective and autonomic reactivity to observed pain and pleasure: An immersive virtual reality study. Conscious. Cogn. 2019, 67, 86–97. [Google Scholar] [CrossRef]
- Lee, J.-H.; Lee, S.E.; Kwon, Y.-S. Exploring empathic engagement in immersive media: An EEG study on mu rhythm suppression in VR. PLoS ONE 2024, 19, e0303553. [Google Scholar] [CrossRef]
- Raposo, R.; Vairinhos, M.; Laska-Leśniewicz, A.; Sztobryn-Giercuszkiewicz, J. Increasing awareness and empathy among university students through immersive exercises—Testing of the virtual reality application: A pilot study. Med. Pracy. Work. Health Saf. 2024, 74, 187–197. [Google Scholar] [CrossRef] [PubMed]
- Nelson, K.M.; Anggraini, E.; Schlüter, A. Virtual reality as a tool for environmental conservation and fundraising. PLoS ONE 2020, 15, e0223631. [Google Scholar] [CrossRef] [PubMed]
- Tamantini, C.; Cordella, F.; Scotto di Luzio, F.; Lauretti, C.; Campagnola, B.; Santacaterina, F.; Bravi, M.; Bressi, F.; Draicchio, F.; Miccinilli, S.; et al. A fuzzy-logic approach for longitudinal assessment of patients’ psychophysiological state: An application to upper-limb orthopedic robot-aided rehabilitation. J. Neuroeng. Rehabil. 2024, 21, 202. [Google Scholar] [CrossRef] [PubMed]
- Kizhevska, E.; Šparemblek, K.; Luštrek, M. Protocol of the study for predicting empathy during VR sessions using sensor data and machine learning. PLoS ONE 2024, 19, e0307385. [Google Scholar] [CrossRef]
- Nan, J.; Herbert, M.S.; Purpura, S.; Henneken, A.N.; Ramanathan, D.; Mishra, J. Personalized machine learning-based prediction of wellbeing and empathy in healthcare professionals. Sensors 2024, 24, 2640. [Google Scholar] [CrossRef]
- Kiesow, H.; Spreng, R.N.; Holmes, A.J.; Chakravarty, M.M.; Marquand, A.F.; Yeo, B.T.; Bzdok, D. Deep learning identifies partially overlapping subnetworks in the human social brain. Commun. Biol. 2021, 4, 65. [Google Scholar] [CrossRef]
- Kaźmierczak, M.; Rybicka, M.; Syty, P. Genetic variations as predictors of dispositional and dyadic empathy—A couple study. Sci. Rep. 2024, 14, 27411. [Google Scholar] [CrossRef]
- Abdel-Ghaffar, E.A.; Salama, M. The effect of stress on a personal identification system based on electroencephalographic signals. Sensors 2024, 24, 4167. [Google Scholar] [CrossRef]
- Mercado-Diaz, L.R.; Veeranki, Y.R.; Large, E.W.; Posada-Quintero, H.F. Fractal analysis of electrodermal activity for emotion recognition: A novel approach using detrended fluctuation analysis and wavelet entropy. Sensors 2024, 24, 8130. [Google Scholar] [CrossRef]
- Hwang, G.; Yoo, S.; Yoo, J. Emotion recognition using PPG signals of smartwatch on purpose of threat detection. Sensors 2025, 25, 18. [Google Scholar] [CrossRef] [PubMed]
- Dar, M.N.; Akram, M.U.; Subhani, A.R.; Khawaja, S.G.; Reyes-Aldasoro, C.C.; Gul, S. Insights from EEG analysis of evoked memory recalls using deep learning for emotion charting. Sci. Rep. 2024, 14, 17080. [Google Scholar] [CrossRef] [PubMed]
- Kołodziej, M.; Majkowski, A.; Jurczak, M. Acquisition and analysis of facial electromyographic signals for emotion recognition. Sensors 2024, 24, 4785. [Google Scholar] [CrossRef]
- Irshad, M.T.; Li, F.; Nisar, M.A.; Huang, X.; Buss, M.; Kloep, L.; Peifer, C.; Kozusznik, B.; Pollak, A.; Pyszka, A.; et al. Wearable-based human flow experience recognition enhanced by transfer learning methods using emotion data. Comput. Biol. Med. 2023, 166, 107489. [Google Scholar] [CrossRef] [PubMed]
- Sanipatín-Díaz, P.A.; Rosero-Montalvo, P.D.; Hernandez, W. Portable facial expression system based on EMG sensors and machine learning models. Sensors 2024, 24, 3350. [Google Scholar] [CrossRef]
- Kujala, M.V.; Parkkonen, L.; Kujala, J. Empathy enhances decoding accuracy of human neurophysiological responses to emotional facial expressions of humans and dogs. Soc. Cogn. Affect. Neurosci. 2024, 19, nsae082. [Google Scholar] [CrossRef]
- Hinkle, L.B.; Roudposhti, K.K.; Metsis, V. Physiological measurement for emotion recognition in virtual reality. In Proceedings of the 2019 2nd International Conference on Data Intelligence and Security (ICDIS), South Padre Island, TX, USA, 28–30 June 2019; IEEE: Piscataway, NJ, USA, 2019; pp. 136–143. [Google Scholar] [CrossRef]
- Arslan, E.E.; Akşahin, M.F.; Yilmaz, M.; Ilgın, H.E. Towards emotionally intelligent virtual environments: Classifying emotions through a biosignal-based approach. Appl. Sci. 2024, 14, 8769. [Google Scholar] [CrossRef]
- Mavridou, I.; Seiss, E.; Kostoulas, T.; Nduka, C.; Balaguer-Ballester, E. Towards an effective arousal detection system for virtual reality. In Proceedings of the Workshop on Human-Habitat for Health (H3), Boulder, CO, USA, 16 October 2018; ACM: New York, NY, USA, 2018; pp. 1–6. [Google Scholar] [CrossRef]
- Bulagang, A.F.; Mountstephens, J.; Teo, J. Multiclass emotion prediction using heart rate and virtual reality stimuli. J. Big Data 2021, 8, 12. [Google Scholar] [CrossRef]
- Olderbak, S.; Sassenrath, C.; Keller, J.; Wilhelm, O. An emotion-differentiated perspective on empathy with the emotion specific empathy questionnaire. Front. Psychol. 2014, 5, 653. [Google Scholar] [CrossRef]
- Lawrence, E.J.; Shaw, P.; Baker, D.; Baron-Cohen, S.; David, A.S. Measuring empathy: Reliability and validity of the Empathy Quotient. Psychol. Med. 2004, 34, 911–920. [Google Scholar] [CrossRef]
- Van Boxtel, A. Facial EMG as a tool for inferring affective states. Proc. Meas. Behav. 2010, 2, 104–108. [Google Scholar]
- Fridlund, A.J.; Schwartz, G.E.; Fowler, S.C. Pattern recognition of self-reported emotional state from multiple-site facial EMG activity during affective imagery. Psychophysiology 1984, 21, 622–637. [Google Scholar] [CrossRef]
- Gnacek, M.; Podlesek, A.; Pelicon, A.; Štrumbelj, E.; Pogačnik, M. emteqpro—fully integrated biometric sensing array for non-invasive biomedical research in virtual reality. Front. Virtual Real. 2022, 3, 781218. [Google Scholar] [CrossRef]
- Shen, L. On a scale of state empathy during message processing. West. J. Commun. 2010, 74, 504–524. [Google Scholar] [CrossRef]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 1994, 25, 49–59. [Google Scholar] [CrossRef] [PubMed]
- Russell, J.A.; Barrett, L.F. Core affect, prototypical emotional episodes, and other things called emotion: Dissecting the elephant. J. Personal. Soc. Psychol. 1999, 76, 805–819. [Google Scholar] [CrossRef]
- Schutte, N.S.; Stilinović, E.J. Facilitating empathy through virtual reality. Motiv. Emot. 2017, 41, 708–712. [Google Scholar] [CrossRef]
- Van der Graaff, J.; Meeus, W.; de Wied, M.; van Boxtel, A.; van Lier, P.A.; Koot, H.M. Motor, affective and cognitive empathy in adolescence: Interrelations between facial electromyography and self-reported trait and state measures. Cogn. Emot. 2016, 30, 745–761. [Google Scholar] [CrossRef]
- Brown, G.W. Standard deviation, standard error: Which ‘standard’ should we use? Am. J. Dis. Child. 1982, 136, 937–941. [Google Scholar] [CrossRef]
Video Type | Average Empathy Score |
---|---|
Happiness | 2.90 |
Sad | 2.41 |
Anxiety | 2.22 |
Anger | 2.04 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kizhevska, E.; Gjoreski, H.; Luštrek, M. Predicting Empathy and Other Mental States During VR Sessions Using Sensor Data and Machine Learning. Sensors 2025, 25, 5766. https://doi.org/10.3390/s25185766
Kizhevska E, Gjoreski H, Luštrek M. Predicting Empathy and Other Mental States During VR Sessions Using Sensor Data and Machine Learning. Sensors. 2025; 25(18):5766. https://doi.org/10.3390/s25185766
Chicago/Turabian StyleKizhevska, Emilija, Hristijan Gjoreski, and Mitja Luštrek. 2025. "Predicting Empathy and Other Mental States During VR Sessions Using Sensor Data and Machine Learning" Sensors 25, no. 18: 5766. https://doi.org/10.3390/s25185766
APA StyleKizhevska, E., Gjoreski, H., & Luštrek, M. (2025). Predicting Empathy and Other Mental States During VR Sessions Using Sensor Data and Machine Learning. Sensors, 25(18), 5766. https://doi.org/10.3390/s25185766