Next Article in Journal
Co-Designing a DSM-5-Based AI-Powered Smart Assistant for Monitoring Dementia and Ongoing Neurocognitive Decline: Development Study
Previous Article in Journal
Stabilizing the Shield: C-Terminal Tail Mutation of HMPV F Protein for Enhanced Vaccine Design
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Real-Time Applications of Biophysiological Markers in Virtual-Reality Exposure Therapy: A Systematic Review

by
Marie-Jeanne Fradette
1,2,
Julie Azrak
1,2,
Florence Cousineau
1,2,
Marie Désilets
1 and
Alexandre Dumais
1,2,3,*
1
Research Center of the Institut Universitaire en Santé Mentale de Montréal, 7331 Hochelaga, Montreal, QC H1N 3V2, Canada
2
Faculty of Medicine, Université de Montréal, 2900 Edouard Montpetit Blvd, Montreal, QC H3T 1J4, Canada
3
Institut National de Psychiatrie Légale Philippe-Pinel, 10905 Blvd Henri-Bourassa E, Montreal, QC H1C 1H1, Canada
*
Author to whom correspondence should be addressed.
BioMedInformatics 2025, 5(3), 48; https://doi.org/10.3390/biomedinformatics5030048
Submission received: 17 June 2025 / Revised: 13 August 2025 / Accepted: 15 August 2025 / Published: 28 August 2025
(This article belongs to the Section Applied Biomedical Data Science)

Abstract

Virtual-reality exposure therapy (VRET) is an emerging treatment for psychiatric disorders that enables immersive and controlled exposure to anxiety-provoking stimuli. Recent developments integrate real-time physiological monitoring, including heart rate (HR), electrodermal activity (EDA), and electroencephalography (EEG), to dynamically tailor therapeutic interventions. This systematic review examines studies that combine VRET with physiological data to adapt virtual environments in real time. A comprehensive search of major databases identified fifteen studies meeting the inclusion criteria: all employed physiological monitoring and adaptive features, with ten using biofeedback to modulate exposure based on single or multimodal physiological measures. The remaining studies leveraged physiological signals to inform scenario selection or threat modulation using dynamic categorization algorithms and machine learning. Although findings currently show an overrepresentation of anxiety disorders, recent studies are increasingly involving more diverse clinical populations. Results suggest that adaptive VRET is technically feasible and offers promising personalization benefits; however, the limited number of studies, methodological variability, and small sample sizes constrain broader conclusions. Future research should prioritize rigorous experimental designs, standardized outcome measures, and greater diversity in clinical populations. Adaptive VRET represents a frontier in precision psychiatry, where real-time biosensing and immersive technologies converge to enhance individualized mental health care.

1. Introduction

Virtual-reality exposure therapy (VRET) has emerged as a potent digital intervention for psychiatric disorders, enabling patients to engage with fear-inducing stimuli within safe, immersive, and precisely controlled virtual environments [1,2,3,4]. Grounded in cognitive-behavioural principles, VRET facilitates gradual, customizable exposure that helps individuals identify triggers and develop adaptive coping strategies [5,6,7]. Over the past two decades, advancements in display technology and computational power have increased the accessibility and efficacy of VRET, positioning it as a viable and, in some cases, superior alternative to traditional in vivo exposure therapies for conditions such as social anxiety, specific phobias, and post-traumatic stress disorder (PTSD) [8,9,10,11,12,13].
Concurrently, physiological measures, such as heart rate (HR), electrodermal activity (EDA), and electroencephalography (EEG), have been increasingly incorporated into virtual reality (VR) protocols [14,15]. These biomarkers serve multiple roles: assessing baseline arousal, monitoring treatment response, and validating the emotional salience of virtual stimuli [16,17,18,19,20]. For instance, heightened psychophysiological reactivity to trauma-related VR content reliably distinguishes patients with PTSD from controls, underscoring their diagnostic utility [21,22]. Physiological metrics are also used pre- and post-intervention to evaluate changes in stress regulation and emotional reactivity and to infer immersive presence [14,16,23,24,25,26,27].
More recently, biofeedback has been integrated into VR environments [28]. The meta-analysis by Kothgassner et al. [28] suggests that VR-based biofeedback effectively reduces anxiety and heart rate, potentially enhancing patient agency and facilitating the generalization of self-regulation skills; however, its clinical superiority over traditional biofeedback remains uncertain due to a limited number of high-quality comparative studies [29]. While preliminary evidence is promising, particularly regarding advantages in user engagement and motivation compared to traditional therapy, further rigorous randomized controlled trials are needed to definitively establish whether VR biofeedback offers superior clinical outcomes [30,31].
Building on these advances, the integration of real-time physiological monitoring directly into VRET shows considerable potential [32]. Unlike biofeedback—which emphasizes conscious self-regulation—real-time adaptive VRET uses physiological data to autonomously modulate virtual stimuli, creating a closed-loop system that dynamically responds to the user’s emotional state without requiring active intervention [33,34]. Closed-loop systems could enable precise tailoring of exposure intensity, stimulus proximity, or calming features in response to detected stress or anxiety, thereby enhancing engagement, preventing emotional overload, and potentially expanding VRET accessibility in self-guided or remote formats [35,36,37]. Despite this promise, challenges remain in selecting valid physiological markers, designing responsive VR environments, and implementing reliable machine-learning algorithms for real-time affective state detection.
This paper systematically reviews the emerging paradigm of real-time adaptive VRET, focusing on the incorporation of physiological data streams within closed-loop therapeutic designs. Specifically, this review aims to (1) identify the technologies and physiological markers employed, and (2) characterize the clinical contexts in which these adaptive approaches are applied.

2. Materials and Methods

2.1. Search Strategy

A detailed search of the databases PubMed, Medline, EMBASE, EBM Reviews—Cochrane Database of Systematic Reviews, and PsycINFO was completed in July 2025 by a research librarian (M.D.) at the Institut Universitaire de santé mentale de Montréal (IUSMM). The search strategy was developed by M.-J.F. in collaboration with M.D., who tailored it to each database. Keywords and subject heading terms were used to target papers addressing the intersection between VRET (virtual reality, virtual-reality exposure therapy, virtual environment, or computer simulation) and physiological parameters (biofeedback, heart rate, or arousal). The search strategies were also reviewed by a librarian peer using the PRESS checklist [38]. The search strategy and plan of concept are provided in the Supplementary Materials. Searches were limited to English and French language sources as every research team member was fluent in both.

2.2. Study Selection

Results from all searches were initially imported into Endnote, and duplicate records were removed. The references cited within relevant retrieved papers were also scanned for new sources. Records were then imported into Covidence [39], an online platform that allowed for independent eligibility assessment by a graduate student (M.-J.F.) and an undergraduate student (F.C.), initially based on titles and abstracts. Then, full-text assessments were performed by M.-J.F. and another graduate student (J.A.), and the remaining studies were included in this review. The inclusion of a study was based on (i) the use of a virtual-reality exposure therapy intervention, (ii) the measurement of physiological markers during the intervention, and (iii) real-time adaptation/changes in the intervention. To ensure consensus, discussions were held with the research team regarding the inclusion of studies. Studies were excluded if they evaluated VR interventions unrelated to subclinical or clinical psychiatric conditions, as defined by the DSM-5 [40] (e.g., gaming), or comprised posters, preprints, non-peer-reviewed reports, study protocols with no available data, non-accessible manuscripts, or papers from the media.

2.3. Data Extraction

Key information related to the study design, sample characteristics, and the outcome measured (real-time adaptation) was independently extracted by M.-J.F. and J.A. This systematic review was performed according to the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) [41].

2.4. Risk of Bias

The risk of bias in each included study was independently assessed by both reviewers using the revised Cochrane risk-of-bias tool for randomized trials [42] and the Risk Of Bias in Non-Randomized Studies of Interventions tool [43]. The reviewers were not blinded to the authors, journal, or publication.

3. Results

The search yielded 962 articles after removing duplicates. Out of the total records retrieved, fifteen studies [44,45,46,47,48,49,50,51,52,53,54,55,56,57,58] met the inclusion criteria and were included in the systematic review. The PRISMA flowchart [41] detailing each phase of information retrieval is presented in Figure 1. Studies (n = 15) were published between 2009 and 2025, with a marked increase in publications beginning in 2024. A summary of the data extraction is presented in Table 1. As shown in Table 1, the current evidence is constrained by small sample sizes, heterogeneous methodologies, and limited diversity of clinical populations, which together hinder direct comparisons across studies. The distribution of physiological measures across diagnoses is presented in Figure 2.
Biofeedback mechanisms were reported in ten studies [45,46,49,50,52,54,55,56,57,58]. Early work focused on biofeedback-enhanced VR for relaxation and controlled exposure to anxiety and stress [45,46,49]. Pallavicini et al. [49] and Gorini et al. [46] measured HR and EDA, while Gaggioli et al. [45] combined EEG, HR, HRV, respiration, and gesture recognition. Feedback was presented visually; for example, a virtual campfire diminished in intensity with reduced arousal [45,46,49]. All three randomized controlled trials included a mobile phone component for at-home use. Participants with generalized anxiety disorder (GAD) [46,49] or occupational stress [46] were randomized to VR and Mobile with Biofeedback (VRMB), VR and Mobile (VRM), or a waitlist. Both VRMB and VRM groups showed clinical improvements, with the VRMB group reporting greater reductions in anxiety symptoms and physiological arousal (EDA and HR). Gaggioli et al. [45] found that only the VRMB group had a significant reduction in trait anxiety, and the qualitative data suggested that the mobile app supported therapeutic generalization. Improvements in coping skills (use of emotional support, denial, positive attitude, being problem-focused, and religious coping) were observed across both treatment groups, though only VRMB participants showed significant gains in emotional support skills.
From 2024 onward, breathing- and HRV-based biofeedback dominated newer interventions [52,54,55,57,58] aimed at training slow, deep breathing in immersive environments. In forensic psychiatric inpatients, the DEEP game [52] used visual cues such as a breathing circle and growth of plants/corals to reinforce diaphragmatic breathing; half the participants showed physiological and self-reported benefits, though effects were heterogeneous and transfer to daily life was limited. Pratviel et al. [54] delivered HRV biofeedback via visual and haptic pacing cues in natural VR scenes, producing comparable gains in HRV, cardiac coherence, and stress reduction to a real-world setup, with no added VR advantage. Xu et al. [57] integrated HRV biofeedback into an interactive VR mindfulness scene, where fog and environmental elements shifted in real time; a single 5 min session reduced anxiety, increased mindfulness, and improved HRV compared to audio mindfulness or the control.
Other studies integrated multimodal or mixed-signal biofeedback. Premkumar et al. [55] combined HR and frontal alpha asymmetry feedback into self-guided VRET for public-speaking anxiety, showing greater physiological arousal reduction across sessions than VRET alone, despite similar anxiety score improvements. Zeng et al. [58] developed a multimodal (visual, auditory, tactile, and olfactory) VR system guiding diaphragmatic breathing, sustained exhalation, and breath-holding in gamified “Flame Altar,” “Cloud City,” and “Underwater Diving” scenarios. Real-time respiratory data triggered scene changes, and EEG data indicated neural relaxation correlates alongside positive usability feedback.
Attention-based feedback using eye tracking was tested in two studies [50,56]. Bekele et al. [50] developed a VR task for individuals with autism spectrum disorder (ASD) in which gaze to selected facial features (eyes and mouth) gradually revealed an emotional face, improving emotion recognition and reducing performance gaps relative to controls. However, initial participant confusion and reliance on gaze alone for adaptation limited engagement gains. Wechsler et al. [56] created a VRET for children with arachnophobia, where sustained gaze triggered positive changes in virtual spiders (e.g., dancing and shrinking). Both phobic and non-phobic children reported more positive than negative affect, and those with spider phobia showed large fear reductions, supporting the approach’s clinical potential.
The discrete categorization of anxiety or fear levels to adapt the VR environments in real-time was used in five studies [44,47,48,51,53]. Bălan [44] and Mevlevioğlu [48] applied machine learning classifiers (e.g., SVM, CNN, and DNN) on multimodal physiological data (EDA, HR, and EEG) to predict fear or anxiety severity and tailor exposure sequences or therapist support, with reasonable accuracy despite inter-individual variability. Kasimova et al. [51] used HRV and CS-index metrics to stratify combatants with PTSD into response categories during VR exposure, enabling personalized therapy and biofeedback-guided self-regulation. Kritikos et al. [47] adapted arachnophobia VRET stimuli dynamically based on EDA, adjusting spider number, size, and movement to maintain targeted arousal states, resulting in participants sustaining desired emotional states twice as long compared to static VR. Marques et al. [53] developed a modular VR framework integrating serious games with wearable biosensors (ECG, HR, respiration, EDA, and EEG) for both automatic and therapist-controlled real-time adjustment of exposure intensity. Two open-world scenarios, spider capture for arachnophobia and a missing-dog search for acrophobia, were designed with built-in respite periods to manage anxiety and collect baseline data. Preliminary evaluations confirmed the feasibility of synchronized biosignal capture, adaptive difficulty adjustment, and maintaining engagement through interactive narratives [44,47,48,51,53].
The detailed risk of bias evaluations, according to the revised Cochrane risk-of-bias tool for randomized trials [42] and the Risk of Bias in Non-Randomized Studies of Interventions tool [43], are presented in Table 2 and Table 3, respectively. Disagreements were resolved through consensus. All randomized controlled trial studies achieved an overall moderate quality. Non-randomized clinical studies were deemed to have a serious risk of bias, mainly due to the risk attributed to confounding factors.

4. Discussion

This systematic review examined fifteen studies exploring the use of physiological markers in VRET for subclinical or clinical psychiatric disorders, with a focus on real-time adaptation of virtual environments. Publications more than doubled after 2024, marking a shift toward more advanced visual displays, cue designs, technologies, and physiological measures. This review aimed to identify the technologies and markers used and the clinical contexts in which they are applied.

4.1. Technologies

The studies reviewed suggest two technological pathways through which physiological data are integrated into VRET: biofeedback-enhanced systems and real-time adaptive stimulus modulation. The former leverages physiological markers to train users in self-regulation, typically through visual metaphors that link internal states to environmental feedback (e.g., a calming fire) [45,46,49,50,52,54,55,56,57,58]. These systems position the user as an active agent, building interoceptive awareness and control, and can be extended beyond the clinical setting through mobile components for reinforcement [59]. In contrast, adaptive systems automate the modulation of virtual stimuli based on physiological reactivity, dynamically altering features such as intensity, frequency, and visual proximity to maintain users within a target arousal or emotional zone [44,47,48,51,53]. These systems reduce therapist input and adapt the therapy in the moment, making them particularly attractive for self-guided or scalable interventions [35,36,37].
First, this dichotomy has important implications for clinical design and therapeutic learning. Biofeedback-based systems can empower users by enhancing insight and a sense of control, which may strengthen transferable self-regulation skills [60,61,62]. In contrast, dynamically adaptive systems rely on implicit learning, which may facilitate exposure habituation in a more fluid and less cognitively demanding way [63,64]. However, this raises questions about user engagement and agency. Some users may benefit from systems that support conscious control and reflection, while those with high avoidance or cognitive difficulties may respond better to seamless, adaptive environments [65]. Tailoring the technological approach to the user’s cognitive style and clinical profile could improve outcomes. Future research should directly compare these models to determine their relative effectiveness and suitability across different patient populations.
Second, beyond technical safeguards, automated systems introduce complex questions of accountability [66,67]. If an algorithm misclassifies a user’s emotional state or triggers inappropriate exposure, who is responsible? These concerns are especially acute in mental health contexts, where poorly calibrated exposures may lead to iatrogenic trauma or premature disengagement from therapy. Marques et al. [53] acknowledged this by creating a manual override to allow the therapist to make informed adjustments, taking into consideration the patient’s verbal feedback and emotional state, preserving a balance between human and machine. However, current ethical and regulatory frameworks for VRET remain broadly underdeveloped, as these interventions occupy a hybrid space between medical devices and digital therapeutics [68]. There is a growing need for tailored ethical guidelines addressing issues such as informed consent for real-time physiological monitoring, data privacy, algorithmic bias, and user safety in partially or fully unsupervised settings.
Third, these technologies raise important questions about the evolving role of the therapist and the nature of the therapeutic alliance [69]. While adaptive systems may act as decision aids in therapist-guided settings, fully automated environments shift clinical control to algorithms, raising concerns about misclassification, overexposure, and diluted therapist accountability [70,71]. Moreover, this shift could alter the therapist’s function from relational guide to technical facilitator. If adaptive systems reduce opportunities for relational exchange, this could inadvertently weaken one of the most important active ingredients in therapy [70,71,72,73]. Nevertheless, these systems may also free up therapist time, allowing for greater efficiency in serving high-need patients [74,75]. Importantly, patient preferences and trust in these systems must be studied as some individuals may find fully automated settings alienating or disempowering. To navigate these changes, therapists may need new competencies that combine technological fluency with relational sensitivity. The use of monitoring dashboards illustrates the beginnings of co-therapy models that balance automation with human presence [45,48,51,53,57]. Future research could investigate how alliance quality and perceived empathy mediate outcomes in hybrid systems.
Finally, hybrid models that combine biofeedback and real-time adaptation may hold significant promise. Biofeedback could be utilized in the initial phases of therapy to enhance interoceptive awareness and develop self-regulation skills. Adaptive systems could be introduced in later stages to optimize exposure intensity without increasing cognitive load. This integration may offer the benefits of both user empowerment and personalized automation. Future research should move beyond comparisons of these models to examine their sequencing, cumulative effects, and alignment with individual preferences and clinical profiles. Mixed-methods designs that capture both quantitative outcomes and qualitative user experiences will be essential to understand not only what works but for whom, when, and why [76,77,78].

4.2. Biophysiological Markers

This review highlights considerable diversity in the physiological markers used across studies, including EDA, HR, HRV, EEG, eye tracking, respiration, skin temperature, and gesture recognition. This variety highlights the experimental richness of the field while also revealing persistent uncertainty regarding which signals are most clinically meaningful and best suited for real-time adaptation [14,79,80].
EEG provides nuanced insights into cognitive–affective dynamics and may be particularly valuable for detecting a broad range of emotional states across individuals and therapeutic contexts in VR, as highlighted in this review [44,45,48,81]. In the VR literature, it has been used to assess presence [82,83], classify and recognize emotions [84,85], objectively measure time perception [86], and monitor mental fatigue [87]. Despite its transdiagnostic potential and broad applicability, its use remains relatively limited within adaptive VRET (focusing on emotion recognition) and has not been found to be used by itself to adapt a dynamic environment. Potential barriers include longer setup times compared to other physiological sensors, variable signal quality, limited headset adjustability, short battery life, susceptibility to noise and artifacts, high cost, and the need for specialized expertise to operate and analyze complex software [88].
EDA, HRV, HR, and respiration are commonly employed, largely due to their non-invasiveness, affordability, and ease of integration into VR hardware [89,90]. Their widespread use reflects their utility in capturing general indices of arousal, stress, or relaxation, which can inform basic, real-time adaptations of the virtual environment, such as a bar graph display for the user or an object colour change [55,59]. Their interpretive limitations constrain their ability to differentiate among qualitatively distinct emotional states [79]. As such, while they remain valuable for simpler adaptive mechanisms, achieving finer granularity in emotional state detection will likely require multimodal approaches or the inclusion of additional markers with higher specificity [20,50,89,90,91].
Skin temperature has been scarcely explored as a parameter in adaptive VRET [50], despite its established correlation to EDA [92]. Immersive environments can influence thermoregulatory centres in the brain, and may be leveraged to modulate thermal perception [93]. For instance, avatar features (fire vs. ice hands) can alter perceived warmth and cause changes in skin temperature [94]. Moreover, skin temperature monitoring may also be used to identify needed resting times [95]. These findings suggest opportunities for integrating skin temperature feedback into adaptive VR, acknowledging its strong dependence on environmental conditions [93,94,95].
Eye tracking is an emerging parameter in VR with increasing applications in VRET [96,97,98]. Recent advances have enabled more accurate and seamless integration of gaze tracking, supporting applications such as gaze-based interaction [50,56]. In therapeutic contexts, eye tracking can provide real-time insights into attentional focus, avoidance behaviours, and emotional engagement, informing adaptive interventions [99]; however, technical limitations remain, including accuracy constraints, high computational demands, and susceptibility to calibration drift [96,100].
Generally, physiological signals in VR research exhibit inconsistencies in reliability and interpretability [101]. Signal quality can be compromised by individual differences, sensor placement, and environmental artifacts, resulting in fluctuations that may not reflect meaningful psychological states [14,79]. Additionally, no clear consensus exists regarding which physiological parameters should be prioritized, how data should be preprocessed and analyzed, or what thresholds should trigger adaptive responses within VR environments [102]. This lack of methodological standardization impedes cross-study comparisons and limits the generalizability of findings. Finally, physiological markers are inherently ambiguous: their interpretation is highly context-dependent and shaped by individual biological and emotional variability [103,104].
Despite ongoing methodological variability, a growing consensus suggests that multi-sensor approaches, which integrate complementary physiological signals, provide a more robust and accurate foundation for real-time assessment of user state in adaptive VRET [91,97,101].

4.3. Clinical Contexts

Current research on physiological–adaptive VRET has started to move away from its original focus on anxiety-related conditions, as shown by this review [44,45,46,47,48,49,50,51,52,53,54,55,56,57,58]. The narrow scope likely reflected VRET’s origins within cognitive-behavioural frameworks, where graded exposure is a central mechanism and virtual environments provide a practical extension [105]. While this foundation had enabled important proof-of-concept work, it limited the generalizability of findings to more complex or comorbid clinical populations [13,28,29].
Expanding the use of adaptive VRET into broader diagnostic categories presents both challenges and opportunities. Conditions such as PTSD, major depressive disorder, schizophrenia, and neurodevelopmental disorders involve diverse symptom profiles, cognitive impairments, and variable physiological responses [103]. In PTSD, physiological markers can be used to dynamically modulate the intensity of trauma-related cues to avoid retraumatization while maintaining engagement [8,10,22,51]. In depression, adaptive VR could support individualized activation strategies based on arousal or attentional states [106]. Similarly, individuals with schizophrenia might benefit from systems that adjust cognitive or emotional load during social or cognitive training, while those with autism spectrum disorder could engage in environments tailored for sensory sensitivity and social learning [50,107,108]. Future research should develop and evaluate interventions across a wider range of clinical and demographic profiles to ensure that real-time adaptive VRET fulfills its potential as a precision mental health tool.
Clinical implementation of adaptive VRET faces multi-level challenges spanning organizational, provider, technical, user, and systemic factors [109,110,111]. Organizational barriers include insufficient training time and the lack of dedicated treatment spaces. Provider-level barriers involve limited VR experience and limited evidence of added value as many studies focus on single-session interventions [112]. Technical issues such as hardware/software malfunctions and discomfort while wearing headsets also hinder adoption. Patient-related variabilities, including cognitive impairments, distress, or cybersickness, further limit use. High costs, maintenance demands, and resistance to innovation are some of the systemic barriers that compound these challenges [109]. Addressing these multi-level barriers will be essential for the successful clinical integration of VR-based interventions.

4.4. Future Directions

Future research on real-time adaptive VRET should prioritize large, multi-site trials comparing biofeedback-enhanced, fully adaptive, and hybrid systems and use mixed-methods to capture both the outcomes and users’ experiences [76,77,78]. Multi-sensor approaches should be prioritized [91,97,101]. Samples must expand beyond anxiety-focused, homogenous groups to include varied diagnoses (e.g., PTSD, depression, and schizophrenia) and recruit across diverse ages, genders, cultures, and socioeconomic contexts for equity and generalizability [13,28,29]. Standardization is needed in physiological marker selection, sensor placement, signal processing, and adaptive thresholds, alongside transparent reporting of hardware, algorithms, and safety protocols. Finally, research should address real-world implementation challenges to move adaptive VRET from proof-of-concept to scalable, ethically grounded clinical practice [109,110,111].

4.5. Strengths and Limitations

This systematic review offers a timely and comprehensive synthesis of how real-time neurophysiological markers are currently being integrated into VRET for mental health treatment. By employing a transparent and methodologically rigorous approach, it identifies key trends in technology use, adaptation strategies, and clinical application, contributing valuable insight into the emerging field of precision digital therapeutics. A notable strength is this review’s focus on real-time adaptation, a domain that remains underexplored yet holds substantial promise for personalized, scalable interventions.
However, several limitations must be acknowledged. First, this review was not based on a preregistered protocol. Second, the body of literature is still nascent, with most included studies being exploratory, characterized by small sample sizes, heterogeneous methodologies, and a high risk of bias. Third, the variability in physiological markers, virtual environments, and outcome measures limited cross-study comparability and precluded quantitative synthesis through meta-analysis. Furthermore, the predominance of positive findings and proof-of-concept designs raises the possibility of publication bias. These limitations underscore the need for larger-scale, methodologically rigorous studies that employ standardized protocols, encompass diverse populations, and evaluate interventions in ecologically valid settings.

5. Conclusions

This systematic review underscores the technical feasibility and conceptual promise of integrating real-time neurophysiological markers into VRET. While preliminary studies demonstrate innovative applications and encouraging outcomes, the field remains in an early, fragmented stage, characterized by limited methodological consistency and a narrow clinical focus. Real-time adaptive VRET has primarily targeted anxiety-related disorders, leaving substantial gaps in generalizability to more diverse and complex mental health populations. To advance the clinical relevance of these interventions, future research must prioritize robust experimental designs, standardized protocols, and inclusivity across diagnostic, demographic, and cultural variables. Ultimately, adaptive VRET represents a frontier in precision psychiatry, where the convergence of biosensing, immersive technology, and ethical clinical design may redefine therapeutic possibilities.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/biomedinformatics5030048/s1, Table S1: Concepts; Table S2: Search Strategy.

Author Contributions

All authors have read and agreed to the published version of the manuscript. Conceptualization, M.-J.F. and A.D.; methodology, M.-J.F. and M.D.; validation, M.-J.F. and M.D.; formal analysis, M.-J.F. and J.A.; investigation, F.C., J.A. and M.-J.F.; data curation, J.A. and M.-J.F.; writing—original draft preparation, J.A. and M.-J.F.; writing—review and editing, M.-J.F.; supervision, A.D.

Funding

This research received no external funding.

Data Availability Statement

The complete search strategy presented in this study is included in the article/Supplementary Materials. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Jerdan, S.W.; Grindle, M.; van Woerden, H.C.; Kamel Boulos, M.N. Head-Mounted Virtual Reality and Mental Health: Critical Review of Current Research. JMIR Serious Games 2018, 6, e14. [Google Scholar] [CrossRef] [PubMed]
  2. Imel, Z.E.; Caperton, D.D.; Tanana, M.; Atkins, D.C. Technology-enhanced human interaction in psychotherapy. J. Couns. Psychol. 2017, 64, 385–393. [Google Scholar] [CrossRef]
  3. Abramowitz, J.S. The Practice of Exposure Therapy: Relevance of Cognitive-Behavioral Theory and Extinction Theory. Behav. Ther. 2013, 44, 548–558. [Google Scholar] [CrossRef]
  4. Beele, G.; Liesong, P.; Bojanowski, S.; Hildebrand, K.; Weingart, M.; Asbrand, J.; Correll, C.U.; Morina, N.; Uhlhaas, P.J. Virtual Reality Exposure Therapy for Reducing School Anxiety in Adolescents: Pilot Study. JMIR Ment. Health 2024, 11, e56235. [Google Scholar] [CrossRef]
  5. Hembree, E.A.; Rauch, S.A.M.; Foa, E.B. Beyond the manual: The insider’s guide to Prolonged Exposure therapy for PTSD. Cogn. Behav. Pract. 2003, 10, 22–30. [Google Scholar] [CrossRef]
  6. Trpkovici, M.; Makai, A.; Prémusz, V.; Ács, P. The possible application of virtual reality for managing anxiety in athletes. Front. Sports Act. Living 2025, 7, 149354. [Google Scholar] [CrossRef] [PubMed]
  7. Kuleli, D.; Tyson, P.; Davies, N.H.; Zeng, B. Examining the comparative effectiveness of virtual reality and in-vivo exposure therapy on social anxiety and specific phobia: A systematic review & meta-analysis. J. Behav. Cogn. Ther. 2025, 35, 100524. [Google Scholar] [CrossRef]
  8. Carl, E.; Stein, A.T.; Levihn-Coon, A.; Pogue, J.R.; Rothbaum, B.; Emmelkamp, P.; Asmundson, G.J.G.; Carlbring, P.; Powers, M.B. Virtual reality exposure therapy for anxiety and related disorders: A meta-analysis of randomized controlled trials. J. Anxiety Disord. 2019, 61, 27–36. [Google Scholar] [CrossRef]
  9. Deacon, B.J.; Farrell, N.R. Therapist Barriers to the Dissemination of Exposure Therapy. In Handbook of Treating Variants and Complications in Anxiety Disorders; Storch, E.A., McKay, D., Eds.; Springer: New York, NY, USA, 2013; pp. 363–373. [Google Scholar]
  10. Jonathan, N.T.; Bachri, M.R.; Wijaya, E.; Ramdhan, D.; Chowanda, A. The efficacy of virtual reality exposure therapy (VRET) with extra intervention for treating PTSD symptoms. Procedia Comput. Sci. 2023, 216, 252–259. [Google Scholar] [CrossRef]
  11. Seuling, P.D.; Czernin, N.S.; Schiele, M.A. Virtual Reality exposure therapy in the treatment of public speaking anxiety and social anxiety disorder. Neurosci. Appl. 2024, 3, 104074. [Google Scholar] [CrossRef] [PubMed]
  12. Hawajri, O.; Jennifer, L.; and Suominen, S. Virtual Reality Exposure Therapy as a Treatment Method Against Anxiety Disorders and Depression-A Structured Literature Review. Issues Ment. Health Nurs. 2023, 44, 245–269. [Google Scholar] [CrossRef]
  13. Horigome, T.; Kurokawa, S.; Sawada, K.; Kudo, S.; Shiga, K.; Mimura, M.; Kishimoto, T. Virtual reality exposure therapy for social anxiety disorder: A systematic review and meta-analysis. Psychol. Med. 2020, 50, 2487–2497. [Google Scholar] [CrossRef]
  14. Halbig, A.; Latoschik, M.E. A Systematic Review of Physiological Measurements, Factors, Methods, and Applications in Virtual Reality. Front. Virtual Real. 2021, 2, 694567. [Google Scholar] [CrossRef]
  15. Martens, M.A.; Antley, A.; Freeman, D.; Slater, M.; Harrison, P.J.; Tunbridge, E.M. It feels real: Physiological responses to a stressful virtual reality environment and its impact on working memory. J. Psychopharmacol. 2019, 33, 1264–1273. [Google Scholar] [CrossRef]
  16. Wiederhold, B.K.; Jang, D.P.; Kim, S.I.; Wiederhold, M.D. Physiological Monitoring as an Objective Tool in Virtual Reality Therapy. CyberPsychol. Behav. 2002, 5, 77–82. [Google Scholar] [CrossRef]
  17. Hanshans, C.; Amler, T.; Zauner, J.; Bröll, L. Inducing and measuring acute stress in virtual reality: Evaluation of canonical physiological stress markers and measuring methods. J. Environ. Psychol. 2024, 94, 102107. [Google Scholar] [CrossRef]
  18. Diemer, J.; Mühlberger, A.; Pauli, P.; Zwanzger, P. Virtual reality exposure in anxiety disorders: Impact on psychophysiological reactivity. World J. Biol. Psychiatry 2014, 15, 427–442. [Google Scholar] [CrossRef]
  19. Rahman, M.A.; Brown, D.J.; Mahmud, M.; Harris, M.; Shopland, N.; Heym, N.; Sumich, A.; Turabee, Z.B.; Standen, B.; Downes, D.; et al. Enhancing biofeedback-driven self-guided virtual reality exposure therapy through arousal detection from multimodal data using machine learning. Brain Inform. 2023, 10, 14. [Google Scholar] [CrossRef]
  20. Šalkevicius, J.; Damaševičius, R.; Maskeliunas, R.; Laukienė, I. Anxiety Level Recognition for Virtual Reality Therapy System Using Physiological Signals. Electronics 2019, 8, 1039. [Google Scholar] [CrossRef]
  21. Norrholm, S.D.; Jovanovic, T.; Gerardi, M.; Breazeale, K.G.; Price, M.; Davis, M.; Duncan, E.; Ressler, K.J.; Bradley, B.; Rizzo, A.; et al. Baseline psychophysiological and cortisol reactivity as a predictor of PTSD treatment outcome in virtual reality exposure therapy. Behav. Res. Ther. 2016, 82, 28–37. [Google Scholar] [CrossRef]
  22. Volovik, M.G.; Belova, A.N.; Kuznetsov, A.N.; Polevaia, A.V.; Vorobyova, O.V.; Khalak, M.E. Use of Virtual Reality Techniques to Rehabilitate Military Veterans with Post-Traumatic Stress Disorder (Review). Sovrem Tekhnologii Med. 2023, 15, 74–85. [Google Scholar] [CrossRef]
  23. Mazgelytė, E.; Rekienė, V.; Dereškevičiūtė, E.; Petrėnas, T.; Songailienė, J.; Utkus, A.; Chomentauskas, G.; Karčiauskaitė, D. Effects of Virtual Reality-Based Relaxation Techniques on Psychological, Physiological, and Biochemical Stress Indicators. Healthcare 2021, 9, 1729. [Google Scholar] [CrossRef]
  24. Katz, A.C.; Norr, A.; Buck, M.; Benjamin, F.; Emily, E.-S.; Amanda, K.-W.; Patricia, Z.; Kimberlee, S.; Derek, J.; Holloway, K.; et al. Changes in physiological reactivity in response to the trauma memory during prolonged exposure and virtual reality exposure therapy for posttraumatic stress disorder. Psychol. Trauma Theory Res. Pract. Policy 2020, 12, 756–764. [Google Scholar] [CrossRef] [PubMed]
  25. Richesin, M.T.; Baldwin, D.R.; Wicks, L.A. Art making and virtual reality: A comparison study of physiological and psychological outcomes. Arts Psychother. 2021, 75, 101823. [Google Scholar] [CrossRef]
  26. Sakib, M.N.; Yadav, M.; Chaspari, T.; Behzadan, A.H. Coupling virtual reality and physiological markers to improve public speaking performance. In Proceedings of the 19th International Conference on Construction Applications of Virtual Reality (CONVR2019), Bangkok, Thailand, 13–15 November 2019; pp. 171–180. [Google Scholar]
  27. Chand, K.; Chandra, S.; Dutt, V. Raga Bhairavi in virtual reality reduces stress-related psychophysiological markers. Sci. Rep. 2024, 14, 24816. [Google Scholar] [CrossRef]
  28. Kothgassner, O.D.; Goreis, A.; Bauda, I.; Ziegenaus, A.; Glenk, L.M.; Felnhofer, A. Virtual reality biofeedback interventions for treating anxiety: A systematic review, meta-analysis and future perspective. Wien. Klin. Wochenschr. 2022, 134, 49–59. [Google Scholar] [CrossRef]
  29. Lüddecke, R.; Felnhofer, A. Virtual Reality Biofeedback in Health: A Scoping Review. Appl. Psychophysiol. Biofeedback 2022, 47, 1–15. [Google Scholar] [CrossRef] [PubMed]
  30. Chittaro, L.; Serafini, M.; Vulcano, Y. Virtual reality experiences for breathing and relaxation training: The effects of real vs. placebo biofeedback. Int. J. Hum.-Comput. Stud. 2024, 188, 103275. [Google Scholar] [CrossRef]
  31. Kerr, J.I.; Weibel, R.P.; Naegelin, M.; Ferrario, A.; Schinazi, V.R.; La Marca, R.; Hoelscher, C.; Nater, U.M.; von Wangenheim, F. The effectiveness and user experience of a biofeedback intervention program for stress management supported by virtual reality and mobile technology: A randomized controlled study. BMC Digital Health 2023, 1, 42. [Google Scholar] [CrossRef]
  32. Lindner, P. Better, Virtually: The Past, Present, and Future of Virtual Reality Cognitive Behavior Therapy. Int. J. Cogn. Ther. 2021, 14, 23–46. [Google Scholar] [CrossRef]
  33. Bălan, O.; Moldoveanu, A.; Leordeanu, M. A Machine Learning Approach to Automatic Phobia Therapy with Virtual Reality; Springer International Publishing: Cham, Switzerland, 2021; pp. 607–636. [Google Scholar]
  34. Petrescu, L.; Petrescu, C.; Mitruț, O.; Moise, G.; Moldoveanu, A.; Moldoveanu, F.; Leordeanu, M. Integrating Biosignals Measurement in Virtual Reality Environments for Anxiety Detection. Sensors 2020, 20, 7088. [Google Scholar] [CrossRef]
  35. Premkumar, P.; Heym, N.; Brown, D.J.; Battersby, S.; Sumich, A.; Huntington, B.; Daly, R.; Zysk, E. The Effectiveness of Self-Guided Virtual-Reality Exposure Therapy for Public-Speaking Anxiety. Front. Psychiatry 2021, 12, 694610. [Google Scholar] [CrossRef]
  36. Shin, B.; Oh, J.; Kim, B.-H.; Kim, H.E.; Kim, H.; Kim, S.; Kim, J.-J. Effectiveness of Self-Guided Virtual Reality–Based Cognitive Behavioral Therapy for Panic Disorder: Randomized Controlled Trial. JMIR Ment. Health 2021, 8, e30590. [Google Scholar] [CrossRef]
  37. Donker, T.; Cornelisz, I.; van Klaveren, C.; van Straten, A.; Carlbring, P.; Cuijpers, P.; van Gelder, J.-L. Effectiveness of Self-guided App-Based Virtual Reality Cognitive Behavior Therapy for Acrophobia: A Randomized Clinical Trial. JAMA Psychiatry 2019, 76, 682–690. [Google Scholar] [CrossRef]
  38. Peer Review of Electronic Search Strategies: 2015 Guideline Explanation and Elaboration (PRESS E&E); Canada’s Drug Agency: Ottawa, ON, Canada. 2015. Available online: https://www.cda-amc.ca/sites/default/files/pdf/CP0015_PRESS_Update_Report_2016.pdf (accessed on 17 June 2025).
  39. Covidence Systematic Review Software, Veritas Health Innovation: Melbourne, Australia. Available online: www.covidence.org (accessed on 16 June 2025).
  40. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders: DSM-IV; American Psychiatric Association: Washington, DC, USA, 1994; Volume 4. [Google Scholar]
  41. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  42. Higgins, J.P.T.; Savović, J.; Page, M.J.; Elbers, R.G.; Sterne, J.A.C. Chapter 8: Assessing risk of bias in a randomized trial. In Cochrane Handbook for Systematic Reviews of Interventions Version 6.5; Higgins, J.P.T., Thomas, J., Chandler, J., Cumpston, M., Li, T., Page, M.J., Welch, V.A., Eds.; Cochrane: London, UK, 2024. Available online: https://www.cochrane.org/authors/handbooks-and-manuals/handbook (accessed on 16 June 2025).
  43. Sterne, J.A.; Hernán, M.A.; Reeves, B.C.; Savović, J.; Berkman, N.D.; Viswanathan, M.; Henry, D.; Altman, D.G.; Ansari, M.T.; Boutron, I.; et al. ROBINS-I: A tool for assessing risk of bias in non-randomised studies of interventions. BMJ 2016, 355, i4919. [Google Scholar] [CrossRef]
  44. Bălan, O.; Moise, G.; Moldoveanu, A.; Leordeanu, M.; Moldoveanu, F. An Investigation of Various Machine and Deep Learning Techniques Applied in Automatic Fear Level Detection and Acrophobia Virtual Therapy. Sensors 2020, 20, 496. [Google Scholar] [CrossRef]
  45. Gaggioli, A.; Pallavicini, F.; Morganti, L.; Serino, S.; Scaratti, C.; Briguglio, M.; Crifaci, G.; Vetrano, N.; Giulintano, A.; Bernava, G.; et al. Experiential Virtual Scenarios With Real-Time Monitoring (Interreality) for the Management of Psychological Stress: A Block Randomized Controlled Trial. J. Med. Internet Res. 2014, 16, e167. [Google Scholar] [CrossRef]
  46. Gorini, A.; Pallavicini, F.; Algeri, D.; Repetto, C.; Gaggioli, A.; Riva, G. Virtual reality in the treatment of generalized anxiety disorders. Stud. Health Technol. Inform. 2010, 154, 39–43. [Google Scholar]
  47. Kritikos, J.; Alevizopoulos, G.; Koutsouris, D. Personalized Virtual Reality Human-Computer Interaction for Psychiatric and Neurological Illnesses: A Dynamically Adaptive Virtual Reality Environment That Changes According to Real-Time Feedback From Electrophysiological Signal Responses. Front. Hum. Neurosci. 2021, 15, 596980. [Google Scholar] [CrossRef]
  48. Mevlevioğlu, D.; Tabirca, S.; Murphy, D. Real-Time Classification of Anxiety in Virtual Reality Therapy Using Biosensors and a Convolutional Neural Network. Biosensors 2024, 14, 131. [Google Scholar] [CrossRef]
  49. Pallavicini, F.; Algeri, D.; Repetto, C.; Gorini, A.; Riva, G. Biofeedback virtual reality and mobile phones in the treatment of generalized anxiety disorder (GAD): A phase-2 controlled clinical trial. J. Cyber Ther. Rehabil. 2009, 2, 315–327. [Google Scholar]
  50. Bekele, E.; Wade, J.; Bian, D.; Fan, J.; Swanson, A.; Warren, Z.; Sarkar, N. Multimodal adaptive social interaction in virtual environment (MASI-VR) for children with Autism spectrum disorders (ASD). In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 121–130. [Google Scholar]
  51. Kasimova, L.N.; Kuznetsov, A.N.; Kropinova, I.I.; Kuznetsov, D.V.; Volovik, M.G.; Svyatogor, M.V.; Sychugov, E.M.; Borovskoy, G.Y.; Khalak, M.E. Comprehensive Assessment of Combatants’ Psychological and Psychophysiological State in Exposure Therapy of Post-Traumatic Stress Disorder Using Virtual Reality. Sovrem Tekhnologii Med. 2024, 16, 35–42. [Google Scholar] [CrossRef]
  52. Klein Haneveld, L.; Dekkers, T.; Bouman, Y.H.A.; Scholten, H.; Weerdmeester, J.; Kelders, S.M.; Kip, H. The Effect of the Virtual Reality-Based Biofeedback Intervention DEEP on Stress, Emotional Tension, and Anger in Forensic Psychiatric Inpatients: Mixed Methods Single-Case Experimental Design. JMIR Form. Res. 2025, 9, e65206. [Google Scholar] [CrossRef]
  53. Marques, B.; Moreira, D.; Neves, M.; Bras, S.; Fernandes, J.M.; Potel, M. Battle Against Your Fears: Virtual Reality Serious Games and Physiological Analysis for Phobia Treatment. IEEE Comput. Graph. Appl. 2025, 45, 67–75. [Google Scholar] [CrossRef]
  54. Pratviel, Y.; Bouny, P.; Deschodt-Arsac, V. Immersion in a relaxing virtual reality environment is associated with similar effects on stress and anxiety as heart rate variability biofeedback. Front. Virtual Real. 2024, 5, 1358981. [Google Scholar] [CrossRef]
  55. Premkumar, P.; Heym, N.; Myers, J.A.C.; Formby, P.; Battersby, S.; Sumich, A.L.; Brown, D.J. Augmenting self-guided virtual-reality exposure therapy for social anxiety with biofeedback: A randomised controlled trial. Front. Psychiatry 2024, 15, 1467141. [Google Scholar] [CrossRef]
  56. Wechsler, T.F.; Kocur, M.; Schumacher, S.; Rubenbauer, M.; Ruider, A.; Brockelmann, M.; Lankes, M.; Wolff, C.; Mühlberger, A. Looking fear in the eye: Gamified virtual reality exposure towards spiders for children using attention based feedback. Clin. Child. Psychol. Psychiatry 2024, 29, 1121–1136. [Google Scholar] [CrossRef] [PubMed]
  57. Xu, Q.; Gu, Y.; Hu, X. Brief Interactive Virtual Reality Mindfulness Training with Real-Time Biofeedback for Anxiety Reduction: A Pilot Study. Appl. Psychophysiol. Biofeedback 2025, 1–13. [Google Scholar] [CrossRef]
  58. Zeng, S.; Chen, L.; Lan, S. Research on the extension of respiratory interaction modalities in virtual reality technology and innovative methods for healing anxiety disorders. Sci. Rep. 2025, 15, 7936. [Google Scholar] [CrossRef] [PubMed]
  59. Planert, J.; Hildebrand, A.S.; Machulska, A.; Roesmann, K.; Neubert, M.; Pilgramm, S.; Pilgramm, J.; Klucken, T. Blended Mobile-Based Interventions With Integrated Virtual Reality Exposure Therapy for Anxiety Disorders: Thematic Analysis of Patient Perspectives. JMIR Hum. Factors 2025, 12, e60957. [Google Scholar] [CrossRef]
  60. Weerdmeester, J.; Van Rooij, M.M.; Granic, I. Visualization, self-efficacy, and locus of control in a virtual reality biofeedback video game for anxiety regulation. Cyberpsychol. Behav. Soc. Netw. 2022, 25, 360–368. [Google Scholar] [CrossRef]
  61. Kniffin, T.C.; Carlson, C.R.; Ellzey, A.; Eisenlohr-Moul, T.; Beck, K.B.; McDonald, R.; Jouriles, E.N. Using virtual reality to explore self-regulation in high-risk settings. Trauma Violence Abus. 2014, 15, 310–321. [Google Scholar] [CrossRef]
  62. Pinheiro, J.; de Almeida, R.S.; Marques, A. Emotional self-regulation, virtual reality and neurofeedback. Comput. Hum. Behav. Rep. 2021, 4, 100101. [Google Scholar] [CrossRef]
  63. Slater, M. Implicit learning through embodiment in immersive virtual reality. In Virtual, Augmented, and Mixed Realities in Education; Springer International Publishing: Cham, Switzerland, 2017; pp. 19–33. [Google Scholar]
  64. Bondesan, P.; Canal, A.; Fleury, S.; Boisadan, A.; Richir, S. Implicit learning of professional skills through immersive virtual reality: A media comparison study. In Proceedings of the 2025 IEEE Conference Virtual Reality and 3D User Interfaces (VR), Saint Malo, France, 8–12 March 2025; pp. 442–449. [Google Scholar]
  65. Houzangbe, S.; Christmann, O.; Gorisse, G.; Richir, S. Effects of Voluntary Heart Rate Control on User Engagement in Virtual Reality. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 982–983. [Google Scholar]
  66. Sukhija, A.; Goyal, H.; Sharma, B.; Angra, S. Accountability of Immersive Technologies in Healthcare: A Review. In Proceedings of the 2024 IEEE International Conference on Computing, Power and Communication Technologies (IC2PCT), Greater Noida, India, 9–10 February 2024; pp. 1898–1902. [Google Scholar]
  67. Schulteis, M.; Rothbaum, B.O. Ethical issues for the use of virtual reality in the psychological sciences. In Ethical issues in clinical Neuropsychology; Swets & Zeitlinger Publishers: Lisse, The Netherlands, 2002; pp. 243–277. [Google Scholar]
  68. Rudschies, C.; Schneider, I. Ethical, legal, and social implications (ELSI) of virtual agents and virtual reality in healthcare. Soc. Sci. Med. 2024, 340, 116483. [Google Scholar] [CrossRef] [PubMed]
  69. Hasan, S.; Alhaj, H.; Hassoulas, A. The Efficacy and Therapeutic Alliance of Augmented Reality Exposure Therapy in Treating Adults With Phobic Disorders: Systematic Review. JMIR Ment. Health 2023, 10, e51318. [Google Scholar] [CrossRef]
  70. Meyerbröker, K.; Morina, N.; Kerkhof, G.A.; Emmelkamp, P.M. Potential predictors of virtual reality exposure therapy for fear of flying: Anxiety sensitivity, self-efficacy and the therapeutic alliance. Cogn. Ther. Res. 2022, 46, 646–654. [Google Scholar] [CrossRef]
  71. Tremain, H.; McEnery, C.; Fletcher, K.; Murray, G. The therapeutic alliance in digital mental health interventions for serious mental illnesses: Narrative review. JMIR Ment. Health 2020, 7, e17204. [Google Scholar] [CrossRef]
  72. Meyerbröker, K.; Emmelkamp, P.M. Therapeutic processes in virtual reality exposure therapy: The role of cognitions and the therapeutic alliance. CyberTher. Rehabil. 2008, 1, 247–257. [Google Scholar]
  73. Varšová, K.; Juřík, V. Fostering therapeutic alliance and long-term benefits through virtual collaboration in VRET. In Proceedings of the International Psychological Applications Conference and Trends 2024, Porto, Portugal, 20–22 April 2024. [Google Scholar]
  74. Andersson, G.; Bergström, J.; Buhrman, M.; Carlbring, P.; Holländare, F.; Kaldo, V.; Nilsson-Ihrfelt, E.; Paxling, B.; Ström, L.; Waara, J. Development of a new approach to guided self-help via the Internet: The Swedish experience. J. Technol. Hum. Serv. 2008, 26, 161–181. [Google Scholar] [CrossRef]
  75. Marks, I.M.; Kenwright, M.; McDonough, M.; Whittaker, M.; Mataix-Cols, D. Saving clinicians’ time by delegating routine aspects of therapy to a computer: A randomized controlled trial in phobia/panic disorder. Psychol. Med. 2004, 34, 9–17. [Google Scholar] [CrossRef]
  76. Fagernäs, S.; Hamilton, W.; Espinoza, N.; Miloff, A.; Carlbring, P.; Lindner, P. What do users think about Virtual Reality relaxation applications? A mixed methods study of online user reviews using natural language processing. Internet Interv. 2021, 24, 100370. [Google Scholar] [CrossRef] [PubMed]
  77. Bernaerts, S.; Bonroy, B.; Daems, J.; Sels, R.; Struyf, D.; Gies, I.; van de Veerdonk, W. Virtual Reality for Distraction and Relaxation in a Pediatric Hospital Setting: An Interventional Study With a Mixed-Methods Design. Front. Digit. Health 2022, 4, 866119. [Google Scholar] [CrossRef]
  78. Seabrook, E.; Kelly, R.; Foley, F.; Theiler, S.; Thomas, N.; Wadley, G.; Nedeljkovic, M. Understanding How Virtual Reality Can Support Mindfulness Practice: Mixed Methods Study. J. Med. Internet Res. 2020, 22, e16106. [Google Scholar] [CrossRef]
  79. Khusainov, R.; Azzi, D.; Achumba, I.E.; Bersch, S.D. Real-time human ambulation, activity, and physiological monitoring: Taxonomy of issues, techniques, applications, challenges and limitations. Sensors 2013, 13, 12852–12902. [Google Scholar] [CrossRef]
  80. Weibel, R.P.; Grübel, J.; Zhao, H.; Thrash, T.; Meloni, D.; Hölscher, C.; Schinazi, V.R. Virtual reality experiments with physiological measures. J. Vis. Exp. JoVE 2018, 138, 58318. [Google Scholar]
  81. Tadayyoni, H.; Ramirez Campos, M.S.; Quevedo, A.J.U.; Murphy, B.A. Biomarkers of Immersion in Virtual Reality Based on Features Extracted from the EEG Signals: A Machine Learning Approach. Brain Sci. 2024, 14, 470. [Google Scholar] [CrossRef] [PubMed]
  82. Kober, S.E.; Neuper, C. Using auditory event-related EEG potentials to assess presence in virtual reality. Int. J. Hum.-Comput. Stud. 2012, 70, 577–587. [Google Scholar] [CrossRef]
  83. Savalle, E.; Pillette, L.; Won, K.; Argelaguet, F.; Lécuyer, A.; Macé, M.J.-M. Towards electrophysiological measurement of presence in virtual reality through auditory oddball stimuli. J. Neural Eng. 2024, 21, 046015. [Google Scholar] [CrossRef]
  84. Li, M.; Yu, P.; Shen, Y. A spatial and temporal transformer-based EEG emotion recognition in VR environment. Front. Hum. Neurosci. 2025, 19, 1517273. [Google Scholar] [CrossRef] [PubMed]
  85. Marcolin, F.; Olivetti, E.C.; Jimenez, I.A.C.; Passavanti, G.; Moos, S.; Vezzetti, E.; Celeghin, A. Stress assessment with EEG and machine learning in affective VR environments. Neurocomputing 2025, 638, 130185. [Google Scholar] [CrossRef]
  86. Niknam, S.; Duraisamy, S.; Botev, J.; Leiva, L.A. Brain Signatures of Time Perception in Virtual Reality. IEEE Trans. Vis. Comput. Graph. 2025, 31, 2535–2545. [Google Scholar] [CrossRef]
  87. Tehrani, B.M.; Wang, J.; Truax, D. Assessment of mental fatigue using electroencephalography (EEG) and virtual reality (VR) for construction fall hazard prevention. Eng. Constr. Archit. Manag. 2021, 29, 3593–3616. [Google Scholar] [CrossRef]
  88. Soufineyestani, M.; Dowling, D.; Khan, A. Electroencephalography (EEG) Technology Applications and Available Devices. Appl. Sci. 2020, 10, 7453. [Google Scholar] [CrossRef]
  89. Horvers, A.; Tombeng, N.; Bosse, T.; Lazonder, A.W.; Molenaar, I. Detecting Emotions through Electrodermal Activity in Learning Contexts: A Systematic Review. Sensors 2021, 21, 7869. [Google Scholar] [CrossRef] [PubMed]
  90. Egger, M.; Ley, M.; Hanke, S. Emotion Recognition from Physiological Signal Analysis: A Review. Electron. Notes Theor. Comput. Sci. 2019, 343, 35–55. [Google Scholar] [CrossRef]
  91. Younis, E.M.; Zaki, S.M.; Kanjo, E.; Houssein, E.H. Evaluating ensemble learning methods for multi-modal emotion recognition using sensor data fusion. Sensors 2022, 22, 5611. [Google Scholar] [CrossRef]
  92. Zhu, X.; Song, J.; Liu, T.; Huang, S.; Yao, B. Electrodermal activity and its molecular mechanisms: Unraveling insights into skin diseases. Innov. Life 2024, 2, 100085. [Google Scholar] [CrossRef]
  93. Ragland, N.; Asarpota Asnani, A.; Roy, A.; Allen, D. Viewing a hot virtual reality augments thermoregulatory responses, lowering body temperature during prolonged exercise. Physiology 2025, 40, 1427. [Google Scholar] [CrossRef]
  94. Kocur, M.; Jackermeier, L.; Schwind, V.; Henze, N. The Effects of Avatar and Environment on Thermal Perception and Skin Temperature in Virtual Reality. In Proceedings of the CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; p. 15. [Google Scholar]
  95. Rupp, M.A. Is it getting hot in here? The effects of VR headset microclimate temperature on perceived thermal discomfort, VR sickness, and skin temperature. Appl. Ergon. 2024, 114, 104128. [Google Scholar] [CrossRef]
  96. Jin, X.; Chai, S.; Tang, J.; Zhou, X.; Wang, K. Eye-Tracking in AR/VR: A Technological Review and Future Directions. IEEE Open J. Immersive Disp. 2024, 1, 146–154. [Google Scholar] [CrossRef]
  97. Alharbi, H. Explainable feature selection and deep learning based emotion recognition in virtual reality using eye tracker and physiological data. Front. Med. 2024, 11, 1438720. [Google Scholar] [CrossRef] [PubMed]
  98. Dillen, A.; Omidi, M.; Ghaffari, F.; Vanderborght, B.; Roelands, B.; Romain, O.; Nowé, A.; De Pauw, K. A shared robot control system combining augmented reality and motor imagery brain-computer interfaces with eye tracking. J. Neural Eng. 2024, 21, 056028. [Google Scholar] [CrossRef]
  99. Hickson, S.; Dufour, N.; Sud, A.; Kwatra, V.; Essa, I. Eyemotion: Classifying Facial Expressions in VR Using Eye-Tracking Cameras. In Proceedings of the 2019 IEEE Winter Conference on Applications of Computer Vision (WACV), Waikoloa Village, HI, USA, 7–11 January 2019; pp. 1626–1635. [Google Scholar]
  100. Mughrabi, M.H.; Mutasim, A.K.; Stuerzlinger, W.; Batmaz, A.U. My Eyes Hurt: Effects of Jitter in 3D Gaze Tracking. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022; pp. 310–315. [Google Scholar]
  101. Shu, L.; Xie, J.; Yang, M.; Li, Z.; Li, Z.; Liao, D.; Xu, X.; Yang, X. A Review of Emotion Recognition Using Physiological Signals. Sensors 2018, 18, 2074. [Google Scholar] [CrossRef]
  102. Zamkah, A.; Hui, T.; Andrews, S.; Dey, N.; Shi, F.; Sherratt, R.S. Identification of suitable biomarkers for stress and emotion detection for future personal affective wearable sensors. Biosensors 2020, 10, 40. [Google Scholar] [CrossRef]
  103. Kjærstad, H.L.; Jespersen, A.E.; Bech, J.L.; Weidemann, S.; Bjertrup, A.J.; Jacobsen, E.H.; Simonsen, S.; Glenthøj, L.B.; Nordentoft, M.; Reveles, K.; et al. Optimizing differential diagnostics and identifying transdiagnostic treatment targets using virtual reality. Eur. Neuropsychopharmacol. 2025, 92, 1–9. [Google Scholar] [CrossRef] [PubMed]
  104. Korpal, P.; Jankowiak, K. Physiological and self-report measures in emotion studies: Methodological considerations. Pol. Psychol. Bull. 2018, 49, 475–481. [Google Scholar] [CrossRef]
  105. Bouchard, S.; Dumoulin, S.; Robillard, G.; Guitard, T.; Klinger, É.; Forget, H.; Loranger, C.; Roucaut, F.X. Virtual reality compared with in vivo exposure in the treatment of social anxiety disorder: A three-arm randomised controlled trial. Br. J. Psychiatry 2017, 210, 276–283. [Google Scholar] [CrossRef]
  106. Baghaei, N.; Chitale, V.; Hlasnik, A.; Stemmet, L.; Liang, H.N.; Porter, R. Virtual Reality for Supporting the Treatment of Depression and Anxiety: Scoping Review. JMIR Ment. Health 2021, 8, e29681. [Google Scholar] [CrossRef]
  107. Lan, L.; Sikov, J.; Lejeune, J.; Ji, C.; Brown, H.; Bullock, K.; Spencer, A.E. A Systematic Review of using Virtual and Augmented Reality for the Diagnosis and Treatment of Psychotic Disorders. Curr. Treat. Options Psychiatry 2023, 10, 87–107. [Google Scholar] [CrossRef]
  108. Failla, C.; Chilà, P.; Vetrano, N.; Doria, G.; Scarcella, I.; Minutoli, R.; Scandurra, A.; Gismondo, S.; Marino, F.; Pioggia, G. Virtual reality for autism: Unlocking learning and growth. Front. Psychol. 2024, 15, 1417717. [Google Scholar] [CrossRef] [PubMed]
  109. Kouijzer, M.; Kip, H.; Bouman, Y.H.A.; Kelders, S.M. Implementation of virtual reality in healthcare: A scoping review on the implementation process of virtual reality in various healthcare settings. Implement. Sci. Commun. 2023, 4, 67. [Google Scholar] [CrossRef] [PubMed]
  110. Garrett, B.; Taverner, T.; Gromala, D.; Tao, G.; Cordingley, E.; Sun, C. Virtual Reality Clinical Research: Promises and Challenges. JMIR Serious Games 2018, 6, e10839. [Google Scholar] [CrossRef]
  111. Baniasadi, T.; Ayyoubzadeh, S.M.; Mohammadzadeh, N. Challenges and Practical Considerations in Applying Virtual Reality in Medical Education and Treatment. Oman Med. J. 2020, 35, e125. [Google Scholar] [CrossRef]
  112. Wieczorek, A.; Schrank, F.; Renner, K.H.; Wagner, M. Psychological and physiological health outcomes of virtual reality-based mindfulness interventions: A systematic review and evidence mapping of empirical studies. Digit. Health 2024, 10, 20552076241272604. [Google Scholar] [CrossRef] [PubMed]
Figure 1. PRISMA flow diagram.
Figure 1. PRISMA flow diagram.
Biomedinformatics 05 00048 g001
Figure 2. Distribution of physiological measures across diagnoses. EDA and HR are most frequently employed across disorders, while measures such as HRV, respiration, skin temperature, gesture recognition, and eye tracking are less common and concentrated in specific conditions.
Figure 2. Distribution of physiological measures across diagnoses. EDA and HR are most frequently employed across disorders, while measures such as HRV, respiration, skin temperature, gesture recognition, and eye tracking are less common and concentrated in specific conditions.
Biomedinformatics 05 00048 g002
Table 1. Characteristics of studies included in this review.
Table 1. Characteristics of studies included in this review.
First Author
Year
Country
Study TypeTechnologyDiagnosticsGroups Participants
% Female
Age M (SD) in YearsPhysiological MeasuresReal-Time
Intervention
Bălan, O.
2020
Romania
[44]
Exp.SVM, kNN, LDA, RF, deep neural network modelsAcrophobiaTGn = 8
75%
22–50EDA
HR
EEG
Fear detection to adapt therapy to each patient.
Bekele, E.
2016
United States [50]
Exp.Biopac (BioNomadix)
Emotiv EPOC headset
Unity 3D
ASDTGn = 6
0%
15.77 (1.9)Eye tracker
EEG
HR
Skin temperature
Respiration
EDA
Gaze scanning reveals concealed full emotional expression.
*
CGn = 6
0%
15.20 (1.7)
Gaggioli, A.
2014
Italy
[45]
RCTNeuroVR2Work-related stress in nurses and
teachers
TGn = 40
57.5%
46.3 (7.7)HR
EEG
HRV
Respiration
Gesture recognition
Monitoring dashboard for therapists.
Relaxation elements adjust.
*
CGn = 42
71.4%
42.9 (10.5)
WLn = 39
51.3%
39.6 (9.7)
Gorini, A.
2010
Italy
[46]
RCTESIEA VR and 3DVIA Virtools 4.1GADTGn = 4
% nr
18–50EDA
HR
Relaxation elements adjust.
*
CGn = 8
% nr
WLn = 8
% nr
Kasimova, L.
2024
Russia
[51]
Exp.HTC Vive Focus 3
Callibri sensor
PTSDTGn = 31
0%
35.61 (9.13)HRVMonitoring dashboard to correct the transmitted stimulus.
CGn = 38
0%
24.68 (5.71)
Klein Haneveld, L.
2025
The Netherlands
[52]
Exp.Oculus RiftAggression in forensic psychiatric inpatientsTGn = 6
0%
33.4 (9.3)RespirationAdaptation of flora in underwater world.
*
Kritikos, J.
2021
Greece
[47]
RCTOculus Rift Arduino
Unity 3D
ArachnophobiaTGn = 18
50%
18–72EDANumber of spiders, size, speed, direction, bouncing.
Marques, B.
2025
Portugal
[53]
Exp.Unity 3D
Plux Biosignals
Arachnophobia and acrophobiaTG n = 23
% nr
nrHR
Respiration
Monitoring dashboard to correct the transmitted stimulus.
Mevlevioğlu, D.
2024
Ireland
[48]
Exp.Unity 3DSubclinical anxietyTG n = 29
38%
18–55EDA
HR
EEG
Anxiety level classification.
Monitoring dashboard for therapists.
Pallavicini, F.
2009
Italy
[49]
RCTGeForce GTX Titan X graphics card, Intel Core i7-5820 K processor.GADTGn = 4
75%
41–51HR
EDA
Relaxation elements adjust.
*
CGn = 4
75%
Pratviel, Y.
2024
France
[54]
Exp.Photoplethysmography sensor
Unity
Stress and anxietyTG n = 36
25%
19.2 (1.3)HRVColour change representing cardiac coherence performance.
*
Premkumar, P.
2024
United Kingdom
[55]
RCTMicrosoft Band 2 wristband
Muse wireless EEG headband
SADTGn = 38
86.8%
25.47 (9.72)HR
EEG
Visual display at the back of the classroom.
*
CGn = 35
62.9%
30.44 (12.34)
Wechsler, T.
2024
Germany
[56]
Ext.Unreal
Engine 4
HTC VIVE Pro Eye display and eye tracker
ArachnophobiaTGn = 11
18.2%
10.27 (1.10)Eye trackingMake spiders dance or shrink.
*
CGn = 10
80%
9.40 (1.17)
Xu, Q.
2025
China
[57]
RCTPolar H10
Unity 3D
HTC VIVE headset
Subclinical anxietyTGn = 25
64%
19.52 (1.92)HRVMonitoring dashboard for therapist.
Visual transformation of environment.
*
CGn = 25
72%
19.76 (2.35)
WLn = 25
52%
20.68 (2.73)
Zeng, S.
2025
China
[58]
Exp.Emotiv EPOC headset
Arduino Unreal Engine
KY-037
sensor
Subclinical anxietyTGn = 24
50%
nrEEG
Respiration
Visual transformation of environment.
*
CGn = 14
50%
*: Presence of biofeedback interventions. Exp.: experimental study; SVM: Support Vector Machine; kNN: k-Nearest Neighbours; LDA: Linear Discriminant Analysis; RF: Random Forest; RCT: randomized controlled trial; nr: not reported; TG: treatment group; CG: control group; WL: waitlist group; GAD: generalized anxiety disorder; SAD: social anxiety disorder; ASD: autism spectrum disorder; PTSD: post-traumatic stress disorder.
Table 2. Quality and methodology assessment of included RCTs.
Table 2. Quality and methodology assessment of included RCTs.
D1 D2 D3 D4 D5 Overall
Gaggioli [45]Biomedinformatics 05 00048 i001Biomedinformatics 05 00048 i002Biomedinformatics 05 00048 i003Biomedinformatics 05 00048 i004Biomedinformatics 05 00048 i005Biomedinformatics 05 00048 i006
Gorini [46]Biomedinformatics 05 00048 i007Biomedinformatics 05 00048 i008Biomedinformatics 05 00048 i009Biomedinformatics 05 00048 i010Biomedinformatics 05 00048 i011Biomedinformatics 05 00048 i012
Kritikos [47]Biomedinformatics 05 00048 i013Biomedinformatics 05 00048 i014Biomedinformatics 05 00048 i015Biomedinformatics 05 00048 i016Biomedinformatics 05 00048 i017Biomedinformatics 05 00048 i018
Pallavicini [49]Biomedinformatics 05 00048 i019Biomedinformatics 05 00048 i020Biomedinformatics 05 00048 i021Biomedinformatics 05 00048 i022Biomedinformatics 05 00048 i023Biomedinformatics 05 00048 i024
Premkumar [55]Biomedinformatics 05 00048 i025Biomedinformatics 05 00048 i026Biomedinformatics 05 00048 i027Biomedinformatics 05 00048 i028Biomedinformatics 05 00048 i029Biomedinformatics 05 00048 i030
Xu [57]Biomedinformatics 05 00048 i031Biomedinformatics 05 00048 i032Biomedinformatics 05 00048 i033Biomedinformatics 05 00048 i034Biomedinformatics 05 00048 i035Biomedinformatics 05 00048 i036
D: domain; D1: risk of bias arising from the randomization process; D2: risk of bias due to deviations from the intended interventions; D3: missing outcome data; D4: risk of bias in the measurement of the outcome; D5: risk of bias in the selection of the reported result. Biomedinformatics 05 00048 i109: low risk; Biomedinformatics 05 00048 i110: moderate risk.
Table 3. Quality and methodology assessment of included experimental studies.
Table 3. Quality and methodology assessment of included experimental studies.
D1 D2 D3 D4 D5 D6D7Overall
Bălan [44]Biomedinformatics 05 00048 i037Biomedinformatics 05 00048 i038Biomedinformatics 05 00048 i039Biomedinformatics 05 00048 i040Biomedinformatics 05 00048 i041Biomedinformatics 05 00048 i042Biomedinformatics 05 00048 i043Biomedinformatics 05 00048 i044
Bekele [50]Biomedinformatics 05 00048 i045Biomedinformatics 05 00048 i046Biomedinformatics 05 00048 i047Biomedinformatics 05 00048 i048Biomedinformatics 05 00048 i049Biomedinformatics 05 00048 i050Biomedinformatics 05 00048 i051Biomedinformatics 05 00048 i052
Kasimova [59]Biomedinformatics 05 00048 i053Biomedinformatics 05 00048 i054Biomedinformatics 05 00048 i055Biomedinformatics 05 00048 i056Biomedinformatics 05 00048 i057Biomedinformatics 05 00048 i058Biomedinformatics 05 00048 i059Biomedinformatics 05 00048 i060
Klein Haneveld [52]Biomedinformatics 05 00048 i061Biomedinformatics 05 00048 i062Biomedinformatics 05 00048 i063Biomedinformatics 05 00048 i064Biomedinformatics 05 00048 i065Biomedinformatics 05 00048 i066Biomedinformatics 05 00048 i067Biomedinformatics 05 00048 i068
Marques [53]Biomedinformatics 05 00048 i069Biomedinformatics 05 00048 i070Biomedinformatics 05 00048 i071Biomedinformatics 05 00048 i072Biomedinformatics 05 00048 i073Biomedinformatics 05 00048 i074Biomedinformatics 05 00048 i075Biomedinformatics 05 00048 i076
Mevlevioğlu [48]Biomedinformatics 05 00048 i077Biomedinformatics 05 00048 i078Biomedinformatics 05 00048 i079Biomedinformatics 05 00048 i080Biomedinformatics 05 00048 i081Biomedinformatics 05 00048 i082Biomedinformatics 05 00048 i083Biomedinformatics 05 00048 i084
Pratviel [54]Biomedinformatics 05 00048 i085Biomedinformatics 05 00048 i086Biomedinformatics 05 00048 i087Biomedinformatics 05 00048 i088Biomedinformatics 05 00048 i089Biomedinformatics 05 00048 i090Biomedinformatics 05 00048 i091Biomedinformatics 05 00048 i092
Wechsler [56]Biomedinformatics 05 00048 i093Biomedinformatics 05 00048 i094Biomedinformatics 05 00048 i095Biomedinformatics 05 00048 i096Biomedinformatics 05 00048 i097Biomedinformatics 05 00048 i098Biomedinformatics 05 00048 i099Biomedinformatics 05 00048 i100
Zeng [58]Biomedinformatics 05 00048 i101Biomedinformatics 05 00048 i102Biomedinformatics 05 00048 i103Biomedinformatics 05 00048 i104Biomedinformatics 05 00048 i105Biomedinformatics 05 00048 i106Biomedinformatics 05 00048 i107Biomedinformatics 05 00048 i108
D: domain; D1: risk of bias due to confounding; D2: risk of bias in classification of interventions; D3: risk of bias in selection of participants into the study (or into the analysis); D4: risk of bias due to deviations from intended interventions; D5: risk of bias due to missing data; D6: risk of bias arising from measurement of the outcome; D7: risk of bias in selection of the reported result. Biomedinformatics 05 00048 i111: low risk; Biomedinformatics 05 00048 i112: moderate risk; Biomedinformatics 05 00048 i113: high risk.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Fradette, M.-J.; Azrak, J.; Cousineau, F.; Désilets, M.; Dumais, A. Real-Time Applications of Biophysiological Markers in Virtual-Reality Exposure Therapy: A Systematic Review. BioMedInformatics 2025, 5, 48. https://doi.org/10.3390/biomedinformatics5030048

AMA Style

Fradette M-J, Azrak J, Cousineau F, Désilets M, Dumais A. Real-Time Applications of Biophysiological Markers in Virtual-Reality Exposure Therapy: A Systematic Review. BioMedInformatics. 2025; 5(3):48. https://doi.org/10.3390/biomedinformatics5030048

Chicago/Turabian Style

Fradette, Marie-Jeanne, Julie Azrak, Florence Cousineau, Marie Désilets, and Alexandre Dumais. 2025. "Real-Time Applications of Biophysiological Markers in Virtual-Reality Exposure Therapy: A Systematic Review" BioMedInformatics 5, no. 3: 48. https://doi.org/10.3390/biomedinformatics5030048

APA Style

Fradette, M.-J., Azrak, J., Cousineau, F., Désilets, M., & Dumais, A. (2025). Real-Time Applications of Biophysiological Markers in Virtual-Reality Exposure Therapy: A Systematic Review. BioMedInformatics, 5(3), 48. https://doi.org/10.3390/biomedinformatics5030048

Article Metrics

Back to TopTop