Previous Article in Journal
Advancing Cognitive–Motor Assessment: Reliability and Validity of Virtual Reality-Based Testing in Elite Athletes
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integrating Physiologic Assessment into Virtual Reality-Based Pediatric Pain Intervention: A Feasibility Study

1
Department of Anesthesiology, Critical Care and Pain Medicine, Boston Children’s Hospital, Boston, MA 02445, USA
2
MIT.nano Immersion Lab, Massachusetts Institute of Technology, Cambridge, MA 02139, USA
3
Department of Psychiatry, Harvard Medical School, Boston, MA 02445, USA
*
Author to whom correspondence should be addressed.
Virtual Worlds 2025, 4(4), 47; https://doi.org/10.3390/virtualworlds4040047
Submission received: 1 September 2025 / Revised: 16 October 2025 / Accepted: 17 October 2025 / Published: 22 October 2025

Abstract

This feasibility study explored the integration of physiological monitoring into a virtual reality (VR) intervention for pediatric pain management. The goal of this study is to identify a feasible strategy for collecting physiologic data in the context of a VR intervention currently being developed for youth with chronic pain. We assess the potential of Cognitive Load (CL)—derived from heart rate and pupillometry/eye-tracking data—as a marker of arousal and user engagement in a VR simulation to promote school functioning in youth with chronic pain. The HP Reverb G2 Omnicept headset and Polar H10 heart-rate sensor were utilized. The Child Presence Questionnaire (CPQ) assessed participants’ self-reported immersion and engagement. Data collection focused on feasibility and utility of physiologic data in assessing arousal and correlations with self-reported experience. Nine participants engaged in the simulation, with eight yielding complete data. The simulation and headset were well tolerated. CPQ Transportation subscale showed trend-level correlation with mean CL. Due to small sample and feasibility focus, individual-level results were examined. Combining multiple physiologic markers into a construct like CL is intriguing, but data interpretability was limited. Pupillometry and related metrics show promise as feasible markers of engagement and arousal for VR-based intervention but require appropriate expertise to fully interpret. The study found that integration of physiologic monitoring is feasible, but further work is needed to standardize metrics and identify the most useful and user-friendly markers.

1. Introduction

Chronic pain, defined as pain that persists for more than three months, is a widespread issue among children and adolescents [1]. Although the prevalence of chronic pain varies by pain type and across studies, approximately one in five children and adolescents experience chronic pain [1]. Chronic pain increases the risk of negative mental health outcomes, school absences, and functional disability, decreasing the overall quality of life for affected children [2,3,4,5]. If left untreated, pediatric chronic pain may lead to substantial functional limitations into adulthood [6].
The use of modern digital health tools has far-reaching possibilities for chronic pain management within the pediatric and young adult populations [7,8]. In particular, extended reality (XR) tools, including augmented, mixed, and virtual reality (VR) applications, can increase patient engagement, reduce access barriers, and enhance patient care through digitally mediated mind–body interventions. Proposed mechanisms of the effects of XR for chronic pain include distraction, motion promotion, and graded exposure to fearful stimuli and situations to reduce fear of pain [9,10,11,12,13]. XR can provide real-time feedback to clinicians about patients’ motivation, engagement, learning, and recovery in physical rehabilitation. It has been shown to positively affect mood and anxiety symptoms in individuals with chronic pain, and there is some evidence that XR interventions can lead to reduced reliance on pain medications [14]. Given the current study’s focus specifically on VR, we henceforth focus specifically on VR findings in the literature. The literature base of VR for pediatric chronic pain management is small but rapidly expanding. A review by Mallari and colleagues concluded that VR reduces chronic pain while patients are immersed in the virtual environment, but less evidence was found for lasting effects post-VR treatment on pain intensity [15]. Conversely, VR has been well studied in the context of procedural pain management, where it has been found to be more effective in children than adults for procedural anxieties, thus demonstrating that age and technology-related variables may influence the success of VR tools [16]. Overall, the emerging literature suggests that VR shows promising results for non-pharmacologic pain management for youth and adults. However, research efforts are currently limited by inconsistent study methodology and a lack of agreed-upon core outcomes and measurement tools. Recent efforts have attempted to put forth consensus-based recommendations to guide studies of VR (and XR more broadly) in pain treatment through recommending core assessment domains and specific tools to use for these assessments [17,18]. Through this expert consensus process, physiologic markers have been deemed “important to consider” in the measurement of outcomes from VR interventions for pain. They are not yet recommended mandatory measures because of a lack of research on the most valid, reliable, and relevant approaches to monitoring physiologic responses.
Commonly used physiologic measurements in VR interventions include heart rate (HR), heart rate variability (HRV), pupillometry, and eye gaze tracking [19]. HR and HRV are typically measured by a photoplethysmogram (PPG) sensor that detects changes in blood volume within the microvascular bed of tissue using light signals that reflect onto the skin. PPG sensors are sometimes embedded within a VR headset hardware to detect the heart rate in beats per minute (BPM). They can also be tracked using an external sensor. Although the PPG sensors are prone to sources of noise and artifacts, they are a reliable, noninvasive method to measure changes in cardiac activity and body-based stress levels [20]. HRV is the interval between adjacent heartbeats and may be influenced by the sympathetic and parasympathetic nervous system by activities or stimuli that may increase or decrease arousal [21,22]. HRV, measured in root mean square of successive differences (RMSSD), can be indicative of the individual’s cardiac vagal tone within the context of stress management [21].
Pupillometry is the measure of the pupil’s dilation and position. The degree of pupil dilation may serve as an indicator of nociceptive input through the autonomic innervation of the iris muscle, thus offering the potential to be an objective measurement of pain intensity [23]. Eye tracking data, including saccades, blinking, and fixations, are also typically recorded in VR interventions. Saccades are quick, successive eye movements which can how an individual’s attention is directed and prioritized. Between saccades, eyes typically remain relatively still to fixate on objects of interest but are never completely stationary [24]. The fixation on objects demonstrates the user’s attentional state and indicates where the user chooses to focus. While pupillometry and eye tracking (e.g., saccades, blinking) are relatively context-independent markers and thus may serve as a generalizable measure of an individual’s attentional state, eye gaze data is, on the other hand, determined largely by the content presented in VR interventions (i.e., it is situation specific) and is not, itself, a reliable measure of a user’s attentional state [20]. Thus, gaze tracking (on an X, Y, Z three-dimensional plane system, X-horizontal, Y-vertical, Z-front/depth) may be better suited for assessing scene-specific immersiveness and indicating vision patterns and scene analysis in the context of VR interventions.
Some VR software systems combine multiple physiologic indicators to yield a summary indicator of overall physiological arousal. Typically, this is referred to as cognitive load (CL). According to the cognitive load theory and cognitive theory of multimedia learning [25,26], cognitive load refers to the mental effort required by a learning task and includes at least two main conceptual components: intrinsic and extraneous load. Intrinsic CL depends on the complexity of the task and the learner’s level of expertise, while extraneous CL relates to how the task is designed and how information is presented [27]. Overly elevated extraneous cognitive load in immersive VR can detract from learning and engagement, but it is positively correlated with presence and engagement in immersive virtual environments [28,29]. Some researchers have sought to define cognitive load by categorizing subjects’ facial expressions while performing tasks, but this has been found to be unreliable [30,31,32]. Other studies have sought to measure cognitive load through acoustic voice features, similarly yielding ungeneralizable results [33,34]. More reliable and context-independent indicators of cognitive load include summative measures of some combination of blood pressure, heart rate, high-frequency heart rate variability, electrodermal activity, skin conductance, pupillometry, and/or rapid eye movement tracking (i.e., saccades) [20]. Although cognitive load may or may not relate directly to pain [35], we sought to explore its utility as an indicator of engagement and physiologic arousal while participating in a VR simulation presenting stressful situations [36].
The purpose of this study is to explore the feasibility of integrating physiologic monitoring into the ongoing study of a VR intervention “vReal School” (vRS) to promote school functioning in youth undergoing intensive interdisciplinary pain treatment (IIPT) for chronic pain [37]. The primary aim was to identify potential physiologic markers to incorporate in subsequent iterations of this VR simulation. To do so, we first explored participants’ response to physiologic monitoring as part of their VR experience, including identifying any barriers to physiologic data collection. An additional exploratory aim was to identify whether cognitive load is an accurate and useful physiological measure to monitor participants’ responses to stressful virtual school environments. Relatedly, we explored correlations between cognitive load (objective) and self-report (subjective) measures to determine whether these physiologic parameters were associated with participants’ subjective experiences of immersiveness and engagement with the simulation [38]. In the context of this small pilot study, we sought to examine both overall trends and individual level responses to physiologic monitoring. This feasibility study will inform the design of a subsequent clinical trial, the ultimate goal of which is to explore the utility of VR-based interventions to promote school functioning in patients undergoing intensive pain rehabilitation.

2. Materials and Methods

2.1. Participants

Following initial usability testing of the vRS with clinicians, youth were recruited from an intensive interdisciplinary chronic pain treatment (IIPT) program at Boston Children’s Hospital (BCH) from July to November 2022. Eligible youth were current patients in the Mayo Family Pediatric Pain Rehabilitation Center (PPRC), BCH’s IIPT day hospital program for youth with chronic pain conditions that have failed first line outpatient treatments. After admission into the IIPT program, the research team approached interested participants to explain the study and assess eligibility. Eligibility criteria included (1) participant age between 11 and 18 years old, (2) current enrollment in middle or high school, and (3) willingness to participate for the entire duration of the study. Exclusionary criteria for the study reflected those of the IIPT program itself: (1) serious psychopathology (i.e., active suicidality, psychosis, eating disorder), (2) acute medical trauma (e.g., fractures), or (3) active, untreated disease or inflammatory conditions. The PPRC research coordinator met with all program participants and parents in person as part of their admission process to review any research studies for which they may have been eligible. Informed consent was obtained from all patients who met eligibility criteria and expressed interest in the study, including written informed consent from parents/guardians and documented assent from participants under age 18. The study’s potential risks of motion sickness, brief psychological distress, and privacy risks were explained during the consent process. Participants who completed the study received a $20 Amazon gift card for their participation.

2.2. Equipment

Prior to recruitment, a prototype version of the vReal-School intervention (Figure 1) was developed in collaboration with the engineering team at the MIT Nano Immersion Laboratory [39]. Data acquisition was performed using the HP Reverb G2 Omnicept Edition headset (HP, Palo Alto, CA, USA) with the HP Omnicept SDK (v1.14). The Unity engine (2021.3.10f1 LTS) served as the simulation environment and interfaced directly with the SDK via the HP Omnicept Unity plugin. In addition, the Glia SDK (v1.8.0) was employed for real-time physiological and cognitive metric integration within Unity [20], which offers an immersive visual environment and integrated biometric sensing capabilities. This headset is equipped with sensors for real-time monitoring of pupillometry (e.g., pupil position and dilation) and eye tracking, enabling the collection of rich physiological data during the VR experience. To measure heart rate and heart rate variability, participants wore a Polar H10 heart rate monitor, designed as a user-friendly chest strap.
Within the simulation, participants completed a series of tasks organized into 12 distinct “events” (Table 1), each representing school-related scenarios. To enhance ecological validity and user immersion, participants engaged in the simulation while standing, enabling natural movement within the virtual space under clinical supervision. This approach was intended to create a more immersive experience, approximating real-world navigation and interaction in a school environment. The virtual events were synchronized with the physiological monitoring equipment software, enabling seamless tracking of participants’ stress-responses within each VR event.
Participants encountered both physical (e.g., walking through hallways) and psychological challenges (e.g., turning in assignments, peer interactions) within the VR simulation. These scenarios were informed by earlier phases of the project, which included focus groups with individuals with lived experience, ensuring relevance and authenticity of the simulated experience [37].
The intervention consisted of a single 30 min exposure session to the VR simulation, conducted within the clinical IIPT setting. Each session was fully supervised by research and clinical staff to provide additional support as needed.

2.3. Measures

In addition to physiological markers collected through the VR headset and Polar H10 heart rate monitor, data collection also included structured participant questionnaires as well as open-ended responses, using an interview guide. Specifically, the following data was collected from participants:
  • Post-VR open-ended interview: We adapted a brief, open-ended interview developed by Griffin et al. to collect feedback from patients, parents, and clinicians on their VR experience [40]. The interview included questions such as “Tell me about what happened when you were in VR,” “Tell me about the parts that you expected,” “Tell me about the parts that you did NOT expect,” and “If you could change anything about this, how would you make it better?”.
  • Child Presence Questionnaire (CPQ): The CPQ gathered participants’ feedback on their level of engagement with the VR simulation, and how they perceived the VR experience [37,41,42]. Although no published psychometrics for the CPQ exist, the measure has been used in other studies of VR use in pediatric populations [41,42]. The three subscales on this measure included transportation (also often referred to as “presence”), realism, and immersion, key concepts in VR research [43]. The transportation subscale evaluates the perceived sensation of being physically relocated from the actual physical environment to a different, simulated space within the virtual experience. The realism subscale refers to the ability to which a virtual environment can replicate real-world sensory perceptions and interactions, with greater realism achieved through detailed visual representations and lifelike environmental cues. The immersion subscale assesses the extent to which a user becomes deeply engaged in the virtual environment and loses awareness of their real-life physical surroundings, experiencing the virtual space as if it were real. This sense of immersion is typically facilitated through a combination of high-quality visuals, audio, and precise user movements. The measure includes 12 items, such as “Did you feel like you were in control of the [VR experience]?” (See Table 2 for full measure). Responses were rated on a three-point scale (scored 0–2) including “No”, “A little”, or “A lot.”. The three subscales are summed for a total score indicating overall level of engagement. This measure has been recommended for the evaluation of pediatric VR interventions [43]. Cronbach’s alpha for the current sample was 0.84. In the current sample, Mean CPQ = 17, range 0–21; M immersion = 9.75, range 5–12; M realism = 3.25, range 0–6; M Transportation = 4, range 2–5).
  • Eye Tracking: Eye tracking and pupil diameter data was acquired at 120 Hz via the Tobii-based eye-tracking module integrated with the HP Omnicept Unity SDK. Each sample includes 3D gaze direction vectors for each eye, combined gaze, pupil diameter for each eye (mm), and validity flags and confidence values. Samples with a pupil dilation confidence value below 0.5 or values of −1 (indicated tracking loss) were marked invalid and excluded. Blinks and other missing intervals shorter than 100 ms were linearly interpolated, while longer gaps were retained as missing (null).
  • Cognitive Load (CL): Cognitive Load was recorded using the HP Omnicept SDK’s built-in metric, which combines pupil size, eye movement, and heart rate variability through a proprietary model. The SDK outputs this metric at approximately 1 Hz after an initial calibration period of 12–24 s. Samples with missing or invalid values were flagged by the SDK and excluded from analysis. Short gaps (<2 s) were linearly interpolated, while longer gaps were treated as missing. The resulting time series were then normalized to the [0–1] range per participant to allow for comparisons [44]. CL was computed based on an algorithm developed by the headset manufacturer (HP) [20]. The data distribution was centered around 0 by applying a z-score normalization, with 95% of the samples fall within the range of [−2, 2]. For our purposes, the z-score was transformed to a [0, 1] range using the following formula:
    Normalized_subjective_rating = (z_score_rating/4) + 0.5
    CL was measured on a scale from 0 to 1, where 0 indicates little to no CL, or that the HP Omnicept G2 VR headset was calibrating, and a score of 1 indicates highest CL.

2.4. Data Analysis Plan

The inclusion of physiologic monitoring tools in this pilot testing was intended to assess and establish their feasibility and accessibility for use in vRS future trials, identify any hurdles to implementation, and generate preliminary feedback on levels of immersion and engagement in this type of intervention. Feasibility and acceptability were measured by any adverse events or difficulties tolerating the physiologic monitoring sensors, technical issues limiting successful data collection, and other feedback from the pilot participants and study team members or clinicians. We also sought to explore correlations between physiologic data (specifically cognitive load) collected in this pilot with participant self-reports of their experiences with the VR. Physiologic data were analyzed in raw form.
To examine the relationship between cognitive load and subjective report of perceived presence, a Spearman’s rank-order correlation analysis was conducted using normalized average CL scores. This test assessed the linear association between CL—a proxy for physiological engagement—and scores on the CPQ measure. A higher correlation coefficient indicates a stronger relationship between physiological indicators of engagement and self-reported presence in the virtual environment, supporting the validity of the physiologic monitoring measures.

2.5. Use of AI

An AI-enhanced literature search tool supported by our hospital (Consensus) was used to aid in identifying appropriate literature for the literature review phase of study planning.

3. Results

This feasibility study found that integrating physiological monitoring into the VR simulation was generally well tolerated by youth participating in an intensive pain rehabilitation program. All participants (n = 9, mean age = 14.2, 67% female, 33% male, neuropathic pain = 3, widespread pain = 3, headache/migraine = 3) completed the school VR simulation without interruption, and no discomfort or adverse effects—such as cybersickness, nausea, or dizziness—were reported during or following the session. In addition to the lack of adverse events reported, post-session feedback indicated that the headset was comfortable to wear and did not interfere with participants’ ability to engage with VR. Participants similarly found the external heart rate monitor chest-band tolerable and did not report any discomfort with this device. Other aspects of the pilot VR setup, such as the tethering of the headset via computer cable, were noted to detract from the seamlessness of the VR experience, but no aspects of the physiologic monitoring were cited as barriers to VR engagement.
Data collection using the HP Omnicept G2 VR headset and the Polar H10 heart rate monitor was successful for the majority of participants. Technical difficulties resulted in the loss of data for one session (Participant 2), producing a failure rate of approximately 11%. Additionally, Participant 4 did not complete the post-session CPQ, resulting in incomplete self-report data for that session. Physiological data were recorded for the remaining participants, although some variability in data quality and completeness was observed. Overall, only 5 out of 8 participants whose data was recorded reached the end screen of the simulation.
The sampling intervals for externally monitored HR and HRV did not align precisely with transition between key school VR events across all nine participants, making it difficult to link specific HRV responses to distinct school scenarios simulated in the VR. As a result, the analysis focused on broader associations between cognitive load—derived jointly from heart rate and pupillometry/eye-tracking data collected within the headset—and psychological measures rather than event-specific HRV responses. Nonparametric Spearman’s ρ correlation coefficients were calculated to examine the relationship between cognitive load and self-reported CPQ (revealing a trend level correlation between cognitive load and the Transportation subdomain of the presence measure (ρ = 0.69, n = 7, p (two-tailed) = 0.08). Correlations between cognitive load and total CPQ score along with the other subscale scores were nonsignificant with the available sample size (n = 7). Due to insufficient data and a failure to meet the normality assumption, statistical analysis for cognitive load and CPQ trends was limited. Graphic visualization of individual participants’ cognitive load results was used to supplement the primary interpretation given that sample size restricted inferential analysis at the group level (Figure 2, Figure 3 and Figure 4).
In the context of the vRS simulation, “tasks” typically involve active engagement and problem-solving, which can increase cognitive load, whereas scene changes or “transitions” involve shifting from one environment to another, which might involve less cognitive engagement compared to tasks. Looking at mean results across all participants, we did not find a significant difference in mean CL scores between the tasks and the transition scenes. However, there is some variability in the CL within both tasks and transitions, indicating that not all tasks or transitions are equally demanding. Given the lack of group level significant findings, we explored individual results to inform subsequent studies. From the case studies depicted in Figure 2 and Figure 3, we can observe that for Participants 3 and 8, CL for tasks appears to be slightly higher than for scene changes. This is expected as tasks require more cognitive effort and attention.
Importantly, Participant 5 (Figure 4) deviated from the traditional course of the simulation by going to the wrong classroom first; see Event 9 (wrong). By comparing Participant 5’s CL with the other two case studies presented, we can see that Participant 5’s CL significantly drops when in the wrong classroom, then steadily increases as the participant is working to return to the correct simulation course. It is important to note that the CL change from this transition to the subsequent task (E10) is notably steeper compared to the other two participants. This indicates that the mistake of going to the wrong classroom first increased Participant 5′s cognitive load, likely due to confusion, stress, or the participant’s need to reorient themselves. These individual case level observations suggest that problem-solving tasks may be more cognitively demanding than transitions for some participants, and deviations from the expected course, which imply more focus and engagement needed for reorientation, can lead to increased cognitive load. Despite our small sample size, the three case studies presented underscore the potential uses of CL as an accurate measure of engagement in virtual reality environments.
Lastly, rich pupillometry and eye gaze data were collected for each participant, yet our analysis was limited by the small sample size and, more importantly, by the specialized expertise required to interpret these complex data, which was beyond the scope of our primary research team. Figure 5 represents eye gaze data among transitions and tasks for a case study of the current sample, which was deemed important to discuss. Combined eye gaze was recorded in the left-hand coordinate system (X points right–left, Y points top–bottom, and Z points forward) and relative to the head pose. For this participant, eye gaze data showed a clear concentration of visual attention along the Z axis, which represents forward-facing gaze, while movement along the X (horizontal) and Y (vertical) axes was minimal. This pattern suggests that the participant maintained steady forward visual focus on the task-relevant content within the virtual environment, rather than shifting attention away from the central field. Research indicates that greater immersion is typically reflected in more focused gaze patterns, with visual attention directed toward central or task-relevant elements (reflected by higher values on the Z axis), whereas disengagement is associated with more dispersed and erratic eye movements (reflected by higher values on the X and Y axes) [45,46]. Building on previous research, the case study presented indicates that combined eye gaze—tracking where users look in a VR environment—may serve as an appropriate measure of immersion and engagement.

4. Discussion

4.1. Feasibility

In this feasibility study, the integration of physiological monitoring into a virtual reality intervention was well tolerated by youth undergoing intensive interdisciplinary pain treatment. Participants demonstrated willingness to engage with both the VR content, integrated monitoring equipment (e.g., pupillometry), and external heart rate sensors, suggesting that the added layer of physiological data collection did not interfere with the immersive experience or perceived utility of the intervention. The absence of negative experiences supports the acceptability of embedding physiological monitoring into VR-based interventions for pediatric patients with chronic pain. This is consistent with previous recent studies of VR interventions in pediatric pain [40,47,48]. While data collection was largely successful, technical issues and variability in participant progression through the simulation highlight areas for improving reliability and task completion in future iterations. By refining both hardware integration and simulation design, consistent data capture and user engagement may be ensured, thus strengthening the ecological validity of VR and other XR as a tool to promote school exposure with youth in IIPT settings.
Participant open-ended feedback on the VR experience revealed valuable insights for future intervention development. While the headset was described as comfortable and non-intrusive, several participants indicated a desire for greater freedom of movement within a defined perimeter, rather than relying on handheld controllers to navigate the virtual environment. Based on this feedback, it may be inferred that the integration of physiological monitoring directly into the VR headset in terms of pupillometry enhanced both comfort and immersion by removing the need for wearable devices in addition to the heart rate monitor. For pediatric chronic pain patients, who may experience sensitivity to equipment due to pain and other associated functional symptoms, this streamlined setup may reduce distraction and allow for greater focus on the virtual environment. A stronger sense of presence, as demonstrated in our sample, is particularly relevant for therapeutic goals related to school exposure in IIPT, as it indicates that participants were able to engage meaningfully without feeling burdened by the technology. Additionally, passive, continuous data collection enabled an uninterrupted flow of the virtual experience. This combination of comfort, immersion, and real-time monitoring makes integrated VR headsets a valuable tool for delivering not only engaging VR experiences, but a rich source of objective information for the continuous improvements of VR interventions in IIPT settings.
These findings suggest that such integration is not only feasible but may also add value in future technological iterations in terms of personalized treatment feedback, real-time stress regulation support, or adaptive features based on physiologic responses [18,49,50,51]. The results provide a promising foundation for advancing future larger scale research aimed at optimizing VR interventions with physiological integrated components in pediatric pain treatment settings, where such interventions are used to further treatment goals of improved pain and functional outcomes. However, the evidence base is still developing, especially regarding the use of advanced physiological markers such as pupillometry and eye gaze. Similar to the present study, most studies to date are small-scale pilots or feasibility trials. There is a pressing need for larger, rigorously designed randomized controlled trials to establish long-term efficacy and to optimize protocols for physiological monitoring integration [17,18,50].
Another aspect that warrants attention is the rapid advancement of VR technology, which presents both significant opportunities and notable challenges for its implementation in IIPT settings and beyond. On one hand, fast-developing hardware and software—such as the headset-integrated physiological monitoring used in this study—have enhanced the immersive quality, interactivity, and research use of VR experiences, showing significant reductions in pain, fear of pain, pain avoidance, and functional limitations in pediatric chronic pain rehabilitation, with high patient engagement and satisfaction [18,40,52]. However, the rapid pace of technological development also introduces barriers. For example, clinicians and researchers in healthcare settings may struggle to adapt quickly enough, leading to challenges with integration into existing systems and workflows and exacerbating already-existent structural constraints to VR implementation and research [47,53]. For clinical researchers, rapid technological developments such as headset-integrated physiological monitoring may outpace the slow timelines inherent to human subject research in academic settings, thus limiting our ability to build timely, evidence-based support for the clinical use of cutting-edge VR. Despite the challenges posed by rapid technological advancement, proactive planning, training, and ethical oversight can support sustainable and effective VR integration into clinical settings such as IIPT programs.

4.2. Cognitive Load

While VR interventions show promise as a complementary therapy for pediatric chronic pain, with growing evidence for their potential to improve pain relief and increase function [54,55], most studies focus on self-reported outcomes and do not directly measure cognitive load using physiological markers like heart rate or pupillometry/eye gaze. Most studies to date have investigated CL in the context of learning outcomes within virtual environments compared to traditional methods [56,57,58,59]. The concept is not widely used in the context of chronic pain and thus was exploratory in this feasibility study. Cognitive load is broadly used for understanding the complexity that occurs when designing VR learning experiences. More complex and detailed visual representations can enhance presence through higher representational fidelity (realism) but may also increase extraneous CL (the CL component that is dependent on the design of the learning task and based on how information is presented to the learner), thus potentially reducing learning [28]. Although recent studies have begun to explore whether physiological measures such as HRV and pupillometry/eye gaze can objectively assess CL in VR [60,61,62,63], there is a notable gap in the literature regarding the use of HRV and pupillometry-derived CL as a measure of presence and engagement in VR interventions for pediatric chronic pain. In addition, CL algorithms may be specific to headset manufacturers or companies, as was the case in the current study, leading to a lack of comparability in findings as well as challenges to clinical teams in fully understanding these metrics.
In the current study, a trend-level correlation between CL and the transportation subdomain of the CPQ suggests that, in this sample, greater CL during the VR experience may be somewhat associated with a stronger sense of being “transported” into the virtual school environment. However, given the lack of significant associations between CL and other CPQ subscales as well as the total measure, these results only provide partial support for the merits of CL as a physiologic marker of VR engagement. For youth undergoing intensive pain rehabilitation with the goal of school reintegration, this study offers modest support for the potential of VR to create immersive school experiences that meaningfully capture attention, promote presence, and offer school exposure in a way that would not otherwise be possible in a hospital setting [37]. Optimal CL may reflect active processing and deep involvement with the content [57], both of which are critical for therapeutic engagement. This relationship highlights the value of designing VR interventions that are mentally stimulating without being overly cognitively taxing, particularly for pediatric populations managing chronic pain. It also supports the use of physiological monitoring more generally, such as heart rate and pupillometry/eye gaze data (basis markers of CL), as a promising approach for evaluating user engagement [62,63], allowing interventions to be tailored for optimal immersion and therapeutic impact.
Although the CPQ total score did not significantly correlate with CL, likely due to small sample size, the general pattern of these two scores moving in the same direction (rho = 0.51, p = 0.24) raises useful questions for further research around the utility of combining self-reported experiences with measurable physiological responses in evaluating engagement and clinical outcomes in VR clinical research [64]. Presumably, the use of multiple methods such as physiologic feedback combined with self-reported experience allows for a richer interpretation of the concept of presence and a more comprehensive assessment of associated clinical outcomes.

4.3. Pupillometry and Eye Tracking

Another aspect that warrants attention is the use of pupillometry as a physiological marker in VR clinical interventions and research. Previous studies show that eye movements correlate with the immersiveness of virtual environments and could be used to reliably measure immersion in VR [65,66]. Combined eye gaze ray (tracking where users look in a VR environment) can serve as an objective measure of immersion and engagement. Research shows that higher immersion is associated with more focused and less dispersed gaze patterns, often concentrated toward the center or on relevant objects, while disengagement leads to more scattered gaze behavior [46,47]. Eye-tracking data, including gaze deviation, can distinguish between states of boredom and engagement, with minimal gaze deviation indicating sustained attention and immersion [67].
In the current study, eye gaze was recorded along an X, Y, Z three-dimensional plane using the HP headset-integrated eye-tracking system, with the Z axis representing forward visual attention. In the case study presented, the participant demonstrated consistently higher values along the Z axis, suggesting that their gaze remained focused on the primary visual field rather than shifting laterally (X axis) or vertically (Y axis). This pattern may indicate a higher level of immersion, task engagement, and visual focus, which are key components of a successful VR experience in therapeutic settings. Given the lack of consistent findings at the group level, however, this interpretation is preliminary and requires further replication. Such directional gaze data offers potentially valuable, objective insight into how users interact with and attend to virtual environments. However, while these preliminary findings are promising, fully interpreting the meaning and significance of gaze direction and pupillometry indicators requires advanced analytical methods and collaboration with appropriate expertise. Techniques such as gaze clustering, gaze-ray casting, and real-time behavioral mapping, along with expertise in spatial data analysis and physiological signal interpretation, were beyond the scope and training of our current research team. As such, while the pupillometry/eye gaze data collected show potential as a proxy for user engagement, more in-depth analysis will be essential in future studies to draw stronger conclusions about attentional focus and immersion in pediatric chronic pain VR interventions.
We used the HP Omnicept SDK’s built-in cognitive load metric, which combined pupil size, hear rate variability, and eye movement data through a proprietary model. Although vendor-specific measures can make it harder to compare across different systems, we chose this headset due to their HP validation and accessible in real time through Unity. This allowed us to look at feasibility and comparisons within our own equipment, and not to address across different devices.

4.4. Limitations

Several limitations of this feasibility study should be considered when interpreting the findings. First, although cognitive load is a well-established concept in the field of learning [25,26] and is increasingly relevant in virtual reality research as a correlate of physiologic arousal [36], no previous studies, to the best of our knowledge, have specifically explored cognitive load in VR within pediatric chronic pain populations, and specifically within IIPT programs. This gap both limits the interpretability of our data and also presents a promising new direction for future research. In this context, while the feasibility design is appropriate for the study’s exploratory nature, it is important to note that the findings are based on a small, homogeneous sample recruited exclusively from a single tertiary-care pain program, which restricts the study’s external validity. Hence, results may not be extrapolated to broader pediatric populations or community settings. Future research should involve more diverse and representative samples from multiple settings to enhance the generalizability of the findings.
Second, while physiological data collection was largely successful, there were notable challenges from a methodological standpoint. The HP Omnicept G2 headset and integrated sensors failed to detect data for one session, resulting in missing data for one participant. Furthermore, the heart rate (HR) and heart rate variability (HRV) sampling intervals did not align with the VR event markers, which limited efforts to analyze event-associated body-based stress responses. These issues limited the granularity of the data and reduced the capacity for data analysis among VR events, but these limitations are addressable for future studies (see Section 4.5).
Another important limitation to consider is the specialized knowledge required in analyzing pupillometry/eye-tracking data. Pupillometry (which includes a host of measures such saccades, fixational drifts, pursuit eye movements, and pupil dilation) has been steadily gaining traction in recent years as a promising tool for assessing immersion and engagement in virtual environments [68]. The broader field of VR research is increasingly moving in this direction and applying these methods to pediatric pain populations is a logical and important next step, given its value in assessing clinical outcomes for VR-based interventions. However, meaningful analysis of pupillometry, including eye gaze data, especially in complex, interactive VR settings, requires highly specialized expertise in areas such as signal processing, neurophysiology, spatial data analysis, and human–computer interaction. In this study, although the HP headset-integrated system successfully captured detailed eye-tracking data, we were unable to conduct meaningful analyses of the full pupillometry data available due to the lack of team members with this specific skill set. This presents a broader accessibility barrier for clinical research teams hoping to integrate these advanced tools. As such, future study protocols and funding proposals should carefully consider the analytical demands of physiological monitoring in VR, including the need for multidisciplinary collaboration from the outset. Building teams that include data scientists, VR developers, cognitive scientists, and clinicians will be essential to fully realize the potential of pupillometry and eye-tracking in immersive pediatric chronic pain interventions.
Lastly, even though our study was designed to explore the feasibility of integrating physiological monitoring into VR rather than to assess clinical pain outcomes, it is important to note that excluding clinical measures (such as pain ratings, anxiety levels, or functional outcomes) limits our understanding of how this integration may affect patient care. Without these clinical measures, it cannot be fully assessed whether the physiological data captured in VR translates into meaningful improvements in pain management. Future studies should include comprehensive clinical assessments to provide a full picture of the impact of VR-based physiological integration in pediatric pain settings.

4.5. Next Iteration Plan

The next phase of this research will prioritize enhancing the precision and interpretability of integrated physiological monitoring in VR interventions. Together with a focus on multidisciplinary collaboration, future studies should incorporate timestamping, synchronized event markers, and epoch-based analyses to align physiologic data streams (e.g., heart rate, heart rate variability, and pupillometry) with discrete VR events. This approach will enable more fine-grained analyses of cognitive load and arousal in relation to specific simulation tasks.
To reduce the likelihood of technical failures in future study iterations, several procedural and technological improvements may be considered. Implementing standardized pre-session equipment checks can help ensure that all sensors—such as those for heart rate and pupillometry—are functioning correctly before the intervention begins. Data collection may be monitored after completion of each research session to troubleshoot software errors and consequently determine recruitment needs according to study sample goals. Alternatively, incorporating real-time data monitoring during sessions would allow detecting and addressing issues as they occur, minimizing data loss. Additionally, comprehensive training for research staff on equipment setup, troubleshooting, software maintenance/updating, and response protocols will further support data collection reliability and robustness [48,69]. Collectively, these steps can enhance data quality and ensure that future studies are both technically vigorous and clinically practical.
As previously mentioned, while the current study focused intentionally on feasibility of physiologic marker integration rather than assessing clinical pain outcomes, subsequent iterations should also add validated clinical outcomes such as pain intensity (VAS/NRS), anxiety scores, and functional measures. These outcomes will help determine whether physiological monitoring measurements not only capture engagement and arousal but also correspond to clinical outcomes. By integrating synchronized physiological monitoring with clinical outcomes, future work will advance the development of standardized, user-friendly markers that can inform both real-time adaptation of VR content and the evaluation of treatment efficacy.

5. Conclusions

This feasibility study explored the integration of physiological monitoring, specifically cognitive load and its subcomponents (HRV and pupillometry/eye-tracking) into a VR-based intervention for pediatric pain management. While these metrics show promise as feasible markers of engagement and arousal for VR-based intervention, due to the limitations discussed, results should be interpreted with some caution and are limited to feasibility findings. Further research is needed to fully endorse and guide the use of cognitive load and pupillometry as physiologic monitoring parameters in pediatric chronic pain clinical research. As with the present study, much of the existing research consists of small-scale feasibility studies, highlighting the need for larger scale randomized controlled trials, to evaluate long-term efficacy and clinical outcomes, and establish refined, scalable protocols for the use of physiological monitoring in VR research for pediatric chronic pain. New research should incorporate additional sensitive physiological monitoring parameters (e.g., electrodermal activity, respiratory rate, body temperature, blood cortisol levels, and blood pressure), with the goal of capturing a more complete picture of engagement with VR interventions and associated clinical outcomes. With these future studies and the continued technological advancement of integrated physiologic monitoring within VR, physiologic monitoring measures can help enrich analysis of patients’ clinical outcomes when using VR therapy treatments and continue to grow a larger evidence base for the use of VR in clinical settings.

Author Contributions

As principal investigator, D.E.L. was involved in all stages of the project and supervised the on-site team (H.M. and S.R.M.). H.M. was involved in analyzing and visualizing data and writing the original draft. S.R.M. contributed to project administration, data analysis and visualization, original draft preparation, as well as the review and editing process. T.R. and B.A. led the methodology, software, and formal analysis efforts, and T.R. also contributed to editing of the final draft. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Boston Children’s Hospital Department of Anesthesia, Critical Care and Pain Medicine Ignition and Trailblazer Awards to Deirdre Logan.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Institutional Review Board of Boston Children’s Hospital (IRB-P00038226, approved on 23 April 2021).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study. This included written informed consent for parents/guardians and written assent for participants.

Data Availability Statement

The data presented in this study are available on request from the corresponding author due to technical limitations.

Acknowledgments

The authors acknowledge Shealyn O’Donnell, Kevin Zirko, and the staff and families of the BCH PPRC program for their assistance with and support of this project. Thanks also to Karina Khanna. During the preparation of this manuscript, the author(s) used Consensus, an AI-enhanced literature search tool supported by our hospital, to aid in identifying appropriate literature. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Chambers, C.T.; Dol, J.; Tutelman, P.R.; Langley, C.L.; Parker, J.A.; Cormier, B.T.; Macfarlane, G.J.; Jones, G.T.; Chapman, D.; Proudfoot, N.; et al. The Prevalence of Chronic Pain in Children and Adolescents: A Systematic Review Update and Meta-Analysis. Pain 2024, 165, 2215–2234. [Google Scholar] [CrossRef]
  2. Logan, D.E.; Gray, L.S.; Iversen, C.N.; Kim, S. School Self-Concept in Adolescents with Chronic Pain. J. Pediatr. Psychol. 2017, 42, 892–901. [Google Scholar] [CrossRef]
  3. Gold, J.I.; Yetwin, A.K.; Mahrer, N.E.; Carson, M.C.; Griffin, A.T.; Palmer, S.N.; Joseph, M.H. Pediatric Chronic Pain and Health-Related Quality of Life. J. Pediatr. Nurs. 2009, 24, 141–150. [Google Scholar] [CrossRef] [PubMed]
  4. Simons, L.E.; Sieberg, C.B.; Claar, R.L. Anxiety and Functional Disability in a Large Sample of Children and Adolescents with Chronic Pain. Pain Res. Manag. 2012, 17, 93–97. [Google Scholar] [CrossRef]
  5. Maes, M.; Van Den Noortgate, W.; Fustolo-Gunnink, S.F.; Rassart, J.; Luyckx, K.; Goossens, L. Loneliness in Children and Adolescents with Chronic Physical Conditions: A Meta-Analysis. J. Pediatr. Psychol. 2017, 42, 622–635. [Google Scholar] [CrossRef] [PubMed]
  6. Murray, C.B.; Groenewald, C.B.; de la Vega, R.; Palermo, T.M. Long-Term Impact of Adolescent Chronic Pain on Young Adult Educational, Vocational, and Social Outcomes. Pain 2020, 161, 439–445. [Google Scholar] [CrossRef]
  7. Mahrer, N.E.; Gold, J.I. The Use of Virtual Reality for Pain Control: A Review. Curr. Pain Headache Rep. 2009, 13, 100–109. [Google Scholar] [CrossRef] [PubMed]
  8. Malloy, K.M.; Milling, L.S. The Effectiveness of Virtual Reality Distraction for Pain Reduction: A Systematic Review. Clin. Psychol. Rev. 2010, 30, 1011–1018. [Google Scholar] [CrossRef]
  9. Gold, J.I.; Belmont, K.A.; Thomas, D.A. The Neurobiology of Virtual Reality Pain Attenuation. Cyberpsychol. Behav. 2007, 10, 536–544. [Google Scholar] [CrossRef]
  10. Gupta, A.; Scott, K.; Dukewich, M. Innovative Technology Using Virtual Reality in the Treatment of Pain: Does It Reduce Pain via Distraction, or Is There More to It? Pain Med. 2018, 19, 151–159. [Google Scholar] [CrossRef]
  11. Law, E.F.; Dahlquist, L.M.; Sil, S.; Weiss, K.E.; Herbert, L.J.; Wohlheiter, K.; Horn, S.B. Videogame Distraction Using Virtual Reality Technology for Children Experiencing Cold Pressor Pain: The Role of Cognitive Processing. J. Pediatr. Psychol. 2011, 36, 84–94. [Google Scholar] [CrossRef]
  12. Moseley, G.L.; Flor, H. Targeting Cortical Representations in the Treatment of Chronic Pain: A Review. Neurorehabil. Neural Repair 2012, 26, 646–652. [Google Scholar] [CrossRef]
  13. Parsons, T.D.; Trost, Z. Virtual Reality Graded Exposure Therapy as Treatment for Pain-Related Fear and Disability in Chronic Pain. In Virtual, Augmented Reality and Serious Games for Healthcare 1; Ma, M., Jain, L.C., Anderson, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; pp. 523–546. ISBN 978-3-642-54816-1. [Google Scholar]
  14. Colloca, L.; Raghuraman, N.; Wang, Y.; Akintola, T.; Brawn-Cinani, B.; Colloca, G.; Kier, C.; Varshney, A.; Murthi, S. Virtual Reality: Physiological and Behavioral Mechanisms to Increase Individual Pain Tolerance Limits. Pain 2020, 161, 2010–2021. [Google Scholar] [CrossRef]
  15. Mallari, B.; Spaeth, E.K.; Goh, H.; Boyd, B.S. Virtual Reality as an Analgesic for Acute and Chronic Pain in Adults: A Systematic Review and Meta-Analysis. J. Pain Res. 2019, 12, 2053–2085. [Google Scholar] [CrossRef]
  16. Arthur, T.; Melendez-Torres, G.J.; Harris, D.; Robinson, S.; Wilson, M.; Vine, S. Extended Reality Interventions for Health and Procedural Anxiety: Panoramic Meta-Analysis Based on Overviews of Reviews. J. Med. Internet Res. 2025, 27, e58086. [Google Scholar] [CrossRef] [PubMed]
  17. Hess, C.W.; Logan, D.E.; Rosenbloom, B.N.; Mesaroli, G.; Simons, L.E.; Ouellette, C.; Nguyen, C.; Alam, F.; Stinson, J.N. Developing a Core Outcome Set for Pediatric and Adult Acute and Chronic Pain Extended Reality Trials: Delphi Consensus-Building Process. J. Med. Internet Res. 2025, 27, e58947. [Google Scholar] [CrossRef] [PubMed]
  18. Logan, D.E.; Simons, L.E.; Caruso, T.J.; Gold, J.I.; Greenleaf, W.; Griffin, A.; King, C.D.; Menendez, M.; Olbrecht, V.A.; Rodriguez, S.; et al. Leveraging Virtual Reality and Augmented Reality to Combat Chronic Pain in Youth: Position Paper from the Interdisciplinary Network on Virtual and Augmented Technologies for Pain Management. J. Med. Internet Res. 2021, 23, e25916. [Google Scholar] [CrossRef]
  19. Halbig, A.; Latoschik, M.E. A Systematic Review of Physiological Measurements, Factors, Methods, and Applications in Virtual Reality. Front. Virtual Real. 2021, 2, 694567. [Google Scholar] [CrossRef]
  20. Siegel, E.H.; Wei, J.; Gomez, A.; Olivera, A.; Sundaramoorthy, P.; Smathers, K.; Vankipuram, M.; Ghosh, S.; Horii, H.; Bailenson, J.; et al. HP Omnicept Cognitive Load Database (HPO-CLD)–Developing a Multimodal Inference Engine for Detecting Real-Time Mental Workload in VR. HP Labs 2021, 20, 21. [Google Scholar]
  21. Blum, J.; Rockstroh, C.; Göritz, A.S. Heart Rate Variability Biofeedback Based on Slow-Paced Breathing with Immersive Virtual Reality Nature Scenery. Front. Psychol. 2019, 10, 2172. [Google Scholar] [CrossRef] [PubMed]
  22. Deniaud, C.; Honnet, V.; Jeanne, B.; Mestre, D. An Investigation into Physiological Responses in Driving Simulators: An Objective Measurement of Presence. In Proceedings of the 2015 Science and Information Conference (SAI), London, UK, 28–30 July 2015; pp. 739–748. [Google Scholar] [CrossRef]
  23. Connelly, M.A.; Brown, J.T.; Kearns, G.L.; Anderson, R.A.; Peter, S.D.S.; Neville, K.A. Pupillometry: A Non-Invasive Technique for Pain Assessment in Paediatric Patients. Arch. Dis. Child. 2014, 99, 1125–1131. [Google Scholar] [CrossRef]
  24. Van der Stigchel, S.; Meeter, M.; Theeuwes, J. Eye Movement Trajectories and What They Tell Us. Neurosci. Biobehav. Rev. 2006, 30, 666–679. [Google Scholar] [CrossRef]
  25. Sweller, J.; Ayres, P.; Kalyuga, S. Cognitive Load Theory; Springer Nature: Dordrecht, The Netherlands, 2011. [Google Scholar] [CrossRef]
  26. Mayer, R.E. The Cambridge Handbook of Multimedia Learning, 2nd ed.; Cambridge University Press: Cambridge, UK, 2014; pp. 1–930. [Google Scholar] [CrossRef]
  27. Sweller, J. Element Interactivity and Intrinsic, Extraneous, and Germane Cognitive Load. Educ. Psychol. Rev. 2010, 22, 123–138. [Google Scholar] [CrossRef]
  28. Makransky, G.; Andreasen, N.K.; Baceviciute, S.; Mayer, R.E. Immersive Virtual Reality Increases Liking but Not Learning with a Science Simulation and Generative Learning Strategies Promote Learning in Immersive Virtual Reality. J. Educ. Psychol. 2021, 113, 719–735. [Google Scholar] [CrossRef]
  29. Makransky, G.; Petersen, G.B. The Cognitive Affective Model of Immersive Learning (CAMIL): A Theoretical Research-Based Model of Learning in Immersive Virtual Reality. Educ. Psychol. Rev. 2021, 33, 937–958. [Google Scholar] [CrossRef]
  30. Sezgin, T.M.; Robinson, P. Affective Video Data Collection Using an Automobile Simulator; Springer: Berlin/Heidelberg, Germany, 2007; pp. 770–771. [Google Scholar] [CrossRef]
  31. Cohen, I.; Sebe, N.; Garg, A.; Chen, L.S.; Huang, T.S. Facial Expression Recognition from Video Sequences: Temporal and Static Modeling. Comput. Vision. Image Underst. 2003, 91, 160–187. [Google Scholar] [CrossRef]
  32. Barrett, L.F.; Adolphs, R.; Marsella, S.; Martinez, A.M.; Pollak, S.D. Emotional Expressions Reconsidered: Challenges to Inferring Emotion from Human Facial Movements. Psychol. Sci. Public Interest 2019, 20, 1–68. [Google Scholar] [CrossRef] [PubMed]
  33. Su, J.; Luz, S. Predicting Cognitive Load Levels from Speech Data. In Smart Innovation, Systems and Technologies; Springer: Cham, Switzerland, 2016; Volume 48, pp. 255–263. [Google Scholar] [CrossRef]
  34. Boril, H.; Sadjadi, S.O.; Hansen, J.H.L. UTDrive: Emotion and Cognitive Load Classification for in-Vehicle Scenarios. In Proceedings of the 5th Biennial Workshop on DSP for In-Vehicle Systems, Kiel, Germany, 4–7 September 2011. [Google Scholar]
  35. Seminowicz, D.A.; Davis, K.D. Interactions of Pain Intensity and Cognitive Load: The Brain Stays on Task. Cereb. Cortex 2007, 17, 1412–1422. [Google Scholar] [CrossRef]
  36. Bong, C.L.; Fraser, K.; Oriot, D. Cognitive Load and Stress in Simulation. In Comprehensive Healthcare Simulation: Pediatrics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 3–17. ISBN 978-3-319-24187-6. [Google Scholar]
  37. Logan, D.E.; Khanna, K.; Randall, E.; O’Donnell, S.; Reks, T.; McLennan, L. Centering Patient and Clinician Voices in Developing Tools to Address Pain Related School Impairment: A Phase I Study of a Virtual Reality School Simulation for Children and Adolescents with Chronic Pain. Children 2023, 10, 1644. [Google Scholar] [CrossRef] [PubMed]
  38. Chen, C.; Hu, X.; Fisher, J. What Is ‘Being There’? An Ontology of the Immersive Experience. Ann. Int. Commun. Assoc. 2024, 48, 391–414. [Google Scholar] [CrossRef]
  39. About the Immersion Lab. Available online: https://nanousers.mit.edu/immersion-lab (accessed on 27 August 2025).
  40. Griffin, A.; Wilson, L.; Feinstein, A.B.; Bortz, A.; Heirich, M.S.; Gilkerson, R.; Wagner, J.F.M.; Menendez, M.; Caruso, T.J.; Rodriguez, S.; et al. Virtual Reality in Pain Rehabilitation for Youth with Chronic Pain: Pilot Feasibility Study. JMIR Rehabil. Assist. Technol. 2020, 7, e22620. [Google Scholar] [CrossRef]
  41. Hundert, A.S.; Birnie, K.A.; Abla, O.; Positano, K.; Cassiani, C.; Lloyd, S.; Tiessen, P.H.; Lalloo, C.; Jibb, L.A.; Stinson, J. A Pilot Randomized Controlled Trial of Virtual Reality Distraction to Reduce Procedural Pain during Subcutaneous Port Access in Children and Adolescents with Cancer. Clin. J. Pain 2022, 38, 189–196. [Google Scholar] [CrossRef]
  42. Gold, J.I.; Kim, S.H.; Kant, A.J.; Joseph, M.H.; Rizzo, A. “Skip” Effectiveness of Virtual Reality for Pediatric Pain Distraction during IV Placement. CyberPsychology Behav. 2006, 9, 207–212. [Google Scholar] [CrossRef]
  43. Trost, Z.; France, C.; Anam, M.; Shum, C. Virtual Reality Approaches to Pain: Toward a State of the Science. Pain 2021, 162, 325. [Google Scholar] [CrossRef]
  44. HP Developers Accessing Sensor Data. Available online: https://developers.hp.com/omnicept/docs/unity/sensors (accessed on 7 October 2025).
  45. Lee, J.; Moon, N. Immersion Analysis Through Eye-Tracking and Audio in Virtual Reality. Comput. Mater. Contin. 2021, 69, 647–660. [Google Scholar] [CrossRef]
  46. King, C.; Lo, M.; Das, M.; Salvo, D. Assessment of Student Engagement in Virtual Reality Clinical Immersion Environments through Eye Tracking. In Proceedings of the 2024 ASEE PSW Conference, Las Vegas, NV, USA, 18–20 April 2024. [Google Scholar] [CrossRef]
  47. Jehl, N.M.; Hess, C.W.; Choate, E.S.; Nguyen, H.T.; Yang, Y.; Simons, L.E. Navigating Virtual Realities: Identifying Barriers and Facilitators to Implementing VR-Enhanced PT for Youth with Chronic Pain. J. Pediatr. Psychol. 2025, 50, 76–85. [Google Scholar] [CrossRef] [PubMed]
  48. Simons, L.E.; Hess, C.W.; Choate, E.S.; Van Orden, A.R.; Tremblay-McGaw, A.G.; Menendez, M.; Boothroyd, D.B.; Parvathinathan, G.; Griffin, A.; Caruso, T.J.; et al. Virtual Reality-Augmented Physiotherapy for Chronic Pain in Youth: Protocol for a Randomized Controlled Trial Enhanced with a Single-Case Experimental Design. JMIR Res. Protoc. 2022, 11, e40705. [Google Scholar] [CrossRef]
  49. Orgil, Z.; Karthic, A.; Bell, N.F.; Heisterberg, L.M.; Williams, S.E.; Ding, L.; Kashikar-Zuck, S.; King, C.D.; Olbrecht, V.A. Use of Biofeedback-Based Virtual Reality in Pediatric Perioperative and Postoperative Settings: Observational Study. JMIR Perioper. Med. 2024, 7, e48959. [Google Scholar] [CrossRef]
  50. Orgil, Z.; Karthic, A.; Bell, N.; Williams, S.E.; Ding, L.; Kashikar-Zuck, S.; King, C.D.; Olbrecht, V.A. Dataset Used to Refine a Treatment Protocol of a Biofeedback-Based Virtual Reality Intervention for Pain and Anxiety in Children and Adolescents Undergoing Surgery. Data Brief 2023, 49, 109331. [Google Scholar] [CrossRef] [PubMed]
  51. Recker, K.; Silliman, J.; Gifford, K.; Patel, P.; Santana, L.; Hildenbrand, A.K.; Palit, S.; Wasserman, R. Virtual Reality Respiratory Biofeedback in an Outpatient Pediatric Pain Rehabilitation Program: Mixed Methods Pilot Study. JMIR Rehabil. Assist. Technol. 2025, 12, e66352. [Google Scholar] [CrossRef] [PubMed]
  52. Won, A.S.; Bailey, J.; Bailenson, J.; Tataru, C.; Yoon, I.A.; Golianu, B. Immersive Virtual Reality for Pediatric Pain. Children 2017, 4, 52. [Google Scholar] [CrossRef]
  53. Elser, A.; Lange, M.; Kopkow, C.; Schäfer, A.G. Barriers and Facilitators to the Implementation of Virtual Reality Interventions for People with Chronic Pain: Scoping Review. JMIR XR Spat. Comput. 2024, 1, e53129. [Google Scholar] [CrossRef]
  54. Medina, S.; Clarke, S.; Hughes, S. Virtual Reality-Based Analgesia: Towards a Novel Framework for the Biopsychosocial Management of Chronic Pain. Br. J. Anaesth. 2024, 133, 486–490. [Google Scholar] [CrossRef]
  55. Goudman, L.; Jansen, J.; Billot, M.; Vets, N.; De Smedt, A.; Roulaud, M.; Rigoard, P.; Moens, M. Virtual Reality Applications in Chronic Pain Management: Systematic Review and Meta-Analysis. JMIR Serious Games 2022, 10, e34402. [Google Scholar] [CrossRef]
  56. Juliano, J.M.; Schweighofer, N.; Liew, S.L. Increased Cognitive Load in Immersive Virtual Reality during Visuomotor Adaptation Is Associated with Decreased Long-Term Retention and Context Transfer. J. Neuroeng. Rehabil. 2022, 19, 106. [Google Scholar] [CrossRef] [PubMed]
  57. Parong, J.; Mayer, R.E. Cognitive and Affective Processes for Learning Science in Immersive Virtual Reality. J. Comput. Assist. Learn. 2021, 37, 226–241. [Google Scholar] [CrossRef]
  58. Huang, C.L.; Luo, Y.F.; Yang, S.C.; Lu, C.M.; Chen, A.S. Influence of Students’ Learning Style, Sense of Presence, and Cognitive Load on Learning Outcomes in an Immersive Virtual Reality Learning Environment. J. Educ. Comput. Res. 2020, 58, 596–615. [Google Scholar] [CrossRef]
  59. Frederiksen, J.G.; Sørensen, S.M.D.; Konge, L.; Svendsen, M.B.S.; Nobel-Jørgensen, M.; Bjerrum, F.; Andersen, S.A.W. Cognitive Load and Performance in Immersive Virtual Reality versus Conventional Virtual Reality Simulation Training of Laparoscopic Surgery: A Randomized Trial. Surg. Endosc. 2020, 34, 1244–1252. [Google Scholar] [CrossRef]
  60. Lee, J.Y.; De Jong, N.; Donkers, J.; Jarodzka, H.; Van Merrienboer, J.J.G. Measuring Cognitive Load in Virtual Reality Training via Pupillometry. IEEE Trans. Learn. Technol. 2024, 17, 704–710. [Google Scholar] [CrossRef]
  61. Wei, J.; Siegel, E.; Sundaramoorthy, P.; Gomes, A.; Zhang, S.; Vankipuram, M.; Smathers, K.; Ghosh, S.; Horii, H.; Bailenson, J.; et al. Cognitive Load Inference Using Physiological Markers in Virtual Reality. In Proceedings of the 2025 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Saint Malo, France, 8–12 March 2025; pp. 759–769. [Google Scholar] [CrossRef]
  62. Setu, J.N.; Le, J.M.; Kundu, R.K.; Giesbrecht, B.; Höllerer, T.; Hoque, K.A.; Desai, K.; Quarles, J. Predicting and Explaining Cognitive Load, Attention, and Working Memory in Virtual Multitasking. IEEE Trans. Vis. Comput. Graph. 2025, 31, 3014–3024. [Google Scholar] [CrossRef] [PubMed]
  63. Elkin, R.L.; Beaubien, J.M.; Damaghi, N.; Chang, T.P.; Kessler, D.O. Dynamic Cognitive Load Assessment in Virtual Reality. Simul. Gaming 2024, 55, 755–775. [Google Scholar] [CrossRef]
  64. Wang, D.; Peng, Y.; Haddouk, L.; Vayatis, N.; Vidal, P.P. Assessing Virtual Reality Presence through Physiological Measures: A Comprehensive Review. Front. Virtual Real. 2025, 6, 1530770. [Google Scholar] [CrossRef]
  65. Jennett, C.; Cox, A.L.; Cairns, P.; Dhoparee, S.; Epps, A.; Tijs, T.; Walton, A. Measuring and Defining the Experience of Immersion in Games. Int. J. Hum. Comput. Stud. 2008, 66, 641–661. [Google Scholar] [CrossRef]
  66. Jensen, L.; Konradsen, F. A Review of the Use of Virtual Reality Head-Mounted Displays in Education and Training. Educ. Inf. Technol. 2018, 23, 1515–1529. [Google Scholar] [CrossRef]
  67. Marañes, C.; Gutierrez, D.; Serrano, A. Revisiting the Heider and Simmel Experiment for Social Meaning Attribution in Virtual Reality. Sci. Rep. 2024, 14, 17103. [Google Scholar] [CrossRef] [PubMed]
  68. Adhanom, I.B.; MacNeilage, P.; Folmer, E. Eye Tracking in Virtual Reality: A Broad Review of Applications and Challenges. Virtual Real. 2023, 27, 1481–1505. [Google Scholar] [CrossRef] [PubMed]
  69. Banerjee-Guénette, P.; Bigford, S.; Glegg, S.M.N. Facilitating the Implementation of Virtual Reality-Based Therapies in Pediatric Rehabilitation. Phys. Occup. Ther. Pediatr. 2020, 40, 201–216. [Google Scholar] [CrossRef]
Figure 1. Screenshots from the vRS simulation in the following order: (A) school entrance; (B) desk in correct classroom; (C) school hallway with peers, lockers, and multiple classroom doors; (D) correct classroom with empty desks for students.
Figure 1. Screenshots from the vRS simulation in the following order: (A) school entrance; (B) desk in correct classroom; (C) school hallway with peers, lockers, and multiple classroom doors; (D) correct classroom with empty desks for students.
Virtualworlds 04 00047 g001
Figure 2. Cognitive Load by Event Type for Participant 3 (E5 to E11).
Figure 2. Cognitive Load by Event Type for Participant 3 (E5 to E11).
Virtualworlds 04 00047 g002
Figure 3. Cognitive Load by Event Type for Participant 8.
Figure 3. Cognitive Load by Event Type for Participant 8.
Virtualworlds 04 00047 g003
Figure 4. Cognitive Load by Event Type for Participant 5.
Figure 4. Cognitive Load by Event Type for Participant 5.
Virtualworlds 04 00047 g004
Figure 5. Combined Eye Gaze in Transitions vs. Tasks for Participant 8.
Figure 5. Combined Eye Gaze in Transitions vs. Tasks for Participant 8.
Virtualworlds 04 00047 g005
Table 1. Events in the vRS simulation.
Table 1. Events in the vRS simulation.
Event NumberEvent
Event 1Transition (Baseline to Tutorial)
Event 2Complete Task (Rotate)
Event 3Complete Task (Teleport)
Event 4Complete Task (Grab)
Event 5 *Transition (Tutorial to Entrance)
Event 6Transition (Entrance to Hallway)
Event 7Complete Task (Find Locker 22)
Event 8Complete Task (Get Your Notebook)
Event 9Transition (Hallway to Classroom)
Event 10Complete Task (Find Your Seat)
Event 11Complete Task (Turn in Your Notebook)
Event 12Transition (Classroom to Finish)
* Events 5-12 are those subsequently examined to explore feasibility of physiologic monitoring.
Table 2. Child Presence Questionnaire.
Table 2. Child Presence Questionnaire.
Instructions: I am going to ask you some questions and I would like you to tell me how you felt while you were using the VR school experience.
Item #Question
1Did you feel like you were in control of what happened in the VR?
2Did you feel like you were really there in the VR school experience?
3Were you interested in what you saw?
4Did the way things moved look real?
5Were you interested in what happened in the VR?
6Did you get used to being in the VR quickly?
7Did you feel like you were in a different place?
8Did the things you heard sound real?
9Did it seem like the scenes in the VR were real?
10Was the VR school experience fun?
11Did you feel like you were a student in the VR school setting?
12Did the things you saw look real?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Marwah, H.; Moldovanu, S.R.; Reks, T.; Anthony, B.; Logan, D.E. Integrating Physiologic Assessment into Virtual Reality-Based Pediatric Pain Intervention: A Feasibility Study. Virtual Worlds 2025, 4, 47. https://doi.org/10.3390/virtualworlds4040047

AMA Style

Marwah H, Moldovanu SR, Reks T, Anthony B, Logan DE. Integrating Physiologic Assessment into Virtual Reality-Based Pediatric Pain Intervention: A Feasibility Study. Virtual Worlds. 2025; 4(4):47. https://doi.org/10.3390/virtualworlds4040047

Chicago/Turabian Style

Marwah, Harsheen, Stefania R. Moldovanu, Talis Reks, Brian Anthony, and Deirdre E. Logan. 2025. "Integrating Physiologic Assessment into Virtual Reality-Based Pediatric Pain Intervention: A Feasibility Study" Virtual Worlds 4, no. 4: 47. https://doi.org/10.3390/virtualworlds4040047

APA Style

Marwah, H., Moldovanu, S. R., Reks, T., Anthony, B., & Logan, D. E. (2025). Integrating Physiologic Assessment into Virtual Reality-Based Pediatric Pain Intervention: A Feasibility Study. Virtual Worlds, 4(4), 47. https://doi.org/10.3390/virtualworlds4040047

Article Metrics

Back to TopTop