Next Article in Journal
Development of High-Power DC Solid-State Power Controllers Using SiC FETs for Aircraft Electrical Systems
Previous Article in Journal
BoxingPro: An IoT-LLM Framework for Automated Boxing Coaching via Wearable Sensor Data Fusion
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Validating a Wearable VR Headset for Postural Sway: Comparison with Force Plate COP Across Standardized Sensorimotor Tests

1
Athlete Engineering Institute, Mississippi State University, Starkville, MS 39759, USA
2
Industrial and Systems Engineering, Mississippi State University, Mississippi State, MS 39762, USA
3
Athletics, Mississippi State University, Mississippi State, MS 39762, USA
4
Center for Advanced Vehicular Systems, Starkville, MS 39759, USA
5
Kinesiology, Mississippi State University, Mississippi State, MS 39762, USA
6
Exercise Science, Belmont University, Nashville, TN 37212, USA
7
Clinical and Translational Research Institute, Northeast Ohio Medical University, Rootstown, OH 44272, USA
8
School of Health Related Professions, University of Mississippi Medical Center, Jackson, MS 39216, USA
9
Office of Research and Economic Development, Mississippi State University, Starkville, MS 39759, USA
10
Office of the Provost, Texas Christian University, Fort Worth, TX 76109, USA
*
Authors to whom correspondence should be addressed.
Electronics 2025, 14(21), 4156; https://doi.org/10.3390/electronics14214156
Submission received: 16 August 2025 / Revised: 13 October 2025 / Accepted: 22 October 2025 / Published: 23 October 2025

Abstract

This study seeks to determine the efficacy of a novel, virtual reality (VR)-based sensorimotor assessment tool, VIST Neuro-ID, in comparison to the gold standard. This was achieved through computing common postural sway metrics, as well as comparing these metrics across population groups including sex and age (50–60 vs. 61–75). Sensorimotor assessments were conducted within the VIST Neuro-ID VR software while participants stood on a force plate. A proxy for center-of-pressure measurement was developed using the six-degree-of-freedom data collected from the head-mounted display used with the VR system. Moderate-to-high (r = 0.542–0.906) Pearson’s correlations were found between VIST Neuro-ID and the force plate for all eight postural sway metrics that were computed. Both systems detected significant differences (p < 0.05) across age groups for all metrics, except for two-dimensional path length from the force plate. Several significant differences were found across sexes, including AP and resultant sway velocity from the force plate, and resultant and AP root-mean-square from the HTC Vive Pro Eye. This indicates potential for VR to be used to collect vital postural sway metrics needed for assessing patient function, while also highlighting potential to identify balance patterns related to aging.

1. Introduction

The World Health Organization (WHO) estimates that 684,000 people die from falls each year worldwide, with adults older than 60 years of age suffering the greatest number of fatal falls of any age population [1]. Older adults are at a greater risk of falling due to progressive decline in physical and cognitive function, as well as increased susceptibility to chronic disease [2]. Fall risk within this population is typically multifactorial, encompassing both modifiable and non-modifiable characteristics. Among the modifiable factors, one of the most impactful is engagement in structured exercise programs designed to improve balance, muscular strength, and cardiovascular endurance [3].
Postural assessment is a component of musculoskeletal and neurological health, particularly within physiotherapy. It is commonly used to identify imbalances, neuromuscular control deficits, or potential injury risks. Proper postural control enables individuals to maintain their center of mass (COM) within their base of support with minimal sway, which is essential for both static and dynamic activities [4]. This control relies on the integration of multiple systems including the visual, vestibular, and somatosensory systems, as well as appropriate motor strategies. Dysfunction in any of these systems can lead to postural instability, increasing the risk of injury and contributing to musculoskeletal disorders such as back pain, joint degeneration, or repetitive strain injuries [5].
Various tools and techniques are used to measure and analyze postural control. Traditional visual observation and outcome measures, such as the Berg Balance Scale [6], remain widely used due to simplicity and clinical utility. However, more advanced technologies such as force plates, motion capture systems, and wearable sensors allow for precise, quantitative measurements of postural sway and balance. These methods are especially useful in early detection of balance impairments, fall risk assessments, and rehabilitation planning, making them valuable in clinical and research settings to evaluate various populations under different conditions [7]. Studies in this context support that good postural control is not only essential for purposeful movement but also a strong indicator of overall functional health [8]. Consequently, postural assessment plays an integral part in clinical practice in physical therapy and rehabilitation.
Frequently used clinical outcome measures rely on subjective scoring and are at risk of inter-rater variability, indicating a potential need for additional objective measures for fall risk [9,10]. One type of sensor-based measure is the analysis of postural sway, which quantifies the natural, continuous movement of the body’s COM within the base of support while standing. Postural sway is commonly measured using force plates, which detect subtle shifts in the center of pressure (COP) during static activities to provide quantitative data on a person’s neuromuscular control [11]. While minor fluctuations in posture are normal, excessive sway may indicate impaired postural control and balance dysfunction. The sensitivity of force plates makes them a preferred tool for detecting minor impairments that may not be visually observable [12].
Postural sway is impacted by cognitive demands. Studies have demonstrated that increased cognitive loads, such as performing memory tasks, can amplify sway in concussed individuals when attempting static activities. These findings support the integration of postural and cognitive testing in concussion evaluations, offering a more comprehensive understanding of injury severity and recovery trajectory [13]. Additionally, in cognitive testing frameworks, especially among populations at risk of cognitive decline, measuring changes in sway during cognitive challenges can enhance assessments and help detect subtle deficits that may not be evident in standard testing alone.
To expand upon the previously used measures of balance, clinicians and practitioners alike have begun to incorporate various sensing technologies into the assessment of balance, including inertial sensors, video-based motion tracking systems, pressure and force sensing platforms, and laser sensing [14]. Assessments with said technologies quantify fall risk through measurements of postural sway and center of pressure, with the previous literature suggesting that derivatives of these metrics are sensitive enough to be differentiated as a method of screening for those at risk of falling [15].
Although force and pressure plates are common in laboratory assessments, wearable devices with comparable sensing abilities could help individuals monitor when and where falls are most likely to occur [16]. Wearable technologies are currently being used to detect falls and reduce the response time for medical assistance; however, future advancements could focus on assessing an individual’s risk of falling and implementing proactive measures to prevent falls [17].
The goal of this investigation is to validate the usage of positional virtual reality (VR)-based tracking against force plates during a battery of sensorimotor tests. This will enable us to understand the feasibility of using this type of technology to measure postural sway to assess cognitive and musculoskeletal health in older populations. Hence, the conducted study evaluates two main research questions (RQs) as it relates to technology validity and sensing efficacy:
RQ1: 
How well does a VR-based motion tracking system correlate with gold-standard force plates when measuring key postural sway metrics?
RQ2: 
How well does each system detect significant differences in postural sway metrics across age groups in elderly populations?

2. Materials and Methods

2.1. Sample Size

The primary aim of this study was to validate head-mounted display (HMD)-derived postural sway metrics against gold-standard force plate COP metrics using correlation analyses. Figure 1 provides a summary diagram of the overall approach and findings of the presented work. An a priori power analysis was conducted to determine the minimum sample size needed to detect a moderate-to-large correlation. Based on Fisher’s z transformation, detecting a correlation of r = 0.60 with α = 0.05 and 80% power requires approximately 19 participants. Our sample of N = 21 therefore exceeded this threshold, ensuring sufficient power (≥0.80) to detect correlations of r ≥ 0.60.
Sensitivity analyses for RQ2 indicated that with group sizes of 9 vs. 12 (sex) and 10 vs. 11 (age), the minimally detectable standardized effect size at 80% power (α = 0.05, two-sided) was approximately Cohen’s d ≈ 1.2. Thus, RQ2 analyses were sufficiently powered to detect only large effects, while smaller differences should be interpreted cautiously.

2.2. Participants

Twenty-one (21) participants (nine males: age, 53–75; height, 163.5–188 cm; mass, 81.7–115 kg; shoe size, 7.5–12.5 (US); and 12 females: age, 53–75; height, 159–175.5 cm; mass, 52.1–81.4 kg; shoe size, 6.5–10 (US)) with no self-reported recent history of lower-extremity musculoskeletal injuries, as well as no self-reported visual, vestibular, or neurological disorders, were tested. The study was approved for human subject testing by the University’s Institutional Review Board (IRB protocol #23-215). Informed consent was obtained from all participants, which outlined the purpose, procedures, risks and discomforts, benefits, confidentiality, and rights of the participant for the study.

2.3. Participant Screening and Preparation

After obtaining informed consent, participants completed the Physical Activity Readiness Questionnaire for Everyone (PAR-Q+) to confirm eligibility for physical activity. The first page of the PAR-Q+ consisted of general health questions for the participant to answer with either “YES” or “NO”. If the participant answered “NO” to all the questions, the participant was cleared to sign the participant declaration and continue to the next phase of the study. If the participant answered “YES” to any of the questions, the participant was asked to complete the last three pages of the PAR-Q+, which provided follow-up questions regarding medical conditions, before signing the participant declaration.
After the participant completed the pre-assessment paperwork, the participant was asked for their clothing size. A lab assistant retrieved a set of clothes consisting of a compression shirt and compression shorts. Following this, the research team collected the participants’ height and weight using an eye-level physician scale with a height rod (Detecto, Webb City, MO, USA). The study location was limited to researchers with card access, and doors were closed during data collection to maintain participant privacy.

2.4. Instrumentation

2.4.1. VR Headset and VIST Neuro-ID Platform

The VR system included the HTC Vive Pro Eye headset (HTC Corporation, Taoyuan City, Taiwan) running the Virtual Immersive Sensorimotor Test for Neurological Impairment Detection (VIST Neuro-ID) software suite on an Alienware Area-51m laptop (Dell, Round Rock, TX, USA). The headset features dual 1440 × 1600 AMOLED displays (2880 × 1600 combined), a 90 Hz refresh rate, and integrated Tobii eye tracking at 120 Hz. It also includes onboard inertial measurement units and external base stations for six degrees of freedom (6-DoF) head tracking. This setup enabled high-fidelity gaze and motion data acquisition during VR-based tasks.

2.4.2. VIST Neuro-ID Test Battery

The VIST Neuro-ID platform, developed by the University of Mississippi Medical Center (UMMC), includes eight interactive tests: smooth pursuits, saccades, binocular convergence, peripheral vision, object discrimination, gaze stability, head–eye coordination, and cervical neuromotor control. These tests simulate traditional neurological evaluations using visual stimuli and track real-time gaze, head movement, and oculomotor behavior. Each test produces 18–26 synchronized data streams at 90 Hz, including fixation error, saccadic latency, vestibulo-ocular reflex stability, convergence distance, and head repositioning accuracy [18]. For the purposes of this study, the peripheral vision, gaze stability, head–eye coordination, and cervical neuromotor control tests were excluded from analysis because the tests required head and/or arm movement. The research team decided that movement artifacts from the head and dominant arm could impact the quality of the sway metrics obtained. For the remaining four tests, participants were instructed to remain still throughout the duration of the assessment.

2.4.3. Kistler Force Plate Hardware and Signal Conditioning

A Kistler Type 9260AA (Kistler Group, Novi, MI, USA) multicomponent force plate was used while the participants completed the VR tasks. The platform uses embedded piezoelectric quartz sensors to measure forces (Fx, Fy, Fz) and moments (Mx, My, Mz), from which center of pressure (COP) is calculated. COP reflects postural sway and load distribution based on the point of vertical ground force application. Analog charge signals were processed via a built-in charge amplifier and routed to a signal conditioning unit. A Kistler 5233A control unit (Kistler Group, Novi, MI, USA) was configured to amplify by 7.5 millivolts per Newton (mV/N) in the Fx and Fy planes, and 3.8 mV/N in the Fz plane. Force plate data were collected during the VIST Neuro-ID assessment for integrated posturography and neurocognitive analysis.

2.5. Experimental Setup and Calibration

2.5.1. Force Plate Setup

The Kistler force plate was prepared using the MotionMonitor xGen software v4.06e (Innovative Sports Training, Chicago, IL, USA) located on the data collection desktop. To begin, the force plate was placed in a marked location in the center of the testing area, which was powered on by a control unit. The force plate was cleared of any objects and calibrated. If the force plate was moved between participants, then a physical alignment procedure was conducted, where a rigid stylus embedded with reflective markers was used to place pressure on 3 locations on the force plate to define its location in the physical space of the environment. Reflective markers during this process were tracked using a 12-camera Vicon Bonita optical motion capture system (Vicon, Oxford, UK).

2.5.2. HTC Vive Setup

The HTC Vive Pro Eye headset and controller were prepared within the SteamVR application located on the Alienware laptop. Using the “Room Setup” and “Standing Only” option, the headset was held by a lab assistant waist-high while standing on the force plate to calibrate the origin. Once the calibration was complete, the headset was placed on the force plate and recorded as 0 cm from the ground. After the Room Setup was complete, the headset, the controller, and the two base stations were checked to ensure that they were all connected and streaming connection data in SteamVR v1.27. The headset and controller were both sanitized with a disinfecting wipe, and the VIST Neuro-ID, located on the same laptop, was then turned on. Figure 2 presents an illustration of a typical participant setup in the capture volume.

2.6. Familiarization

Prior to beginning the study, the participant was led through a familiarization process with the VR headset and VIST Neuro-ID assessment, allowing them to go through three tests selected at random by a lab assistant. For this process, the participant removed their shoes and stepped onto the force plate. The participant was then familiarized with the headset and controller, and a lab assistant helped the participant put the headset on by adjusting the headset and headset speakers for optimal comfort. The lab assistant placed the controller in the participant’s right hand and guided the participant to the trigger button on the controller to be used throughout the study.
Once the participant was comfortable and prepared for the familiarization tests, a lab assistant created a new patient in the VIST Neuro-ID software. Next, a calibration procedure within the software was initiated for the headset to adjust to the participant’s vision and ensure that placement was adequate for reliable HMD position and orientation tracking. There was also a knob located on the outside of the headset that allowed the participant to further adjust the focus of the headset if needed. After calibration was complete, three of the eight tests were selected at random to familiarize the participant with the study. Once the three tests were finished, the participant was assisted in removing the headset and asked to complete the Pre-Assessment Simulator Sickness Questionnaire (SSQ) [19]. Force plate data were not recorded during the familiarization tests, since the purpose was to ensure that the participant did not experience any motion sickness during testing.

2.7. Experimental Procedures

To conduct this study, 2–3 lab assistants were needed to record and save all data for each individual test, while also spotting the participant as needed. One assistant ran the VIST Neuro-ID software while another monitored the force plate data. A third assistant could be used to spot and guide the participant throughout the study. Between tests, participants would be asked to record symptoms within VIST Neuro-ID. This included four virtual dials that could be turned with the handheld VR controller, which included the categories “Headache”, “Nausea”, “Dizziness”, and “Fogginess” for each dial. If the participant increased the value on a dial of one of the symptoms, a lab assistant would serve as a spotter for subsequent tests. If the participant reported increasing symptoms for two tests in a row, then they would be asked to remove the headset and rest for several minutes before continuing. However, the researchers did not encounter consecutively increasing symptoms with the participants recruited.
The participant was then assisted in putting on the VR headset and holding the controller while standing on the force plate. The participant was informed that the study would be like the familiarization process; however, since data would be recorded and saved in between tests, the participant would have to wait until prompted by the lab assistant before clicking “Ready” to begin the next test. This was to ensure that all data were saved and that the systems were ready to record the next test. The headset was then calibrated again. Before prompting the participant to begin the test, all lab assistants communicated non-verbally to ensure that all software was ready to record and that the systems (including base stations) were calibrated. Once ready, a lab assistant cleared the participant to begin when ready, and the participant used the controller to click the “Ready” button to begin the countdown for the test. On “3” of the countdown, the lab assistant running the MotionMonitor xGen software simultaneously started recording the data for the force plate.
All lab assistants observed the test until the participant completed the test and stopped the recordings for the force plate once the symptom assessment appeared. The participant was asked to record any simulation sickness experienced from that test and then wait to be cleared for the next test. As before, all lab assistants communicated once their systems were ready to record the next test before the participant was prompted to begin the next test. This procedure was repeated for all eight VIST tests to be conducted within the study, which were conducted in the same order. Several tests included subcomponents, in which the participant had already been instructed to wait for data recording to complete before moving on to the next subcomponent. If the lab assistant running VIST Neuro-ID saw any indicators that the participant misunderstood the directions for a test (e.g., followed an object with their head instead of just eyes during smooth pursuits), then the research team would pause recording, ensure that the participant understood test instructions, and re-do the recording.
Table 1 includes a summary of the tests that were conducted, including a description of the test procedures. Once all eight tests were complete, the participant was assisted in removing the headset and returned to the changing area to change back into their personal clothes. The participant was then asked to complete the Post-SSQ, which reiterated the questions from the Pre-SSQ.

2.8. Data Preprocessing

Regarding the data preprocessing steps in Section 2.8.1 and Section 2.8.2, generative artificial intelligence techniques were used to support the development of Python scripts that implemented data cleaning, alignment, visualization, and statistical analyses. All scripts and results produced were reviewed by a member of the research team with previous data-processing experience, and individual trial results were visualized for manual inspection to ensure proper implementation of Python 3 processing and analysis.

2.8.1. Data Transformation, Filtering, and Alignment

Force plate COP data were exported from the MotionMonitor xGen in .txt file format at 1000 Hz in medio-lateral (ML) and anterior–posterior (AP) directions in the X- and Y-planar directions, respectively, as defined relative to the system’s world origin definition. The VIST Neuro-ID system automatically generated files in .csv format for each trial upon completion of the respective test. Only the positional and rotational data of the HMD collected during the VIST Neuro-ID tests were used for analysis. If a participant had to re-do a trial due to an error, only the most recent version of the trial was used for analysis. All data cleaning, transformation, filtering, alignment, and trimming procedures were completed using a suite of Python scripts developed in a conda environment (Python 3.12.4; conda 24.5.0).
The HMD originally reported ML movement in the X-plane and AP movement in the Z-plane, while the Y-plane collected up and down movement. To reliably compare metrics derived from both systems, a proxy for center of pressure was developed using the 6-DoF HMD data. HMD head position data were expressed in a head-stabilized frame by rotating the world X/Z-planes by negative yaw (HMD rotation in Y-plane) to minimize mixing of AP and ML data that could have been induced by subtle head rotation of the participant during data collection. While the current model of the proxy does not mitigate cervical flexion/extension in the vertical plane, contamination of this movement was minimized by selected tasks where participants were instructed to face straight in a quiet stance. Computation for these COP proxies is expressed in Equations (1) and (2), where ψ is yaw in radians, measured from HMD rotation in the Y-plane.
H M D M L = cos ψ x sin ψ z ,
H M D A P = sin ψ   x + cos ψ z ,
HMD data were cleaned to remove errant data recording artifacts, which, on rare occasion, would include a single row of data set to “0” or the timestamp column being reset to 1 ms. This was likely due to a minor issue in the software’s approach to recording timestamps but did not have a meaningful effect on the data once the issues were addressed. Rows where all values defaulted to 0 were removed, and when a non-monotonic decrease was detected (i.e., timestamp value jumps back 1), the last valid timestamp was added to it as an offset to that sample and all subsequent samples.
Both systems originally recorded AP/ML movement in meters, but these measurements were converted to millimeters for analysis purposes, and to better represent the subtle movements that are detected for quantifying postural sway. For smoothing data and minimizing potential noise introduced by cervical flexion, a zero-phase Butterworth 4th-order low-pass filter at 5 Hz was applied to the AP/ML COP data derived from the VR HMD, and a zero-phase Butterworth 4th-order low-pass filter at 10 Hz was applied to the raw force plate COP data. Trial segmentation of the HMD data were conducted for smooth pursuits (VIST 1a, 1b) and saccades (VIST 2a, 2b, 2c) by detecting large jumps in the timestamp data. This was a known and consistent behavior in how VIST Neuro-ID stores data, due to the system not collecting any data while instructions for subtests are being given, despite the timestamp continuing to increase.
Because there was not a direct hardware-based pulse sent to both systems for synchronization, and timing of start and end recordings relied on coordination of the research team, a robust cross-correlation methodology was developed to improve the quality of the time alignment and comparisons made between the two systems. For each of the seven trials conducted (Tests 1a, 1b, 2a, 2b, 2c, 3, and 5), a window of data the same length as the duration of the VR trial data were spread across ±2 s of the final Kistler timestamp. Each window is scored by a correlation between VR and Kistler resultant sway velocity (resultant measure of ML/AP directions, then taking first derivative). To mitigate any additional noise that could impact alignment, the first and last 0.5 s of each window were ignored, and spikes in data were detected and mitigated through a median absolute deviation (MAD) rule on the first difference. Thresholds were set to 6 × MAD and ±0.10 s of padding before detecting the next potential spike. Additionally, the correlation was winsorized using Z-scores (median/MAD), which were clipped at ±3 before computing Pearson’s correlation. Finally, after the window was selected, a sweep of 0.75 s of relative lag was used to choose a lag that maximizes the score. The lag was applied by shifting values padded with not-a-number (NaN) values. This final window selection essentially trimmed the force plate data to a time length that is comparable to the VIST Neuro-ID tests. A utility plotting tool was developed so that the research team could visually inspect COP data from both systems in the ML and AP directions and verify that data alignment worked correctly. Figure 3 provides an example of what this visual inspection looks like for one of the participant trials, while Figure 4 presents a Cartesian view of the overall COP traces for visual comparison across systems.
Prior to metric computation and comparison for statistical analysis, 0.20 s was initially removed from both ends of the dataset to remove data spikes that result from filtering and trimming data segments. In cases where spikes lasted longer, the first and last 0.8 s were scanned for large derivative spikes, similar to the MAD approach previously mentioned, and trimmed inward until 0.15 s of a stable signal was observed. Finally, to provide adequate comparison across systems, both time series datasets were resampled to a common grid at 100 Hz using linear interpolation of values. All parameters used for data filtering and trimming were fixed a priori for this analysis. These were selected with the intention of reducing brief start and stop artifacts, while still preserving integrity of the detected sway movement.

2.8.2. Postural Sway Metric Computation

In total, eight metrics were computed per system and per trial segment. These metrics were determined based on what is commonly reported in the literature related to postural sway, particularly as they relate to aging [20]. For metrics derived from COP positional data, values are mean-centered within the respective segment. Table 2 summarizes these metrics and their respective formulas that were computed in the Python 3 processing script.

2.9. Statistical Analysis

For addressing cross-system correlation in RQ1, Pearson’s r and Spearman’s ρ correlation values were computed for each of the eight postural sway metrics across all 147 trials (21 participants × 7 trials) to determine linear association. Line plots demonstrating these correlations for each metric can be found in Appendix A.1. Regarding RQ2, participants were split into age groups of 50–60 years old (N = 10) and 61–75 years old (N = 11), as well as sex groups of male (N = 9) vs. female (N = 12). Appendix A.2 and Appendix A.3 include bar plots illustrating these group comparisons. Trial-level metrics were merged based on these different groupings, and ultimately a two-sided Welch’s t-test was implemented under the assumption of no equal variance or sample size. All statistical analyses were implemented within the Python 3 scripts that were developed. All tests were conducted with α = 0.05.

3. Results

For each of the postural sway metrics computed using the Kistler force plate and VIST Neuro-ID, Pearson correlation values were computed to determine the linear relationship between the metric computation of the two systems. Table 3 outlines these values accordingly. Out of the metrics compared, it was found that 2D path length reported very high correlation, RMS ML, AP, and resultant, and 95% ellipse area reported high correlation, and sway velocity ML, AP, and resultant all reported moderate correlation [23]. Despite most metric correlations decreasing slightly when examining Spearman’s ρ, the overall correlation strength classifications did not change, indicating consistent evidence of association between the two systems.
The T-statistic and p-values were calculated using two-sided Welch’s t-tests to determine significant differences in sex and age group across postural sway metrics for both systems. It was found that the Kistler force plate COP metrics detected significant differences across age groups for most metrics, excluding path length. For the HTC Vive Pro Eye, the COP-derived metrics were significantly different across age groups for all postural sway metrics. Regarding differences in postural sway across sexes, the Kistler force plate only found significant differences in sway velocity AP and resultant velocity. In contrast, the HTC Vive Pro Eye data measured only produced significant differences in the RMS AP and resultant metrics. A summary of the Welch’s paired t-test results, including their respective groupings, can be found in Table 4 and Table 5.

4. Discussion

4.1. System Correlation (RQ1)

The correlation analysis between the HTC Vive Pro Eye and the force plate across all trials demonstrated strong correlations for most spatial sway metrics, with Pearson’s r correlation values ranging from 0.852 to 0.906 for RMS displacement (ML, AP, and resultant) and path length. These values indicate a high degree of linear association, suggesting that the HTC Vive Pro Eye can capture postural sway characteristics in a manner comparable to the gold-standard force plate. This is consistent with previous validation studies showing strong correlations between VR headsets and force plate COP measures [24,25]. Notably, path length achieved the highest correlation (r = 0.906), reinforcing findings that cumulative displacement is a robust measure less affected by sampling differences between systems [26]. The results also align with Rosiak et al.’s systematic review, which identified multiple studies demonstrating high agreement between immersive VR-based tracking and posturography platforms, particularly for displacement-based metrics [24].
In contrast, agreement for velocity-based metrics, such as sway velocity in the ML and AP directions, was more modest, with Pearson’s r values between 0.542 and 0.600. The reduced agreement in these measures may be attributable to differences in sampling rates, noise filtering, or the way the Vive system estimates head position from headset-mounted sensors compared to the force plate’s direct COP calculations. Similar limitations were observed in Wittstein et al., where VR-based sensory organization testing yielded weaker correlations for velocity-derived parameters compared to displacement measures [25]. Despite these differences, the resultant velocity and sway velocity metrics still demonstrated moderate correlations, suggesting potential utility for gross velocity estimates, albeit with caution for applications requiring high temporal precision.
The ellipse area metric also showed a high correlation (r = 0.863), supporting the Vive’s capability to capture overall sway magnitude. This finding is consistent with Craig et al., who demonstrated that VR systems could reliably detect changes in postural sway magnitude across different stance and visual conditions when benchmarked against a motion capture system and COP data [27]. Collectively, these results indicate that while the HTC Vive Pro Eye can serve as a viable tool for balance assessment, particularly for spatial sway metrics, care should be taken when interpreting velocity-based measures. Future work should explore calibration procedures or signal-processing approaches to enhance temporal metric accuracy, potentially improving agreement with the gold-standard force plate.

4.2. Population-Specific Comparison (RQ2)

The comparison between age groups revealed significant differences across all sway metrics for both measurement systems, except for 2D path length recorded via force plates. Progressive decade-by-decade declines in COP amplitude and velocity have been reported previously [28,29], and the combination of behavioral and physiological changes accompanying the transition between age categories likely contributes to the present findings. Prior work examining postural sway during bilateral quiet standing under eyes-open and eyes-closed conditions suggests that reduced stability without visual input reflects greater reliance on vision and may indicate deficits in the vestibular or somatosensory systems [30]. This aligns with evidence that age-related declines in balance may be driven primarily by deterioration in these systems. Additionally, the progressive prevalence of sarcopenia in older adults suggests a musculoskeletal contribution to reduced postural stability [31].
Plane-specific analysis indicated significant group differences in path and velocity measures across the ML, AP, and resultant planes, with the largest differences occurring in the ML direction. Lateral stability has been identified as a key target for fall-prevention interventions [32], although other studies suggest that composite sway measures may be preferable when foot placement on the force plate is uncontrolled [7]. Further modeling of the relationship between directional sway metrics and fall risk has failed to identify significant independent effects [33]. Similarly, direction-specific metrics have shown limited sensitivity in single-leg-stance tests, which place greater emphasis on individual planes of balance [34]. In this context, the observed group differences in resultant measures may hold more practical relevance than the directional metrics alone.
In contrast to the clear age-related effects, sex differences in the present study were limited. With force plate assessment, males and females differed significantly in AP and resultant velocities, whereas HTC Vive Pro Eye headset measurements identified differences in resultant and AP sway RMS. Prior studies in adults have reported higher COP path lengths in males, attributed to greater moments of body inertia and differences in soleus muscle architecture [35,36]. The differences seen with the present findings may reflect a stronger influence of age compared with sex in determining sway outcomes. However, the lack of body-size normalization in the current analysis, particularly for height, represents a potential confounding factor [37].

5. Conclusions

Through observation and assessment of the various metrics computed from HDM-derived COP metrics, in comparison to gold-standard force plate-derived COP, the positional and rotational data collected during the VIST Neuro-ID assessment indicate promising results for providing additional insights into patient data beyond gaze measures used for neurocognitive function. The significant differences found in aging could indicate potential for HMD 6-DoF data to be used as a secondary measure for assessing severity of neurological and musculoskeletal conditions in addition to the eye tracking data already being collected, which further attests to the benefits of a multi-modal sensing system that is included with the HTC Vive Pro Eye VR system.

5.1. Future Work

Future work could include investigating the development of machine learning algorithms with the data collected during VIST Neuro-ID test completion to detect meaningful and possibly predictive patterns indicative of early-onset fall risk or cognitive decline. This combination of motion and ocular data streams could lead to deeper insights into aging and neurocognitive function. Additionally, these types of metrics could be compared to other types of portable technologies such as wearables (e.g., pressure-sensing socks [38] or insoles [39]) and markerless motion capture (e.g., Theia3D [40]). The VR system provides a robust dataset that has the potential to provide better accuracy over wearable systems, but it is currently restricted to being near a laptop or desktop computer. Markerless motion capture systems provide a larger amount of data while working within the confines of the capture volume within an array of cameras and generally come at a higher cost than VR- and wearable-based systems. Wearable systems provide the freedom to collect data in many different environments but can be the most prone to data accuracy limitations. Further investigation is warranted to better inform practitioners of the trade-offs between these technologies regarding accuracy, cost, and ease of use.

5.2. Limitations

Given challenges with development of the VIST Neuro-ID software, there was not an easily accessible process for establishing high-precision synchronization between the VR and force plate data collection systems. Consequently, complex Python utility scripts had to be developed to compile a robust, aligned dataset that could be trusted by the research team. Further investigation into data synchronization techniques is warranted. Additionally, our COP proxy for the HMD did not account for cervical flexion of the neck. We attempted to mitigate this through excluding tests where exaggerated neck movement occurred and applied a low-pass filter to reduce noise of subtle neck motion, but cervical flexion and roll were not explicitly removed.
For participant screening, we did not administer a validated cognitive screening outcome measure; therefore, undiagnosed mild cognitive impairment may be present in our sample and could influence gait and task performance. Future work will include a brief validated screen (e.g., Mini-Cog or Montreal Cognitive Assessment) [41,42] to confirm typical cognitive function and improve internal validity.

Author Contributions

Conceptualization, H.C., R.F.B.; methodology, H.C., R.F.B., D.S.; software, D.S.; validation, D.S.; formal analysis, D.S., E.W., J.L.W.; investigation, K.M., M.M., R.B.; resources, D.S., H.D., J.C.R., H.C.; data curation, D.S., K.M., M.M., R.B., E.W., H.D.; writing—original draft preparation, D.S., K.M., M.M., R.B.; writing—review and editing, D.S., H.D., H.C., J.C.R., J.L.W.; visualization, D.S.; supervision, D.S., H.C.; project administration, D.S., H.C., R.F.B., J.L.W.; funding acquisition, H.C., R.F.B. All authors have read and agreed to the published version of the manuscript.

Funding

Harish Chander is partially supported by the National Institute of General Medical Sciences of the National Institutes of Health under Award Number 5U54GM115428. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Project: Mississippi Center for Clinical and Translational Research, Grant # 5U54GM115428-06.

Data Availability Statement

The datasets presented in this article are not readily available because of limitations indicated by the sponsor. Requests to access the datasets should be directed to the corresponding author.

Acknowledgments

The research team acknowledges the support of University of Mississippi Medical Center for providing the Neuro VIST-ID solution for testing and data collection. The authors would like to extend their acknowledgements to the Mississippi Center for Clinical and Translational Research (MCCTR) for the support to advance this research. During the implementation of data processing and analysis, the authors used ChatGPT 4o/5 for the purposes of supporting the data analysis and Python 3 development process. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

Jennifer C. Reneker is an intellectual property holder with two patents pending, which are related to the VIST Neuro-ID technology. No other authors have financial conflicts of interest to disclose.

Abbreviations

The following abbreviations are used in this manuscript:
VRVirtual reality
VIST Neuro-IDVirtual Immersive Sensorimotor Test for Neurological Impairment Detection
MLMedio-lateral
APAnterior–posterior
COPCenter of pressure
HTCHigh Tech Computer Corporation
HMDHead-mounted display

Appendix A

The following plots represent a visual of individual trials as well as overall group comparisons for all metrics related to addressing RQ1 and RQ2.

Appendix A.1. RQ1 Correlation Plots

This appendix presents scatterplots illustrating the relationships between postural sway metrics computed from the Kistler force plate and the VIVE (HMD) system. Each point represents a single trial (N = 147). The red solid line indicates the best-fit regression line describing the relationship between systems, while the black dashed line represents the line of equality (y = x). Colored points correspond to individual participants.
  • Electronics 14 04156 i001a
  • Electronics 14 04156 i001b
  • Electronics 14 04156 i001c

Appendix A.2. RQ2 Bar Plots for Age Comparisons

  • Electronics 14 04156 i002a
  • Electronics 14 04156 i002b

Appendix A.3. RQ2 Bar Plots for Sex Comparisons

  • Electronics 14 04156 i003a
  • Electronics 14 04156 i003b

References

  1. World Health Organization. Falls [Fact Sheet]. Available online: https://www.who.int/news-room/fact-sheets/detail/falls (accessed on 14 August 2025).
  2. Lo, P.Y.; Su, B.L.; You, Y.L.; Yen, C.W.; Wang, S.T.; Guo, L.Y. Measuring the Reliability of Postural Sway Measurements for a Static Standing Task: The Effect of Age. Front. Physiol. 2022, 13, 850707. [Google Scholar] [CrossRef] [PubMed]
  3. Rubenstein, L.Z. Falls in older people: Epidemiology, risk factors and strategies for prevention. Age Ageing 2006, 35 (Suppl. S2), ii37–ii41. [Google Scholar] [CrossRef] [PubMed]
  4. Shumway-Cook, A.; Woollacott, M.H. Motor Control: Translating Research into Clinical Practice, 5th ed.; Wolters Kluwer: Waltham, MA, USA, 2017. [Google Scholar]
  5. Horak, F.B. Postural orientation and equilibrium: What do we need to know about neural control of balance to prevent falls? Age Ageing 2006, 35 (Suppl. S2), ii7–ii11. [Google Scholar] [CrossRef] [PubMed]
  6. La Porta, F.; Caselli, S.; Susassi, S.; Cavallini, P.; Tennant, A.; Franceschini, M. Is the berg balance scale an internally valid and reliable measure of balance across different etiologies in neurorehabilitation? A revisited rasch analysis study. Arch. Phys. Med. Rehabil. 2012, 93, 1209–1216. [Google Scholar] [CrossRef]
  7. Prieto, T.E.; Myklebust, J.B.; Hoffmann, R.G.; Lovett, E.G.; Myklebust, B.M. Measures of postural steadiness: Differences between healthy young and elderly adults. IEEE Trans. Biomed. Eng. 1996, 43, 956–966. [Google Scholar] [CrossRef]
  8. Santos, M.J.; Kanekar, N.; Aruin, A.S. The role of anticipatory postural adjustments in compensatory control of posture. Natl. Libr. Med. 2010, 20, 398–405. [Google Scholar] [CrossRef]
  9. Mikó, I.; Szerb, I.; Szerb, A.; Poor, G. Effectiveness of balance training programme in reducing the frequency of falling in established osteoporotic women: A randomized controlled trial. Clin. Rehabil. 2017, 31, 217–224. [Google Scholar] [CrossRef]
  10. Horak, F.B.; Wrisley, D.M.; Frank, J. The Balance Evaluation Systems Test (BESTest) to Differentiate Balance Deficits. 2009. Available online: www.ptjournal.org (accessed on 2 August 2025).
  11. Palm, H.-G.; Johannes, S.; Gerhard, A. The role and interaction of visual and auditory afferents in postural stability. Natl. Libr. Med. 2009, 30, 328–333. [Google Scholar] [CrossRef]
  12. Cho, K.; Lee, K.; Lee, B.; Lee, H.; Lee, W. Relationship between Postural Sway and Dynamic Balance in Stroke Patients. Natl. Libr. Med. 2014, 26, 1989–1992. [Google Scholar] [CrossRef]
  13. Quatman-Yates, C.C.; Lee, A.; Hugentobler, J.A.; Kurowski, B.G.; Myer, G.D.; Riley, M.A. Test-retest consistency of a postural sway assessment protocol for adolescent athletes measured with a force plate. Natl. Libr. Med. 2013, 8, 741. Available online: https://pubmed.ncbi.nlm.nih.gov/24377060/ (accessed on 2 August 2025).
  14. Sun, R.; Sosnoff, J.J. Novel Sensing Technology in Fall Risk Assessment in Older Adults: A Systematic Review; BioMed Central Ltd.: London, UK, 2018. [Google Scholar] [CrossRef]
  15. Quijoux, F.; Vienne-Jumeau, A.; Bertin-Hugault, F.; Zawieja, P.; Lefevre, M.; Vidal, P.P.; Ricard, D. Center of pressure displacement characteristics differentiate fall risk in older people: A systematic review with meta-analysis. Ageing Res. Rev. 2020, 62, 101117. [Google Scholar] [CrossRef]
  16. Chander, H.; Burch, R.F.; Talegaonkar, P.; Saucier, D.; Luczak, T.; Ball, J.E.; Turner, A.; Kodithuwakku Arachchige, S.N.K.; Carroll, W.; Smith, B.K.; et al. Wearable stretch sensors for human movement monitoring and fall detection in ergonomics. Int. J. Environ. Res. Public Health 2020, 17, 3554. [Google Scholar] [CrossRef] [PubMed]
  17. Li, Y.; Liu, P.; Fang, Y.; Wu, X.; Xie, Y.; Xu, Z.; Ren, H.; Jing, F. A Decade of Progress in Wearable Sensors for Fall Detection (2015–2024): A Network-Based Visualization Review. Sensors 2025, 25, 2205. [Google Scholar] [CrossRef] [PubMed]
  18. Reneker, J.C.; Pruett, W.A.; Babl, R.; Brown, M.; Daniels, J.; Pannell, W.C.; Shirley, H.L. Developmental methods and results for a novel virtual reality concussion detection system. Virtual Real 2025, 29, 72. [Google Scholar] [CrossRef]
  19. Kennedy, R.S.; Lane, N.E.; Berbaum, K.S.; Lilienthal, M.G. Simulator Sickness Questionnaire: An Enhanced Method for Quantifying Simulator Sickness. Int. J. Aviat. Psychol. 1993, 3, 203–220. [Google Scholar] [CrossRef]
  20. Quijoux, F.; Nicolaï, A.; Chairi, I.; Bargiotas, I.; Ricard, D.; Yelnik, A.; Oudre, L.; Bertin-Hugault, F.; Vidal, P.-P.; Vayatis, N.; et al. A review of center of pressure (COP) variables to quantify standing balance in elderly people: Algorithms and open-access code*. Am. Physiol. Soc. 2021, 9, e15067. [Google Scholar] [CrossRef]
  21. Ruhe, A.; Fejer, R.; Walker, B. The test-retest reliability of centre of pressure measures in bipedal static task conditions—A systematic review of the literature. Gait Posture 2010, 32, 436–445. [Google Scholar] [CrossRef]
  22. Qiu, H.; Xiong, S. Center-of-pressure based postural sway measures: Reliability and ability to distinguish between age, fear of falling and fall history. Int. J. Ind. Ergon. 2015, 47, 37–44. [Google Scholar] [CrossRef]
  23. Mukaka, M.M. A guide to appropriate use of Correlation coefficient in medical research. Malawi Med. J. 2012, 24, 69–71. Available online: https://pmc.ncbi.nlm.nih.gov/articles/PMC3576830/ (accessed on 14 August 2025).
  24. Rosiak, O.; Puzio, A.; Kaminska, D.; Zwolinski, G.; Jozefowicz-Korczynska, M. Virtual Reality-A Supplement to Posturography or a Novel Balance Assessment Tool? Sensors 2022, 22, 7904. [Google Scholar] [CrossRef]
  25. Wittstein, M.W.; Crider, A.; Mastrocola, S.; Gonzalez, M.G. Use of virtual reality to assess dynamic posturography and sensory organization: Instrument validation study. JMIR Serious Games 2020, 8, e19580. [Google Scholar] [CrossRef]
  26. Sylcott, B.; Lin, C.C.; Williams, K.; Hinderaker, M. Investigating the use of virtual reality headsets for postural control assessment: Instrument validation study. JMIR Rehabil. Assist. Technol. 2021, 8, e24950. [Google Scholar] [CrossRef] [PubMed]
  27. Craig, C.M.; Stafford, J.; Egorova, A.; McCabe, C.; Matthews, M. Can We Use the Oculus Quest VR Headset and Controllers to Reliably Assess Balance Stability? Diagnostics 2022, 12, 1409. [Google Scholar] [CrossRef] [PubMed]
  28. Choy, N.L.; Brauer, S.; Nitz, J. Changes in Postural Stability in Women Aged 20 to 80 Years. J. Gerontol. Ser. A Biol. Sci. Med. Sci. 2003, 58, M525–M530. [Google Scholar] [CrossRef] [PubMed]
  29. Riis, J.; Eika, F.; Blomkvist, A.W.; Rahbek, M.T.; Eikhof, K.D.; Hansen, M.D.; Søndergaard, M.; Ryg, J.; Andersen, S.; Jorgensen, M.G. Lifespan data on postural balance in multiple standing positions. Gait Posture 2020, 76, 68–73. [Google Scholar] [CrossRef]
  30. Hill, M.W.; Duncan, M.J.; Price, M.J. The emergence of age-related deterioration in dynamic, but not quiet standing balance abilities among healthy middle-aged adults. Exp. Gerontol. 2020, 140, 111076. [Google Scholar] [CrossRef]
  31. Santilli, V.; Bernetti, A.; Mangone, M.; Paoloni, M. Clinical definition of sarcopenia. Clin. Cases Miner. Bone Metab. 2014, 11, 177. [Google Scholar] [CrossRef]
  32. Maki, B.E.; Holliday, P.J.; Topper, A.K. A Prospective Study of Postural Balance and Risk of Falling in An Ambulatory and Independent Elderly Population. J. Gerontol. 1994, 49, M72–M84. [Google Scholar] [CrossRef]
  33. Mertes, G.; Baldewijns, G.; Dingenen, P.J.; Croonenborghs, T.; Vanrumste, B. Automatic fall risk estimation using the nintendo Wii Balance Board. In HEALTHINF 2015—8th International Conference on Health Informatics, Proceedings; Part of 8th International Joint Conference on Biomedical Engineering Systems and Technologies, BIOSTEC 2015; SciTePress: Setúbal, Portugal, 2015; pp. 75–81. [Google Scholar] [CrossRef]
  34. Kozinc, Ž.; Löfler, S.; Hofer, C.; Carraro, U.; Šarabon, N. Diagnostic Balance Tests for Assessing Risk of Falls and Distinguishing Older Adult Fallers and Non-Fallers: A Systematic Review with Meta-Analysis. Diagnostics 2020, 10, 667. [Google Scholar] [CrossRef]
  35. Kitabayashi, T.; Demura, S.; Yamaji, S.; Nakada, M.; Noda, M.; Imaoka, K. Gender Differences and Relationships between Physical Parameters on Evaluating the Center of Foot Pressure in Static Standing Posture. Equilib. Res. 2002, 61, 16–27. [Google Scholar] [CrossRef]
  36. Farenc, I.; Rougier, P.; Berger, L. The influence of gender and body characteristics on upright stance. Ann. Hum. Biol. 2003, 30, 279–294. [Google Scholar] [CrossRef]
  37. Bryant, E.C.; Trew, M.E.; Bruce, A.M.; Kuisma, R.M.E.; Smith, A.W. Gender differences in balance performance at the time of retirement. Clin. Biomech. 2005, 20, 330–335. [Google Scholar] [CrossRef] [PubMed]
  38. D’Addio, G.; Iuppariello, L.; Pagano, G.; Biancardi, A.; Lanzillo, B.; Pappone, N.; Cesarelli, M. New posturographic assessment by means of novel e-textile and wireless socks device. In Proceedings of the 2016 IEEE International Symposium on Medical Measurements and Applications, MeMeA 2016—Proceedings, Benevento, Italy, 15–18 May 2016. [Google Scholar] [CrossRef]
  39. Reinfelder, S.; Durlak, F.; Barth, J.; Klucken, J.; Eskofier, B.M. Wearable static posturography solution using a novel pressure sensor sole. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2014, Chicago, IL, USA, 6 November 2014; pp. 2973–2976. [Google Scholar] [CrossRef]
  40. Federolf, P.; Kühne, M.; Schiel, K.; Reimeir, E.; Debertin, D.; Calisti, M.; Mohr, M. Validation of markerless (Theia3DTM) against marker-based (ViconTM) motion capture data of postural control movements analyzed through principal component analysis. J. Biomech. 2025, 189, 112831. [Google Scholar] [CrossRef] [PubMed]
  41. Nasreddine, Z.S.; Phillips, N.A.; Bédirian, V.; Charbonneau, S.; Whitehead, V.; Collin, I.; Cummings, J.L.; Chertkow, H. The Montreal Cognitive Assessment, MoCA: A brief screening tool for mild cognitive impairment. J. Am. Geriatr. Soc. 2005, 53, 695–699. [Google Scholar] [CrossRef] [PubMed]
  42. The Mini-Cog: A Cognitive ‘Vital Signs’ Measure for Dementia Screening in Multi-Lingual Elderly. Available online: https://psycnet.apa.org/record/2000-12514-005 (accessed on 1 October 2025).
Figure 1. Summary block diagram illustrating the comprehensive workflow of the present work to evaluate HMD VR-based COP measurements.
Figure 1. Summary block diagram illustrating the comprehensive workflow of the present work to evaluate HMD VR-based COP measurements.
Electronics 14 04156 g001
Figure 2. Experimental setup schematic. Side view (left) and front view (right) showing participant standing barefoot on Kistler 9260AA force plate while wearing HTC Vive Pro Eye HMD. Force plate recorded COP at 1000 Hz, and HMD pose data were collected at 90 Hz.
Figure 2. Experimental setup schematic. Side view (left) and front view (right) showing participant standing barefoot on Kistler 9260AA force plate while wearing HTC Vive Pro Eye HMD. Force plate recorded COP at 1000 Hz, and HMD pose data were collected at 90 Hz.
Electronics 14 04156 g002
Figure 3. Sample visual of COP data collected from Kistler force plate and HTC Vive Pro Eye systems. Note that researchers would visually inspect key peak data for verification of successful data alignment.
Figure 3. Sample visual of COP data collected from Kistler force plate and HTC Vive Pro Eye systems. Note that researchers would visually inspect key peak data for verification of successful data alignment.
Electronics 14 04156 g003
Figure 4. Sample visual of COP data collected from Kistler force plate and HTC Vive Pro Eye systems in Cartesian plane to illustrate overall top-down perspective of COP traces. The blue dot indicates the initial sample of the trial, and the orange x indicates the final sample of the trial.
Figure 4. Sample visual of COP data collected from Kistler force plate and HTC Vive Pro Eye systems in Cartesian plane to illustrate overall top-down perspective of COP traces. The blue dot indicates the initial sample of the trial, and the orange x indicates the final sample of the trial.
Electronics 14 04156 g004
Table 1. Summary of VIST Neuro-ID tests as described in previous work [18]. Tests that are included in data processing and analysis are indicated with an asterisk (*). Further details on the nature of these tests are described by Reneker et al. [18].
Table 1. Summary of VIST Neuro-ID tests as described in previous work [18]. Tests that are included in data processing and analysis are indicated with an asterisk (*). Further details on the nature of these tests are described by Reneker et al. [18].
VIST Neuro-ID TestsPart IDTest Description
Smooth Pursuits 1a *An object begins left of midline, moving to the right and back in a sinusoidal pattern at three different frequencies, moving at a constant velocity.
1b *Central object is displayed for fixation, and a moving object comes in from one side and moves in opposite direction, increasing velocity in a step-ramp pattern.
Saccades2a *A blue object appears in center of view, and, at random, a peripheral object appears to the left, right, above, or below the object. The participant must look in the direction of the object that appears.
2b *Similar to (2a), except the object is yellow, and the participant must look in the opposite direction of where the object appears.
2c *Functions as a combination of (2a) and (2b), where participant will have to respond to a blue or yellow object according to the previous subcomponents.
Convergence3 *An object starts in the center of view, appearing to be distant, while slowly moving towards the participant’s eyes at a constant speed. This is repeated three times.
Peripheral Vision4A center object is displayed along with three objects along each side—all unique shapes and colors. The central object changes to match one of the other objects until selection is made with the controller. The participant is instructed to keep their focus on the center throughout the test.
Object Discrimination5 *Two T-shaped objects with different trunk lengths are displayed side-by-side, and the trunk lengths are then quickly concealed. Participants are instructed to keep their focus on the object with the longer trunk.
Gaze Stability6aA 3D object is displayed in the center, and the participant is instructed to shake their head left and right while focusing on the object to the sound of a metronome at 180 beats per minute for ten repetitions.
6bSimilar to (6a), except the participant shakes their head up and down.
Head–Eye Coordination7Object appears in a random order in four corners of the virtual environment, disappearing and re-appearing randomly in different corners. The participant is instructed to follow the object with both their eyes and head.
Cervical Neuromotor Control8A target is displayed with the head in midline, a circle is drawn around the center as the participant keeps their focus on the center of the target. Once the circle completes, the screen will turn dark and the participant is instructed to turn their head right, left, or into extension until they see a blue object. When returning to midline, the participant presses the button on the controller to reveal the target and repeat the process. This process is completed three times in each direction.
Table 2. COP metric computation summary table following procedures used in Python development, where x represents COP movement in ML direction, y represents COP movement in AP direction, and T represents total trial duration in seconds.
Table 2. COP metric computation summary table following procedures used in Python development, where x represents COP movement in ML direction, y represents COP movement in AP direction, and T represents total trial duration in seconds.
Metric NameComputation/Formula
Root-mean-square (RMS) ML [7]Root-mean-square calculation of mean-centered time series data in ML direction
RMS AP [7]Root-mean-square calculation of mean-centered time series data in AP direction
RMS resultant [7]Root-mean-square calculation of mean-centered time series data of 2D resultant vector
Sway velocity ML [21] x T
Sway velocity AP [21] y T
2D path length [7,22] x 2 + y 2
Resultant velocity [21,22]Path length divided by total duration
95% ellipse area (mm2) [7]Using covariance ∑ of [ML, AP],
A 95 = π × χ 0.95 , 2 2 × det Σ
Table 3. Summary of correlation results for all postural sway metrics computed across VIST Neuro-ID and force plate data.
Table 3. Summary of correlation results for all postural sway metrics computed across VIST Neuro-ID and force plate data.
Postural Sway MetricPearson’s r Correlation (Linear)Spearman’s ρ Correlation (Rank-Based)Strength
RMS ML0.8520.824High
RMS AP0.8830.879High
RMS Resultant0.8690.857High
Sway Velocity ML0.5590.523Moderate
Sway Velocity AP0.5420.521Moderate
2D Path Length0.9060.920Very high
Resultant Velocity0.6000.554Moderate
95% Ellipse Area0.8630.855High
Table 4. Welch’s t-tests comparing sex groups (male vs. female) for each metric and system. Two-sided tests; α = 0.05. Symbols denote significance: p < 0.05 (*), p < 0.01 (**).
Table 4. Welch’s t-tests comparing sex groups (male vs. female) for each metric and system. Two-sided tests; α = 0.05. Symbols denote significance: p < 0.05 (*), p < 0.01 (**).
ComparisonSystemMetrictpSig.
Sex (M vs. F)KistlerRMS ML0.0270.979
KistlerRMS AP−1.9120.058
KistlerRMS Resultant−1.6970.092
KistlerSway Velocity ML0.9920.323
KistlerSway Velocity AP−3.4950.001**
Kistler2D Path Length−1.1150.267
KistlerResultant Velocity−2.6520.009*
Kistler95% Ellipse Area−0.3920.696
HTC ViveRMS ML0.5580.578
HTC ViveRMS AP−2.0220.046*
HTC ViveRMS Resultant−2.2060.029*
HTC ViveSway Velocity ML0.2930.77
HTC ViveSway Velocity AP−0.7040.483
HTC Vive2D Path Length−0.1040.918
HTC ViveResultant Velocity−0.5470.585
HTC Vive95% Ellipse Area−0.8150.417
Table 5. Welch’s t-tests comparing age groups (50–60 vs. 61–75 years) for each metric and system. Two-sided tests; α = 0.05. Symbols denote significance: p < 0.05 (*), p < 0.01 (**), and p < 0.001 (***).
Table 5. Welch’s t-tests comparing age groups (50–60 vs. 61–75 years) for each metric and system. Two-sided tests; α = 0.05. Symbols denote significance: p < 0.05 (*), p < 0.01 (**), and p < 0.001 (***).
ComparisonSystemMetrictpSig.
Age Ranges (50–60 years old vs. 61–75 years old)KistlerRMS ML−6.988<0.001***
KistlerRMS AP−3.3100.001**
KistlerRMS Resultant−3.694<0.001***
KistlerSway Velocity ML−4.316<0.001***
KistlerSway Velocity AP−2.7880.006**
Kistler2D Path Length−1.5420.125
KistlerResultant Velocity−3.616<0.001***
Kistler95% Ellipse Area−5.061<0.001***
HTC ViveRMS ML−7.551<0.001***
HTC ViveRMS AP−3.4900.001**
HTC ViveRMS Resultant−3.4650.001**
HTC ViveSway Velocity ML−5.816<0.001***
HTC ViveSway Velocity AP−5.384<0.001***
HTC Vive2D Path Length−2.6190.01*
HTC ViveResultant Velocity−6.132<0.001***
HTC Vive95% Ellipse Area−5.300<0.001***
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Saucier, D.; McDonald, K.; Mydlo, M.; Barber, R.; Wall, E.; Derby, H.; Reneker, J.C.; Chander, H.; Burch, R.F.; Weinstein, J.L. Validating a Wearable VR Headset for Postural Sway: Comparison with Force Plate COP Across Standardized Sensorimotor Tests. Electronics 2025, 14, 4156. https://doi.org/10.3390/electronics14214156

AMA Style

Saucier D, McDonald K, Mydlo M, Barber R, Wall E, Derby H, Reneker JC, Chander H, Burch RF, Weinstein JL. Validating a Wearable VR Headset for Postural Sway: Comparison with Force Plate COP Across Standardized Sensorimotor Tests. Electronics. 2025; 14(21):4156. https://doi.org/10.3390/electronics14214156

Chicago/Turabian Style

Saucier, David, Kaitlyn McDonald, Michael Mydlo, Rachel Barber, Emily Wall, Hunter Derby, Jennifer C. Reneker, Harish Chander, Reuben F. Burch, and James L. Weinstein. 2025. "Validating a Wearable VR Headset for Postural Sway: Comparison with Force Plate COP Across Standardized Sensorimotor Tests" Electronics 14, no. 21: 4156. https://doi.org/10.3390/electronics14214156

APA Style

Saucier, D., McDonald, K., Mydlo, M., Barber, R., Wall, E., Derby, H., Reneker, J. C., Chander, H., Burch, R. F., & Weinstein, J. L. (2025). Validating a Wearable VR Headset for Postural Sway: Comparison with Force Plate COP Across Standardized Sensorimotor Tests. Electronics, 14(21), 4156. https://doi.org/10.3390/electronics14214156

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop