Next Article in Journal
Jaw Clenching Alters Neuromuscular Coordination in Dynamic Postural Tasks: A Pilot Study on Single-Leg Sit-to-Stand Movements
Previous Article in Journal
Intersegmental Coordination Patterns During Heel Rise: Effects of Knee Position and Movement Phases
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Smartphone-Based Gait Analysis with OpenCap: A Narrative Review

by
Serena Cerfoglio
1,2,*,
Jorge Lopes Storniolo
3,
Edilson Fernando de Borba
4,
Paolo Cavallari
3,5,
Manuela Galli
1,
Paolo Capodaglio
6,7,† and
Veronica Cimolin
1,2,†
1
Department of Electronics, Information and Bioengineering, Politecnico di Milano, 20133 Milan, Italy
2
IRCCS Istituto Auxologico Italiano, San Giuseppe Hospital, 28824 Piancavallo, Italy
3
Laboratorio Sperimentale di Fisiopatologia Neuromotoria, IRCCS Istituto Auxologico Italiano, 20821 Meda, Italy
4
Department of Physical Education, Federal University of Paraná, Curitiba 81530-000, Brazil
5
Human Physiology Section of the Department of Pathophysiology and Transplantation, University of Milan, 20133 Milan, Italy
6
UOC Musculoskeletal and Metabolic Rehabilitation, IRCCS Istituto Auxologico Italiano, 20133 Milan, Italy
7
Department of Biomedical, Surgical and Dental Sciences, University of Milan, 20133 Milan, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Biomechanics 2025, 5(4), 88; https://doi.org/10.3390/biomechanics5040088
Submission received: 1 September 2025 / Revised: 30 October 2025 / Accepted: 31 October 2025 / Published: 3 November 2025
(This article belongs to the Section Gait and Posture Biomechanics)

Abstract

Background: Gait analysis plays a key role in detecting and monitoring neurological, musculoskeletal, and orthopedic impairments. While marker-based motion capture (MoCap) systems are the gold standard, their cost and complexity limit routine use. Recent advances in computer vision have enabled markerless smartphone-based approaches. OpenCap, an open-source platform for 3D motion analysis, offers a potentially accessible alternative. This review summarizes current evidence on its accuracy, limitations, and clinical applicability in gait assessment. Methods: A search was performed in major scientific databases to identify studies published from OpenCap’s release in 2023 to June 2025. Articles were included if they applied OpenCap to human gait and reported quantitative biomechanical outcomes. Both validation and applied studies were considered, and findings were synthesized qualitatively. Results: Nine studies were included. Validation research showed OpenCap achieved generally acceptable accuracy kinematics (RMSE 4–6°) in healthy gait, while increased errors were reported for pathological gait patterns. Applied studies confirmed feasibility in different clinical conditions, though trial-to-trial variability remained higher than MoCap, and test–retest reliability was moderate, with minimal detectable changes often exceeding 5°, limiting sensitivity to subtle clinical differences. Conclusions: OpenCap is a promising, low-cost tool for gait screening, remote monitoring, and tele-rehabilitation. Its strengths lie in accessibility and feasibility outside laboratory settings, but limitations in multiplanar accuracy, pathological gait assessment, and kinetic estimation currently preclude its replacement of MoCap in advanced clinical applications. Further research should refine algorithms and standardize protocols to improve robustness and clinical utility.

1. Introduction

Walking is a key motor component of daily physical activity, and its integrity is closely linked to an individual’s functional independence, quality of life, and overall health status [1,2]. Its efficiency is achieved by the act of moving the body forward on foot, which mechanically represents one of the most common gait patterns in human locomotion.
This gait depends on the complex interplay between the central nervous system’s control and the musculoskeletal system. Deviations from standard gait patterns are among the earliest and most sensitive signs of various medical conditions, including neurodegenerative disorders, musculoskeletal impairments, orthopaedic injuries, and age-related functional decline [3,4,5,6,7]. Therefore, gait assessment is essential for the early detection of impairments, enabling timely intervention [8]. It also serves as a reliable indicator for monitoring disease progression and the effectiveness of therapeutic strategies, ultimately supporting the preservation of mobility and independence [9,10].
In clinical and research settings, marker-based optoelectronic motion capture (MoCap) systems are considered the gold standard for gait analysis [11,12,13]. These complex, laboratory-based systems utilize multiple high-speed infrared cameras to track the position of reflective markers placed on specific anatomical landmarks, thereby enabling the reconstruction of joint movements in three-dimensional space. When coupled with force platforms and electromyography, MoCap systems enable the analysis of kinetic parameters and muscle activation patterns, allowing for the estimation of joint forces, muscle moments, and segmental power using inverse dynamics models [14,15]. Despite their high accuracy and reliability, the advantages of these methods are limited by high costs, the need for specialized equipment and facilities, and the time required to position cameras and reflective markers adequately, beyond the marker labelling and modelling process [16,17,18,19].
The growing demand for accessible tools for movement analysis has driven the development of alternative technologies for use beyond specialized laboratories [20,21]. Inertial measurement units (IMUs), which combine accelerometers, gyroscopes, and magnetometers, offer a portable and cost-effective solution for capturing linear and angular motion, making them particularly well-suited for gait analysis in real-world and ambulatory settings [22,23,24,25]. Likewise, consumer-grade RGB-depth cameras enable markerless, three-dimensional motion tracking, providing an affordable and user-friendly option for assessing movement in both research and clinical environments [26,27].
Recent advancements in computer vision and machine learning, particularly those driven by deep learning-based human pose estimation [28], have led to the development of a new generation of markerless motion capture systems [29,30,31]. Approaches such as OpenPose [32] and HRNet [33] use convolutional neural networks trained on large annotated datasets to extract 2D anatomical key points from standard RGB video recorded with smartphones, which can then be extended to 3D pose estimation through multi-view setups or additional reconstruction algorithms [34,35,36,37,38]. These systems enable real-time pose estimation without the need for markers or specialized hardware, facilitating non-invasive movement analysis across diverse environments. Among these emerging approaches, OpenCap [39] represents a significant advancement in this field.
OpenCap is an open-source platform designed to estimate 3D human movement dynamics in terms of both kinematics and kinetics, using only iOS smartphone video recordings. The platform includes several core modules (i.e., multi-view video capture, 2D pose estimation, 3D triangulation, inverse kinematics, and inverse dynamics), all accessible through a user-friendly web-based interface with automated back-end processing [39,40,41]. In practice, users record synchronized videos of a motor task, such as walking, using two or more iOS devices. Calibration of the cameras is achieved via a printed checkerboard, and video data is uploaded to the cloud-based processing server. OpenCap then employs state-of-the-art deep learning models to extract 2D key points frame by frame. Deep learning algorithms extract 2D anatomical key points, which are triangulated into 3D landmarks [39,42].
The biomechanical analysis in OpenCap is based on a subject-specific musculoskeletal model derived from the OpenSim framework [43]. This model simulates skeletal joints and muscle dynamics to estimate joint angles, moments, and internal loads. The generic model is personalized through scaling based on the reconstructed 3D landmarks to improve anatomical accuracy. Inverse kinematics is then applied to fit joint angles to the observed movements, followed by inverse dynamics to calculate joint moments and ground reaction forces (GRFs) [44,45,46]. By integrating physics-based simulation, OpenCap provides detailed kinetic data typically obtained only with marker-based systems and force plates, enabling comprehensive biomechanical assessment outside of traditional laboratory environments [47].
OpenCap’s potential spans a wide range of clinical and research applications. It offers new possibilities for longitudinal monitoring in patients with gait impairments [48], sport assessment [49,50,51,52,53], and large-scale data collection [39]. Nonetheless, several questions remain about its generalizability to highly pathological gaits, the impact of soft tissue artefacts in video-based capture, and its performance in non-standard environments [54]. Furthermore, its adoption in clinical practice depends on the validation of its outputs for specific diagnostic and therapeutic decisions, as well as the development of normative datasets for benchmarking. Moreover, the accuracy of its measurements must be rigorously compared against that of gold-standard marker-based motion capture systems to determine its validity and suitability for clinical and research use [55,56].
This narrative review aims to provide an overview of the current state of the art regarding the use of OpenCap for gait analysis. By analyzing the available literature, this review seeks to clarify the capabilities, limitations, and potential applications of the OpenCap platform, offering insights into its readiness for research and clinical integration in gait assessment.

2. Materials and Methods

The literature search was performed across several major electronic databases, including PubMed, Scopus, IEEE Xplore, and Google Scholar, covering publications from the release of OpenCap in 2023 up to June 2025. The search strategy combined relevant keywords using Boolean operators, using the following structure in title/abstract fields: (“OpenCap”) AND (“gait” OR “walking” OR “gait analysis”). Synonym expansion was not formally implemented, but potential variations were considered during manual screening.
The document type was restricted to peer-reviewed full-length research articles written in English. Records not meeting these criteria (e.g., preprints, conference abstracts, theses, case studies, reviews, or non-peer-reviewed works) were not considered eligible. In addition to electronic database searches, manual screening of the reference lists of the retrieved documents was performed to identify further pertinent publications not captured by the automated search.
Studies were considered for eligibility if they employed OpenCap to assess gait in human participants and reported quantitative biomechanical outcomes. Both validation studies—benchmarking OpenCap against gold standard marker-based MoCap systems—and applied studies where OpenCap was a central methodological tool for motor assessment were considered. Studies were excluded if gait was not the primary outcome, if the main task differed from walking (e.g., running), or if MoCap was not used as the gold standard reference for validation.
Given the narrative nature of this review and the recent introduction of OpenCap, no formal meta-analysis was performed [57,58]. Instead, a qualitative synthesis was conducted, focusing on methodological approaches, technical capabilities of OpenCap, and its agreement with gold standard MoCap reference systems in the context of gait analysis.

3. Results

The database search yielded a total of 62 records, with no additional items identified through manual screening of reference lists. After the removal of 14 duplicates, 47 records remained for screening and eligibility assessment. Of these, 20 were excluded because the document type did not meet the inclusion criteria, and the full-text of 1 work was not available. The remaining 27 articles were assessed at title, abstract and full-text level, leading to the exclusion of 18 documents. Ultimately, 9 articles were included in the qualitative synthesis of this work. A PRISMA-like flow diagram summarizing the study selection process is presented in Figure 1.
To facilitate comparison across studies, Table 1 presents a structured summary of the research articles that investigated the application of OpenCap for gait analysis. Specifically, it reports the source, year, and country of each study, details regarding participant characteristics (i.e., sample size, age, sex and condition), information on the validation set-up when available (i.e., MoCap set-up and marker’s placement), information on the configuration of OpenCap (i.e., type of devices and their placement), the type of gait task performed, and the purpose of the study—whether system validation, characterization of gait, or both. Although some studies also investigated tasks beyond gait, only their results related to gait were considered in this review, in line with its primary focus.
The included studies were examined in detail to identify both methodological consistencies and differences. The comparison focused on key aspects such as study aims, data collection procedures, participant characteristics and experimental protocols, the gait parameters analyzed, statistical approaches employed, and the key findings.

3.1. Study Objectives

Several studies aimed to validate OpenCap by comparing its accuracy with gold standard MoCap systems across different healthy and clinical populations, as well as across different movement conditions. The initial foundational study by Ulrich et al. [39] aimed to present and validate OpenCap itself, by demonstrating its feasibility in estimating both 3D joint kinematics and kinetics from smartphone videos. Similarly, Horsak et al. [59] examined lower-limb kinematics in healthy individuals, also simulating pathological gait patterns, while Svetek et al. [62] assessed joint angles during walking and other functional tasks. Wang et al. [65] focused on sagittal joint kinematics in individuals with knee osteoarthritis and healthy controls, assessed during bi-directional walking. Similarly, Martiš et al. [64] performed a validation study in healthy adults while walking in both directions relative to the cameras.
Peng et al. [61] instead aimed to estimate lower-limb joint contact forces and GRFs through a smartphone-video-based musculoskeletal multibody dynamics workflow, using OpenCap for marker estimation but applying an inverse dynamics approach rather than a forward dynamics one. Conversely, the remaining studies focused on exploring OpenCap’s applicability in gait assessment rather than on direct validation, as they did not include a comparison of its output with that of a MoCap system. In particular, Min et al. [48] tested the system’s feasibility in patients with neurological conditions such as stroke, Parkinson’s disease, and cerebral palsy. Additionally, Horsak et al. conducted two complementary investigations. One focused on test–retest reliability and sensitivity to clothing-related variability across different sessions [63], and a second secondary analysis focused on the same dataset of Horsak et al. [59] to examine intra-session repeatability by analyzing inter-trial variability [60].

3.2. Set-Up, Data Collection and Processing

OpenCap was deployed using a standardized dual-camera setup involving iOS devices (e.g., iPhones or iPads). Devices were typically positioned at oblique angles (30° to 45°) relative to the subject’s direction of movement to facilitate optimal triangulation of 2D key points into 3D space. Camera placement varied from 1.5 to 3 m from the subject, with a common height of around 1.3 to 1.5 m on tripods. Most systems recorded at 60 Hz with a resolution of 720 × 1280 pixels, with only Wang et al. [65] recording at 120 Hz. Calibration was performed using a checkerboard of known dimensions (either 210 × 175 mm or 720 × 540 mm), placed visibly in the camera field according to OpenCap’s protocol proposed by Uhlrich et al. [39]. This process was consistently applied in all the reviewed studies.
According to its original design, OpenCap relies on a web-based interface to synchronize and manage data collection from multiple devices, then process videos in the cloud to extract 2D key points, triangulate them in 3D, and estimate a full set of anatomical marker trajectories via recurrent neural networks. Subsequent inverse kinematics and muscle-driven dynamic simulations in OpenSim allow the estimation of joint-level kinematics and kinetics. The OpenCap processing pipeline involved video-based pose estimation using algorithms such as HRNet and OpenPose, followed by anatomical marker prediction via Long Short-Term Memory (LSTM) networks and biomechanical modeling in OpenSim for joint-level kinematics.
In validation studies, OpenCap outputs were compared with those of marker-based MoCap system, mainly relying on multi-camera VICON setups (i.e., 8–11 cameras), to assess agreement in joint kinematics (Figure 2). Some studies also integrated force plates to capture ground reaction forces for benchmarking musculoskeletal inverse dynamics [39,61]. In contrast, studies by Min et al. [48] and Horsak et al. [63] utilized OpenCap exclusively, without concurrent MoCap.

3.3. Participants and Experimental Protocol

The reviewed studies involved different groups of participants and experimental protocols. First, Uhlrich et al. [39] recruited 10 healthy adults, who were instructed to walk at a self-selected speed under two conditions: the first corresponding to their physiological gait, and the second to a trunk-sway gait aimed at reducing the knee adduction moment. Walking trials were conducted along a flat laboratory walkway, with each participant completing multiple trials per condition, following standardized instructions to maintain consistent step length and cadence. Regarding the following studies, most were conducted on healthy adults.
For instance, Horsak et al. [59] recruited 21 healthy volunteers. They instructed them to perform both physiological gait and three simulated pathological patterns—crouch, circumduction, and equinus—under the supervision of a physiotherapist. The order of gait patterns was randomized. For each gait pattern, five left and five right foot contacts on force plates were recorded, simultaneously by both OpenCap and MoCap. Similarly, Horsak et al. [63] involved 20 healthy adults in a test–retest design using only OpenCap, with two experimental sessions spaced approximately 26 days apart. Participants were asked to walk at a self-selected comfortable speed along an 8 m walkway. During the first session, tasks were performed twice, under two counterbalanced clothing conditions: regular street clothing and minimal clothing (barefoot in both). The second session included only the minimal clothing condition. Svetek et al. [62] included 20 collegiate ice hockey players in a protocol that included various dynamic tasks. Although the study investigated walking, running, squats, and jump-related movements, only the gait-related components were considered in the present review. Finally, Martiš et al. [64] recruited 10 participants to walk in both directions (i.e., towards and away from the cameras) along a 3 m walkway, with five to seven trials per direction. The camera setup followed the Timed Up and Go (TUG) test configuration, though the TUG itself was not performed.
Peng et al. [61] similarly involved a sample of healthy young adults who performed walking trials at self-selected comfortable speeds along a walkway with embedded force platforms. Multiple trials per condition were recorded, with each trial requiring consistent foot placement on the force plates for successful data capture. In addition to healthy participants, the remaining studies addressed clinical populations.
In particular, Wang et al. [65] conducted walking trials with a group of 30 healthy controls, as well as with 53 patients with radiographically confirmed knee osteoarthritis, in both directions—toward and away from the camera system. Similarly, Min et al. [48] addressed a neurological cohort by recruiting 10 patients with conditions such as stroke, Parkinson’s disease, and cerebral palsy, alongside 10 matched controls. Participants walked at a self-selected speed along a 4 m flat path, after achieving a calibrated static posture and connecting to the camera system. Each participant completed three walking trials, with the best performance selected for analysis.

3.4. Estimated Gait Parameters

The reviewed studies estimated a wide range of gait parameters, ranging from basic spatio-temporal parameters to joint-level kinematics. The selection of parameters and their relative focus varied depending on the study objectives, target populations, and experimental conditions.
Uhlrich et al. [39] focused on estimating lower limb joint kinematics of hip, knee, and associated joint moments during gait, as well as spatio-temporal parameters such as step length and cadence. Regarding spatio-temporal parameters, they were also reported as outcome measures in the study by Martiš et al. [64]. In particular, they reported step length, step width, and gait speed to both compare the estimate between OpenCap and MoCap, but also to compare walking performance in the two testing directions (i.e., toward and away from the camera). Min et al. [48] and Wang et al. [65] reported spatio-temporal parameters primarily for descriptive purposes, rather than as primary outcome measures.
Joint-level kinematics represented the primary outcome of the gait analysis, serving both for system validation and gait characterization purposes. These typically included angular trajectories and peak angles (i.e., range of motion, ROM) at the hip, knee, ankle, and pelvis. All the angular waveforms were reported as a percentage of the gait system’s duration to enable cross-trial and cross-system comparisons. However, the specific joint planes analyzed varied among studies. For instance, Horsak et al. [59] conducted the most comprehensive lower-limb joint analysis, evaluating nine joint angles across three anatomical planes under different gait conditions, using the same dataset to assess intra-session repeatability of the same joint-level kinematics, focusing on the inter-trial variability of angular waveforms within a single gait analysis session [60]. Similarly, Svetek et al. [62] reported peak flexion-extension angles for the hip, knee, and ankle, and additionally assessed hip abduction–adduction angles.
Regarding kinetic parameters, besides Uhlrich et al. [39], who estimated joint moments using force plate data, Min et al. [48] attempted to reconstruct joint kinetics, relying on a force-free inverse dynamics approach due to the absence of reference GRFs. In addition, Peng et al. [61] estimated not only external kinetic parameters such as GRFs but also internal kinetic quantities, specifically lower-limb joint contact forces. These joint contact forces represent the internal load transmitted between articular surfaces. They are considered part of gait kinetics, complementing joint moments and external reaction forces to provide a more complete understanding of lower-limb joint loading during walking [66,67].

3.5. Statistical Analysis Methods

Across the reviewed studies, statistical approaches varied depending on whether studies aimed to validate OpenCap against a gold standard system or to explore reliability, group differences, or the effects of different testing conditions.
Regarding studies that involve validation against a MoCap reference system, statistical analysis primarily focused on evaluating inter-system agreement. The most commonly used metric was the root mean square error (RMSE), calculated for joint kinematics across different walking conditions, trials, and participant groups. RMSE served as a primary indicator of the magnitude of discrepancy between the two systems, with values below 5° often interpreted as clinically acceptable [59,61,62,64,65]. In addition, Svetek et al. [62] complemented the RMSE analysis with Bland–Altman plots, including 95% limits of agreement (LoA), to visually and quantitatively assess the agreement between systems across various joints and tasks. The plots were aggregated across both left and right sides to enhance robustness.
Uhlrich et al. [39] similarly employed RMSE and mean absolute error (MAE) metrics to quantify discrepancies in kinematic and kinetic variables between OpenCap and MoCap, together with the results of the visual inspection of waveform overlays to support agreement interpretation. Intraclass correlation coefficients (ICCs) were also applied to evaluate consistency in joint angle measurements between systems [65] and interpreted using standard benchmarks, categorizing agreement from poor to excellent [68]. Statistical parametric mapping was employed by Horsak et al. [59] and Martiš et al. [64] to compare time-series kinematic waveforms. Depending on the normality of data, either paired t-tests or Wilcoxon signed-rank tests were used within the Statistical Parametric Mapping (SPM) framework, with Bonferroni correction applied to control for multiple comparisons.
Repeated measures ANOVA was used in multiple studies to investigate the influence of factors such as walking condition [59], group and walking direction [64,65], and to test for interactions among these variables. Post hoc analyses with Bonferroni correction were performed when significant main effects were identified. Finally, Martiš et al. [64] also included Spearman correlation analyses to examine the association between spatio-temporal parameters and kinematics in terms of ROM derived from the two systems.
In studies where OpenCap was used to characterize gait without comparison to a marker-based system, statistical analyses focused on repeatability, variability, and group-level differences. Horsak et al. [63] assessed inter-session consistency and the impact of clothing using root mean square deviation (RMSD), standard error of measurement (SEM), and minimal detectable change (MDC). Waveform comparisons were conducted using statistical parametric mapping, with corrections for multiple comparisons. Reliability metrics included ICCs, Pearson correlations, and Bland–Altman plots, and outlier trials were excluded using a z-score threshold. Min et al. [48] applied Functional Data Analysis (FDA) to compare joint trajectories across the gait cycle between healthy individuals and patients with neurological disorders. Bootstrap resampling (1000 iterations) was used to compute 95% confidence intervals, and non-overlapping regions identified significant group differences. This approach enabled sensitive detection of gait alterations, even with a small sample size. Horsak et al. [60] further explored intra-session repeatability by applying the Gait Standard Deviation (GaitSD), which summarizes variability across all degrees of freedom, and the Gait Variable Standard Deviation (GVSD), which quantifies variability for each joint. Statistical comparisons included paired t-tests or Wilcoxon signed-rank tests with Sidak correction, a Friedman test for differences across gait patterns at non-parametric data, and visualization through boxplots with outlier detection.

3.6. Findings and Data Availability

Across validation studies, emerging evidence suggests that OpenCap holds promise as a markerless motion capture solution for more accessible and cost-effective tools for gait analysis.
The foundational study by Uhlrich et al. [39] demonstrated OpenCap’s ability to estimate lower-limb kinematics, kinetics, and spatio-temporal parameters. Reported mean absolute errors for joint angles during walking were around 4.5°, thus falling within clinically acceptable thresholds [11,69]. Errors for joint moments were below 1.5% bodyweight × height, while GRFs errors remained under 7% of bodyweight. Spatio-temporal parameters also showed strong agreement, with step length errors of about 1.2 cm and cadence errors under 3 steps/min, resulting in gait speed deviations within 5% of the reference system. Moreover, OpenCap was able to detect controlled gait modifications, such as trunk sway used to reduce medial knee loading, thus highlighting its sensitivity to clinically relevant gait adaptations. Overall, this work positioned OpenCap as a methodological benchmark for subsequent validation studies. Building on this foundation, later research has focused more specifically on joint kinematics to quantify OpenCap’s accuracy across broader populations and movement patterns.
When focusing on joint kinematics, validations studies have generally reported moderate to good agreement between OpenCap and MoCap systems, with errors in terms of RMSE typically in the range of 4–6° during physiological gait. Although the overall agreement is encouraging, findings show some variability across studies and parameters, as displayed in the radar plots in Figure 3 and Figure 4. Most parameters clustered between 3° and 6°, generally close to or below the clinical threshold, indicating overall robust performance. However, certain variables, such as hip flexion/extension and pelvis tilt- showed higher errors, with Martis et al. [64] reporting the larges deviations (>6–9°). In contrast, Peng et al. [61] systematically reported lower errors across all joints, pointing to potential methodological or sample-related differences. Overall, these findings suggest that while most kinematic variables can be reconstructed with clinically acceptable accuracy, some remain more error-prone and require careful interpretation.
However, when applied to pathological gait, OpenCap showed reduced accuracy, with errors increasing as movement patterns deviated from the physiological range, as displayed in Figure 5. All conditions, actual and simulated, exhibited higher errors than in physiological gait, frequently exceeding the 5° clinical threshold. Largest deviations were observed in crouch gait, particularly for knee flexion/extension and ankle in/eversion. Circumduction gait also showed elevated errors across multiple joints, while equinus presented intermediate values. Knee OA displayed the lowest errors among pathological groups, but still above the physiological reference. This reduced performance is likely related to the pose estimation algorithms having been trained predominantly on healthy gait datasets, limiting their ability to generalize to atypical or pathological movement strategies [39,70].
With respect to anatomical planes in healthy patterns (Figure 6a), the frontal plane exhibited the lowest errors in absolute terms, with pelvic list and hip abduction/adduction showing median RMSE values below the 5° clinical threshold, although ankle inversion/eversion was a consistent outlier. The sagittal plane showed slightly higher errors overall, but with estimates relatively stable across parameters, with hip flexion/extension displaying the highest variability. The transverse plane showed moderate performance overall, since pelvic rotation was estimated with relatively low errors, whereas hip internal–external rotation was less consistent and more frequently exceeded the clinical threshold. From a clinical perspective, sagittal variables can be considered the most robust, because their larger range of motion makes them less affected by absolute errors of, and they represent the parameters most commonly used in gait assessment. This overall tendency was confirmed in pathological gait, although with systematically higher errors across all planes (Figure 6b).
Camera configuration and participant orientation also emerged as critical factors for estimating joint kinematics with OpenCap. Both Wang et al. [65] and Martiš et al. [64] reported that accuracy was higher when participants walked toward the camera (i.e., WTC condition) than when walking away (i.e., WAC condition), likely due to reduced occlusion and better key points visibility in front-facing postures (Figure 7). Conversely, spatio-temporal parameters remained robust under different orientations [64]. Moreover, environmental constraints, as noted by Svetek et al. [62] on an instrumented treadmill, may also compromise camera positioning and degrade tracking quality, especially at the hip joint.
Beyond validation, other studies have explored specific applications and methodological aspects of OpenCap. Min et al. [48] directly applied the system to characterize gait in healthy individuals and in patients with neurological disorders, such as post-stroke, Parkinson’s disease and cerebral palsy, without comparative MoCap reference. Overall, in healthy participants, joint angles were within expected ranges in literature, providing a reference framework against which pathological deviations could be identified. Patients, by contrast, displayed greater variability and reduced control across multiple joints, with alterations most evident in pelvic tilt, hip rotation, and ankle dorsiflexion. Hip flexion during the swing phase was also reduced by up to 10° compared to the typical range of 35–40°, contributing to asymmetric gait patterns often observed in neurological conditions. Together, these results show that OpenCap can detect clinically relevant kinematic abnormalities and reliably identify pathological gait patterns, although caution is warranted when interpreting absolute kinematics values, given the previously discussed errors when dealing with pathological motion patterns.
In terms of kinetics, Min et al. [48] also noted that while OpenCap can provide reliable kinematics, the absence of direct GRF measurements limited the precision of joint forces estimation compared to traditional force-plate-based gait analysis systems. Peng et al. [61] addressed this limitation by integrating an inverse dynamics workflow with a foot–ground contact model, leading to reliable estimation joint contact forces and GRFs.
This approach allowed them to estimate joint contact forces and GRFs, achieving excellent correlations (ρ > 0.9) for vertical and antero-posterior GRFs and strong correlations (around 0.88) for lower-limb joint contact forces, with RMSE below 0.5*Body Weight (BW). This methodological difference suggests that such a workflow could complement OpenCap’s native muscle-driven simulations, providing an alternative way to recover kinetic variables in scenarios where direct GRF data are unavailable.
Finally, Horsak et al. [60,63] focused on methodological aspects of the system, investigating the influence of clothing and measurement repeatability. Regarding clothing, a standard error of 2° and a MDC around 6°, with slightly higher errors for trunk and hip angles in the sagittal plane. Moreover, wearing regular streetwear was reported to increase variability. With respect to repeatability, in a complementary secondary analysis of the same dataset as their 2023 study, Horsak et al. [60] examined the intra-session repeatability of OpenCap by analyzing trial-to-trial variability. Applying GaitSD and GVSD metrics, they found significantly higher inter-trial variability for markerless data, with increases ranging from 6.6% to 22%, depending on the gait pattern, compared to marker-based systems. The grand mean variability increases across all patterns by approximately 14%, with pelvis tilt and hip flexion contributing most to this difference. These results underline the need to average multiple gait cycles, ideally more than ten per side, to obtain reliable measures with OpenCap, especially in complex or pathological gait conditions.

4. Discussion and Conclusions

This narrative review aimed to provide a comprehensive summary of the current research and performance of OpenCap, a markerless smartphone-based motion capture system, for gait assessment in both healthy and clinical populations. In particular, this work aimed to summarize existing evidence on the potential of such a system in gait analysis by identifying both its strengths and current limitations. Given the narrative nature of this review, no formal quality assessment of the included studies was performed, as this is typical of systematic review frameworks [57]. Moreover, a limitation of this narrative review is the restricted time window of the available evidence, as OpenCap is a relatively new technology, and, to date,, only a limited number of peer-reviewed studies have been published on its validation and clinical application. As a result, the conclusions drawn here necessarily reflect a still-evolving research landscape, and future studies will be essential to expand, refine, and confirm these early findings.
Overall, the findings support OpenCap as a promising, low-cost, and accessible alternative to traditional marker-based motion capture systems, with clear potential to facilitate gait screenings and functional assessments outside specialized laboratories. Technically, OpenCap demonstrated acceptable accuracy in estimating joint kinematics, with errors typically between 4 and 6° during healthy physiological gait [59,62,65], a level consistent with the 5° threshold often regarded as clinically acceptable [11,69], and thus supporting OpenCap’s application in research and preliminary clinical evaluation where MoCap is not feasible. This agreement is particularly relevant in clinical contexts, where gait-related parameters are often used as functional markers of mobility, frailty, and fall risk and can inform both diagnostic screening and the evaluation of rehabilitation outcomes [71]. However, caution is warranted when interpreting OpenCap-based metrics in pathological populations, as studies consistently reported larger errors compared with physiological gait. This reduced performance likely depends on the bias in OpenCap’s pose-estimation algorithms, like OpenPose and HRNet, which were mainly trained on healthy, physiological patterns and thus lack robustness to generalize to pathological gait with atypical gait patterns [32,39,70]. From a clinical perspective, although OpenCap may provide sufficiently accurate information for preliminary screening and monitoring, its reduced reliability in pathological gait currently limits its suitability for diagnostic purposes and for detecting subtle rehabilitation-related changes.
OpenCap’s performance is also influenced by technical factors, such as camera setup and participant orientation [64,65]. Furthermore, environmental factors such as occlusions between the subject and the cameras (e.g., treadmill bars), space constraints, or poor lighting, may further impact OpenCap’s accuracy [72].
Moreover, a current gap in the validation of OpenCap is the lack of evidence on its ability to estimate center of mass (CoM) motion, which is a crucial parameter for evaluating balance, dynamic stability, and overall gait function [73,74]. The absence of validated CoM data limits the system’s applicability in contexts where a comprehensive assessment of postural control and whole-body dynamics is required. Furthermore, no studies to date have specifically evaluated OpenCap in pediatric or geriatric populations, despite the high clinical relevance of gait analysis in these groups.
Despite its limitations, OpenCap shows promise for increasing the accessibility of gait analysis in clinical settings. Its features make it suitable for large-scale screening, outpatient assessments, remote monitoring, and tele-rehabilitation [75]. At the same time, current evidence indicates that measurement variability and minimal detectable changes remain higher than desirable for certain joint variables, which may limit sensitivity to subtle longitudinal adaptations. For these reasons, while OpenCap is an attractive tool for preliminary evaluations and functional gait screening, it cannot yet replace gold standard MoCap systems for complex clinical assessments, especially where high accuracy and complementary kinetic data are required [76].
Future work should aim to address these limitations by refining the machine learning models with more diverse and pathological gait data, thereby reducing training bias and improving robustness to abnormal movement patterns. Errors observed in pathological gait likely reflect domain bias in current 2D pose-estimation models, which have been trained mainly on healthy samples. Fine-tuning or retraining with pathological datasets could enable algorithms to adapt more effectively to atypical gait. Multi-view training approaches, more resilient to occlusions, may further enhance landmark tracking in less controlled environments [77]. Finally, creating and openly sharing annotated pathological gait datasets will be essential to improve model generalization, facilitate benchmarking across studies, and accelerate the clinical translation of markerless systems such as OpenCap. Beyond improvements in training data and 3D landmark tracking algorithms, the integration of portable force-sensing technologies, such as instrumented insoles, could complement video-based kinematics with direct kinetic information, enabling more comprehensive biomechanical assessments. From a clinical standpoint, such multimodal approaches could expand the applicability of OpenCap beyond basic kinematic screening, enabling more comprehensive assessments of joint loading, balance, and dynamic stability, which are critical for rehabilitation planning, fall-prevention strategies, and long-term patient monitoring [78].
In conclusion, OpenCap represents a significant step toward expanding access to quantitative gait analysis, with the potential to extend biomechanical assessments beyond the laboratory and into clinical and community settings. Nevertheless, the current evidence remains limited, and findings should be considered preliminary until validated in larger and more diverse cohorts. Further refinement will be required before OpenCap can replace traditional motion capture in clinical decision-making.

Author Contributions

Conceptualization, S.C. and V.C.; formal analysis, S.C. and J.L.S.; investigation, S.C. and E.F.d.B.; data curation, S.C.; writing—original draft preparation, S.C., J.L.S. and E.F.d.B.; writing—review and editing, P.C. (Paolo Capodaglio), P.C. (Paolo Cavallari), M.G. and V.C.; supervision, V.C., P.C. (Paolo Capodaglio) and P.C. (Paolo Cavallari). All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MoCapMotion Capture
GRFsGround Reaction Forces
Flex/extFlexion/extension
Abd/addAbduction/adduction
Int/extInternal/external
RotRotation
In/evInversion/eversion

References

  1. Cimolin, V.; Galli, M. Summary measures for clinical gait analysis: A literature review. Gait Posture 2014, 39, 1005–1010. [Google Scholar] [CrossRef]
  2. Ungvari, Z.; Fazekas-Pongor, V.; Csiszar, A.; Kunutsor, S.K. The multifaceted benefits of walking for healthy aging: From Blue Zones to molecular mechanisms. GeroScience 2023, 45, 3211–3239. [Google Scholar] [CrossRef] [PubMed]
  3. Moon, Y.; Sung, J.; An, R.; Hernandez, M.E.; Sosnoff, J.J. Gait variability in people with neurological disorders: A systematic review and meta-analysis. Hum. Mov. Sci. 2016, 47, 197–208. [Google Scholar] [CrossRef] [PubMed]
  4. Snijders, A.H.; Leunissen, I.; Bakker, M.; Overeem, S.; Helmich, R.C.; Bloem, B.R.; Toni, I. Gait-related cerebral alterations in patients with Parkinson’s disease with freezing of gait. Brain 2011, 134, 59–72. [Google Scholar] [CrossRef] [PubMed]
  5. Chen, G.; Patten, C.; Kothari, D.H.; Zajac, F.E. Gait differences between individuals with post-stroke hemiparesis and non-disabled controls at matched speeds. Gait Posture 2005, 22, 51–56. [Google Scholar] [CrossRef]
  6. Rezaei, A.; Bhat, S.G.; Cheng, C.-H.; Pignolo, R.J.; Lu, L.; Kaufman, K.R. Age-related changes in gait, balance, and strength parameters: A cross-sectional study. PLoS ONE 2024, 19, e0310764. [Google Scholar] [CrossRef]
  7. Cruz-Jimenez, M. Normal Changes in Gait and Mobility Problems in the Elderly. Phys. Med. Rehabil. Clin. N. Am. 2017, 28, 713–725. [Google Scholar] [CrossRef]
  8. Selves, C.; Stoquart, G.; Lejeune, T. Gait rehabilitation after stroke: Review of the evidence of predictors, clinical outcomes and timing for interventions. Acta Neurol. Belg. 2020, 120, 783–790. [Google Scholar] [CrossRef]
  9. Khalid, S.; Malik, A.N.; Siddiqi, F.A.; Rathore, F.A. Overview of gait rehabilitation in stroke. J. Pak. Med. Assoc. 2023, 73, 1142–1145. [Google Scholar] [CrossRef]
  10. Cimolin, V.; Vismara, L.; Galli, M.; Grugni, G.; Cau, N.; Capodaglio, P. Gait strategy in genetically obese patients: A 7-year follow up. Res. Dev. Disabil. 2014, 35, 1501–1506. [Google Scholar] [CrossRef]
  11. McGinley, J.L.; Baker, R.; Wolfe, R.; Morris, M.E. The reliability of three-dimensional kinematic gait measurements: A systematic review. Gait Posture 2009, 29, 360–369. [Google Scholar] [CrossRef] [PubMed]
  12. Bonato, P.; Feipel, V.; Corniani, G.; Arin-Bal, G.; Leardini, A. Position paper on how technology for human motion analysis and relevant clinical applications have evolved over the past decades: Striking a balance between accuracy and convenience. Gait Posture 2024, 113, 191–203. [Google Scholar] [CrossRef] [PubMed]
  13. Topley, M.; Richards, J.G. A comparison of currently available optoelectronic motion capture systems. J. Biomech. 2020, 106, 109820. [Google Scholar] [CrossRef] [PubMed]
  14. Cappozzo, A.; Catani, F.; Della Croce, U.; Leardini, A. Position and orientation in space of bones during movement: Anatomical frame definition and determination. Clin. Biomech. 1995, 10, 171–178. [Google Scholar] [CrossRef]
  15. Richards, J.G. The measurement of human motion: A comparison of commercially available systems. Hum. Mov. Sci. 1999, 18, 589–602. [Google Scholar] [CrossRef]
  16. van der Kruk, E.; Reijne, M.M. Accuracy of human motion capture systems for sport applications; state-of-the-art review. Eur. J. Sport Sci. 2018, 18, 806–819. [Google Scholar] [CrossRef]
  17. Muro-de-la-Herran, A.; Garcia-Zapirain, B.; Mendez-Zorrilla, A. Gait analysis methods: An overview of wearable and non-wearable systems, highlighting clinical applications. Sensors 2014, 14, 3362–3394. [Google Scholar] [CrossRef]
  18. Lopes, T.J.A.; Ferrari, D.; Ioannidis, J.; Simic, M.; Mícolis de Azevedo, F.; Pappas, E. Reliability and Validity of Frontal Plane Kinematics of the Trunk and Lower Extremity Measured With 2-Dimensional Cameras During Athletic Tasks: A Systematic Review With Meta-analysis. J. Orthop. Sports Phys. Ther. 2018, 48, 812–822. [Google Scholar] [CrossRef]
  19. Reinking, M.F.; Dugan, L.; Ripple, N.; Schleper, K.; Scholz, H.; Spadino, J.; Stahl, C.; McPoil, T.G. Reliability of two-dimensional video-based running gait analysis. Int. J. Sports Phys. Ther. 2018, 13, 453–461. [Google Scholar] [CrossRef]
  20. Milosevic, B.; Leardini, A.; Farella, E. Kinect and wearable inertial sensors for motor rehabilitation programs at home: State of the art and an experimental comparison. Biomed. Eng. Online 2020, 19, 25. [Google Scholar] [CrossRef]
  21. Colyer, S.L.; Evans, M.; Cosker, D.P.; Salo, A.I.T. A Review of the Evolution of Vision-Based Motion Analysis and the Integration of Advanced Computer Vision Methods Towards Developing a Markerless System. Sport. Med. Open 2018, 4, 24. [Google Scholar] [CrossRef] [PubMed]
  22. Gu, C.; Lin, W.; He, X.; Zhang, L.; Zhang, M. IMU-based motion capture system for rehabilitation applications: A systematic review. Biomim. Intell. Robot. 2023, 3, 100097. [Google Scholar] [CrossRef]
  23. Horak, F.; King, L.; Mancini, M. Role of body-worn movement monitor technology for balance and gait rehabilitation. Phys. Ther. 2015, 95, 461–470. [Google Scholar] [CrossRef] [PubMed]
  24. Komaris, D.-S.; Tarfali, G.; O’Flynn, B.; Tedesco, S. Unsupervised IMU-based evaluation of at-home exercise programmes: A feasibility study. BMC Sports Sci. Med. Rehabil. 2022, 14, 28. [Google Scholar] [CrossRef]
  25. Motta, F.; Varrecchia, T.; Chini, G.; Ranavolo, A.; Galli, M. The Use of Wearable Systems for Assessing Work-Related Risks Related to the Musculoskeletal System—A Systematic Review. Int. J. Environ. Res. Public Health 2024, 21, 1567. [Google Scholar] [CrossRef]
  26. Clark, R.A.; Mentiplay, B.F.; Hough, E.; Pua, Y.H. Three-dimensional cameras and skeleton pose tracking for physical function assessment: A review of uses, validity, current developments and Kinect alternatives. Gait Posture 2019, 68, 193–200. [Google Scholar] [CrossRef]
  27. Mousavi Hondori, H.; Khademi, M. A Review on Technical and Clinical Impact of Microsoft Kinect on Physical Therapy and Rehabilitation. J. Med. Eng. 2014, 2014, 846514. [Google Scholar] [CrossRef]
  28. Desmarais, Y.; Mottet, D.; Slangen, P.; Montesinos, P. A Review of 3D Human Pose Estimation Algorithms for Markerless Motion Capture; Computer Vision and Image Understanding: San Diego, CA, USA, 2021; Volume 212, ISBN 1077314221. [Google Scholar]
  29. Lam, W.W.T.; Tang, Y.M.; Fong, K.N.K. A systematic review of the applications of markerless motion capture (MMC) technology for clinical measurement in rehabilitation. J. Neuroeng. Rehabil. 2023, 20, 57. [Google Scholar] [CrossRef]
  30. Scataglini, S.; Abts, E.; Van Bocxlaer, C.; den Bussche, M.; Meletani, S.; Truijen, S. Accuracy, Validity, and Reliability of Markerless Camera-Based 3D Motion Capture Systems versus Marker-Based 3D Motion Capture Systems in Gait Analysis: A Systematic Review and Meta-Analysis. Sensors 2024, 24, 3686. [Google Scholar] [CrossRef]
  31. Molteni, L.E.; Andreoni, G. Comparing the Accuracy of Markerless Motion Analysis and Optoelectronic System for Measuring Gait Kinematics of Lower Limb. Bioengineering 2025, 12, 424. [Google Scholar] [CrossRef]
  32. Cao, Z.; Hidalgo, G.; Simon, T.; Wei, S.-E.; Sheikh, Y. OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. IEEE Trans. Pattern Anal. Mach. Intell. 2021, 43, 172–186. [Google Scholar] [CrossRef]
  33. Rui, L.; Gao, Y.; Ren, H. EDite-HRNet: Enhanced Dynamic Lightweight High-Resolution Network for Human Pose Estimation. IEEE Access 2023, 11, 95948–95957. [Google Scholar] [CrossRef]
  34. Neupane, R.B.; Li, K.; Boka, T.F. A survey on deep 3D human pose estimation. Artif. Intell. Rev. 2024, 58, 24. [Google Scholar] [CrossRef]
  35. Zheng, C.; Wu, W.; Yang, T.; Zhu, S.; Chen, C.; Liu, R.; Shen, J.; Kehtarnavaz, N.; Shah, M. Deep Learning-Based Human Pose Estimation: A Survey. ACM Comput. Surv. 2023, 56, 1–37. [Google Scholar] [CrossRef]
  36. Mathis, A.; Mamidanna, P.; Cury, K.M.; Abe, T.; Murthy, V.N.; Mathis, M.W.; Bethge, M. DeepLabCut: Markerless pose estimation of user-defined body parts with deep learning. Nat. Neurosci. 2018, 21, 1281–1289. [Google Scholar] [CrossRef]
  37. Toshev, A.; Szegedy, C. DeepPose: Human pose estimation via deep neural networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 24–27 June 2014; pp. 1653–1660. [Google Scholar] [CrossRef]
  38. Kanko, R.M.; Laende, E.K.; Strutzenberger, G.; Brown, M.; Selbie, W.S.; DePaul, V.; Scott, S.H.; Deluzio, K.J. Assessment of spatiotemporal gait parameters using a deep learning algorithm-based markerless motion capture system. J. Biomech. 2021, 122, 110414. [Google Scholar] [CrossRef]
  39. Uhlrich, S.D.; Falisse, A.; Kidziński, Ł.; Muccini, J.; Ko, M.; Chaudhari, A.S.; Hicks, J.L.; Delp, S.L. OpenCap: Human movement dynamics from smartphone videos. PLoS Comput. Biol. 2023, 19, e1011462. [Google Scholar] [CrossRef]
  40. Gozlan, Y.; Falisse, A.; Uhlrich, S.; Gatti, A.; Black, M.; Hicks, J.; Delp, S.; Chaudhari, A. OpenCapBench: A Benchmark to Bridge Pose Estimation and Biomechanics. In Proceedings of the 2025 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), Tucson, AZ, USA, 26 February–6 March 2025. [Google Scholar]
  41. Falisse, A.; Uhlrich, S.; Delp, S. OpenCap: Movement Biomechanics from Smartphone Videos. In International Conference on NeuroRehabilitation; Springer Nature: Cham, Switzerland, 2024; pp. 519–522. ISBN 978-3-031-77583-3. [Google Scholar]
  42. Falisse, A.; Uhlrich, S.; Chaudhari, A.; Delp, S. Marker Data Enhancement for Markerless Motion Capture. IEEE Trans. Biomed. Eng. 2025, 72, 2013–2022. [Google Scholar] [CrossRef]
  43. Delp, S.L.; Anderson, F.C.; Arnold, A.S.; Loan, P.; Habib, A.; John, C.T.; Guendelman, E.; Thelen, D.G. OpenSim: Open-source software to create and analyze dynamic simulations of movement. IEEE Trans. Biomed. Eng. 2007, 54, 1940–1950. [Google Scholar] [CrossRef]
  44. Rajagopal, A.; Dembia, C.L.; DeMers, M.S.; Delp, D.D.; Hicks, J.L.; Delp, S.L. Full-Body Musculoskeletal Model for Muscle-Driven Simulation of Human Gait. IEEE Trans. Biomed. Eng. 2016, 63, 2068–2079. [Google Scholar] [CrossRef]
  45. Seth, A.; Sherman, M.; Reinbolt, J.A.; Delp, S.L. OpenSim: A musculoskeletal modeling and simulation framework for in silico investigations and exchange. Procedia IUTAM 2011, 2, 212–232. [Google Scholar] [CrossRef]
  46. Abdullah, M.; Hulleck, A.A.; Katmah, R.; Khalaf, K.; El-Rich, M. Multibody dynamics-based musculoskeletal modeling for gait analysis: A systematic review. J. Neuroeng. Rehabil. 2024, 21, 178. [Google Scholar] [CrossRef]
  47. Kakavand, R.; Ahmadi, R.; Parsaei, A.; Brent Edwards, W.; Komeili, A. Comparison of kinematics and kinetics between OpenCap and a marker-based motion capture system in cycling. Comput. Biol. Med. 2025, 192, 110295. [Google Scholar] [CrossRef] [PubMed]
  48. Min, Y.S.; Jung, T.-D.; Lee, Y.S.; Kwon, Y.; Kim, H.J.; Kim, H.C.; Lee, J.C.; Park, E. Biomechanical Gait Analysis Using a Smartphone-Based Motion Capture System (OpenCap) in Patients with Neurological Disorders. Bioengineering 2024, 11, 911. [Google Scholar] [CrossRef] [PubMed]
  49. Turner, J.A.; Chaaban, C.R.; Padua, D.A. Validation of OpenCap: A low-cost markerless motion capture system for lower-extremity kinematics during return-to-sport tasks. J. Biomech. 2024, 171, 112200. [Google Scholar] [CrossRef] [PubMed]
  50. Cronin, N.J.; Walker, J.; Tucker, C.B.; Nicholson, G.; Cooke, M.; Merlino, S.; Bissas, A. Feasibility of OpenPose markerless motion analysis in a real athletics competition. Front. Sport. Act. Living 2023, 5, 1298003. [Google Scholar] [CrossRef]
  51. Verheul, J.; Robinson, M.A.; Burton, S. Jumping towards field-based ground reaction force estimation and assessment with OpenCap. J. Biomech. 2024, 166, 112044. [Google Scholar] [CrossRef]
  52. de Borba, E.F.; da Silva, E.S.; de Alves, L.L.; Neto, A.R.D.S.; Inda, A.R.; Ibrahim, B.M.; Ribas, L.R.; Correale, L.; Peyré-Tartaruga, L.A.; Tartaruga, M.P. Fatigue-Related Changes in Running Technique and Mechanical Variables After a Maximal Incremental Test in Recreational Runners. J. Appl. Biomech. 2024, 40, 424–431. [Google Scholar] [CrossRef]
  53. Bertozzi, F.; Brunetti, C.; Maver, P.; Palombi, M.; Santini, M.; Galli, M.; Tarabini, M. Concurrent validity of IMU and phone-based markerless systems for lower-limb kinematics during cognitively-challenging landing tasks. J. Biomech. 2025, 191, 112883. [Google Scholar] [CrossRef]
  54. Cheng, X.; Jiao, Y.; Meiring, R.M.; Sheng, B.; Zhang, Y. Reliability and validity of current computer vision based motion capture systems in gait analysis: A systematic review. Gait Posture 2025, 120, 150–160. [Google Scholar] [CrossRef]
  55. Drazan, J.F.; Phillips, W.T.; Seethapathi, N.; Hullfish, T.J.; Baxter, J.R. Moving outside the lab: Markerless motion capture accurately quantifies sagittal plane kinematics during the vertical jump. J. Biomech. 2021, 125, 110547. [Google Scholar] [CrossRef]
  56. Van Hooren, B.; Pecasse, N.; Meijer, K.; Essers, J.M.N. The accuracy of markerless motion capture combined with computer vision techniques for measuring running kinematics. Scand. J. Med. Sci. Sports 2023, 33, 966–978. [Google Scholar] [CrossRef]
  57. Ferrari, R. Writing narrative style literature reviews. Med. Writ. 2015, 24, 230–235. [Google Scholar] [CrossRef]
  58. Sukhera, J. Narrative Reviews: Flexible, Rigorous, and Practical. J. Grad. Med. Educ. 2022, 14, 414–417. [Google Scholar] [CrossRef] [PubMed]
  59. Horsak, B.; Eichmann, A.; Lauer, K.; Prock, K.; Krondorfer, P.; Siragy, T.; Dumphart, B. Concurrent validity of smartphone-based markerless motion capturing to quantify lower-limb joint kinematics in healthy and pathological gait. J. Biomech. 2023, 159, 111801. [Google Scholar] [CrossRef] [PubMed]
  60. Horsak, B.; Prock, K.; Krondorfer, P.; Siragy, T.; Simonlehner, M.; Dumphart, B. Inter-trial variability is higher in 3D markerless compared to marker-based motion capture: Implications for data post-processing and analysis. J. Biomech. 2024, 166, 112049. [Google Scholar] [CrossRef]
  61. Peng, Y.; Wang, W.; Wang, L.; Zhou, H.; Chen, Z.; Zhang, Q.; Li, G. Smartphone videos-driven musculoskeletal multibody dynamics modelling workflow to estimate the lower limb joint contact forces and ground reaction forces. Med. Biol. Eng. Comput. 2024, 62, 3841–3853. [Google Scholar] [CrossRef]
  62. Svetek, A.; Morgan, K.; Burland, J.; Glaviano, N.R. Validation of OpenCap on lower extremity kinematics during functional tasks. J. Biomech. 2025, 183, 112602. [Google Scholar] [CrossRef]
  63. Horsak, B.; Kainz, H.; Dumphart, B. Repeatability and minimal detectable change including clothing effects for smartphone-based 3D markerless motion capture. J. Biomech. 2024, 175, 112281. [Google Scholar] [CrossRef]
  64. Martiš, P.; Košutzká, Z.; Kranzl, A. A Step Forward Understanding Directional Limitations in Markerless Smartphone-Based Gait Analysis: A Pilot Study. Sensors 2024, 24, 3091. [Google Scholar] [CrossRef]
  65. Wang, J.; Xu, W.; Wu, Z.; Zhang, H.; Wang, B.; Zhou, Z.; Wang, C.; Li, K.; Nie, Y. Evaluation of a smartphone-based markerless system to measure lower-limb kinematics in patients with knee osteoarthritis. J. Biomech. 2025, 181, 112529. [Google Scholar] [CrossRef] [PubMed]
  66. Karimi, M.T.; Tahmasebi, R.; Sharifmoradi, K.; Abarghuei, M.A.F. Investigation of joint contact forces during walking in the subjects with toe in gait due to increasing in femoral head anteversion angle. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2024, 238, 755–763. [Google Scholar] [CrossRef] [PubMed]
  67. Liew, B.X.W.; Rügamer, D.; Mei, Q.; Altai, Z.; Zhu, X.; Zhai, X.; Cortes, N. Smooth and accurate predictions of joint contact force time-series in gait using over parameterised deep neural networks. Front. Bioeng. Biotechnol. 2023, 11, 1208711. [Google Scholar] [CrossRef] [PubMed]
  68. Koo, T.K.; Li, M.Y. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J. Chiropr. Med. 2016, 15, 155–163. [Google Scholar] [CrossRef]
  69. Wilken, J.M.; Rodriguez, K.M.; Brawner, M.; Darter, B.J. Reliability and Minimal Detectible Change values for gait kinematics and kinetics in healthy adults. Gait Posture 2012, 35, 301–307. [Google Scholar] [CrossRef]
  70. Lin, T.-Y.; Maire, M.; Belongie, S.; Hays, J.; Perona, P.; Ramanan, D.; Dollár, P.; Zitnick, C. Microsoft COCO: Common Objects in Context. In European Conference on Computer Vision; Springer International: Cham, Switzerland, 2014. [Google Scholar]
  71. Greene, B.R.; McManus, K.; Redmond, S.J.; Caulfield, B.; Quinn, C.C. Digital assessment of falls risk, frailty, and mobility impairment using wearable sensors. Npj Digit. Med. 2019, 2, 125. [Google Scholar] [CrossRef]
  72. Mündermann, L.; Corazza, S.; Andriacchi, T.P. The evolution of methods for the capture of human movement leading to markerless motion capture for biomechanical applications. J. Neuroeng. Rehabil. 2006, 3, 6. [Google Scholar] [CrossRef]
  73. Saibene, F.; Minetti, A.E. Biomechanical and physiological aspects of legged locomotion in humans. Eur. J. Appl. Physiol. 2003, 88, 297–316. [Google Scholar] [CrossRef]
  74. Tesio, L.; Rota, V. The Motion of Body Center of Mass During Walking: A Review Oriented to Clinical Applications. Front. Neurol. 2019, 10, 999. [Google Scholar] [CrossRef]
  75. Biswas, N.; Chakrabarti, S.; Jones, L.D.; Ashili, S. Smart wearables addressing gait disorders: A review. Mater. Today Commun. 2023, 35, 106250. [Google Scholar] [CrossRef]
  76. Wren, T.A.L.; Tucker, C.A.; Rethlefsen, S.A.; Gorton, G.E., 3rd; Õunpuu, S. Clinical efficacy of instrumented gait analysis: Systematic review 2020 update. Gait Posture 2020, 80, 274–279. [Google Scholar] [CrossRef]
  77. Roggio, F.; Trovato, B.; Sortino, M.; Musumeci, G. A comprehensive analysis of the machine learning pose estimation models used in human movement and posture analyses: A narrative review. Heliyon 2024, 10, e39977. [Google Scholar] [CrossRef]
  78. Osness, E.; Isley, S.; Bertrand, J.; Dennett, L.; Bates, J.; Van Decker, N.; Stanhope, A.; Omkar, A.; Dolgoy, N.; Ezeugwu, V.E.; et al. Markerless Motion Capture Parameters Associated with Fall Risk or Frailty: A Scoping Review. Sensors 2025, 25, 5741. [Google Scholar] [CrossRef]
Figure 1. Process of study selection.
Figure 1. Process of study selection.
Biomechanics 05 00088 g001
Figure 2. Representative setup of OpenCap (black phones) and MoCap (blue cameras) for validation studies. Walking trials were performed along a 4 m path either toward the camera (WTC, red line) or away from the camera (WAC, green line).
Figure 2. Representative setup of OpenCap (black phones) and MoCap (blue cameras) for validation studies. Walking trials were performed along a 4 m path either toward the camera (WTC, red line) or away from the camera (WAC, green line).
Biomechanics 05 00088 g002
Figure 3. Radar plot of RMSE values (°) for pelvis kinematics in healthy gait. The pelvic parameters (i.e., tilt, list, and rotation) are reported across three studies [39,59,64], allowing direct comparison. The red dashed line represents the 5° clinical threshold.
Figure 3. Radar plot of RMSE values (°) for pelvis kinematics in healthy gait. The pelvic parameters (i.e., tilt, list, and rotation) are reported across three studies [39,59,64], allowing direct comparison. The red dashed line represents the 5° clinical threshold.
Biomechanics 05 00088 g003
Figure 4. Radar plot of RMSE values (°) for hip, knee, and ankle kinematics in healthy gait. Only the parameters common across the available datasets are represented [39,59,61,64,65]. Values from Svetek et al. [62] are not displayed, as they refer only to peak joint angles and are not directly comparable with full gait cycle RMSEs from the other studies. The red dashed line represents the 5° clinical threshold.
Figure 4. Radar plot of RMSE values (°) for hip, knee, and ankle kinematics in healthy gait. Only the parameters common across the available datasets are represented [39,59,61,64,65]. Values from Svetek et al. [62] are not displayed, as they refer only to peak joint angles and are not directly comparable with full gait cycle RMSEs from the other studies. The red dashed line represents the 5° clinical threshold.
Biomechanics 05 00088 g004
Figure 5. Radar plot of RMSE values (°) for joint kinematics in pathological gait patterns [59,65]. Only the parameters common across the available datasets are represented. The red dashed line represents the 5° clinical threshold.
Figure 5. Radar plot of RMSE values (°) for joint kinematics in pathological gait patterns [59,65]. Only the parameters common across the available datasets are represented. The red dashed line represents the 5° clinical threshold.
Biomechanics 05 00088 g005
Figure 6. Error distributions in terms of RMSE (°) in healthy (a) and pathological (b) gait, divided per movement plane. The top panel summarizes RMSE values pooled across the sagittal (blue), frontal (green), and transverse (orange) planes, while the bottom panels detail the distributions for each variable within its respective plane. Boxes indicate the interquartile range, horizontal lines mark the median and whiskers extend to 1.5× the interquartile range. The dashed red line denotes the 5° clinical threshold, above which reconstruction errors may be considered non-negligible. The number of data points (n) shown below each label corresponds to the total count of RMSE values aggregated across studies for that variable or plane. Healthy datasets included data from [39,59,61,64,65] whereas pathological datasets derived from [59,65].
Figure 6. Error distributions in terms of RMSE (°) in healthy (a) and pathological (b) gait, divided per movement plane. The top panel summarizes RMSE values pooled across the sagittal (blue), frontal (green), and transverse (orange) planes, while the bottom panels detail the distributions for each variable within its respective plane. Boxes indicate the interquartile range, horizontal lines mark the median and whiskers extend to 1.5× the interquartile range. The dashed red line denotes the 5° clinical threshold, above which reconstruction errors may be considered non-negligible. The number of data points (n) shown below each label corresponds to the total count of RMSE values aggregated across studies for that variable or plane. Healthy datasets included data from [39,59,61,64,65] whereas pathological datasets derived from [59,65].
Biomechanics 05 00088 g006
Figure 7. Error distributions in terms of RMSE (°) for the parameters common to the two studies analyzing WTC and WAC conditions [64,65]. Data are reported for healthy participants of both studies and for patients with Knee OA. The red dashed line represents the 5° clinical threshold.
Figure 7. Error distributions in terms of RMSE (°) for the parameters common to the two studies analyzing WTC and WAC conditions [64,65]. Data are reported for healthy participants of both studies and for patients with Knee OA. The red dashed line represents the 5° clinical threshold.
Biomechanics 05 00088 g007
Table 1. Summary of the main details of the reviewed studies.
Table 1. Summary of the main details of the reviewed studies.
Source, Year and CountryParticipants, Age (yrs) and Sex (M/F)Validation Set-UpOpenCap Set-UpGait AssessmentFinality of the Study
Reference SystemMarkers’ PlacementDevicesPlacementFunctional Tasks and PatternsParameters
Uhrich et al., 2023, USA
[39]
Total: 10 healthy adults, 27.7 ± 3.8 yrs, M: 4/F: 68-camera MoCap (Motion Analysis Corp., Santa Rosa, CA, USA) at 100 Hz + 3 force plates (Bertec Corp., Columbus, OH, USA) at 2000 Hz31 retro-reflective markers, custom placement2 iOS smartphones (iPhone 12 Pro)45° angle relative to walking direction, 3 m from the center of walking path, 1.5 m off the ground, tripod-mountedGait analysis, physiological gait and gait with trunk sway modificationSpatio-temporal parameters; lower-limb joint kinematics and kinetics (hip, knee, ankle)Validation
Horsak et al. 2023, 2024, Austria [59,60]Total: 21 healthy individuals, 30.2 ± 8.5 yrs, M: 9/F: 1216-camera MoCap (Vicon, Oxford, UK) at 120 Hz + 3 force plates at 1200 Hz57 retro-reflective markers, extended Cleveland Clinic set (with medial/lateral markers) + Plug-in-Gait2 iOS smartphones (iPhone 11 and 12 Pro)35° off center of the walking path, 1.5 m from ground, tripod-mounted, ~5° inclineGait analysis along a 10 m walkway; physiological gait, simulated crouch, circumduction, and equinus gaitLower-limb joint kinematics (pelvis, hip, knee, ankle, subtalar)Validation and Characterization
Peng et al., 2024, China [61]Total: 12 healthy adults, 21.7 ± 1.4 yrs, M: 5/F: 711-camera MoCap (Vicon, Oxford, UK) at 150 Hz + 2 force plates at 1200 HzCustom lower-limb and trunk marker set (Plug-in-Gait + additional foot markers)2 iOS smartphones (iPhone 12 Pro)45° relative to walking direction, 2–3 m distance, 1.3 m height, tripod-mountedGait analysis along a flat 10 m walkway, physiological walkingLower-limb joint kinematics (hip, knee, ankle) and kinetics (ground reaction forces, joint contact forces)Validation
Svetek et al. 2024, USA
[62]
Total: 20 athletes (ice hockey), 21.35 ± 1.3 yrs, M: 2/F: 1810-camera MoCap (Vicon, Oxford, UK) at 240 Hz37 retro-reflective markers, custom placement2 iOS devices (iPad Air)45° off center of the walking path, tripod-mountedGait analysis on a treadmill; healthy gait and other functional tasksPeak joint angles in sagittal and frontal planes (hip, knee)Validation
Min et al. 2024, South Korea [48]Total: 20 participants (10 neurological patients: stroke, Parkinson’s, cerebral palsy; 10 healthy controls), age and gender division N/AN/A2 iOS smartphones (model N/A)30–45° angle relative to walking direction, tripod-mountedGait analysis along a 4 m walkway; physiological gait and gait in neurological impairments (i.e., stroke, Parkinson’s, and cerebral palsy)Joint kinematics and kinetics (pelvis, hip, knee, ankle)Characterization
Horsak et al. 2024, Austria [63]Total: 19 healthy adults, 35 ± 11 yrs, M: 12/F: 7N/A2 iOS devices (12 mini and 13 Pro)35° off center of the walking path, 1.5 m from ground, tripod-mountedGait analysis along an 8 m walkway, physiological gait with different clothing conditionsLower-limb (hip, knee, ankle), pelvic, and trunk kinematicsCharacterization
Martiš et al. 2024, Austria [64]Total: 10 healthy adults, 29.7 ± 8.6 yrs, M: 6/F: 417-camera MoCap (Vicon, Oxford, UK) at 150 Hz49 markers, modified Cleveland Clinic and Plug-In-Gait sets2 iOS devices (12 and 14)30° off center of the walking path, 1.5 m from ground, tripod- mountedTUG test walking along a 3 m walking path, toward and away from cameraSpatio-temporal parameters and joint kinematics (pelvis, hip, knee, ankle), foot lift-off and landing anglesValidation
Wang et al. 2025, China [65]Total: 83 individuals, 53 patients with knee osteoarthritis (64.5 ± 6.5 yrs, 42M/11F), 30 healthy individuals (55.2 ± 3.3 yrs, 25M/5F)10-camera MoCap (Vicon, Oxford, UK) at 150 Hz + 2 force plates at 1200 Hz34 retro-reflective markers, custom placement2 iPhones (12 Pro)35° off center of the walking path, 1.3 m from ground, tripod-mountedGait analysis on a 7 m walkway, pathological and healthy gait toward and away from cameraLower-limb joint kinematics (pelvis, hip, knee, ankle)Validation and Characterization
Note. N/A: not available.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cerfoglio, S.; Lopes Storniolo, J.; de Borba, E.F.; Cavallari, P.; Galli, M.; Capodaglio, P.; Cimolin, V. Smartphone-Based Gait Analysis with OpenCap: A Narrative Review. Biomechanics 2025, 5, 88. https://doi.org/10.3390/biomechanics5040088

AMA Style

Cerfoglio S, Lopes Storniolo J, de Borba EF, Cavallari P, Galli M, Capodaglio P, Cimolin V. Smartphone-Based Gait Analysis with OpenCap: A Narrative Review. Biomechanics. 2025; 5(4):88. https://doi.org/10.3390/biomechanics5040088

Chicago/Turabian Style

Cerfoglio, Serena, Jorge Lopes Storniolo, Edilson Fernando de Borba, Paolo Cavallari, Manuela Galli, Paolo Capodaglio, and Veronica Cimolin. 2025. "Smartphone-Based Gait Analysis with OpenCap: A Narrative Review" Biomechanics 5, no. 4: 88. https://doi.org/10.3390/biomechanics5040088

APA Style

Cerfoglio, S., Lopes Storniolo, J., de Borba, E. F., Cavallari, P., Galli, M., Capodaglio, P., & Cimolin, V. (2025). Smartphone-Based Gait Analysis with OpenCap: A Narrative Review. Biomechanics, 5(4), 88. https://doi.org/10.3390/biomechanics5040088

Article Metrics

Back to TopTop