A Systematic Review of the Accuracy, Validity, and Reliability of Markerless Versus Marker Camera-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis
Abstract
Highlights
- Markerless camera-based motion capture systems (MCBSs) for ergonomic risk assessment in industrial settings offer substantial accuracy and reliability.
- Markerless systems (MCBSs) provide a feasible, scalable alternative to traditional ergonomic methods.
- Markerless systems (MCBSs) have strong potential for practical, real-world applications and automation.
- Markerless systems (MCBS) support Industry 5.0 goals in occupational risk prevention.
Abstract
1. Introduction
2. Materials and Methods
2.1. Data Sources and Search Strategy
2.2. Eligibility Criteria
2.3. Study Selection
2.4. Data Extraction
2.5. Synthesis and Grading of Evidence
3. Results
3.1. Study Selection
3.2. Study Characteristics
3.3. Evidence Table
3.4. Measurement Evidence Table
3.5. Risk of Bias in Included Studies
3.6. Results of Individual Studies
3.7. Summary of Synthesis and Bias
3.8. Reporting Bias
3.9. Certainty of Evidence
4. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
References
- Pavlovic-Veselinovic, S.; Hedge, A.; Veselinovic, M. An ergonomic expert system for risk assessment of work-related musculo-skeletal disorders. Int. J. Ind. Ergon. 2016, 53, 130–139. [Google Scholar] [CrossRef]
- Janela, D.; Areias, A.C.; Moulder, R.G.; Molinos, M.; Bento, V.; Yanamadala, V.; Correia, F.D.; Costa, F. Recovering Work Productivity in a Population with Chronic Musculoskeletal Pain: Unveiling the Value and Cost-Savings of a Digital Care Program. J. Occup. Environ. Med. 2024, 66, e493–e499. [Google Scholar] [CrossRef]
- Doğrul, Z.; Mazican, N.; Turk, M. The Prevalence of Work-Related Musculoskeletal Disorders (WRMSDs) and Related Factors among Occupational Disease Clinic Patients. Int. Arch. Public Health Community Med. 2019, 3, 030. [Google Scholar] [CrossRef]
- Rahman, M.H.; Hasan, M.R.; Chowdhury, N.I.; Syed, M.A.B.; Farah, M.U. Predictive Health Analysis in Industry 5.0: A Scientometric and Systematic Review of Motion Capture in Construction. Digit. Eng. 2024, 1, 100002. [Google Scholar] [CrossRef]
- McAtamney, L.; Nigel Corlett, E. RULA: A survey method for the investigation of work-related upper limb disorders. Appl. Ergon. 1993, 24, 91–99. [Google Scholar] [CrossRef] [PubMed]
- Hignett, S.; McAtamney, L. Rapid entire body assessment (REBA). Appl. Ergon. 2000, 31, 201–205. [Google Scholar] [CrossRef] [PubMed]
- Scataglini, S.; Abts, E.; Van Bocxlaer, C.; Van den Bussche, M.; Meletani, S.; Truijen, S. Accuracy, Validity, and Reliability of Markerless Camera-Based 3D Motion Capture Systems versus Marker-Based 3D Motion Capture Systems in Gait Analysis: A Systematic Review and Meta-Analysis. Sensors 2024, 24, 3686. [Google Scholar] [CrossRef]
- Unger, T.; Moslehian, A.S.; Peiffer, J.D.; Ullrich, J.; Gassert, R.; Lambercy, O.; Cotton, R.J.; Awai Easthope, C. Differentiable Biomechanics for Markerless Motion Capture in Upper Limb Stroke Rehabilitation: A Comparison with Optical Motion Capture. arXiv 2024, arXiv:2411.14992. [Google Scholar] [CrossRef]
- Nakano, N.; Sakura, T.; Ueda, K.; Omura, L.; Kimura, A.; Iino, Y.; Fukashiro, S.; Yoshioka, S. Evaluation of 3D Markerless Motion Capture Accuracy Using OpenPose with Multiple Video Cameras. Front. Sports Act. Living 2020, 2, 50. [Google Scholar] [CrossRef] [PubMed]
- Avogaro, A.; Cunico, F.; Rosenhahn, B.; Setti, F. Markerless human pose estimation for biomedical applications: A survey. Front. Comput. Sci. 2023, 5, 1153160. [Google Scholar] [CrossRef]
- Meletani, S.; Scataglini, S.; Mandolini, M.; Scalise, L.; Truijen, S. Experimental Comparison between 4D Stereophotogrammetry and Inertial Measurement Unit Systems for Gait Spatiotemporal Parameters and Joint Kinematics. Sensors 2024, 24, 4669. [Google Scholar] [CrossRef] [PubMed]
- Otto, M.; Lampen, E.; Auris, F.; Gaisbauer, F.; Rukzio, E. Applicability Evaluation of Kinect for EAWS Ergonomic Assessments. Procedia CIRP 2019, 81, 781–784. [Google Scholar] [CrossRef]
- Bortolini, M.; Gamberi, M.; Pilati, F.; Regattieri, A. Automatic assessment of the ergonomic risk for manual manufacturing and assembly activities through optical motion capture technology. Procedia CIRP 2018, 72, 81–86. [Google Scholar] [CrossRef]
- Li, X.; Han, S.; Gül, M.; Al-Hussein, M.; El-Rich, M. 3D Visualization-Based Ergonomic Risk Assessment and Work Modification Framework and Its Validation for a Lifting Task. J. Constr. Eng. Manag. 2018, 144, 04017093. [Google Scholar] [CrossRef]
- Eldar, R.; Fisher-Gewirtzman, D. E-worker postural comfort in the third-workplace: An ergonomic design assessment. Work-A J. Prev. Assess. Rehabil. 2020, 66, 519–538. [Google Scholar] [CrossRef]
- Mehrizi, R.; Peng, X.; Xu, X.; Zhang, S.; Metaxas, D.; Li, K. A computer vision based method for 3D posture estimation of symmetrical lifting. J. Biomech. 2018, 69, 40–46. [Google Scholar] [CrossRef]
- Brunner, O.; Mertens, A.; Nitsch, V.; Brandl, C. Accuracy of a markerless motion capture system for postural ergonomic risk assessment in occupational practice. Int. J. Occup. Saf. Ergon. 2022, 28, 1865–1873. [Google Scholar] [CrossRef]
- Portney, L.G.; Watkins, M.P. Foundations of Clinical Research: Applications to Practice; Pearson/Prentice Hall: Hoboken, NJ, USA, 2015. [Google Scholar]
- Hauenstein, J.D.; Huebner, A.; Wagle, J.P.; Cobian, E.R.; Cummings, J.; Hills, C.; McGinty, M.; Merritt, M.; Rosengarten, S.; Skinner, K.; et al. Reliability of Markerless Motion Capture Systems for Assessing Movement Screenings. Orthop. J. Sports Med. 2024, 12, 23259671241234339. [Google Scholar] [CrossRef]
- Wang, H.; Xie, Z.; Lu, L.; Li, L.; Xu, X. A computer-vision method to estimate joint angles and L5/S1 moments during lifting tasks through a single camera. J. Biomech. 2021, 129, 110860. [Google Scholar] [CrossRef]
- Rybnikár, F.; Kačerová, I.; Horejsi, P.; Michal, Š. Ergonomics Evaluation Using Motion Capture Technology—Literature Review. Appl. Sci. 2022, 13, 162. [Google Scholar] [CrossRef]
- Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
- Khan, M.U.; Scataglini, S.; Khafaga, N. A Systematic Review of the Accuracy, Validity, and Reliability of Marker-Less Versus Marker-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis; Centre for Reviews and Dissemination, University of York: York, UK, 2025. [Google Scholar]
- Clarivate Analytics. EndNote (Version X20) [Software], Clarivate Analytics: North Sydney, NSW, Australia, 2021.
- Ouzzani, M.; Hammady, H.; Fedorowicz, Z.; Elmagarmid, A. Rayyan—A web and mobile app for systematic reviews. Syst. Rev. 2016, 5, 210. [Google Scholar] [CrossRef]
- Terwee, C.B.; Prinsen, C.A.C.; Chiarotto, A.; Westerman, M.J.; Patrick, D.L.; Alonso, J.; Bouter, L.M.; de Vet, H.C.W.; Mokkink, L.B. COSMIN methodology for evaluating the content validity of patient-reported outcome measures: A Delphi study. Qual. Life Res. 2018, 27, 1159–1170. [Google Scholar] [CrossRef]
- Mokkink, L.B.; Boers, M.; van der Vleuten, C.P.M.; Bouter, L.M.; Alonso, J.; Patrick, D.L.; de Vet, H.C.W.; Terwee, C.B. COSMIN Risk of Bias tool to assess the quality of studies on reliability or measurement error of outcome measurement instruments: A Delphi study. BMC Med. Res. Methodol. 2020, 20, 293. [Google Scholar] [CrossRef]
- Burgers, J.S.; van Everdingen, J.J. Evidence-based guideline development in the Netherlands: The EBRO platform. Ned Tijdschr Geneeskd 2004, 148, 2057–2059. [Google Scholar] [PubMed]
- Abobakr, A.; Nahavandi, D.; Hossny, M.; Iskander, J.; Attia, M.; Nahavandi, S.; Smets, M. RGB-D ergonomic assessment system of adopted working postures. Appl. Ergon. 2019, 80, 75–88. [Google Scholar] [CrossRef] [PubMed]
- Boldo, M.; De Marchi, M.; Martini, E.; Aldegheri, S.; Quaglia, D.; Fummi, F.; Bombieri, N. Real-time multi-camera 3D human pose estimation at the edge for industrial applications. Expert Syst. Appl. 2024, 252, 124089. [Google Scholar] [CrossRef]
- Fan, C.; Mei, Q.; Li, X. 3D pose estimation dataset and deep learning-based ergonomic risk assessment in construction. Autom. Constr. 2024, 164, 105452. [Google Scholar] [CrossRef]
- Jiang, J.; Skalli, W.; Siadat, A.; Gajny, L. Société de Biomécanique young investigator award 2023: Estimation of intersegmental load at L5-S1 during lifting/lowering tasks using force plate free markerless motion capture. J. Biomech. 2024, 177, 112422. [Google Scholar] [CrossRef]
- Liu, P.; Chang, C. Simple method integrating OpenPose and RGB-D camera for identifying 3D body landmark locations in various postures. Int. J. Ind. Ergon. 2022, 91, 103354. [Google Scholar] [CrossRef]
- Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Bevilacqua, V.; Trotta, G.F.; Monno, G. Real time RULA assessment using Kinect v2 sensor. Appl. Ergon. 2017, 65, 481–491. [Google Scholar] [CrossRef]
- Mehrizi, R.; Peng, X.; Tang, Z.; Xu, X.; Metaxas, D.; Li, K. Toward Marker-free 3D Pose Estimation in Lifting: A Deep Multi-view Solution. In Proceedings of the 2018 13th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2018), Xi’an, China, 15–19 May 2018; pp. 485–491. [Google Scholar]
- Patrizi, A.; Pennestri, E.; Valentini, P. Comparison between low-cost marker-less and high-end marker-based motion capture systems for the computer-aided assessment of working ergonomics. Ergonomics 2016, 59, 155–162. [Google Scholar] [CrossRef]
- Plantard, P.; Muller, A.; Pontonnier, C.; Dumont, G.; Shum, H.; Multon, F. Inverse dynamics based on occlusion-resistant Kinect data: Is it usable for ergonomics? Int. J. Ind. Ergon. 2017, 61, 71–80. [Google Scholar] [CrossRef]
- Plantard, P.; Shum, H.P.; Le Pierres, A.S.; Multon, F. Validation of an ergonomic assessment method using Kinect data in real workplace conditions. Appl. Ergon. 2017, 65, 562–569. [Google Scholar] [CrossRef] [PubMed]
- Seo, J.; Alwasel, A.; Lee, S.; Abdel-Rahman, E.; Haas, C. A comparative study of in-field motion capture approaches for body kinematics measurement in construction. Robotica 2019, 37, 928–946. [Google Scholar] [CrossRef]
- Van Crombrugge, I.; Sels, S.; Ribbens, B.; Steenackers, G.; Penne, R.; Vanlanduit, S. Accuracy Assessment of Joint Angles Estimated from 2D and 3D Camera Measurements. Sensors 2022, 22, 1729. [Google Scholar] [CrossRef]
- Wong, C.; Zhang, Z.; McKeague, S.; Yang, G. Multi-person vision-based head detector for markerless human motion capture. In Proceedings of the 2013 IEEE International Conference on Body Sensor Networks, Cambridge, MA, USA, 6–9 May 2013. [Google Scholar]
- Bonakdar, A.; Riahi, N.; Shakourisalim, M.; Miller, L.; Tavakoli, M.; Rouhani, H.; Golabchi, A. Validation of markerless vision-based motion capture for ergonomics risk assessment. Int. J. Ind. Ergon. 2025, 107, 103734. [Google Scholar] [CrossRef]
- Ojelade, A.; Rajabi, M.S.; Kim, S.; Nussbaum, M.A. A data-driven approach to classifying manual material handling tasks using markerless motion capture and recurrent neural networks. Int. J. Ind. Ergon. 2025, 107, 103755. [Google Scholar] [CrossRef]
- Colyer, S.L.; Evans, M.; Cosker, D.P.; Salo, A.I.T. A Review of the Evolution of Vision-Based Motion Analysis and the Integration of Advanced Computer Vision Methods Towards Developing a Markerless System. Sports Med. Open 2018, 4, 24. [Google Scholar] [CrossRef] [PubMed]
Database: PubMed | Search Strategy |
---|---|
P | (“workplace” OR worker * OR employee * OR industry OR manufacturing OR labor) |
I | (“motion capture” OR “markerless” OR “marker-based” OR “motion analysis” OR kinematics OR biomechanics OR “3D analysis” OR “human movement” OR “body movement”) |
C | (“assessment” OR “risk assessment” OR “task analysis” OR “manual handling” OR RULA OR REBA OR OWAS OR “postural analysis” OR “ergonomic tools”) |
O | (“accuracy” OR “validity” OR “reliability” OR “evaluation” OR “comparison” OR reproducibility OR “measurement properties” OR “Reproducibility of Results” [MeSH]) |
Filters: Full text, English, humans, adults: 19+ years, from 1 January 2005 to 31 May 2025 |
Database: PubMed | Search Strategy |
---|---|
P | TS = (“workplace” OR worker * OR employee * OR industry OR manufacturing OR labor OR job) |
I | TS = ((“motion capture” OR “markerless” OR “marker-based” OR “motion analysis” OR kinematics OR biomechanics OR “3D analysis” OR “human movement” OR “body movement”)) |
C | TS = (“assessment” OR “risk assessment” OR “task analysis” OR “manual handling” OR RULA OR REBA OR OWAS OR “postural analysis” OR “ergonomic tools”) |
O | TS = (“accuracy” OR “validity” OR “reliability” OR “comparison” OR reproducibility OR “measurement properties”) |
Database: PubMed | Search Strategy |
---|---|
P | (Industry OR Work) |
I | (Markerless OR Markerbase) |
C | (Ergonomic OR Biomech) |
O | (“Accuracy” OR “Reliability” OR “Validity”) |
Filters: Date Range: 2005–2025 |
Database: PubMed | Search Strategy |
---|---|
P | “Ergonomic” OR “Human Factors” OR “Biomechanical Risk” OR “Workplace Ergonomics” OR “Occupational Health” OR “Postural Risk” OR “Work-Related Musculoskeletal Disorders” OR “Ergonomic Assessment” OR “Physical Workload” OR “Occupational Biomechanics” OR “Postural Analysis” OR “Task Analysis” |
I | “Marker-Based” OR “Markerless” OR “Optoelectronic System” OR “Optical Motion Capture” OR “Inertial Motion Capture” OR “Wearable Sensors” OR “Stereophotogrammetry” OR “Motion Tracking” OR “Kinematic Analysis” OR “Human Motion Analysis” OR “Computer Vision” OR “Pose Estimation” OR “3D Motion Analysis” |
C | “Industry” OR “Work” OR “Occupational Setting” OR “Manufacturing” OR “Manual Labor” OR “Digital Human Model” OR “Workplace Safety” OR “Office Ergonomics” OR “Industrial Ergonomics” OR “Manual Handling” OR “Workplace Risk Assessment” |
O | “Accuracy” OR “Validity” OR “Reliability” OR “Evaluation” OR “Performance Analysis” |
Database: PubMed | Search Strategy |
---|---|
Abstract and Title | “motion capture” markerless “marker-based” “motion analysis” biomechanics kinematics “3D analysis” “human movement” “body movement” workplace worker employee industry manufacturing labor job assessment “risk assessment” “task analysis” “manual handling” RULA REBA OWAS “postural analysis” “ergonomic tools” accuracy validity reliability evaluation comparison reproducibility “measurement properties” |
Subdiscipline | Not Specified |
Topic | Not Specified |
Method | Clinical Trials |
Published Since | 1 January 2005 |
Language | English Only |
Category | Inclusion Criteria | Exclusion Criteria |
---|---|---|
Population | Studies involving working-age adults (18–65 years) performing occupational tasks in workplace settings, including industrial, healthcare, and service sectors. | Studies on non-occupational populations (e.g., athletes, children, and elderly individuals outside work contexts). Studies in recreational or non-workplace settings. |
Intervention | Use of marker or markerless camera-based 3D motion capture for ergonomic risk assessment. | Studies utilizing 3D motion capture for non-ergonomic purposes (e.g., general biomechanics and sports performance). |
Comparison | Not obligatory for inclusion; however, studies incorporating any form of comparison (e.g., 3D motion capture vs. traditional ergonomic assessments such as REBA and RULA) were considered. | |
Outcomes | Accuracy, validity, and reliability of 3D motion capture in ergonomic risk evaluation. | Studies not reporting accuracy, validity, or reliability in relation to ergonomic risk assessment. |
Study Design | Quantitative studies, including observational studies, validation studies, randomized controlled trials, cohort studies, and cross-sectional studies. | Qualitative studies, case reports, systematic reviews, or meta-analyses. |
Language | Only studies published in English. | Studies published in languages other than English. |
Study (Author, Year) | Study Design | Setting | Technology Used | Comparison Method | Outcome Measures | Key Findings | Population |
---|---|---|---|---|---|---|---|
Abobakr et al., 2019 [29] | Validation study | Automotive manufacturing industry (field-based, ergonomic tasks) | Novel MCBS RGB-D sensor-based ergonomic risk assessment (CNN deep learning from depth images) | Kinetic skeleton datasets (built a synthetic data generation pipeline) | Joint-angle accuracy; RULA grand-score accuracy | Joint-angle error ±3.19° (±1.57°); RULA prediction accuracy 89% (κ = 0.71) indicating substantial agreement | Not explicitly reported |
Boldo et al., 2024 [30] | Validation study | Industrial manufacturing environment (multi-camera human–machine interaction) | MCBS (distributed multi-camera 3D human pose estimation at the edge deployed (CNN-based) | MBS (infrared optical motion capture system, multi-camera setup) | Pose estimation accuracy; robustness to occlusion and multi-person tracking | High accuracy and robustness compared to MBSs; effective management of occlusion issues through multi-camera redundancy | Not explicitly reported |
Bonakdar et al., 2025 [42] | Experimental validation | Lab experiment (University of Alberta Lab) | Markerless OMC (PoseChecker, RGB camera) + OpenSim | Marker-based OMC (Vicon), IMUs (Xsens), force plates, expert REBA | Joint-angle/JRF accuracy, REBA agreement | Back-angle r ≈ 0.95 vs. MBS-OMC/IMU; L5-S1 JRF r ≈ 0.91; REBA 87% match to MBS-OMC | 8 healthy adults (4F/4M, ~25y) |
Bortolini et al., 2018 [13] | Quasi-experimental study | Industrial workplace | MCBS (Multi-Kinect V2 depth camera network) | EAWS (observational) | EAWS risk index, task postures | MAS system using MCBS MoCap auto-computed EAWS sections accurately; time-saving and reliable for ergonomic evaluation | Automotive assembly operators |
Brunner et al., 2022 [17] | Validation study | Lab (static postures) | MCBS (Kinect V2) | MBS (Vicon Bonita) | Joint-angle accuracy | Axial trunk rotation error ~14°; Kinect V2 showed issues with occlusions; risk of underestimation in posture evaluation | Human-sized mannequin |
Eldar and Fisher-Gewirtzman, 2020 [15] | Experimental study | Simulated “third workplace” | MBS (VICON) | RULA + subjective feedback | Comfort, RULA, kinematic posture | Posture affected by workstation setup; MBS MoCap useful in design of ergonomic settings | 3 seated e-workers |
Fan et al., 2024 [31] | Validation study | Construction industry (field-based, construction tasks) | MCBS CV-based 3D human pose estimation (ConstructionPose3D dataset) | Non-CV-based 3D human pose estimation (solely on MuCo-3DHP) | Accuracy of pose estimation; REBA and RULA ergonomic scores | 35% improvement in accuracy using construction-specific dataset compared to generic datasets; enhanced ergonomic assessment accuracy | 7 construction workers (5 males, 2 females) |
Jiang et al., 2024 [32] | Validation study | Lab (simulated lifting task) | MCBS (deep learning multi-cam) | MBS (Vicon) + Force Plate | L5-S1 load estimation (kinetics) | MCBS system estimated lumbar loads with high accuracy; force diff. ~14 N, moment diff. ~9 Nm | 12 adults (avg. age: 24.2 years) |
Li et al., 2018 [14] | Validation study | Lab (construction lifting task) | Marker-based 3D visualization (MBS) | Traditional observation (joint-angle comparison, risk rating comparison for body segments) + REBA/RULA | Accuracy, validity, risk ratings | 3D visualization was comparable to motion capture and better than observation; useful for workstation redesign | 3 healthy adults |
Liu et al., 2022 [33] | Validation study | Lab (postural tasks) | MCBS (OpenPose + RGB-D) | MBS (OptiTrack Prime 13 motion capture system) | Accuracy (3D landmarks) | Average tracking error (cm): stereoscopic 6.96–12.47; ToF 5.57–9.31cm; sufficient for postural ergonomic assessment; OpenPose with RGB-D offers low-cost 3D solution | 30 healthy adults (avg. age: 23.4 years) |
Manghisi et al., 2017 [34] | Validation study | General industrial scenarios (real-time posture monitoring) | MCBS (Kinect V2) | Traditional visual assessment; expert rater assessment | RULA grand-score accuracy | Agreement proportion vs. MoCap = 0.97 (κ = 0.87), expert rater = 0.96 (κ = 0.84), indicating near-perfect agreement | Not explicitly reported |
Mehrizi et al., 2018 [16] | Validation study | Lab (symmetrical lifting tasks) | MCBS (computer vision) | MBS (Optical; Motion Analysis Corp., Santa Rosa, CA; 100 Hz, 45-marker setup) | Accuracy (joint angles), validity | Joint angles differed <3° from MBS; accurate enough for ergonomic risk estimation in lifting scenarios | 12 healthy males (47.5 ± 11.3 years) |
Ojelade et al., 2025 [43] | Experimental validation + ML classification | Lab experiment (Virginia Tech lab) | Azure Kinect markerless MoCap + RNNs (Bi-LSTM, GRU, BGRU) | RNN models and feature sets compared | Accuracy, precision, recall, and F1 for task/hand/lift classification | Best ≈ 93% accuracy; TOP-80 > TOP-60; hand ≈ 97% acc., lift origin ≈ 80–84%; GRU+TOP-80 efficient | 36 healthy adults (14 F), right-handed |
Otto et al., 2019 [12] | Evaluation study | Lab (automotive setting) | MCBS (Kinect V2) | Expert evaluation (EAWS) | Validity, feasibility | 9 out of 11 EAWS postures accurately captured; sitting and standing performed well; lying poses not reliably tracked | 3 adult volunteers |
Patrizi et al., 2016 [36] | Validation study | Real working tasks | MCBS (Kinect V1) | MBS (BTS) | Posture score agreement | MCBS Kinect system provided scores close to BTS system; supports Kinect as cost-effective alternative | 3 adult participants |
Plantard et al., 2016 [38] | Observational study | Lab + real workplace | MCBS (Kinect V2) | Human experts + Vicon (MBS) | Validity (RULA), accuracy | Kinect data accurately predicted RULA scores; robust in cluttered industrial settings | 12 volunteers + 7 workers |
Plantard et al., 2017 [37] | Validation study | Simulated lab work tasks | MCBS (Kinect V2) | MBS (Vicon) | Accuracy, reliability, validity | Kinect showed high correlation with MBS; feasible for ergonomic risk assessment in occluded environments | 12 adults (30.1 ± 7.0 years) |
Seo et al., 2019 [39] | Comparative study | Construction tasks | MCBS (RGB-D, Stereo, Multi-cam) | MBS (Optotrak) | Joint-angle error, feasibility | Vision-based systems had 5–10° error; angular sensor ~3°; suitable for field posture screening with minor trade-offs | |
Van Crombrugge et al., 2022 [40] | Validation study | Assembly lab simulation | MCBS (RealSense + Detectron2) | MBS (VICON) | Joint angles, REBA | 3D skeletons from triangulated 2D joint detection estimated key joint angles with moderate error | 1 adult male |
Wong et al., 2013 [41] | Validation study | Operating theatre simulation | Hybrid (MCBS vision and Biomotion + Wearable (IMU) | MBS (BTS SMART-D) | Accuracy (pose estimation), tracking fidelity | Combining visual head tracking and inertial data achieved accurate motion capture under occlusion | Surgical team (simulated) |
Study | Accuracy (Metrics) | Validity (Comparison Basis) | Reliability | Parameters Assessed |
---|---|---|---|---|
Abobakr et al. (2019) [29] | High precision from AI-based system; RMSE values within clinical acceptability | Compared estimated joint trajectories against labelled datasets | Not evaluated | Whole-body joint angles, AI vision system performance |
Boldo et al. (2024) [30] | High agreement reported; 3D postural estimates aligned with ergonomic models | Compared visual system output with expert ergonomic review | Reliability discussed qualitatively; no formal metrics | Multi-joint angles, postural scoring via AI |
Bonakdar et al. (2025) [42] | Back-angle r ≈ 0.95 vs. MB-OMC/IMU (RMSE 6.5–9.9°); L5-S1 JRF r ≈ 0.91 (nRMSE ~9%); REBA 87% match to MB-OMC | Compared with marker-based OMC, IMUs, force plates, and expert visual REBA | High consistency across trials; statistical agreement with MB-OMC | Joint angles, joint reaction forces (L5-S1, hip, knee, elbow), REBA scores |
Bortolini et al. (2018) [13] | Postural scores from MAS matched expert EAWS sections; semi-automated scoring evaluated qualitatively | Postural scoring compared with manually scored EAWS | Reliability discussed, not statistically assessed | Assembly line posture, EAWS components |
Brunner et al. (2022) [17] | Mean deviation: 14.04° in axial rotation, maximum deviations in self-occluded positions | Compared MCBS Kinect v2 against MBS Vicon Bonita | Not evaluated | Static upper body poses; trunk rotation accuracy |
Eldar (2020) [15] | Posture and comfort compared across work configurations; scoring deviations not numerically detailed | RULA scoring and subjective feedback used to assess workstation ergonomics | Not evaluated | Neck, spine, and shoulder postures during tablet use |
Fan et al. (2024) [31] | Large dataset used; model accuracy tested with human pose database; joint detection error metrics reported | High model alignment with reference poses in construction simulation | Reliability discussed qualitatively; no formal repeatability testing | Joint center detection, construction task simulation |
Jiang et al. (2024) [32] | RMSE between estimated and reference L5-S1 loads calculated using biplanar radiography and biomechanical modeling | Good agreement between estimated and reference spinal loads; qualitative and quantitative validation | Not formally tested; mentions consistency over trials | L5-S1 spinal load (force and moment components) |
Li et al. (2018) [14] | Numeric accuracy reported. Overall corr. for vertical angles r = 0.80 (up to 0.94 for hand/lower arm/upper and lower leg); trunk flexion (S3) avg. diff. = 10.24°, avg. error = 11%; 14/20 angles < ±14% error | Workstation assessment validity supported by ergonomic expert consensus, no statistical evaluation | Not evaluated | 41 joint angles; lifting task; explicit difference/error equations reported for comparisons |
Liu et al. (2022) [33] | 3D landmark error: typically, ~3–12 cm depending on landmark and posture; largest errors at hips in occluded postures (e.g., ~20–35 cm for mid-hip/hips in some sitting/stoop conditions); use when you want the per-landmark nuance, otherwise keep the averages above | Compared landmark detection using OpenPose vs. marker-based motion tracking and inertial sensors | Not evaluated | Whole-body 3D body landmarks during static postures |
Manghisi et al. (2017) [34] | Acceptable deviation from reference; REBA/EAWS estimates generated in near-real time | AI-driven model tested against expert EAWS scorings | No formal metrics, but repeatability discussed | Joint angles, ergonomic scoring (REBA, EAWS) |
Mehrizi et al. (2018) [16] | Joint-angle error <3° compared to gold standard; strong agreement in sagittal plane | Angle accuracy validated against optical MBS | Qualitative repeatability noted; no formal metrics | Symmetrical lifting task: joint angles, body segment tracking |
Ojelade et al., 2025 [43] | Task classification = 93% agreement; hand classification ≈ 96–97%; lift origin ≈ 80–84% (F1-scores similar) | Compared performance across RNN models (Bi-LSTM, GRU, BGRU) and feature sets (TOP-60, TOP-80) | Stable across cross-validation folds; low variance between runs | Task type classification, hand configuration classification, lift origin classification |
Otto et al. (2019) [12] | Not reported numerically; visual posture accuracy confirmed for 9/11 EAWS movements | Ergonomic classification visually compared with EAWS expert assessments | Not tested; performance inconsistencies noted in crouched/lying positions | Postural assessment (EAWS), standing/sitting/crouching |
Patrizi et al. (2016) [36] | Joint-angle deviations from reference system (BTS); correlation reported between Kinect and BTS scores | NIOSH lifting metrics and RULA compared; strong qualitative match | Qualitative repeatability noted; no formal metrics | Trunk and arm angles, NIOSH risk multipliers |
Plantard et al. (2016) [38] | RMSE in RULA score: 0.22–0.68; correlation of shoulder angles: r = 0.68–0.98 across conditions | Compared RULA scores computed from Kinect with expert assessments; κ = 0.46–0.66; 73–74% agreement in real work | Not formally assessed; limitations noted in occluded conditions | RULA scores, shoulder and elbow joint angles |
Plantard et al. (2017) [37] | Joint-angle cross-correlation with Vicon: r = 0.65–0.99; RMSE for joint torque values (nRMSE: 10.6–29.8%) | Joint torques and angles compared with Vicon; acceptable RMSE and correlation; residual force validation under 3.5% | Mean Kinect joint reliability: 0.70–0.91, depending on occlusion level | Shoulder/elbow joint angles, joint torques, residual forces |
Seo et al. (2019) [39] | Joint-angle error: 5–10° for vision systems, 3° for encoder-based systems | Accuracy validated against MBS (Optotrak) data | Not evaluated | Construction task joint tracking (shoulder, trunk) |
Van Crombrugge et al. (2022) [40] | RMS joint-angle error: 12° using combined triangulation; R2 values: 0.43–0.89 for key REBA angles | Joint-angle comparison with VICON system; REBA angle alignment visualized | Not evaluated | Shoulder, elbow, and trunk joint angles relevant to REBA |
Wong et al. (2013) [41] | Numeric accuracy reported (pixels) for head detection, not joint-angle RMSE: mean (±SD) error range ≈ 7.9–85.3 px across single-/multi-person conditions | Compared head pose tracking with BTS system in simulated surgery | Not evaluated | Head tracking under occlusion; IMU + visual detection |
Study | Reliability | Validity | Accuracy | Conclusion | Rating |
---|---|---|---|---|---|
Abobakr et al. (2019) [29] | - | V | V | AI-based assessment with strong accuracy validation | V |
Boldo et al. (2024) [30] | A | V | A | High validity and accuracy; significant reliability concerns | V |
Bonakdar et al., (2025) [42] | A | A | A | High consensus of accuracy, validity, and reliability | A |
Bortolini et al. (2018) [13] | A | V | A | Strong validity but lacks formal test–retest reliability | A |
Brunner et al. (2022) [17] | D | A | A | High validity but significant trunk rotation errors | D |
Eldar (2020) [15] | D | V | I | Some reliability issues; accuracy not reported | A |
Fan et al. (2024) [31] | A | V | A | Large dataset validation but lacks test–retest reliability | A |
Jiang et al. (2024) [32] | A | V | V | Strong validity and accuracy; minor reliability concerns | V |
Li et al. (2018) [14] | D | D | A | Lacked reliability testing but accuracy is adequate | A |
Liu et al. (2022) [33] | - | V | A | High validity, accuracy is acceptable | A |
Manghisi et al. (2017) [34] | A | V | A | Strong validity with minor inter-rater concerns | A |
Mehrizi et al. (2018) [16] | A | V | V | High validity and accuracy, minimal bias risk | V |
Ojelade et al., 2025 [43] | A | V | V | High accuracy and reliability, moderate validity | V |
Otto et al. (2019) [12] | I | A | I | Limited reliability; validity is acceptable | D |
Patrizi et al. (2016) [36] | - | A | A | Minor validity concerns, overall acceptable | A |
Plantard et al. (2016) [38] | D | A | A | Adequate validity but lacks robust reliability testing | A |
Plantard et al. (2017) [37] | A | A | V | Some reliability concerns but strong accuracy | A |
Seo et al. (2019) [39] | - | V | A | Strong validity, self-occlusion challenges exist | A |
Van Crombrugge et al. (2022) [40] | - | V | A | High validity but lacks dynamic testing | A |
Wong et al. (2013) [41] | D | A | A | Adequate validity; lacks strong reliability testing | A |
Measurement Property | No. of Studies | EBRO Level of Evidence | Summary Justification | Certainty Level |
---|---|---|---|---|
Accuracy | 18 | A2 | Consistent findings from multiple high-quality cohort/validation studies with RMSE < 5°, high correlations (r > 0.85), and joint-angle errors under 10° in most conditions | High |
Validity | 17 | B | Moderate to strong agreement with expert-rated RULA/REBA/EAWS scores; κ up to 0.71; studies well-aligned with ergonomic constructs | Moderate |
Reliability | 6 | C | Limited statistical evaluation (ICCs in a few studies); other studies described repeatability qualitatively or not at all; inconsistent or missing reporting | Low |
Study | Accuracy Reported | Validity Reported | Reliability Reported |
---|---|---|---|
Abobakr et al., 2019 [29] | 1 | 1 | 0 |
Boldo et al., 2024 [30] | 1 | 1 | 1 |
Bonakdar et al., 2025 [42] | 1 | 1 | 1 |
Bortolini et al., 2018 [13] | 1 | 1 | 0 |
Brunner et al., 2022 [17] | 1 | 1 | 0 |
Eldar and Fisher-Gewirtzman, 2020 [15] | 0 | 1 | 0 |
Fan et al., 2024 [31] | 1 | 1 | 1 |
Jiang et al., 2024 [32] | 1 | 1 | 0 |
Li et al., 2018 [14] | 1 | 0 | 0 |
Liu et al., 2022 [33] | 1 | 1 | 0 |
Manghisi et al., 2017 [34] | 1 | 1 | 1 |
Mehrizi et al., 2018 [16] | 1 | 1 | 1 |
Ojelade et al., 2025 [43] | 1 | 1 | 1 |
Otto et al., 2019 [12] | 0 | 1 | 0 |
Patrizi et al., 2016 [36] | 1 | 1 | 1 |
Plantard et al., 2016 [38] | 1 | 1 | 0 |
Plantard et al., 2017 [37] | 1 | 1 | 1 |
Seo et al., 2019 [39] | 1 | 1 | 0 |
Van Crombrugge et al., 2022 [40] | 1 | 1 | 0 |
Wong et al., 2013 [41] | 1 | 1 | 0 |
Total | 18 | 19 | 8 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Scataglini, S.; Fontinovo, E.; Khafaga, N.; Khan, M.; Faizan Khan, M.; Truijen, S. A Systematic Review of the Accuracy, Validity, and Reliability of Markerless Versus Marker Camera-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis. Sensors 2025, 25, 5513. https://doi.org/10.3390/s25175513
Scataglini S, Fontinovo E, Khafaga N, Khan M, Faizan Khan M, Truijen S. A Systematic Review of the Accuracy, Validity, and Reliability of Markerless Versus Marker Camera-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis. Sensors. 2025; 25(17):5513. https://doi.org/10.3390/s25175513
Chicago/Turabian StyleScataglini, Sofia, Eugenia Fontinovo, Nouran Khafaga, Muhammad Khan, Muhammad Faizan Khan, and Steven Truijen. 2025. "A Systematic Review of the Accuracy, Validity, and Reliability of Markerless Versus Marker Camera-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis" Sensors 25, no. 17: 5513. https://doi.org/10.3390/s25175513
APA StyleScataglini, S., Fontinovo, E., Khafaga, N., Khan, M., Faizan Khan, M., & Truijen, S. (2025). A Systematic Review of the Accuracy, Validity, and Reliability of Markerless Versus Marker Camera-Based 3D Motion Capture for Industrial Ergonomic Risk Analysis. Sensors, 25(17), 5513. https://doi.org/10.3390/s25175513