Next Article in Journal
IoT-Enabled Fog-Based Secure Aggregation in Smart Grids Supporting Data Analytics
Previous Article in Journal
Analysis of Physiological Parameters and Driver Posture for Prevention of Road Accidents: A Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Accessible AI-Assisted Rehabilitation System for Guided Upper Limb Therapy

by
Kevin Hou
1,
Md Mahafuzur Rahaman Khan
2,* and
Mohammad H. Rahman
2
1
Brookfield Central High School, Brookfield, WI 53005, USA
2
Department of Mechanical Engineering, University of Wisconsin-Milwaukee, Milwaukee, WI 53211, USA
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(19), 6239; https://doi.org/10.3390/s25196239
Submission received: 23 July 2025 / Revised: 20 September 2025 / Accepted: 2 October 2025 / Published: 8 October 2025
(This article belongs to the Section Biomedical Sensors)

Abstract

Conventional upper limb rehabilitation methods often encounter significant obstacles, including high costs, limited accessibility, and reduced patient adherence. Emerging technological solutions, such as telerehabilitation, virtual reality (VR), and wearable sensor-based systems, address some of these challenges but still face issues concerning supervision quality, affordability, and usability. To overcome these limitations, this study presents an innovative and cost-effective rehabilitation system based on advanced computer vision techniques and artificial intelligence (AI). Developed using Python (3.11.5), the proposed system utilizes a standard webcam in conjunction with robust pose estimation algorithms to provide real-time analysis of patient movements during guided upper limb exercises. Instructional exercise videos featuring an NAO robot facilitate patient engagement and consistency in practice. The system generates instant quantitative feedback on movement precision, repetition accuracy, and exercise phase completion. The core advantages of the proposed approach include minimal equipment requirements, affordability, ease of setup, and enhanced interactive guidance compared to traditional telerehabilitation methods. By reducing the complexity and expense associated with many VR and wearable-sensor solutions, while acknowledging that some lower-cost and haptic-enabled VR options exist, this single-webcam approach aims to broaden access to guided home rehabilitation without specialized hardware.

1. Introduction

Musculoskeletal disorders significantly contribute to the global burden of disability, affecting approximately 1.7 billion individuals worldwide and representing the leading cause of long-term impairment [1]. Disorders involving the upper limb are particularly debilitating, as they severely restrict essential daily activities. The causes behind upper limb impairments are numerous and include repetitive strain injuries due to occupational overuse, neurological impairments from conditions like stroke, acute traumas such as fractures, and degenerative diseases like osteoarthritis [1,2,3,4]. Among these, stroke is notably prevalent, often leaving survivors with persistent motor deficits in the upper extremities [3]. Beyond stroke, upper limb motor impairments also arise in cerebral palsy [5], Parkinson’s disease, and multiple sclerosis, where structured, repetitive practice can improve range of motion, strength, and task ability [6,7,8]. Including these populations clarifies the general need for accessible, feedback-rich home exercise. Across these populations, structured rehabilitation improves upper limb outcomes: systematic reviews on Parkinson’s disease show benefits for hand dexterity and function, reviews on multiple sclerosis support effects of upper limb training, and CP guidelines synthesize effective motor treatments [6,9,10].
Despite the high incidence and substantial impact of upper limb impairments, considerable barriers remain regarding accessibility, affordability, and patient adherence to rehabilitation programs. In low-resource settings, geographical barriers, inadequate availability of qualified therapists, and prohibitive treatment costs severely restrict access to rehabilitation services. Alarmingly, it has been estimated that 90% of individuals in low-income countries requiring rehabilitation do not receive appropriate care [11]. Even where rehabilitation is available, long-term patient engagement remains problematic. Rates of non-adherence to prescribed home exercise programs are notably high, ranging from 50% to 70% among individuals with musculoskeletal conditions [12]. Consequently, these significant challenges markedly diminish the overall efficacy of conventional rehabilitation strategies.
Technological interventions, including virtual reality (VR), wearable sensors, and telerehabilitation via video conferencing, have been explored to address these barriers. However, each approach has critical shortcomings. VR platforms can deliver immersive practice with rich feedback, and some systems incorporate haptic/force cues; entry-level VR has also reduced financial barriers [13]. In parallel, low-cost VR and exergaming options (e.g., consumer HMD/serious-game systems; Nintendo® Wii-based programs in CP) have reduced entry barriers compared with those for clinic-grade rigs [5,14]. Nevertheless, these typically require specialized hardware and setup, which can limit deployment in low-resource homes. Wearable sensors (e.g., IMUs) offer illumination independence and robustness to occlusion while maintaining excellent portability, remaining independent of ambient illumination and line-of-sight occlusion; however, they require additional devices for donning and doffing [15,16]. Our goal is a minimal-equipment path that complements these approaches when sensors or head-mounted displays are not practical [17]. Wearable sensors such as inertial measurement units and sensor-integrated gloves provide accurate motion tracking but require users to manage additional hardware, potentially reducing usability and adherence [18]. Likewise, conventional telerehabilitation through video calls expands accessibility but fails to deliver the precise, real-time feedback necessary for adequate exercise supervision [19]. Hence, a clear need persists for rehabilitation solutions that combine technological accessibility, minimal equipment requirements, and interactive, real-time feedback.
This work’s novelty is the integration of three elements into one accessible, single-webcam platform: (i) per-session, individualized range of motion calibration used to qualify repetitions; (ii) socially assistive guidance via short, standardized NAO demonstrations; and (iii) real-time quantitative feedback, including a within-session accuracy analysis that summarizes the average repetition and reports the RMSE relative to a didactic target [20,21]. We then evaluate the system by validating the angle accuracy against a reference setup and by assessing the within-session performance change and user feedback in home-style trials.

2. Background Study

Markerless computer vision has emerged as a highly effective approach to rehabilitation monitoring, eliminating the need for wearable sensors or markers [22]. Pose estimation technologies, notably OpenPose [23] and MediaPipe [24], facilitate the real-time detection and tracking of skeletal landmarks using standard webcams. These technologies provide clinicians with accurate joint angle measurements and range of motion analyses, offering a cost-effective and accessible alternative to traditional motion capture systems. Previously prevalent hardware-based solutions, such as Microsoft’s Kinect, have gradually been replaced by simpler, webcam-based alternatives tailored to home-based rehabilitation scenarios [22]. Validation studies support the efficacy of these markerless approaches, demonstrating accuracy comparable to that of traditional motion-capture systems for assessing various rehabilitation exercises, including squats, and gait analyses [25,26].
Despite these advantages, single-camera pose estimation presents certain limitations, particularly concerning depth ambiguity and occlusion. A single 2D perspective limits accurate tracking of movements toward or away from the camera, potentially reducing measurement precision [27]. Occlusion—occurring when limbs overlap—further complicates accurate joint detection [25]. To address these challenges, advancements such as 3D pose estimation methods involving multi-camera configurations or depth estimation algorithms, such as MediaPipe’s BlazePose, have been proposed [24]. Additionally, careful patient positioning and exercise orientation can mitigate occlusion issues, enhancing the reliability of single-camera setups. For example, fixed frontal or lateral camera placements have been successfully utilized in stroke rehabilitation to maximize limb visibility and tracking accuracy [28].
Beyond motion tracking alone, rehabilitation effectiveness is also significantly enhanced by visual guidance provided by humanoid robots and virtual avatars. Socially assistive robots, exemplified by the NAO robot from SoftBank Robotics, contribute positively to patient motivation and compliance through engaging physical demonstrations combined with verbal encouragement [29,30]. Research by Raso et al. demonstrated improved smoothness and temporal control of shoulder rehabilitation exercises when guided by a NAO robot compared to traditional approaches [30]. Similarly, virtual avatars used in digital coaching platforms have effectively instructed patients, demonstrating comparable clinical outcomes and improved compliance relative to conventional methods [31].
Integrating these technologies, recent home-based telerehabilitation systems have leveraged real-time feedback via markerless pose estimation. Adolf et al. introduced an OpenPose-based system that effectively tracked joint movements during home exercises, albeit with occasional accuracy reductions in certain positions such as floor-based exercises [25]. Cóias et al. developed a webcam-assisted virtual coaching system specifically targeting stroke rehabilitation, successfully providing real-time audiovisual feedback to correct excessive trunk compensatory movements [28]. These systems have been validated for efficacy and patient acceptability, particularly critical during increased remote rehabilitation adoption driven by the COVID-19 pandemic [31].
Nevertheless, current systems continue to rely heavily on specialized hardware or exhibit limited real-time responsiveness, particularly when deployed on low-powered devices, thereby affecting accessibility and usability [22,27,28]. As summarized in Table 1, a range of prior work on webcam-based systems, humanoid robots, and wearables has highlighted specific limitations such as occlusion and setup complexity. Additionally, purely virtual coaching solutions often lack the physical interaction and motivational presence provided by humanoid robots. Addressing these gaps, the proposed system integrates real-time pose estimation using MediaPipe, demonstrative guidance from the NAO robot, and immediate, automated corrective feedback. This innovative approach provides an affordable, user-friendly, and scalable rehabilitation solution suitable for widespread deployment in home settings, aiming to significantly enhance patient compliance and overall rehabilitation outcomes.

3. Methodology

3.1. Exercise Selection and NAO Robot Demonstration

Four exercises targeting the shoulder and elbow joints were selected based on clinical evidence, relevance, and documented effectiveness in enhancing functional recovery of upper limb impairments. The selected exercises included.
Shoulder Abduction and Adduction: These exercises strengthen the deltoid and supraspinatus muscles, improving shoulder stability, and are fundamental for restoring the ability to perform lateral reaching. They are especially important for patients recovering from conditions such as rotator cuff injuries, adhesive capsulitis, and hemiparesis following stroke [35,36]. Additionally, these movements enhance scapulohumeral rhythm, reduce muscular imbalance, and prevent joint dysfunction [37].
Shoulder Flexion and Extension: Crucial for activities such as forward-reaching, overhead lifting, and pushing/pulling movements required in daily life; incorporation of these exercises is critical following adhesive capsulitis, tendonitis, and fractures involving the proximal humerus, all of which lead to limitations in independence. Regular execution significantly improves range of motion (ROM), reduces pain, and accelerates functional rehabilitation [36,38].
Elbow Flexion and Extension: Essential for enhancing joint mobility and strength, these exercises help manage conditions such as post-fracture stiffness, tendinopathies, and stroke-related impairments, aiding in reductions in stiffness and spasticity and improving daily function [38,39].
Shoulder External and Internal Rotation: These movements play a vital role in rotator cuff rehabilitation, shoulder joint stability, and the management of conditions like adhesive capsulitis. Properly structured rotational exercises effectively reduce shoulder pain and enhance functional ROM [36,37].
To facilitate patient engagement, each exercise was visually demonstrated by the humanoid NAO robot (SoftBank Robotics), shown in Figure 1. Demonstration motions were developed using Choregraphe’s timeline editor, precisely defining the keyframes for initial, intermediate, and final exercise positions. Intermediate positions were automatically interpolated, resulting in smooth, natural motions. While the demonstrations were informed by exercises documented in the clinical literature, they were not directly supervised or formally validated by physiotherapists, which we recognize as a limitation of this study. Robot demonstrations were captured as short video clips (10–15 s) and integrated within the rehabilitation software, providing clear visual instructions that patients could easily follow [30,40].

3.2. Pose Estimation and Joint Angle Calculations

The system utilizes MediaPipe Pose (landmark map shown in Figure 2) to detect critical anatomical landmarks in real time from the webcam input, a framework known for high accuracy in human pose tracking [24]. Joint angles were computed using vector mathematics based on the extracted landmark coordinates, with each equation described in order below. All variables are defined explicitly to ensure reproducibility.

3.2.1. General Joint Angle Calculation

Given three anatomical landmarks represented by points A ( x a , y a ) , B ( x b , y b ) , and C ( x c , y c ) , the vectors V 1 and V 2 are defined as
V 1 = B A = ( x a x b , y a y b ) ,
V 2 = B C = ( x c x b , y c y b ) .
The dot product of the two vectors is
V 1 · V 2 = ( x a x b ) ( x c x b ) + ( y a y b ) ( y c y b ) ,
and their magnitudes are
| | V 1 | | = ( x a x b ) 2 + ( y a y b ) 2 ,
| | V 2 | | = ( x c x b ) 2 + ( y c y b ) 2 .
The angle θ at joint B is then given by the cosine rule:
θ = cos 1 V 1 · V 2 | | V 1 | | | | V 2 | | .
This general approach was used for shoulder abduction/adduction, shoulder flexion/extension, and elbow flexion/extension.

3.2.2. Shoulder Medial/Lateral Rotation

Direct measurement of shoulder medial/lateral rotation is difficult using a 2D camera setup. Instead, the forearm was used as an indicator, with correction for perspective distortion. The forearm vector is defined as
v forearm = ( w x e x , w y e y ) ,
where ( e x , e y ) and ( w x , w y ) are the elbow and wrist coordinates, respectively.
The forearm length is
| v forearm | = ( w x e x ) 2 + ( w y e y ) 2 .
Because elbow flexion alters the perceived forearm length in 2D, a correction factor was applied:
L corrected = | v forearm | · sin ( θ elbow ) ,
where θ elbow is the elbow joint angle, defined as
θ elbow = cos 1 v upperarm · v forearm | | v upperarm | | | | v forearm | | ,
and v upperarm = ( e x s x , e y s y ) with ( s x , s y ) as the shoulder coordinates.
Finally, the medial/lateral rotation angle is estimated as
θ rotation = sin 1 L corrected L max ,
where L max is the maximum calibrated forearm length during full extension.

3.3. System Workflow

Upon initiation, the system presents an intuitive start menu, designed with Tkinter, that enables the patient to select their target limb (left or right arm) and the specific joint (elbow or shoulder) intended for rehabilitation. Following the selection, the system activates either an integrated or external webcam to track the patient’s movements. Concurrently, an instructional video demonstrating the exercise, performed by the NAO robot, is displayed alongside a real-time webcam feed. Clear and informative overlays—including exercise instructions, joint angle measurements, and helpful visual diagrams rendered via OpenCV—provide immediate, actionable feedback, see Figure 3.
The rehabilitation session begins with a calibration phase, wherein the patient is instructed to perform one repetition of the demonstrated exercise at their maximum achievable range. Using the MediaPipe BlazePose pose estimation model, the system records and evaluates joint angles during this initial repetition. Suppose the performed repetition does not reach the full ideal range of motion criteria, defined using clinical sources [41,42]: the system detects the discrepancy and dynamically adjusts the subsequent exercise parameters, ensuring personalized and realistic therapeutic goals that align with the patient’s current capabilities.
Throughout the session, the system continuously captures video frames from the webcam and processes them using the BlazePose model to identify anatomical landmarks accurately. These 2D landmark coordinates enable the precise computation of joint angles through trigonometric and vector calculations. Real-time joint angle measurements are then compared against established ideal ranges, determined during the calibration phase, allowing the system to accurately assess the quality and phase (such as “up” or “down” and “full extension” or “full flexion”) of each executed movement.
Patients perform the exercises guided by the NAO robot demonstration, and clear graphical feedback is displayed on the screen. Instructions and corrective feedback are dynamically overlaid onto the live webcam feed, facilitating immediate corrections and ensuring adherence to the desired movement patterns. Upon successful completion of the required range of motion for each phase, repetitions are automatically counted. Incomplete or incorrect attempts are disregarded, though future developments could record these failed reps to track individual progress throughout multiple sessions. After achieving a set of ten repetitions, the system seamlessly transitions to the subsequent exercise. This calibration and feedback loop is systematically repeated for each prescribed exercise targeting the chosen joint. The rehabilitation session concludes automatically upon the completion of all assigned exercises or manually upon the patient’s request. This adaptive, responsive design allows the system to continuously tailor the therapeutic experience to the patient’s evolving needs, ensuring personalized, effective, and engaging rehabilitation.

4. Experimental Setup

4.1. Participants and Informed Consent

Thirteen volunteers participated in total, all of whom were healthy adults ( A B 18 years old) with no history of upper limb injuries, neurological disorders, or musculoskeletal impairments, and were gathered through networking within the local community. Inclusion criteria required the full cognitive ability to understand and follow instructions. Individuals with current or past shoulder/elbow pathology, surgical history, or ongoing rehabilitation were excluded. Participant demographics are summarized in Table 2.
Prior to participation, all individuals received a detailed explanation of the study objectives and procedures and any potential risks. Written informed consent was obtained from each participant. This study involved minimal risk, and no identifiable personal data were collected.

4.2. Experiment Overview

The first experiment, an angle calculation validation testing (shown in Figure 4), was conducted in a quiet, well-lit room with ample space to conduct safe and comfortable upper limb movements. The room contained a single plain-colored flat wall. Clearly visible physical angle markers (30°, 45°, 60°, 90°, 120°, 135°, 150°, and 180°) were drawn on big pieces of paper and pinned onto the wall as visual and measurable guides for correct limb placement.
The rehabilitation setup consisted of a laptop with an ordinary webcam that the rehabilitation program ran on. It was placed onto a stable table in front of the participant at waist height, approximately 1 m from the participant. This configuration ensured the maximum angle coverage and unobstructed visibility of participant movement.
Five volunteer participants aged 18–65, recruited through local networking, all of whom had no history of upper limb injury or mobility impairment, undertook the trials. This sample aims to ensure the required degree of poses are properly executed in line with the physical markers on the wall to eliminate confounding factors.
The participants performed the series of arm exercises, placing their limb precisely in line with each of the prescribed angle markers on the wall. During each trial, the rehab software determined the joint angle in real time using its internal angle calculation functions. Simultaneously, both the visibly confirmed angle from the wall markers and the system-determined angle were recorded for comparison. The recorded system angles for each angle marker were averaged for the participants.
This controlled test aimed to measure the program’s joint angle detection accuracy by pitting its software-calculated angles against conventional physical angle markers. This validation step was undertaken to determine the reliability and accuracy of the program’s real-time feedback mechanisms in clinical and home-based rehabilitation use with the possibility of high occlusion, especially in home-based settings.
For the next experiment (the example shown in Figure 5), the 13 healthy participants, aged 18–65, recruited through networking within the local community, performed all exercises using the system. The trials were conducted inside the participants’ homes in order to effectively emulate the home-based application of the program. The objective was to determine whether the system feedback improved a patient’s exercise accuracy. This is undertaken by splitting the detected cycles into the first half and second half, computing the average range of motion (ROM) for each half, repeating this across all participants and finding the grand average for each half, modeling the average repetitions using sinusoidal functions, calculating their RMSEs (Root Mean Square Errors) against the ideal trajectory, and comparing the calculated error between the first half and second half to determine accuracy improvements. A post-survey was also filled out by all participants which evaluated various aspects of the system based on personal experience.

4.3. Data Splitting and Cycle Segmentation

The raw angle data is collected and stored in an array by the program for processing after the session ends.
Data Acquisition: The angle data are recorded at uniform time intervals (every 0.5 s).
Cycle Definition: Each cycle represents a complete movement from a minimum (baseline) to a maximum (peak) and back to a minimum.
Trough Detection: Local minima (troughs) are detected in the data using an algorithm (by applying the find_peaks function on the negative of the data). Each cycle is defined from one trough to the next.
For each cycle,
ROM = Peak Baseline
where the baseline is the angle at the trough, and the peak is the maximum angle within that cycle.

4.4. Splitting into Two Halves and Averaging

The angle data is then split into two halves and each half is averaged.
Cycle Grouping: Once cycles are identified using the find_peaks function from SciPy (1.13.0), they are split into two groups: the first half and the second half of the session.
Averaging: For each half, compute the average baseline and the average ROM:
Baseline ¯ = 1 n i = 1 n Baseline i , ROM ¯ = 1 n i = 1 n ROM i
This process is repeated for all participants’ data, and the resulting averages are averaged for each half.

4.5. Modeling the Average Rep

A normalized time axis is defined from t = 0 to t = 1 for one complete cycle, where t = 0 corresponds to the start of the repetition and t = 1 to its completion. Each individual repetition, regardless of its absolute duration, is rescaled to this unit interval so that intermediate points ( 0 < t < 1 ) represent proportional progress through the movement. This normalization allows repetitions of different lengths to be compared on a common temporal basis and enables averaging across participants without being affected by execution speed.
The average movement is modeled using a sinusoidal function:
Model ( t ) = Baseline ¯ + ROM ¯ · sin ( π t )
This function assumes that
  • At t = 0 and t = 1 , the angle is at the baseline (neutral);
  • At t = 0.5 , the movement reaches the peak.

4.6. Ideal Trajectory and RMSE Calculation

The ideal movement is defined as
Ideal ( t ) = IdealRange · sin ( π t )
For example, if the ideal ROM is 180°, then
Ideal ( t ) = 180 · sin ( π t )
The RMSE between the modeled average rep and the ideal function is calculated as
RMSE = 1 N i = 1 N Model ( t i ) Ideal ( t i ) 2
where t i are sample points on the normalized time axis.
This process is repeated across both halves of each exercise.

5. Results and Discussion

The results of this study demonstrate that the proposed computer-vision-based rehabilitation system effectively computes the patients’ joint angles and enhances their movement accuracy across a variety of upper limb exercises. The quantitative assessment using the Root Mean Square Error (RMSE) serves as an objective measure of improvement, and the observed trends align with the system’s intended function of guiding and refining patients’ exercise performance. The demographic data of the participants are shown in Table 2.

5.1. Angle Calculation Validation

The following tables display the trial results of the first experiment: the angle calculation validation test. Each trial consists of the angle value detected by the system for the corresponding physical angle value averaged across the five participants, which include participants with the IDs HP1–HP5.
Validation results indicate that joint angle detection using the system varied in terms of accuracy levels depending on which movement was being tested. Among the four movements that were tested, the smallest error in terms of the RMSE was for shoulder abduction/adduction (Table 3) at 1.04°, and the largest RMSE was for shoulder flexion/extension (Table 4) at 8.13°. These results suggest that the system performs best by tracking joint movements mostly in the coronal plane, in which joint characteristics are less occluded and angle changes are more apparent for a front-facing camera.
Elbow flexion/extension (Table 5) produced an RMSE of 8.05°, which, although less than that for shoulder flexion/extension, still indicates the presence of detectable deviation from the actual to the detected angles. This may be due to the reduced joint articulation and greater dynamic changes in angles characteristic of elbow motion and challenging for accuracy in 2D detection in some frames.
Shoulder medial/lateral rotation validation (Table 6) yielded an RMSE of 6.44°, indicating moderate accuracy. Inasmuch as the movement often involves minor internal rotations of the upper arm and potential occlusion of elbow or wrist landmarks, the ability of the system to maintain a sub-5° error in this exercise is evidence of error-free rotational movement measurements.
On every task, the estimated errors were well within acceptable limits for non-clinical rehabilitation environments, particularly when the system is used to track trends or identify large changes over time rather than for diagnostics. Overall, employing decimal-level precision in the RMSE measures maintains the evaluation as designed to reflect true detection fidelity, even when rounding to the nearest integer on the system interface for user simplicity.
Overall, quantitative verification demonstrates that the system offers clinically valid accuracy for a variety of upper limb movements, with a particularly strong performance for diverse motions with minimal occlusion and high-contrast vision.

5.2. Movement Accuracy Improvements

The following graphs show the modeled function of the detected angle value averages between the 13 participants during the second test compared with the ideal graph, separated into the first half and the second half in order to show the improvement.
There was a consistent reduction in the RMSE across all four targeted exercises when comparing the first half and the second half of the repetitions, shown in Table 7. The reduction indicates that the participants, by interacting with the system, became increasingly accurate in their movement patterns within a single session. Shoulder abduction/adduction (Figure 6), specifically, indicated a 29.3 percent decrease in the RMSE, from 17.08° to 12.08°. Shoulder flexion/extension (Figure 7) revealed a slightly higher decrease of 32.5 percent, with the RMSE decreasing from 6.80° to 4.59°. Elbow flexion/extension (Figure 8) showed the largest improvement, with a 42.3 percent decrease (from 5.98° to 3.45°), reflecting that the system could handle exercises well for less complex hinge-like joints. Shoulder medial/lateral rotation (Figure 9) also saw a notable RMSE decrease of 39.9 percent, from 3.53° to 2.12°. The different levels of improvement between exercises could correspond to differences in joint complexity and movement visibility in the field of view of the camera.

5.3. Interpretation and Clinical Relevance

These results demonstrate that real-time feedback supplied by the computer vision AI system can significantly improve the movement accuracy, achieving the primary research objective. Notably, even more complicated movements, such as shoulder rotation, showed noteworthy error reductions.
This result has significant clinical and home-based rehabilitation implications. Reduction in the RMSE reflects more accurate control of joint movements, which are essential to effective motor recovery. Through encouraging patients toward the proper performance, the system can facilitate compensation maneuver mitigation, injury risk minimization, and regular compliance to rehabilitation programs.
The participants’ response, shown in Table 8, also supports the system’s effectiveness and acceptability. The overall satisfaction was high, as expressed through the survey responses, with most areas of the system receiving high Likert-scale scores. The NAO robot video presentation had slightly lower (though still moderate) scores compared to those for other areas. This suggests that while the system’s functional guidance is good, its use of the NAO robot as a visual exemplar is less effective than originally intended and may potentially be improved by either making the animations more effective or by attempting alternative forms of demonstration.
Notably, the absence of low ratings (scores of 1 or 2) suggests that the system operated at or above user expectations for most participants. The occasional lower ratings likely reflect personal preference rather than systemic problems and could be addressed by adding more customization features such as color schemes, audio instructions, or font adjustments.

6. Conclusions

The ability of the system to provide measurable improvements in movement accuracy without requiring expensive equipment, dedicated sensors, or constant professional oversight highlights its potential to expand access to effective rehabilitation. Its affordability, accessibility, and efficacy position it as a suitable option for supplementing home-based rehabilitation, particularly for patients facing economic barriers to traditional therapy.
Nevertheless, important limitations remain. Webcam-based systems are inherently susceptible to occlusions, especially in uncontrolled home environments, which may reduce tracking accuracy. Future work should address this challenge by exploring hybrid solutions that combine vision-based tracking with wearable or depth-sensing technologies. In particular, inertial measurement units (IMUs) have been shown to enhance motion tracking accuracy in rehabilitation contexts [43]. Kinect-based methods for investigating joint movement connections have also demonstrated promising results [44,45,46] and could be adapted to improve accuracy and robustness in our framework. However, these will be optional, as the additional expensive hardware can be unaffordable to some.
Further developments will also include expanding the system to support more complex compound movements, implementing bilateral execution of motor tasks (e.g., shoulder flexion) to better reflect real-life functionality, and integrating robotic assistance when required for impaired users. Moreover, while incomplete or incorrect repetitions are excluded from sessions currently, future versions of the system could log such attempts as valuable indicators of user progress, enabling longitudinal tracking and adaptive feedback during rehabilitation. Additionally, features such as personalized exercise plan generation and visual customization can further improve flexibility and adaptability across diverse patient populations.

Author Contributions

Conceptualization: K.H. and M.H.R.; data curation: K.H. and M.M.R.K.; formal analysis: K.H. and M.M.R.K.; investigation: K.H. and M.M.R.K.; methodology: K.H. and M.M.R.K.; software: K.H.; supervision: M.H.R.; validation: K.H., M.M.R.K., and M.H.R.; visualization: K.H. and M.M.R.K.; writing—original draft: K.H. and M.M.R.K.; writing—review and editing: K.H., M.M.R.K., and M.H.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study involved minimal risk and included only healthy adult volunteers performing non-interventional, non-clinical activities. According to U.S. federal regulations [45 CFR 46.104(d)(2)], research that involves only the collection and analysis of data obtained through non-invasive observation of behavior is exempt from IRB review. No identifiable data were collected, and all participants gave their written informed consent in accordance with the Declaration of Helsinki.

Informed Consent Statement

This study involved minimal risk and was conducted with healthy adult volunteers who provided written informed consent prior to participation. No identifiable personal data were collected, and no clinical procedures or interventions were performed.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author [Kevin Hou] upon reasonable request.

Acknowledgments

The authors acknowledge the use of ChatGPT 4.5 for idea exploration and language improvement.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. World Health Organization. Musculoskeletal Conditions (Fact Sheet). 2022. Available online: https://www.who.int/news-room/fact-sheets/detail/musculoskeletal-conditions (accessed on 10 June 2025).
  2. Lacerda, E.M.; Nacul, L.C.; Augusto, L.G.d.S.; Olinto, M.T.A.; Rocha, D.C.; Wanderley, D.C. Prevalence and associations of symptoms of upper extremities, repetitive strain injuries (RSI) and “RSI-like” condition: A cross-sectional study of bank workers in Northeast Brazil. BMC Public Health 2005, 5, 107. [Google Scholar] [CrossRef]
  3. Hayward, K.S.; Kramer, S.F.; Dalhousie, E.J.; Shearer, J.; Carroll, T.; Lynch, E.; Janssen, H.; Kho, M.H.; Turkstra, L.S.; Senserrick, T.; et al. Timing and dose of upper limb motor intervention after stroke: A systematic review. Stroke 2021, 52, 3706–3717. [Google Scholar] [CrossRef]
  4. Khan, M.M.R.; Sunny, M.S.H.; Ahmed, T.; Shahria, M.T.; Modi, P.P.; Zarif, M.I.I.; De Caro, J.D.S.; Ahamed, S.I.; Ahmed, H.U.; Ketchum, E.M.M.; et al. Development of a robot-assisted telerehabilitation system with integrated IIoT and digital twin. IEEE Access 2023, 11, 70174–70189. [Google Scholar] [CrossRef]
  5. Suglia, V.; Maselli, A.; Ranieri, G.; Di Capua, R.; Lancia, S.; Fiore, G.; Lanzilotti, R.; Bevilacqua, V. A Serious Game for the Assessment of Visuomotor Adaptation Capabilities during Locomotion Tasks Employing an Embodied Avatar in Virtual Reality. Sensors 2023, 23, 5017. [Google Scholar] [CrossRef]
  6. Faccioli, S.; Pagliano, E.; Ferrari, A.; Maghini, C.; Siani, M.F.; Sgherri, G.; Cappetta, G.; Borelli, G.; Farella, G.M.; Foscan, M.; et al. Evidence-based management and motor rehabilitation of cerebral palsy: An overview to develop shared recommendations. Front. Neurol. 2023, 14, 1171224. [Google Scholar] [CrossRef]
  7. Semrau, J.A.; Perlmutter, J.S.; Thoroughman, K.A. Visuomotor adaptation in Parkinson’s disease: Effects of perturbation type and medication state. J. Neurophysiol. 2014, 111, 2675–2687. [Google Scholar] [CrossRef] [PubMed][Green Version]
  8. Nguemeni, C.; Nakchbandi, L.; Homola, G.; Zeller, D. Impaired consolidation of visuomotor adaptation in patients with multiple sclerosis. Eur. J. Neurol. 2021, 28, 884–892. [Google Scholar] [CrossRef]
  9. Proud, E.L.; Miller, K.J.; Morris, M.E.; McGinley, J.L.; Blennerhassett, J.M. Effects of Upper Limb Exercise or Training on Hand Dexterity and Function in People with Parkinson Disease: A Systematic Review and Meta-analysis. Arch. Phys. Med. Rehabil. 2024, 105, 1375–1387. [Google Scholar] [CrossRef]
  10. Lang, C.E.; Edwards, D.F.; Birkenmeier, R.L. Upper Limb Rehabilitation in People With Multiple Sclerosis: A Systematic Review. Neurorehabilit. Neural Repair 2016, 30, 773–793. [Google Scholar] [CrossRef]
  11. World Health Organization; United Nations Children’s Fund. Global Report on Assistive Technology; World Health Organization: Geneva, Switzerland, 2022. [Google Scholar]
  12. Naqvi, A.A.; Hassali, M.A.; Naqvi, S.; Zehra, T.; Ahmad, N.; Zafar, S.; Islam, M.; Iqbal, Q. Development and validation of the General Rehabilitation Adherence Scale (GRAS) in patients attending physical therapy clinics for musculoskeletal disorders. BMC Musculoskelet. Disord. 2020, 21, 65. [Google Scholar] [CrossRef] [PubMed]
  13. Bortone, I.; Leonardis, D.; Mastronicola, N.; Crecchi, A.; Bonfiglio, L.; Procopio, C.; Solazzi, M.; Frisoli, A. Wearable haptics and immersive virtual reality rehabilitation training in children with neuromotor impairments. IEEE Trans. Neural Syst. Rehabil. Eng. 2018, 26, 1469–1478. [Google Scholar] [CrossRef] [PubMed]
  14. Chesser, B.T.; Blythe, S.A.; Ridge, L.D.; Tomaszewski, R.E.; Kinne, B.L. Effectiveness of the Wii for Pediatric Rehabilitation in Individuals with Cerebral Palsy: A Systematic Review. Phys. Ther. Rev. 2020, 25, 106–117. [Google Scholar] [CrossRef]
  15. Camardella, C.; Chiaradia, D.; Bortone, I.; Frisoli, A.; Leonardis, D. Introducing wearable haptics for rendering velocity feedback in VR serious games for neuro-rehabilitation of children. Front. Virtual Real. 2023, 3, 1019302. [Google Scholar] [CrossRef]
  16. Patel, S.; Park, H.; Bonato, P.; Chan, L.; Rodgers, M. A review of wearable sensors and systems with application in rehabilitation. J. NeuroEngineering Rehabil. 2012, 9, 21. [Google Scholar] [CrossRef]
  17. Paladugu, P.; Garcia, A.; Sporn, K.A.; Berman, J.; Ahmed, I.; Abhishek, K.; Carey, J.; Vaccaro, A.R. Virtual reality-enhanced rehabilitation for improving musculoskeletal function and recovery after trauma. J. Orthop. Surg. Res. 2025, 20, 404. [Google Scholar] [CrossRef]
  18. Chandrasekaran, R.; Katthula, V.; Moustakas, E. Patterns of use and key predictors for the use of wearable health care devices by US adults: Insights from a national survey. J. Med Internet Res. 2020, 22, e22443. [Google Scholar] [CrossRef]
  19. Cottrell, M.A.; Russell, T.G. Telehealth for musculoskeletal physiotherapy: Challenges and opportunities. Musculoskelet. Sci. Pract. 2020, 48, 102193. [Google Scholar] [CrossRef]
  20. Nakano, N.; Sakura, T.; Ueda, K.; Omura, L.; Kimura, A.; Iino, Y.; Fukashiro, S.; Yoshioka, S. Evaluation of 3D markerless motion capture accuracy using OpenPose with multiple video cameras. Front. Sport. Act. Living 2020, 2, 50. [Google Scholar] [CrossRef] [PubMed]
  21. Pulido, J.C.; Pegalajar, M.; Gargallo, P.; Carrasco, L. Humanoid robot-based system for the physical rehabilitation of children with special needs: From a conceptual design to a test with users. In Proceedings of the XX International Conference on Human Computer Interaction (Interacción 2019), Donostia, Spain, 25–28 June 2019; pp. 1–8. [Google Scholar] [CrossRef]
  22. Lam, W.W.T.; Tang, Y.M.; Fong, K.N.K. A systematic review of the applications of markerless motion capture (MMC) technology for clinical measurement in rehabilitation. J. NeuroEngineering Rehabil. 2023, 20, 57. [Google Scholar] [CrossRef] [PubMed]
  23. Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017. [Google Scholar]
  24. Lugaresi, C.; Tang, J.; Nash, H.; McClanahan, C.; Uboweja, E.; Hays, M.; Zhang, F.; Chang, C.L.; Yong, M.G.; Lee, J.; et al. MediaPipe: A Framework for Building Perception Pipelines. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision Workshops (CVPRW), Long Beach, CA, USA, 16–17 June 2019. [Google Scholar]
  25. Adolf, J.; Dolezal, J.; Kutilek, P.; Hejda, J.; Lhotska, L. Single Camera-Based Remote Physical Therapy: Verification on a Large Video Dataset. Appl. Sci. 2022, 12, 799. [Google Scholar] [CrossRef]
  26. Ota, M.; Tateuchi, H.; Hashiguchi, T.; Kato, T.; Ogino, Y.; Yamagata, M.; Ichihashi, N. Verification of reliability and validity of motion analysis systems during bilateral squat using human pose tracking algorithm. Gait Posture 2020, 80, 62–67. [Google Scholar] [CrossRef] [PubMed]
  27. Boudreault-Morales, G.E.; Marquez-Chin, C.; Liu, X.; He, K.; Zhou, S.; Zariffa, J. The effect of depth data and upper limb impairment on lightweight monocular RGB human pose estimation models. BioMedical Eng. OnLine 2025, 24, 12. [Google Scholar] [CrossRef]
  28. Cóias, A.R.; Lee, M.H.; Bernardino, A. A low-cost virtual coach for 2D video-based compensation assessment of upper extremity rehabilitation exercises. J. NeuroEngineering Rehabil. 2022, 19, 83. [Google Scholar] [CrossRef]
  29. Assad-Uz-Zaman, M.; Islam, M.R.; Rahman, M.H.; Wang, Y.C.; McGonigle, E. Kinect Controlled NAO Robot for Telerehabilitation. J. Intell. Syst. 2021, 30, 224–239. [Google Scholar] [CrossRef]
  30. Raso, A.; Pulcinelli, M.; Schena, E.; Puglisi, A.; Pioggia, G.; Carnevale, A.; Longo, U.G. A pilot study for assessing NAO humanoid robot assistance in shoulder rehabilitation. J. Exp. Orthop. 2024, 12, e70122. [Google Scholar] [CrossRef]
  31. Bettger, J.P.; Green, C.L.; Holmes, D.N.; Chokshi, A.; Mather, R.C., III; Hoch, B.T.; De Leon, A.J.; Aluisio, F.; Seyler, T.M.; Del Gaizo, D.J.; et al. Effects of Virtual Exercise Rehabilitation In-Home Therapy Compared with Traditional Care After Total Knee Arthroplasty: VERITAS, a Randomized Controlled Trial. J. Bone Jt. Surgery. Am. Vol. 2020, 102, 101–109. [Google Scholar] [CrossRef]
  32. Osawa, K.; You, Y.; Sun, Y.; Wang, T.Q.; Zhang, S.; Shimodozono, M.; Tanaka, E. Telerehabilitation System Based on OpenPose and 3D Reconstruction With Monocular Camera. J. Robot. Mechatronics 2023, 35, 586–600. [Google Scholar] [CrossRef]
  33. Yu, C.Y.; Haeberle, R.A.; Starfinger, A.; Steele, Z.J.; Gulfidan, A.H.; Thomas, M.A.; Stieghahn, J.; Winter, J.A.; Ukwatta, S.N.; Sykes, E.A. Feasibility of 3D Body Tracking From Monocular 2D Video Feeds in Musculoskeletal Telerehabilitation. Sensors 2024, 24, 206. [Google Scholar] [CrossRef]
  34. Komaris, D.S.; Tarfali, G.; O’Flynn, B.; Tedesco, S. Unsupervised IMU-Based Evaluation of At-Home Exercise Programmes: A Feasibility Study. BMC Sport. Sci. Med. Rehabil. 2022, 14, 28. [Google Scholar] [CrossRef] [PubMed]
  35. Fahy, K.; Galvin, R.; Lewis, J.; McCreesh, K. Exercise as effective as surgery in improving quality of life, disability, and pain for large to massive rotator cuff tears: A systematic review & meta-analysis. Musculoskelet. Sci. Pract. 2022, 61, 102597. [Google Scholar] [CrossRef]
  36. Mertens, M.G.; Meert, L.; Struyf, F.; Schwank, A.; Meeus, M. Exercise Therapy Is Effective for Improvement in Range of Motion, Function, and Pain in Patients with Frozen Shoulder: A Systematic Review and Meta-analysis. Arch. Phys. Med. Rehabil. 2022, 103, 998–1012.e14. [Google Scholar] [CrossRef] [PubMed]
  37. Vila-Dieguez, O.; Heindel, M.D.; Awokuse, D.; Kulig, K.; Michener, L.A. Exercise for rotator cuff tendinopathy: Proposed mechanisms of recovery. Shoulder Elb. 2023, 15, 233–249. [Google Scholar] [CrossRef] [PubMed]
  38. Abd El-Fatah Ismael, M.; Labieb, M.M.; Khalil, S.S.; Hashem, E.M. Effect of Rehabilitation Exercises on Shoulder Function after Proximal Humeral Fracture Surgery. Egypt. J. Health Care 2023, 14, 537–550. [Google Scholar] [CrossRef]
  39. Tenberg, S.; Mueller, S.; Vogt, L.; Roth, C.; Happ, K.; Scherer, M.; Behringer, M.; Niederer, D. Comparative Effectiveness of Upper Limb Exercise Interventions in Individuals With Stroke: A Network Meta-Analysis. Stroke 2023, 54, 1839–1853. [Google Scholar] [CrossRef]
  40. Özden, F.; Sarı, Z.; Karaman, Ö.N.; Aydoğmuş, H. The effect of video exercise-based telerehabilitation on clinical outcomes, expectation, satisfaction, and motivation in patients with chronic low back pain. Ir. J. Med Sci. 2022, 191, 1229–1239. [Google Scholar] [CrossRef]
  41. Family Practice Notebook. Shoulder Range of Motion. Available online: https://fpnotebook.com/Ortho/Exam/ShldrRngOfMtn.htm (accessed on 11 September 2025).
  42. Hampton, L.; O’Reilly, N. Range of Motion Normative Values. 2021. Available online: https://www.physio-pedia.com/Range_of_Motion_Normative_Values (accessed on 11 September 2025).
  43. Palazzo, L.; Suglia, V.; Grieco, S.; Buongiorno, D.; Brunetti, A.; Carnimeo, L.; Amitrano, F.; Coccia, A.; Pagano, G.; D’Addio, G.; et al. A Deep Learning-Based Framework Oriented to Pathological Gait Recognition with Inertial Sensors. Sensors 2025, 25, 260. [Google Scholar] [CrossRef]
  44. Troisi Lopez, E.; Liparoti, M.; Minino, R.; Romano, A.; Polverino, A.; Carotenuto, A.; Tafuri, D.; Sorrentino, G.; Sorrentino, P. Kinematic network of joint motion provides insight on gait coordination: An observational study on Parkinson’s disease. Heliyon 2024, 10, e35751. [Google Scholar] [CrossRef]
  45. Romano, A.; Liparoti, M.; Minino, R.; Polverino, A.; Cipriano, L.; Carotenuto, A.; Tafuri, D.; Sorrentino, G.; Troisi Lopez, E. The effect of dopaminergic treatment on whole body kinematics explored through network theory. Sci. Rep. 2024, 14, 1913. [Google Scholar] [CrossRef]
  46. Troisi Lopez, E.; Sorrentino, P.; Liparoti, M.; Minino, R.; Polverino, A.; Romano, A.; Carotenuto, A.; Amico, E.; Sorrentino, G. The kinectome: A comprehensive kinematic map of human motion in health and disease. Ann. N. Y. Acad. Sci. 2022, 1516, 247–259. [Google Scholar] [CrossRef] [PubMed]
Figure 1. The virtual NAO robot inside Choregraphe where the selected joint can be manipulated to a specified degree.
Figure 1. The virtual NAO robot inside Choregraphe where the selected joint can be manipulated to a specified degree.
Sensors 25 06239 g001
Figure 2. All 33 landmarks tracked by the MediaPipe Pose model.
Figure 2. All 33 landmarks tracked by the MediaPipe Pose model.
Sensors 25 06239 g002
Figure 3. Systemworkflow mapping out the thought process of the system.
Figure 3. Systemworkflow mapping out the thought process of the system.
Sensors 25 06239 g003
Figure 4. Experimental setup of angle calculation validation testing.
Figure 4. Experimental setup of angle calculation validation testing.
Sensors 25 06239 g004
Figure 5. Experimental setup of system effectiveness and participant feedback testing.
Figure 5. Experimental setup of system effectiveness and participant feedback testing.
Sensors 25 06239 g005
Figure 6. Comparison of shoulder abduction/adduction trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 90°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Figure 6. Comparison of shoulder abduction/adduction trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 90°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Sensors 25 06239 g006
Figure 7. Comparison of shoulder flexion/extension trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 180°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Figure 7. Comparison of shoulder flexion/extension trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 180°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Sensors 25 06239 g007
Figure 8. Comparison of elbow flexion/extension trajectories. The graphs model the average elbow range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 30° to 150°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Figure 8. Comparison of elbow flexion/extension trajectories. The graphs model the average elbow range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 30° to 150°; comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Sensors 25 06239 g008
Figure 9. Comparison of shoulder medial/lateral rotation trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 80° (medial rotation) and 0° to 90° (lateral rotation); comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Figure 9. Comparison of shoulder medial/lateral rotation trajectories. The graphs model the average shoulder range of motion during the first and second halves of the exercise repetitions, averaged across the 13 participants. The solid blue line represents the average participant trajectory, while the dashed red line shows the ideal trajectory from 0° to 80° (medial rotation) and 0° to 90° (lateral rotation); comparing the two graphs gives a visual representation of participant accuracy improvements throughout the exercise.
Sensors 25 06239 g009
Table 1. Comparison of representative telerehabilitation systems based on sensing modality, guidance, feedback, and key outcomes.
Table 1. Comparison of representative telerehabilitation systems based on sensing modality, guidance, feedback, and key outcomes.
Study (Year)Modality/SensingGuidance ModalityAutomated FeedbackSetting and Population (N)Key Outcome(s)
Adolf et al. [25] (2022)Single RGB webcam; OpenPose 2D HPEN/A (tracking subcomponent)Keypoint confidence; posture effectsHome-exercise video dataset (>2000 unique exercises)Demonstrated robustness of webcam HPE for home exercises; side and floor positions reduced accuracy; occlusion and lower-body joints most affected.
Cóias et al. [28] (2022)Laptop webcam; markerless HPEVirtual coach (on-screen)RT audiovisual cues; compensation (trunk) detectionStroke rehab; system evaluation on videos (15 post-stroke) and usability (7 volunteers)Corrects excessive trunk compensation during reaching; low-cost, home-oriented workflow reported as usable.
Nakano et al. [20] (2020)Multi-camera RGB (five cams) and OpenPose 2D → 3D DLTN/A (measurement subcomponent)3D joint positions; accuracy vs. optical mocapLab; two healthy adults; walking/jump/throw tasks∼80% of joint-position MAEs <30 mm; errors driven by 2D tracking failures; shows path to accurate markerless 3D with consumer cameras.
Ota et al. [26] (2020)Single camera and OpenPose vs. ViconN/A (validation subcomponent)Joint angles (trunk/hip/knee/ankle) during squatsLab; 20 healthy adultsReported reliability of OpenPose angles vs. Vicon for bilateral squat; supports feasibility of low-cost video for clinic assessments.
Osawa et al. [32] (2023)Monocular RGB; OpenPose and 3D reconstructionOn-screen guidanceDTW-based motion similarity scoring; RT feedbackHome-oriented design (whole-body exercises)Low-cost, single-camera telerehab with 3D pose and similarity evaluation; designed for intuitive remote coaching without wearables.
Clemente et al. [33] (2024)MediaPipe Pose (3D from monocular 2D)N/A (measurement subcomponent)ROM estimation across eight physio exercises; accuracy vs. ground truthLab; musculoskeletal physio tasksROM MAPE ∼15–25% with high correlations; highlights strengths (e.g., shoulder abduction) and limits (occlusion/depth).
Assad-Uz-Zaman et al. [29] (2021)Kinect V2 depth and NAONAO humanoidNAO mirrors therapist’s joint angles (IK)Lab; feasibility (demo with adult operator)Demonstrated therapist-driven, remote NAO instruction for upper limb rehab movements; shows social-robot guidance feasibility.
Raso et al. [30] (2024)Vision-guided NAONAO humanoid (scripted demos)Temporal pacing; ROM and smoothness indicesPilot: 10 healthy and 2 with shoulder pathologyNAO guidance improved temporal control and smoothness during shoulder tasks; supports robot-coach engagement benefits.
Komaris et al. [34] (2022)Single IMU (wearable)Video demo; no robotUnsupervised at-home performance metrics (smoothness, intensity, regularity)30 healthy adults; lab and at-home weekIMU features sensitive to at-home execution quality; showed need for feedback pacing; objective adherence tracking without video.
Table 2. Demographic data of the participants.
Table 2. Demographic data of the participants.
Participant IDOccupationAge RangeGenderDominant HandRace
HP1E64+MRAS
HP2U50–64MRAS
HP3E46–50FRAS
HP4R50–64FLAS
HP5E50–64MRAS
HP6E50–64MRAS
HP7U18–24MLHI
HP8U18–24FRAS
HP9E46–50FRAS
HP10U18–24MRNHW
HP11U18–24MRAS
HP12E25–34MRAS
HP13R35–40MRAS
Abbreviations: Participant ID: HP = Healthy Participant; Occupation: E = Employed, U = Unemployed, R = Retired; Gender: M = Male, F = Female; Dominant Hand: R = Right Arm, L = Left Arm; Race: AS = Asian, AA = African American, HI = Hispanic, NHW = Non-Hispanic White.
Table 3. Shoulder abduction/adduction validation testing results.
Table 3. Shoulder abduction/adduction validation testing results.
TrialKnown (°)Detected (°)Error (°)Squared Error
13030.3+0.30.09
24543.5−1.52.25
36059.8−0.20.04
49088.6−1.41.96
RMSE across all trials:1.04°
Table 4. Shoulder flexion/extension validation testing results.
Table 4. Shoulder flexion/extension validation testing results.
TrialKnown (°)Detected (°)Error (°)Squared Error
13027.0−3.09.00
24543.1−1.93.61
36057.6−2.45.76
49090.8+0.80.64
5120132.8+12.8163.84
6135151.3+16.3265.69
7150158.5+8.572.25
8180177.1−2.98.41
RMSE across all trials:8.13°
Table 5. Elbow flexion/extension validation testing results.
Table 5. Elbow flexion/extension validation testing results.
TrialKnown (°)Detected (°)Error (°)Squared Error
13019.3−10.7114.49
24533.1−11.9141.61
36049.5−10.5110.25
49092.6+2.66.76
5120123.9+3.915.21
6135143.0+8.064.00
7150148.8−1.21.44
RMSE across all trials:8.05°
Table 6. Shoulder medial/lateral rotation validation testing results.
Table 6. Shoulder medial/lateral rotation validation testing results.
TrialKnown (°)Detected (°)Error (°)Squared Error
13038.1+8.165.61
24553.5+8.572.25
36065.3+5.328.09
49089.8−0.20.04
RMSE across all trials:6.44°
Table 7. RMSE comparisons between the first and second half.
Table 7. RMSE comparisons between the first and second half.
ExerciseRMSE First HalfRMSE Second HalfRMSE % Decrease
Shoulder Abduction/Adduction17.08°12.08°29.3%
Shoulder Flexion/Extension6.80°4.59°32.5%
Elbow Flexion/Extension5.98°3.45°42.3%
Shoulder Medial/Lateral Rotation3.53°2.12°39.9%
Table 8. Post-experiment participant survey ratings for system attributes (Likert scale: 1 = lowest, 5 = highest). Values represent the number of participant responses for each rating.
Table 8. Post-experiment participant survey ratings for system attributes (Likert scale: 1 = lowest, 5 = highest). Values represent the number of participant responses for each rating.
AttributeRatingMean Score
54321
Effectiveness921104.46
Functionality920114.38
Flexibility931004.62
Safety1012004.62
Comfortability1002104.54
Clarity940004.69
Exercise Selection931004.62
Video Demonstration930104.54
Ease of Use1020014.54
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hou, K.; Khan, M.M.R.; Rahman, M.H. An Accessible AI-Assisted Rehabilitation System for Guided Upper Limb Therapy. Sensors 2025, 25, 6239. https://doi.org/10.3390/s25196239

AMA Style

Hou K, Khan MMR, Rahman MH. An Accessible AI-Assisted Rehabilitation System for Guided Upper Limb Therapy. Sensors. 2025; 25(19):6239. https://doi.org/10.3390/s25196239

Chicago/Turabian Style

Hou, Kevin, Md Mahafuzur Rahaman Khan, and Mohammad H. Rahman. 2025. "An Accessible AI-Assisted Rehabilitation System for Guided Upper Limb Therapy" Sensors 25, no. 19: 6239. https://doi.org/10.3390/s25196239

APA Style

Hou, K., Khan, M. M. R., & Rahman, M. H. (2025). An Accessible AI-Assisted Rehabilitation System for Guided Upper Limb Therapy. Sensors, 25(19), 6239. https://doi.org/10.3390/s25196239

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop