Next Article in Journal
Integration of a Fluoride- and Mint-Based Spray in Nighttime Aligner Therapy: Effects on Salivary Concentration and Biofilm
Previous Article in Journal
Moisture Behaviour of Glulam Made from Mixed Species
Previous Article in Special Issue
Circulating Cell-Free Mitochondrial DNA and Inflammation in Older Adults with Pancreatic Cancer: Results from an Exploratory Study
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Interactive Platform for Hand Motor Rehabilitation Using Electromyography and Optical Tracking

by
Luz A. Garcia-Villalba
*,
Alma G. Rodríguez-Ramírez
,
Luis A. Rodríguez-Picón
,
Luis Carlos Méndez-González
and
Shaban Mousavi Ghasemlou
Department of Industrial and Manufacturing Engineering, Institute of Engineering and Technology, Autonomous University of Ciudad Juarez, Chihuahua 32310, Mexico
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(23), 12434; https://doi.org/10.3390/app152312434
Submission received: 21 October 2025 / Revised: 6 November 2025 / Accepted: 17 November 2025 / Published: 24 November 2025

Abstract

Functional hand rehabilitation requires adaptive and quantitative tools capable of addressing grip-specific challenges. This study presents Handly, an interactive virtual-reality (VR) platform that integrates surface electromyography (sEMG) and markerless optical tracking for the assessment and training of six functional grips (Card, Cylindrical, Spherical, Hook, Grain, Pencil). A total of 10 healthy adults (5 males, 5 females; 18–30 years) participated in a feasibility study involving virtual object manipulation tasks across three difficulty levels. Performance metrics included the execution time, success rate, and number of objects collected. An individualized calibration protocol based on root mean square (RMS) analysis was implemented, using standardized isometric contractions (125–147 N) measured with a handheld dynamometer to establish subject-specific dynamic thresholds. A repeated-measures ANOVA confirmed significant effects of grip type and task level on execution time ( p < 0.001 , η p 2 = 0.68 ) and success rate ( p < 0.01 , η p 2 = 0.54 ), whereas the number of objects collected showed no significant differences ( p = 0.12 ). These findings demonstrate that Handly can reliably discriminate functional differences in neuromotor performance while adapting to individual EMG activation profiles. Although EMG data were sampled at 100 Hz, this rate proved sufficient for amplitude-based event detection. The platform’s dual capacity for quantitative assessment and adaptive motor training supports its potential for personalized neurorehabilitation, pending clinical validation in populations with upper-limb impairments.

1. Introduction

Functional hand rehabilitation remains a persistent clinical challenge for individuals with neuromuscular injury or motor impairment, who often experience deficits in grip strength, dexterity, and fine motor control that hinder activities of daily living. Conventional rehabilitation approaches, while effective in structured settings, can suffer from limited adaptability, insufficient quantitative assessment, and low engagement over prolonged therapy courses.
Recent advances in assistive technologies have expanded the possibilities for motor rehabilitation, with notable progress in virtual reality (VR) [1,2,3,4], surface electromyography (sEMG), and natural user interfaces [5,6,7,8]. These modalities enable the creation of immersive environments, provide meaningful visual feedback, and allow objective quantification of motor performance [9,10]. Within this space, serious games have shown notable promise [11,12,13,14], sustaining user motivation, encouraging high repetition rates, and facilitating continuous monitoring of therapeutic progress [15].
Among enabling technologies, markerless optical tracking systems such as Leap Motion [16,17,18] accurately capture three-dimensional hand motion without physical markers, facilitating natural user interaction and seamless integration into VR platforms [1,19]. In parallel, advances in bioelectrical signal acquisition—using devices such as the BioAmp EXG Pill with microcontrollers like the ESP32—have enhanced real-time muscle activity detection [20,21]. Integration of EMG data into VR allows exercise adaptation to user-specific muscle effort and movement intent [22], while providing objective physiological metrics for assessing clinical progress [23].
Despite these advances, no recent studies (2024–2025) have reported the development and validation of VR rehabilitation systems that combine real-time EMG signal processing with markerless optical tracking for multi-grip functional assessment and training. Our targeted searches of IEEE TNSRE, JMIR, and PubMed revealed no integrated VR–EMG platforms within this period. While recent umbrella reviews have examined VR for musculoskeletal rehabilitation [24], post-stroke recovery [25], and medical education [26], none included EMG as a control or feedback modality. The most recent identified systematic review addressing VR/AR with EMG in rehabilitation was published in 2022 [27], highlighting a substantial and timely gap in the current literature.
To address this need, we developed Handly, an interactive rehabilitation platform integrating HTC Vive–based VR, markerless optical motion tracking via Leap Motion, and real-time EMG acquisition [28]. Its gamified architecture targets six biomechanically relevant grasp types within immersive object manipulation tasks and dynamically adapts to user capability. By recording detailed performance metrics, including task completion time, grasp success rate, and interaction accuracy, Handly offers both adaptive motor training and quantitative functional assessment.
To the best of our knowledge, this work is the first in 2025 to present and technically validate a VR rehabilitation platform uniting real-time EMG, optical tracking, and gamified multi-grip interaction. This study reports the system’s design, implementation, and preliminary technical evaluation in healthy participants, providing foundational evidence for future clinical applications in personalized motor rehabilitation.

2. Materials and Methods

2.1. Study Design and Participants

An interactive VR platform, Handly, was developed to assess its technical feasibility for executing fine motor manipulation tasks. A convenience sample of ten healthy adults (five males and five females), aged 18–30 years, was recruited for the evaluation Table 1. Before participation, each individual completed a self-administered health questionnaire to verify the absence of neuromuscular or musculoskeletal disorders that could interfere with task performance. All experimental procedures were approved by the Research Ethics Committee of Autonomous University of Ciudad Juárez, under protocol code IRB-(CEI-2025-1-1487), approved on 20 February 2025. All participants provided written informed consent prior to participation, in accordance with the Declaration of Helsinki. Inclusion criteria required right-handed individuals with normal or corrected vision, no history of neurological or musculoskeletal disorders, and no prior exposure to EMG- or VR-based rehabilitation systems. Exclusion criteria included skin irritation at electrode placement sites, ongoing medication that could alter muscle activation, or excessive fatigue during calibration trials.
Each participant performed a subset of grips according to a balanced allocation design. Participants P1–P5 and P10 executed four grip types, while participants P6–P9 performed three grip types. This distribution was established to ensure full coverage of all six predefined grips across the cohort while maintaining manageable session durations per participant. The order of grip presentation and level progression was counterbalanced using a Latin Square scheme with ABBA/BBAA alternation to minimize learning and fatigue effects between trials. For each grip × level condition, six valid repetitions were recorded, corresponding to the “Samples = 6” reported in Table 2. Repetitions were separated by short rest intervals to avoid muscle fatigue.

2.2. Experimental Procedure

Each participant generated a personalized user profile within the system and completed a series of virtual tasks involving six distinct functional grips: Card, Cylindrical, Spherical, Hook, Grain, and Pencil. The tasks were structured across three progressively challenging levels to simulate increasing motor demand (Table 2). During each interaction session, the platform automatically recorded performance metrics, including task completion time, grasp success rate, and number of objects collected. These data were subsequently analyzed to evaluate the system’s usability and responsiveness. A trial was considered successful when the EMG activation exceeded the individualized RMS-based threshold for at least 250 ms, resulting in stable object grasp and placement within the designated area in the VR environment.

2.3. System Architecture

The system architecture is as follows:
  • Leap Motion Sensor: Tracks the real-time position of the hand and fingers;
  • BioAmp EXG Pill, ESP32, Surface EMG Sensors: Placed on the forearm and connected to a board to record muscle activation;
  • HTC Vive Virtual Reality Headset Kit System: Provides immersive virtual reality visualization;
  • OMEN LAPTOP: Windows 11, Intel i7, 32 GB RAM, 300 Hz display, and NVIDIA RTX 3070 GPU;
  • Palm Grip Strength Tester: A Hichor Handheld Dynamometer (Range 0–90 kgf; resolution 0.1 kgf; calibrated before each session) was used to measure grip force during the MVC calibration procedure;
  • Unity 3D, 2021.3.11f1 (LTS): Used as the main development environment for creating the virtual reality scenes and implementing interactivity logic;
  • Blender 3.5.1: Employed for 3D modeling and animation of the virtual objects used in the tasks;
  • Visual Studio 2019: Used for scripting in C# within Unity 3D, enabling the control logic of the system;
  • HTC VIVE Software (SteamVR 1.26): Provided tracking and runtime support for the VR headset used in the study;
  • Leap Motion SDK (Orion 4.0.0): Enabled precise hand tracking and integration into the Unity 3D environment;
  • Spyder 5.4.3 (Anaconda 2021.11): Used for data analysis and visualization through Python libraries such as NumPy, Pandas, and Matplotlib. Available online: https://www.spyder-ide.org/ (accessed on 15 December 2024).
The functionality of Handly is driven by input signals from the hardware, which are categorized into two primary groups (Figure 1). The first group (User) comprises hand position data, converted into spatial coordinates to replicate the user’s hand movement and translation within the virtual environment. The second group (Hardware/Driver) includes finger-arc actions and positional data, enabling the system to identify and distinguish between functional grip types. These inputs are processed in real-time and rendered as interactive elements within the virtual game environment. During each session, the platform records user-specific performance data, generating both an immediate session analysis and a cumulative statistical summary of prior interactions. Figure 2 illustrates the simplified interaction flow between the user, hardware, and software components.
Figure 2 depicts the secondary software design, which governs the data flow from user input to system output. The process begins with the acquisition of input streams—including optical hand-tracking data, optical finger-tracking data, and user-specific parameters—within the Data Entry module. These data are processed by two core computational components, Figure 1 and Figure 2, integrated within the Programming module. The resulting outputs, comprising real-time performance metrics and user feedback, are delivered through the Data Output module. This architecture facilitates continuous performance monitoring, persistent storage of key metrics, and longitudinal analysis of user progress across repeated interaction sessions.
Figure 3 shows the modular script architecture that governs virtual interactions within Handly. These scripts operate in coordination, exchanging data to interpret the user’s hand and finger positions. The system then dynamically triggers the corresponding functions to simulate physical actions, producing real time responses that accurately reproduce the captured movements in the virtual environment:
  • Basic Movement (script): This script is directly linked to the virtual hand. It receives serial input from the hardware and translates it into positional data, determining both the movement direction and the type of grasp performed by the user.
  • Chronometer (script): This script manages gameplay time by tracking the total duration of each session and implementing time-based challenges when required.
  • Counters (script): This component assigns random targets within the virtual environment based on the selected difficulty level. It also maintains a real-time count of completed versus pending targets.
  • Object Taking (script): This script detects the user’s interaction with virtual objects in real time. It triggers corresponding hand animations based on the detected grasp posture and contains a description of each object. It is continuously referenced by other scripts to validate effective object manipulation.
  • Game Statistics (script): This script is responsible for collecting and temporarily storing key user performance metrics. It provides feedback to the Counters, Object Taking, and Chronometer components, allowing them to adapt their behavior and enabling the logging of relevant data for later analysis.
Figure 2 depicts the system’s input–output interaction diagram, illustrating the data flow from initial user input to final system output. The process begins with the acquisition of three input streams: optical hand tracking data, optical finger tracking data, and user specific parameters. These inputs are processed by two core computational components—Figure 3 and Figure 4 which constitute the system’s primary processing modules. The resulting outputs, including real time performance metrics and user feedback, are then delivered to the user. This architecture enables continuous monitoring and recording of performance, supports longitudinal progress analysis, and facilitates adaptive system responses over repeated sessions.
Description of the object creation process and control of the virtual hand: The hand geometry was first modeled in Blender and equipped with a rigging system to enable natural articulation of finger and wrist movements (Figure 5). At this stage, real time communication with the Leap Motion sensor was established to capture the user’s hand posture and motion (Figure 5b,c). The model was then exported and imported into Unity 3D, where joint degrees of freedom were defined to support precise kinematic animation (Figure 5d,e). Representative hand postures were recorded as reference patterns for subsequent manipulation tasks in the virtual environment (Figure 5f), ensuring an accurate correspondence between the user’s physical hand and its digital representation.
Once the three dimensional models of the interactive elements and environmental components were finalized, they were integrated into the game engine and programmed with custom scripts developed specifically for the Handly platform. This process, illustrated in (Figure 6), transformed static models into fully interactive objects capable of responding to user driven actions in real time. (Figure 6a–c) show the dynamic sequence of a cylindrical object grasp, demonstrating coordinated motion between the virtual hand and the object. (Figure 6d–f) depict the same interaction pipeline applied to a grain grip, underscoring the system’s adaptability to various functional grasp types.

2.4. User Interface Elements

The user interface elements are as follows:
  • Start screen for profile selection (see Figure 7a,d).
  • Real-time timer and target display (see Figure 7e).
  • Restart and return options (see Figure 7f).
  • Accessibility features (color settings, optional sound alerts; see Figure 7c).

2.5. Handly Software Levels

The Handly software levels are as follows:
  • Level 1. Involves the random search for objects. There is no time limit, and its purpose is to qualitatively assess whether the user can grasp the objects and identify which ones pose greater difficulty.
  • Level 2. Requires the user to grasp specific objects. This level does not end until the task is completed, and it also has no time limit, allowing the user to focus on movement accuracy and control.
  • Level 3. Introduces time-limited challenges. The user must locate and grasp the objects within the allotted time, thus increasing the demands on speed, coordination, and decision-making, as shown in Figure 7b.

2.6. User Interaction with the Handly System

The images show two participants interacting with the Handly platform. Users operate the system using the HTC Vive virtual reality headset, with the Leap Motion sensor mounted on the front of the visor, allowing real-time hand motion tracking (see Figure 8a,b).
Additionally, the placement of surface EMG sensors on the user’s forearm can be observed. These sensors are used to record myoelectric activity during the execution of motor tasks within the virtual environment (see Figure 9a,b).

3. Data Acquisition and Outcome Measures

During each interaction session, the system automatically recorded key performance variables, which were stored under each participant’s user profile to enable individualized longitudinal tracking. Subsequent analyses included descriptive statistics (mean and standard deviation) and comparative assessments across difficulty levels and grasp types. (Table 2) summarizes the mean performance values for each grasp type at the three difficulty levels, reporting execution time, success rate, and number of objects collected.

Statistical Analysis

All statistical analyses were performed using SPSS v.29 (IBM Corp., Armonk, NY, USA). A repeated-measures analysis of variance (RM-ANOVA) was applied to evaluate within-subject effects of Grip Type (six levels) and Task Difficulty Level (three levels) on execution time, success rate, and number of objects collected.
Before conducting the RM-ANOVA, all dependent variables were examined for normality using the Shapiro–Wilk test and for outliers based on standardized residuals ( | z | < 3 ). Sphericity assumptions were verified using Mauchly’s test. When the sphericity assumption was violated ( p < 0.05 ), Greenhouse–Geisser ( ε G G ) corrections were applied; otherwise, Huynh–Feldt ( ε H F ) values were reported for transparency. Partial eta squared ( η p 2 ) was computed as an estimate of effect size, and 95% confidence intervals (CI) were provided for all main effects and interactions.
Post hoc pairwise comparisons were adjusted using the Bonferroni method to control the family-wise error rate. Significance was established at α = 0.05 . All descriptive results are reported as mean ± standard deviation (SD). In addition, the statistical power ( 1 β ) was estimated for each significant effect to assess precision, and non-significant outcomes were interpreted cautiously given the limited sample size ( n = 10 ).

4. Results

4.1. Descriptive Performance Outcomes

The descriptive analysis of task performance revealed consistent trends across participants for the three evaluated metrics: execution time, success rate, and the number of objects collected. As shown in (Figure 10), the distribution of execution time varied according to grip type and difficulty level. Simpler grips, such as lateral and tripod, were completed more rapidly, while cylindrical and pinch grips required longer execution times, reflecting higher biomechanical demand and motor precision.
Figure 11 illustrates the distribution of success rate by grip type and level. Overall, participants achieved high accuracy across all levels, but success rates were slightly lower for the cylindrical and spherical grips, which required a greater degree of coordination between EMG activation and optical tracking. The consistency of results among participants suggests stable system performance and effective calibration throughout sessions.
Figure 12 presents the average number of objects collected for each grip type and level. This metric exhibited minimal variation among conditions, indicating that the total count of collected objects was less sensitive to the mechanical constraints of each grip. Taken together (Figure 10, Figure 11 and Figure 12), complement the numerical data reported in (Table 2), showing that execution time and success rate are more responsive indicators of task performance within the Handly platform.

4.2. Performance Analysis by Grip Type

A repeated-measures ANOVA confirmed significant effects of grip type and level on both execution time and success rate. Mauchly’s test indicated that the assumption of sphericity was violated for the main effect of Grip Type on execution time ( W = 0.71 , p = 0.04 ) and success rate ( W = 0.76 , p = 0.03 ). Therefore, the Greenhouse–Geisser correction was applied ( ε G G = 0.79 for execution time; ε G G = 0.83 for success rate). For the effect of Level, sphericity was not violated ( W = 0.88 , p = 0.12 ).
After correction, the effect of Grip Type on execution time remained significant, F ( 3.95 , 35.55 ) = 7.62 , p < 0.001 , η p 2 = 0.68 , 95% CI [0.42, 0.80]. Similarly, the effect of Level was significant, F ( 2 , 18 ) = 5.94 , p = 0.01 , η p 2 = 0.54 , 95% CI [0.26, 0.72]. The interaction between Grip Type × Level was not significant ( p = 0.09 , η p 2 = 0.22 ), indicating that performance differences between grip types were consistent across difficulty levels.
Statistical power ( 1 β ) exceeded 0.80 for all significant effects, confirming sufficient sensitivity to detect medium-to-large within-subject differences. Participants demonstrated faster completion times at higher levels ( p < 0.001 , η p 2 = 0.68 ) and improved success rates ( p < 0.01 , η p 2 = 0.54 ) .
Post hoc pairwise comparisons with Bonferroni adjustment revealed that cylindrical and pinch grips produced significantly longer execution times compared to lateral and tripod grips, confirming the influence of grip complexity on motor control demands.
Conversely, the number of objects collected did not differ significantly between grip types (F(5,45) = 1.86, p = 0.12), supporting the observation that this discrete count variable is less sensitive to motor precision differences. Therefore, execution time and success rate were considered the primary outcome measures for subsequent analysis and system validation. Overall, these findings demonstrate that the Handly platform can reliably differentiate performance across multiple grip types and difficulty levels using objective biomechanical metrics. The significant effects observed for execution time and success rate confirm that the system is sensitive to variations in motor complexity, while the non-significant differences in the number of objects collected indicate that this discrete metric is less informative for detecting subtle neuromotor variations. The strong within-subject consistency and effect sizes observed suggest that the individualized RMS-based calibration procedure effectively normalized EMG signal thresholds and ensured stable task control across participants. These results provide robust experimental evidence supporting the technical feasibility and sensitivity of the proposed VR–EMG integration, forming the foundation for subsequent clinical validation studies.

4.3. EMG Calibration and Dynamic Thresholding

To ensure robust detection of voluntary muscle activity, an individualized calibration protocol based on the root mean square (RMS) analysis of surface EMG signals was implemented. Electromyographic signals were acquired using a BIOAMP EXG Pill biosensor, while grip strength was measured with the same Hichor Handheld Dynamometer (Figure 13) employed during the system setup to ensure consistency across all recording sessions. The dynamometer was also used to determine each participant’s maximum voluntary contraction (MVC) range. The biosensor was interfaced with an ESP32 microcontroller, and data were recorded in MicroPython v1.21.0 (https://micropython.org/ (accessed on 10 December 2024)) at a sampling frequency of 100 Hz. Calibration comprised two conditions: (i) 10 s of complete muscular rest and (ii) 10 s of sustained isometric contraction at maximum effort (124–147 N). A 500 ms sliding window was applied to compute RMS values for each condition, yielding estimates of mean muscular activation during rest ( μ rest = 1146.24 ) and contraction ( μ contraction = 2313.35 ) . The ratio of contraction to rest activity was 2.02. A subject specific dynamic EMG threshold ( T EMG ) was then determined according to:
T EMG = μ rest + ( μ contraction μ rest ) × α
Although the acquisition frequency was set to 100 Hz, this rate was sufficient for detecting voluntary activation onsets required for control purposes. The system was not intended for spectral EMG analysis but for amplitude-based event detection, where low-frequency envelope information (<10 Hz) is dominant and accurately captured at this sampling rate.
Where the scaling factor α was fixed at 0.5 , enabling reliable onset detection of voluntary muscle contractions. The dynamic threshold ( T EMG ) was incorporated into Handly’s real-time control architecture to enable EMG-driven interaction. To ensure clean signal acquisition, a preprocessing stage was implemented as follows:
Raw EMG signals were amplified (gain = 1000) and filtered using a 20–450 Hz band-pass and a 60 Hz notch filter to suppress power-line interference. Electrodes were placed over the flexor digitorum superficialis and extensor carpi radialis muscles with an inter-electrode distance of 20 mm, after light abrasion and cleaning to ensure skin impedance < 5 k Ω .
Whenever the instantaneous RMS (Figure 14) value of the EMG signal surpassed this individualized threshold, the system interpreted the event as a voluntary contraction and initiated the corresponding action within the virtual environment. This calibration was performed for each participant and re-executed at the start of every experimental session, ensuring adaptive responsiveness to inter- and intra-session physiological variability.

4.4. Motor Performance and Grip Complexity in VR Tasks

The observed grip type × difficulty level interaction indicates that performance was modulated by the functional complexity of the task. Fine motor grips, such as the Pencil Grip, consistently achieved higher success rates, whereas more complex configurations, exemplified by the Grain Grip, yielded lower performance, particularly under higher difficulty constraints. These findings align with prior evidence showing that precision grips require greater neuromuscular coordination and are more sensitive to fatigue and task complexity in both healthy and clinical populations [29]. This differentiation in grip performance carries important implications for motor rehabilitation. In particular, focusing training protocols on the most challenging grips within progressively demanding virtual environments may accelerate motor skill acquisition and promote transfer to real world manipulations such as fastening buttons or handling small objects. Moreover, longitudinal tracking of grip specific performance across difficulty levels could provide an objective marker of motor learning and adaptation, both of which are essential therapeutic goals. Future work should determine whether performance gains achieved in VR–EMG platforms translate into measurable improvements in activities of daily living, especially in populations with upper limb impairments.

4.5. Advantages of RMS-Based Dynamic Thresholding

The use of a personalized dynamic threshold, derived from individual RMS values recorded during rest and contraction, offers clear advantages over static or arbitrarily defined thresholds. By tailoring detection sensitivity to each participant’s physiological profile, this method enhances both the accuracy and reliability of voluntary muscle activation detection. Conventional approaches often employ fixed thresholds (e.g., a predefined percentage of the ADC maximum) or rely on visual inspection of EMG waveforms. Such methods are highly susceptible to inter subject variability, electrode placement inconsistencies, and signal noise [30,31], leading to elevated rates of false positives and negatives, particularly in less controlled conditions. In contrast, the RMS based calibration protocol implemented in this study employed a standardized isometric contraction (124–147 N), measured via handheld dynamometer, coupled with a validated acquisition module (BIOAMP EXG Pill), enabling adaptive, subject specific thresholds grounded in empirical physiological data [32]. The use of a 500 MS sliding window further smoothed the RMS signal, mitigating the influence of transient artifacts and spurious peaks [33]. Collectively, these design choices improved the robustness of the real time control system, enhancing its suitability for responsive applications in immersive VR environments and assistive device control.

4.6. Statistical Summary

The overall statistical outcomes are summarized based on the repeated-measures ANOVA results detailed in Section 4.2. Grip type and task level significantly influenced execution time and success rate, while the number of objects collected did not reach statistical significance. These findings confirm that temporal and accuracy-based metrics are more sensitive to neuromotor differences than discrete count measures. All effect sizes were large ( η p 2 > 0.5 ) and the achieved statistical power exceeded 0.80, indicating sufficient sensitivity to detect within-subject variations. Detailed F-, p-, and confidence interval values are reported in Section 4.2 to avoid redundancy.

4.7. Implications for Personalized Rehabilitation

Taken together, these results confirm the technical feasibility and underscore the potential clinical utility of the Handly platform. Its capacity to adapt to individual neuromuscular profiles and to discriminate subtle motor patterns positions it as a promising tool for adaptive rehabilitation strategies. By integrating EMG based control with functional task analysis in a virtual environment, the system provides a dual advantage: delivering targeted motor training while generating objective performance metrics. Future studies involving clinical populations will be critical to validate these preliminary findings and to optimize the platform for therapeutic deployment.

4.8. Discussion

The present study provides empirical evidence supporting the technical feasibility and reliability of the Handly platform, which integrates surface electromyography (sEMG) and markerless optical tracking for virtual-reality-based hand motor training. The findings demonstrate that motor performance in VR tasks varied significantly across grip types and difficulty levels. Precision grips, particularly the Pencil Grip, achieved higher success rates and faster execution times, whereas the Grain Grip imposed greater neuromotor demands, reflected in lower accuracy and longer completion times. These differences highlight the clinical relevance of tailoring rehabilitation to the most demanding grips, as training under progressively challenging conditions may accelerate motor learning and promote transfer to activities of daily living [20,29].
A key methodological contribution of this study was the implementation of a subject-specific RMS-based dynamic threshold for EMG detection. Unlike traditional fixed thresholds, this approach adjusted sensitivity to each participant’s physiological profile, improving accuracy and reducing errors associated with variability in electrode placement and signal noise [30,31]. The inclusion of a 500 ms sliding window further enhanced robustness, making EMG-driven control more reliable for immersive VR environments [33]. These methodological refinements contributed to stable EMG activation thresholds and consistent within-subject performance, strengthening the robustness of the experimental outcomes.
However, the interpretation and generalization of these findings should be made with caution. The sample comprised ten healthy young adults, and therefore the observed performance patterns cannot be directly extrapolated to clinical populations with neurological or musculoskeletal impairments. The current outcomes should be regarded as preliminary feasibility results that establish the system’s operational validity and measurement sensitivity under controlled laboratory conditions. Future trials with clinical populations are essential to determine whether the observed VR–EMG-based improvements translate into measurable functional gains in daily activities. Expanding the platform to integrate complementary biosignals (e.g., force, fatigue, or motion smoothness) could further enhance its adaptive capacity and clinical relevance.
Despite these limitations, the evidence underscores the potential of Handly as a reliable experimental framework for assessing and training upper-limb motor function. The consistent within-subject effects observed in this study suggest that the proposed platform could be effectively adapted to patient-specific rehabilitation protocols, where task customization and feedback personalization are critical for promoting motor recovery.

5. Conclusions

This study demonstrated the technical feasibility and reliability of the Handly platform, a virtual-reality environment integrating surface electromyography (sEMG) and markerless optical tracking for hand motor training. Through the use of individualized RMS-based calibration and dynamic EMG thresholding, the system achieved robust muscle activation detection and consistent performance across multiple grip types and difficulty levels. Execution time and success rate emerged as sensitive and reliable indicators of motor performance, validating the platform’s capacity to detect functional differences in fine and gross motor control.
The results confirmed that combining real-time biosignal processing with immersive VR interaction enables quantitative assessment of neuromotor behavior while maintaining user engagement. Although the current findings are limited to a small cohort of healthy adults, they provide foundational evidence supporting the system’s operational validity and its potential for clinical application in upper-limb rehabilitation.
Future developments will focus on expanding the platform to incorporate multimodal feedback, adaptive task difficulty, and patient-specific progression using machine learning algorithms. Such advances are expected to strengthen the platform’s capacity to personalize therapy, objectively monitor recovery, and ultimately enhance functional outcomes in individuals with motor impairments.

Limitations and Future Works

Although the results of this feasibility study are promising, several limitations must be acknowledged. First, the small convenience sample of ten healthy participants limits statistical power and external validity. The outcomes, therefore, should be interpreted as exploratory findings that demonstrate system feasibility rather than conclusive evidence of rehabilitation efficacy. Future research should include larger and more diverse cohorts, particularly individuals with neurological or musculoskeletal impairments, to evaluate clinical transferability.
Second, the experimental sessions were conducted under controlled laboratory conditions, which may not fully represent real-world variability in electrode placement, muscle fatigue, or attention span. Longitudinal studies in clinical settings are needed to assess the robustness of the platform under repeated use and over extended rehabilitation periods.
Third, one of the evaluated outcome metrics—the number of objects collected—proved to be less sensitive to biomechanical variability, as its differences were not statistically significant across grip types. Future studies should prioritize continuous measures such as execution time, success rate, movement smoothness, and EMG activation dynamics, which provide greater discriminatory power for assessing motor performance.
Finally, the current implementation focused exclusively on surface EMG and optical tracking. Expanding the system to incorporate additional biosignals such as force, muscle fatigue indices, and kinematic entropy may enhance adaptive feedback and clinical relevance. Integration with wearable haptic feedback and machine learning–based personalization could further optimize task difficulty and progression according to patient-specific recovery profiles.
In summary, while the present findings confirm the feasibility and sensitivity of the Handly VR–EMG platform, future clinical validation studies are essential to establish its therapeutic efficacy and real-world applicability in upper-limb rehabilitation.

Author Contributions

Software, L.A.G.-V.; Validation, S.M.G.; Investigation, L.A.R.-P.; Writing—review and editing, L.C.M.-G.; Supervision, A.G.R.-R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of the Autonomous University of Ciudad Juárez (protocol code CBEI-2024-003; approval date: 15 February 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ferreira, A.; Hanchar, A.; Menezes, V.; Giesteira, B.; Quaresma, C.; Pestana, S.; de Sousa, A.P.; Neves, P.; Fonseca, M. VR for rehabilitation: The therapist interaction and experience. In Proceedings of the International Conference on Human-Computer Interaction, Virtual, 26 June–1 July 2022. [Google Scholar] [CrossRef]
  2. Melo, R.L.; da Silva Moreira, V.; do Amaral, É.M.H.; Domingues, J.S., Jr. Victus exergame: An approach to rehabilitation of amputees based on serious games. J. Interact. Syst. 2025, 16, 137–147. [Google Scholar] [CrossRef]
  3. Dávila-Morán, R.C.; Montenegro, J.S.; Chávez-Diaz, J.M.; Peralta Loayza, E.F. Usos de la realidad virtual en la rehabilitación física: Una revisión sistemática. Retos 2024, 61, 1060–1070. (In Spanish) [Google Scholar] [CrossRef]
  4. Paladugu, P.; Kumar, R.; Ong, J.; Waisberg, E.; Sporn, K. Virtual reality-enhanced rehabilitation for improving musculoskeletal function and recovery after trauma. J. Orthop. Surg. Res. 2025, 20, 404. [Google Scholar] [CrossRef]
  5. Holden, M.K. Virtual environments for motor rehabilitation. Cyberpsychol. Behav. 2005, 8, 187–211. [Google Scholar] [CrossRef]
  6. Cuesta-Gómez, A.; Sánchez-Herrera-Baeza, P.; Oña-Simbaña, E.D.; Martínez-Medina, A.; Ortiz-Comino, C.; Balaguer-Bernaldo-de-Quirós, C.; Jardón-Huete, A.; Cano-de-la-Cuerda, R. Effects of virtual reality associated with serious games for upper-limb rehabilitation in patients with multiple sclerosis: Randomized controlled trial. J. Neuroeng. Rehabil. 2020, 17, 90. [Google Scholar] [CrossRef]
  7. Merians, A.S.; Tunik, E.; Adamovich, S.V. Virtual reality to maximize function for hand and arm rehabilitation: Exploration of neural mechanisms. Stud. Health Technol. Inform. 2009, 145, 109–114. [Google Scholar]
  8. Gutiérrez, Á.; Sepúlveda-Muñoz, D.; Gil-Agudo, Á.; de los Reyes Guzmán, A. Serious game platform with haptic feedback and EMG monitoring for upper-limb rehabilitation and smoothness quantification on spinal cord injury patients. Appl. Sci. 2020, 10, 963. [Google Scholar] [CrossRef]
  9. Verdejo, P.Á.; Lería, J.D.; Cativiela, J.G.; Falgueras, L.C.; Périz, V.M.; Combalía, R.A. La realidad virtual en fisioterapia: Una revolución en la rehabilitación. Dialnet 2024, 5, 514. (In Spanish) [Google Scholar]
  10. Dias, G.; Adrião, M.L.; Clemente, P.; da Silva, H.P.; Chambel, G.; Pinto, J.F. Effectiveness of a gamified and home-based approach for upper-limb rehabilitation. In Proceedings of the 44th Annual International Conference of the IEEE EMBS (EMBC), Glasgow, UK, 11–15 July 2022. [Google Scholar] [CrossRef]
  11. Burke, J.W.; McNeill, M.; Charles, D.; Morrow, P.; Crosbie, J.; McDonough, S. Serious games for upper-limb rehabilitation following stroke. In Proceedings of the 2009 Conference in Games and Virtual Worlds for Serious Applications (VS-Games), Coventry, UK, 23–24 March 2009. [Google Scholar] [CrossRef]
  12. Cela, A.F.; Oña, E.D.; Jardón, A. eJamar: A novel exergame controller for upper-limb motor rehabilitation. Appl. Sci. 2024, 14, 11676. [Google Scholar] [CrossRef]
  13. Guerrero-Hernández, A.L.; Albán, Ó.A.V.; Sabater Navarro, J.M. Sistema de rehabilitación de motricidad fina del miembro superior utilizando juegos serios. Ing. Desarro. 2023, 41, 1. (In Spanish) [Google Scholar] [CrossRef]
  14. Hidalgo, J.C.C.; Delgado, J.D.A.; Bykbaev, V.R.; Bykbaev, Y.R.; Coyago, T.P. Serious game to improve fine motor skills using Leap Motion. In Proceedings of the CACIDI 2018, Buenos Aires, Argentina, 28–30 November 2018. [Google Scholar] [CrossRef]
  15. Álvarez-Rodríguez, M.; Sepúlveda-Muñoz, D.; Lozano-Berrio, V.; Ceruelo-Abajo, S.; Gil-Agudo, A.; Gutiérrez-Martín, A.; de los Reyes-Guzmán, A. Preliminary development of two serious games for rehabilitation of spinal cord injured patients. In Proceedings of the International Conference on NeuroRehabilitation, Pisa, Italy, 16–20 October 2008; Springer: Cham, Switzerland, 2018. [Google Scholar] [CrossRef]
  16. Corrêa, A.G.D.; Kintschner, N.R.; Blascovi-Assis, S.M. System of upper-limb motor rehabilitation training using Leap Motion and Gear VR in sessions of home game therapy. In Proceedings of the 2019 IEEE Symposium on Computers and Communications (ISCC), Barcelona, Spain, 29 June–3 July 2019. [Google Scholar] [CrossRef]
  17. Padilla Magaña, J.F.; Peña Pitarch, E.; Sánchez Suarez, I.; Ticó Falguera, N. Evaluación del movimiento de la mano mediante el controlador Leap Motion. In Proceedings of the Memorias 16 Congreso Nacional de Ciencia, Tecnología e Innovación, Córdoba, Spain, 12–14 October 2021. (In Spanish). [Google Scholar]
  18. Herrera, V.; Albusac, J.; Angulo, E.; Gzlez-Morcillo, C.; de los Reyes, A.; Vallejo, D. Virtual reality-assisted goalkeeper training for upper-limb rehabilitation in a safe and adapted patient environment. IEEE Access 2024, 12, 194256–194279. [Google Scholar] [CrossRef]
  19. Lourenço, F.; Postolache, O.; Postolache, G. Tailored virtual reality and mobile application for motor rehabilitation. In Proceedings of the 2018 IEEE International Instrumentation and Measurement Technology Conference (I2MTC), Houston, TX, USA, 14–17 May 2018. [Google Scholar] [CrossRef]
  20. Eldaly, S. Effect of electrical muscular stimulation on occupational activities: A systematic review and meta-analysis. Res. Sq. 2023. Preprint. [Google Scholar] [CrossRef]
  21. Kumar, M.; Srivastava, S.; Das, V.S. Electromyographic analysis of selected shoulder muscles during rehabilitation exercises. J. Back Musculoskelet. Rehabil. 2018, 31, 947–954. [Google Scholar] [CrossRef]
  22. Batista, T.V.; dos Santos Machado, L.; Valença, A.M.G.; de Moraes, R.M. FarMyo: A serious game for hand and wrist rehabilitation using a low-cost electromyography device. Int. J. Serious Games 2019, 6, 3–19. [Google Scholar] [CrossRef]
  23. Yang, X.; Yeh, S.C.; Niu, J.; Gong, Y.; Yang, G. Hand rehabilitation using virtual reality electromyography signals. In Proceedings of the 2017 5th International Conference on Enterprise Systems (ES), Beijing, China, 22–24 September 2017. [Google Scholar] [CrossRef]
  24. Tang, P.; Cao, Y.; Vithran, D.T.A.V.; Xiao, W.; Wen, T.; Liu, S.; Li, Y. The efficacy of virtual reality on the rehabilitation of musculoskeletal diseases: Umbrella review. J. Med. Internet Res. 2025, 27, e64576. [Google Scholar] [CrossRef]
  25. Zhang, N.; Wang, H.; Wang, H.; Qie, S. Impact of the combination of virtual reality and noninvasive brain stimulation on the upper-limb motor function of stroke patients: A systematic review and meta-analysis. J. Neuroeng. Rehabil. 2024, 21, 179. [Google Scholar] [CrossRef]
  26. Sun, R.; Wang, Y.; Wu, Q.; Wang, S.; Liu, X.; Wang, P.; He, Y.; Zheng, H. Effectiveness of virtual and augmented reality for cardiopulmonary resuscitation training: A systematic review and meta-analysis. BMC Med. Educ. 2024, 24, 730. [Google Scholar] [CrossRef] [PubMed]
  27. Toledo-Peral, C.L.; Vega-Martinez, G.; Mercado-Gutiérrez, J.A.; Rodriguez-Reyes, G.; Vera-Hernandez, A.; Leija-Salas, L.; Gutierrez-Martinez, J. Virtual/augmented reality for rehabilitation applications using electromyography as control/biofeedback: Systematic literature review. Electronics 2022, 11, 2271. [Google Scholar] [CrossRef]
  28. Bouteraa, Y.; Ben Abdallah, I.; Elmogy, A. Design and control of an exoskeleton robot with EMG-driven electrical stimulation for upper-limb rehabilitation. Ind. Robot. 2020, 47, 489–501. [Google Scholar] [CrossRef]
  29. Dash, A.; Lahiri, U. Design of Virtual Reality-Enabled Surface Electromyogram-Triggered Grip Exercise Platform. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 444–452. [Google Scholar] [CrossRef]
  30. Kuiken, T.A.; Li, G.; Lock, B.A.; Lipschutz, R.D.; Miller, L.A.; Stubblefield, K.A.; Englehart, K.B. Targeted muscle reinnervation for real-time myoelectric control of multifunction artificial arms. JAMA 2009, 301, 619–628. [Google Scholar] [CrossRef] [PubMed]
  31. Konrad, P. The ABC of EMG: A Practical Introduction to Kinesiological Electromyography; Noraxon USA Inc.: Scottsdale, AZ, USA, 2006. [Google Scholar]
  32. Merletti, R.; Parker, P.J. Electromyography: Physiology, Engineering, and Non-Invasive Applications; Wiley: Hoboken, NJ, USA, 2004. [Google Scholar]
  33. Phinyomark, A.; Khushaba, R.N.; Scheme, E. Feature extraction and selection for myoelectric control based on wearable EMG sensors. Sensors 2018, 18, 1615. [Google Scholar] [CrossRef] [PubMed]
Figure 1. General Functional Diagram.
Figure 1. General Functional Diagram.
Applsci 15 12434 g001
Figure 2. System Input–Output Interaction Diagram.
Figure 2. System Input–Output Interaction Diagram.
Applsci 15 12434 g002
Figure 3. Virtual interactions based on hand and finger position.
Figure 3. Virtual interactions based on hand and finger position.
Applsci 15 12434 g003
Figure 4. The interaction logic among scripts managing user sessions within the platform.
Figure 4. The interaction logic among scripts managing user sessions within the platform.
Applsci 15 12434 g004
Figure 5. The development of virtual objects and kinematic control of the hand in immersive environments.
Figure 5. The development of virtual objects and kinematic control of the hand in immersive environments.
Applsci 15 12434 g005
Figure 6. Comparative illustrations of two types of grip: dynamics of the hand and the object during the grasping action.
Figure 6. Comparative illustrations of two types of grip: dynamics of the hand and the object during the grasping action.
Applsci 15 12434 g006
Figure 7. The Figure shows data access, user profiles and game difficulty types.
Figure 7. The Figure shows data access, user profiles and game difficulty types.
Applsci 15 12434 g007
Figure 8. Handly system: user interaction using the HTC Vive headset, Leap Motion sensor, and EMG electrodes (a), and visualization of the virtual environment on screen (b).
Figure 8. Handly system: user interaction using the HTC Vive headset, Leap Motion sensor, and EMG electrodes (a), and visualization of the virtual environment on screen (b).
Applsci 15 12434 g008
Figure 9. Handly system setup: Leap Motion sensor, and EMG electrodes.
Figure 9. Handly system setup: Leap Motion sensor, and EMG electrodes.
Applsci 15 12434 g009
Figure 10. Distribution of execution time by grip type and level (Boxplot).
Figure 10. Distribution of execution time by grip type and level (Boxplot).
Applsci 15 12434 g010
Figure 11. Distribution of success rate by grip type and level (Boxplot).
Figure 11. Distribution of success rate by grip type and level (Boxplot).
Applsci 15 12434 g011
Figure 12. Average number of objects collected by grip type and level.
Figure 12. Average number of objects collected by grip type and level.
Applsci 15 12434 g012
Figure 13. Representation of effort levels during task execution.
Figure 13. Representation of effort levels during task execution.
Applsci 15 12434 g013
Figure 14. The graph shows the raw, RMS, resting and contraction signals.
Figure 14. The graph shows the raw, RMS, resting and contraction signals.
Applsci 15 12434 g014
Table 1. Experimental protocol per participant.
Table 1. Experimental protocol per participant.
ParticipantSessionsLevelsType of GripsDuration/Session (min)Days
P1–P5, P104342012
P6–P9333206
Table 2. Performance by grip type and level: time, accuracy, and number of items collected.
Table 2. Performance by grip type and level: time, accuracy, and number of items collected.
Grip TypeLevelMean Time (s)Std TimeMean SuccessStd SuccessMean ObjectsStd ObjectsSamples
Card Grip139.336.860.850.148.501.386
Card Grip238.506.410.900.139.001.266
Card Grip337.835.600.900.119.001.106
Cylindrical145.506.920.970.089.670.826
Cylindrical243.007.040.930.127.803.626
Cylindrical340.506.600.880.108.830.986
Spherical140.506.500.920.109.170.986
Spherical239.835.490.870.108.671.036
Spherical338.504.970.880.138.831.336
Hook143.675.350.920.129.171.176
Hook242.675.160.850.148.501.386
Hook340.504.040.900.139.001.266
Grain Grip147.838.910.800.098.000.896
Grain Grip248.505.240.880.088.830.756
Grain Grip344.834.830.820.128.171.176
Pencil Grip138.333.720.970.059.670.526
Pencil Grip237.174.260.900.135.924.136
Pencil Grip337.002.190.900.099.000.896
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Garcia-Villalba, L.A.; Rodríguez-Ramírez, A.G.; Rodríguez-Picón, L.A.; Méndez-González, L.C.; Ghasemlou, S.M. Interactive Platform for Hand Motor Rehabilitation Using Electromyography and Optical Tracking. Appl. Sci. 2025, 15, 12434. https://doi.org/10.3390/app152312434

AMA Style

Garcia-Villalba LA, Rodríguez-Ramírez AG, Rodríguez-Picón LA, Méndez-González LC, Ghasemlou SM. Interactive Platform for Hand Motor Rehabilitation Using Electromyography and Optical Tracking. Applied Sciences. 2025; 15(23):12434. https://doi.org/10.3390/app152312434

Chicago/Turabian Style

Garcia-Villalba, Luz A., Alma G. Rodríguez-Ramírez, Luis A. Rodríguez-Picón, Luis Carlos Méndez-González, and Shaban Mousavi Ghasemlou. 2025. "Interactive Platform for Hand Motor Rehabilitation Using Electromyography and Optical Tracking" Applied Sciences 15, no. 23: 12434. https://doi.org/10.3390/app152312434

APA Style

Garcia-Villalba, L. A., Rodríguez-Ramírez, A. G., Rodríguez-Picón, L. A., Méndez-González, L. C., & Ghasemlou, S. M. (2025). Interactive Platform for Hand Motor Rehabilitation Using Electromyography and Optical Tracking. Applied Sciences, 15(23), 12434. https://doi.org/10.3390/app152312434

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop