Next Article in Journal
Color Stability and Surface Roughness of CAD/CAM Hybrid Ceramics and Resin Composites After Simulated Toothbrushing in Coffee: An In Vitro Study
Next Article in Special Issue
A Comprehensive Narrative Review of Abrupt Movements in Human–Robot Interaction
Previous Article in Journal
Sustainable Urban Planning for High-Rise Residential Complexes Using an Adaptive Facade System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton

Department of Industrial Engineering, University of Florence, 50139 Florence, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2026, 16(3), 1575; https://doi.org/10.3390/app16031575
Submission received: 16 January 2026 / Revised: 30 January 2026 / Accepted: 2 February 2026 / Published: 4 February 2026
(This article belongs to the Special Issue Latest Advances and Prospects of Human-Robot Interaction (HRI))

Abstract

Hand exoskeletons are used in rehabilitation together with serious games to enhance patient experience and, possibly, therapy outcomes. To achieve good engagement, a realistic virtual representation of hand motion is needed; however, the relationship between exoskeleton joint motion and anatomical finger kinematics is rarely obtained using low-cost procedures. This work introduces a mechanical redesign and modeling pipeline that utilizes temporary sensors to identify the exoskeleton–finger mapping, enabling qualitatively realistic virtual hand motion driven solely by the existing on-board sensor. A recently developed hand exoskeleton prototype was redesigned to host two temporary rotary encoders aligned with the MetaCarpoPhalangeal (MCP) and Proximal InterPhalangeal (PIP) joints, in addition to the actuation encoder. Healthy subjects wore the modified device and performed full flexion–extension cycles. Encoder trajectories were processed; then each cycle was approximated by a third-order polynomial in the normalized actuation angle, and a group-level model was obtained by averaging coefficients across valid cycles. Finally, the encoder-based reconstructions of MCP and PIP motion were evaluated against measurements from a gold-standard optical motion capture system. Results indicate that the proposed polynomial model enables joint-angle estimation with sufficient accuracy for interactive rehabilitation scenarios, supporting its use to drive smooth virtual hand motion from the on-board exoskeleton encoder alone.

1. Introduction

Impairments of hand motor function are a common consequence of neurological injuries, particularly stroke [1]. In addition to conventional rehabilitation supervised by physical therapists, wearable robotics, such as Hand Exoskeleton Systems (HESs), can further enhance rehabilitation training [2]. It is well established that, when individuals are properly motivated, the expression of latent motor ability is facilitated [3]. Accordingly, integrating a serious game with a HES provides an interactive human–robot rehabilitation setting that can become more engaging and potentially more effective, as reported in the literature [4,5,6,7]. In this context, improving the reliability and usability of human motion monitoring is fundamental for effective human–robot interaction in rehabilitation.
A key requirement for such interactive rehabilitation is the availability of reliable hand motion tracking to drive a virtual hand in real time. In particular, tracking solutions should balance accuracy with low setup burden, as long calibration and intrusive instrumentation can obstruct repeatable clinical-like interaction. Sensors integrated into the HES, such as inertial sensors (IMUs) and other wearable sensors (e.g., magnetic encoders, flex sensors, and EMG), can be used to track the state of the device in real time and, if appropriately mapped onto a model of the body part under rehabilitation, to produce a natural movement of the model in the game. From a human–robot interaction (HRI) perspective, reducing the number of sensors at run time avoids intrusiveness of environmental constraints, improving usability, shortening setup time, and supporting repeatable interaction in clinical-like scenarios. Alternative hand tracking solutions have been explored in the literature, including vision-based systems and wearable sensing devices, each with its own advantages and limitations. In the following section, these approaches are discussed and the rationale for the method adopted in this study is outlined.
As will be discussed in detail in the following sections, each of the finger mechanisms of the hand exoskeleton considered in this work is a 1-degree-of-freedom (DoF) rigid kinematic chain equipped with a single built-in encoder measuring the exoskeleton motion. The encoder-measured motion coordinate is denoted as q 18 ( t ) , i.e., the only actuated (independent) coordinate among the 18 variables describing the device kinematics. While q 18 is sufficient to describe the device actuation, interactive virtual hand control typically requires joint-level kinematics, e.g., MetaCarpoPhalangeal (MCP) and Proximal InterPhalangeal (PIP) flexion angles. To enable such a representation while keeping the exoskeleton unmodified during operation, we propose a two-stage approach (Figure 1). During an initial model-fitting phase, we temporarily redesign the exoskeleton with 3D-printed add-on parts to mount two additional encoders that provide reference measurements of finger motion. The acquired encoder signals are filtered to attenuate measurement noise and obtain smoother trajectories; the resulting dataset is then used to identify polynomial mapping functions that relate the built-in encoder angle q 18 ( t ) to the corresponding MCP and PIP joint angles. After the model-fitting stage, the add-on sensors can be removed and the original exoskeleton can be used unmodified: the virtual hand motion is reconstructed in real time using only the on-board encoder q 18 ( t ) and the identified polynomial mappings, without requiring external motion capture systems or other intrusive equipment for deployment.
In this work, we present the proposed model-fitting strategy and its integration in a Unity-based serious game to control a virtual hand (Figure 1). Resource efficiency is achieved by limiting sensing and infrastructure requirements during testing and routine operation: MCP and PIP flexion–extension angles are estimated in real time from a single on-board encoder, avoiding additional wearable sensors and external tracking systems during rehabilitation sessions; additional sensing is required only during the model-fitting phase, using low-cost temporary encoders mounted through reversible 3D-printed add-on parts, without permanently modifying the device worn by the patient. The approach is further evaluated to investigate whether the encoder-based sensing strategy, relying on a single on-board encoder at run time, can provide sufficiently accurate and repeatable MCP and PIP kinematics for interactive rehabilitation-oriented applications. When needed, external motion capture (MoCap) is used only for experimental validation and is not required for system operation.
The main contributions of this paper are as follows: (i) a reversible and resource-efficient 3D-printed redesign enabling temporary sensorization while preserving the original device for deployment; (ii) a polynomial regression-based mapping from the single built-in encoder signal q 18 ( t ) to MCP and PIP joint kinematics, obtained from filtered data collected during the model-fitting stage; (iii) the integration of the proposed reconstruction pipeline in a Unity serious game for real-time virtual-hand control; and (iv) an experimental evaluation including repeatability analysis and motion capture-based validation.
The structure of this paper is as follows: Section 1 reviews motion-tracking methods for hand kinematics and positions the proposed approach within existing solutions; Section 2 describes the exoskeleton’s redesign, the sensing architecture, and the motion reconstruction pipeline; Section 3 reports the experimental results obtained with the proposed system; and finally, Section 4 discusses the findings and concludes this paper.

Related Work

The following discussion outlines the different tracking systems, their pros and cons, and the rationale behind the choice of the current method to improve finger motion tracking with the HES developed at the Department of Industrial Engineering of the University of Florence (DIEF-HES) (as detailed later in Section 2.1).
Tracking systems can be classified as non-visual, visual-based, or a combination of both [8,9,10,11,12] (Figure 2).
Visual tracking systems always include cameras, which can be either stereo cameras or 3D depth cameras (key system parameters typically include resolution, field of view, frame rate, and working distance). In the former case, two images are used to reconstruct the subject of interest, whereas in the latter, a camera is combined with one or more sensors, such as infrared. In both cases, both budget-friendly and expensive solutions are available, and an initial calibration is required for data acquisition, determining the quality of the capture (e.g., intrinsic/extrinsic calibration and/or hand model initialization). While the choice of the camera is a crucial factor in visual tracking systems, the setup of the tracked object is also a key aspect. One approach involves placing trackers in strategic positions of the object [13], in which case marker occlusions or marker swapping may occur [14] (often quantified via tracking loss rate under occlusions and reprojection/pose consistency). Alternatively, colored gloves can be used [15]. A markerless measurement system is also available, offering a less intrusive experience for the user but at the cost of being less accurate (performance is commonly reported in terms of positional/joint angle error and end-to-end latency/jitter in real-time applications) [16,17].
The other main tracking method is non-visual tracking, which includes inertial-based, magnetic-based or other sensor-based systems. Inertial Measurement Units (IMUs) [18,19,20] use accelerometers, gyroscopes and/or magnetometers to acquire data on inertial motion and the 3D orientation of joined individual segments. Unlike visual systems, IMUs do not suffer from the line-of-sight problem, though fluctuation in offsets and measurement noise can lead to integration drift. Magnetic sensors are also widely used for tracking movements in virtual reality (VR) due to their small size, although they may suffer from latency and jitter [21]. In all cases, the sensors can be integrated into devices worn by the subject. While this solution provides a higher accuracy of the measurement, it also requires taking up part of the limited surrounding space of the hand, making it a more intrusive solution [22,23,24].
Finally, hybrid solutions [25,26,27] typically use cameras, such as Dynamic Vision Sensors (DVSs) or 3D depth cameras, to capture hand movements, combined with ElectroMyoGraphic (EMG) signal measurements to correlate muscle activity with the tracked motion.
Within this landscape, our work focuses on minimal, encoder-based sensing and on identifying a subject-specific mapping from device actuation to finger joint kinematics, avoiding external vision systems during deployment and thus eliminating line-of-sight constraints and reducing sensitivity to occlusions. The design choice of using encoders is motivated by (i) the limited space available in hand-coupled mechanisms, for which compact magnetic encoders (8 mm diameter, 3 mm height) are particularly suitable and (ii) the presence of an on-board encoder of the same type in the baseline device. Therefore, to ensure measurement consistency, the two temporary encoders added for the model-fitting phase were selected to match the built-in one, enabling the use of three homogeneous sensors.

2. Materials and Methods

This section describes the materials and methods used in this study. The device adopted for the experiments is first presented, including its mechanical structure, operating principle, and integration with the other components of the system. The relevant kinematic model and variables used throughout this work are also introduced. The following subsections illustrate the redesign of the device, the measurement setup, and the signal processing pipeline. Finally, the validation procedures adopted to assess the performance of the proposed approach are outlined.

2.1. Baseline Hand Exoskeleton

This study was conducted using the 2022 model of the DIEF-HES, which represents the most recent stable and fully operational version available; a complete hardware and software redesign is currently under development and testing and was therefore not used. The system comprehends the DIEF-HES, the Remote Actuation System (RAS), where actuation and control components are housed, and a monitor that displays the serious game used during the rehabilitation process (Figure 3). The DIEF-HES, worn integral to the back of the hand and secured to the intermediate phalanges of each finger via Velcro straps, is connected to the RAS through Bowden cables (Semerfil Worldwires s.r.l., Bari, Italy) for actuation and through electrical cables for communication with the on-board sensors (encoders (RM08 Linear Miniature Rotary Magnetic Encoder, RLS, Komenda, Slovenia) and load cells (FSSM-500N, Forsentek Co., Shenzhen, China)). The serious game used alongside the DIEF-HES was specifically designed to work together with the system to engage the user during rehabilitation, while enlisting them through the exercises prescribed by physical therapists. The game simulates the user’s hand moving and interacting with digital objects in a virtual environment. Specifically, the DIEF-HES’s encoders measure the FM motion and drive the flexion–extension movements of the digital fingers. The embedded load cells measure the force applied by the user with each finger while playing the game; the hand exoskeleton is hence force-controlled to follow the force references coming from the virtual reality (e.g., the force reference is zero when no interaction with objects is detected, or the force reference assumes values proportional to the stiffness of the objects and the indentation between the bounding boxes of the object and the fingers). From this perspective, it is mandatory to achieve an accurate tracking of the finger kinematics, since the interaction with virtual objects heavily relies on it. As better described in later sections, DIEF-HES did not have a sufficiently accurate finger motion tracking system, so it became necessary to conduct the work described in this paper.
Regarding the mechanical design of the FM, it consists of five links and a ground frame (Figure 4). The frame is rigidly coupled to the hand housing via a magnetic interface and a pin-hole coupling, making it integral to the patient’s hand. The FM has one DoF; consequently, the mechanism configuration is fully determined by a single generalized coordinate, this being the rear crank rotation q 18 , which is the actuated link. Actuation is provided through a Bowden cable transmission: the system is connected, on one side, to a pulley rigidly attached to the rear crank and, on the actuator side, to a pulley mounted on an electric motor. Two sensors are integrated in the system: (i) a magnetic encoder, placed at joint Z to measure q 18 , which uniquely determines the full mechanism configuration at any time, and (ii) a load cell mounted integral to the connecting rod to continuously measure the axial force in that link. While the back of the hand is integral to the frame, the finger’s intermediate phalanx is secured to the thimble via a Velcro strap. The thimble is the only FM component whose position is not directly known during the exercise, as it is connected to the mechanism through a passive pivot–slider coupling (joint E). This interface constrains the finger–thimble interaction to occur only along the direction normal to the phalanx, which is desirable to minimize shear stresses on the skin.

2.2. Exoskeleton Redesign and Sensor Integration

The redesign focused on a single FM of the DIEF-HES, specifically the index finger FM, to validate the proposed method and assess its performance before extending the approach to the full device. The FM motion can be tracked in real time through the on-board encoder, which measures the rear crank angle q 18 ; however, this measurement alone does not provide the thimble rotation (and thus the finger phalanx motion). The objective of the redesign is therefore to estimate the finger phalanx kinematics as a function of the mechanism motion. To this end, the rotational displacements of the MCP and PIP joints must be recorded together with q 18 . Accordingly, two additional temporary encoders were introduced: one to measure the MCP rotation and one to measure the thimble rotation, which is coupled to the intermediate phalanx and can thus be used to measure the PIP rotation.
The mechanical redesign can be summarized in two main interventions. First, to embed a magnetic encoder in the thimble, the latter and the connected link were redesigned while leveraging the existing FM architecture. Second, a dedicated temporary external assembly, referred to as the Phalanx–Metacarpal Module (PMM), was developed to accommodate the second temporary magnetic encoder for MCP joint angle measurement. Figure 5 illustrates the mechanical redesign adopted to enable temporary sensorization. For clarity, Figure 6 further provides a schematic representation of the resulting closed-chain coupling between the assemblies and the finger, explicitly showing how the measured device variables relate to the finger joint kinematics. In this schematic, the angles θ and φ are introduced only to illustrate the kinematic coupling, as formalized in Equations (1) and (2).
MCP ( t ) = θ enc ( t ) θ enc ( t 0 ) PIP ( t ) = φ enc ( t ) φ enc ( t 0 )
MCP ( t ) , PIP ( t ) = h q 18 ( t ) .
After offset removal at the open-hand reference, these quantities are directly mapped to the anatomical joint angles MCP and PIP, which are used consistently throughout the rest of this paper for clarity and for presenting the experimental results.
Since the two new assemblies were devised solely as test cases, encumbrance and weight were considered but not treated as strict design constraints, as the assemblies were not intended for final deployment in robot-assisted therapy with patients.
Both the redesigned thimble and PMM were designed for 3D printing in Acrylonitrile Butadiene Styrene (ABS).
The sensors selected for this study are two miniature rotary magnetic encoders, identical to the one already embedded in the device. These sensors offer several advantages, mainly a super small size with a 8 mm diameter body, an accuracy to ± 0.3 , a high-speed operation to 30,000 rpm, and a non-contact, frictionless design, which makes them suitable for the proposed integration.
Finally, structural static analyses of the new components were performed using the Finite Element Method (FEM). Material properties of ABS, including yield stress (31 MPa) and Young’s modulus (1.5 GPa), were considered in SolidWorks v2020 simulations. Previous activities of the research group adopted a nominal force of 15 N; for this work, a force of 20 N was chosen to ensure the components’ reliability. The load was applied on joint E, while hinge constraints were used to replicate the interactions with connected components. In addition, the force direction identified in the latest HES study by the authors [6] was considered in relation to the global reference frame and the corresponding value of q 18 .
It is noteworthy that, compared to the original aluminum design, the redesigned ABS CE-link was thickened to ensure its structural integrity. To conclude, the most stressed component of the system can be considered well-dimensioned since, under the worst operating conditions, it presented a maximum stress value of 6.23 · 10 6 N / m 2 and a maximum displacement of 9.11 · 10 1 mm ; therefore it is a stress that is far from the yield condition (a safety factor of 5 ) and an acceptable maximum displacement, confirming the new assembly design.

2.3. Measurement and Signal Processing

Three super small non-contact rotary encoders (RLS RM08 Linear Miniature Rotary Magnetic Encoder) were the sensors chosen for this application. The encoder mounted at joint Z measures the angle q 18 , whereas the two temporary encoders, introduced for this measurement phase, measure the thimble (i.e., the PIP angle) and the MCP angle; the latter is placed in the new PMM (Figure 7). The data acquisition setup comprises an Arduino Mega microcontroller board, a breadboard for circuitry, and an ESP32.
Five healthy subjects were enrolled in this preliminary study after providing informed consent and were asked to wear the redesigned FM and the PMM. Data collection was performed as follows. First, the exoskeleton was set in the open configuration, with both the thimble and the PMM in a horizontal position. The initial configuration of the FM, thimble, and PMM was used as the starting reference for data evaluation, corresponding to the q 18  open position. Second, the user’s hand, and consequently the connected FM, were closed at a constant speed until the rear crank reached the closed end-stop. The procedure was repeated three additional times to minimize the influence of external noise or software-related errors. It is worth noting that the movement performed by the volunteers corresponds to the gesture intended to be displayed in the serious game during rehabilitation exercises.
The recorded data (stored as CSV files) were imported into MATLAB vR2023b for processing (Figure 8). This processing pipeline was selected to attenuate tremor and transmission-related fluctuations while preserving the low-frequency kinematic trends that are relevant for real-time interaction in the serious game. Prior to inclusion in the dataset, the signals were filtered according to the following pipeline: (i) Basic pre-processing, including offset removal; no timestamp correction or resampling was required because all encoders shared the same acquisition frequency (identical sensor model and configuration). (ii) Low-pass Butterworth filtering (4th order, zero-phase using filtfilt to avoid phase lag, and 5 Hz cut-off) [28,29,30] to attenuate high-frequency components attributable to physiological tremor and mechanical friction in the exoskeleton. (iii) Additional low-pass smoothing using a Savitzky–Golay filter (3rd-order polynomial) [31,32,33] to further reduce residual high-frequency fluctuations while preserving the overall signal shape.
For each recording, flexion–extension cycles were extracted from the filtered encoder trajectories. A cycle was defined as a movement starting from a relaxed posture, reaching a clear flexion peak, and returning to the initial range of motion. Cycles with incomplete motion (reduced range), artifacts (loss of contact and signal saturation) or interruptions were discarded. From each recording, one representative flexion–extension cycle (close–open) was selected from the central portion of the trial and time-normalized to [ 0 , 1 ] . Therefore, five total cycles were retained ( n { 1 , , 5 } ). Each subject-specific cycle was used to fit a third-order polynomial mapping the measured M C P and P I P angles to the FM control angle q 18 . A third-order polynomial was chosen as the lowest-order model capable of capturing the smooth but asymmetric curvature of the flexion–extension profile while avoiding the oscillations and overfitting typically introduced by higher-order fits.
M C P n ( q 18 ) = a 0 n + a 1 n q 18 + a 2 n q 18 2 + a 3 n q 18 3 with n { 1 , , 5 }
P I P n ( q 18 ) = b 0 n + b 1 n q 18 + b 2 n q 18 2 + b 3 n q 18 3 with n { 1 , , 5 }
where a n = [ a 0 n , a 1 n , a 2 n , a 3 n ] T and b n = [ b 0 n , b 1 n , b 2 n , b 3 n ] T are the polynomial coefficients. For each subject, the approximation quality was quantified by computing the Root-Mean-Square Error (RMSE) between the measured angles and the corresponding polynomial reconstructions. Finally, models were then obtained by averaging coefficients across cycles:
a = 1 5 n = 1 5 a n , b = 1 5 n = 1 5 b n
The resulting polynomials M C P ( q 18 ) and P I P ( q 18 ) describe a representative flexion–extension pattern used to drive the joint motion of the virtual finger in the serious game.

2.4. Validation with Motion Capture

The MoCap system used for this phase was the OptiTrack motion capture system. Twelve infrared cameras where employed: eleven for tracking purposes and 1 configured to record a black-and-white reference video for future reference. To capture the angles of interest, the experimental setup included the hand wearing the FM, and reflective markers were placed at strategic locations. Four markers were attached to the exoskeleton to track the rotation of q 18 and the thimble (hence, the P I P joint), while three markers were placed on the finger to estimate the M C P and, again, P I P joint rotations (Figure 9).
Motive (OptiTrack’s v3.1 software) provides several predefined marker sets for describing body motion; however, for this application a custom marker set was created.
Five subjects were asked to wear the glove with the FM and perform the same movements executed during the previous encoder-based recordings. After the acquisitions, the recordings were refined with Motive in the dedicated data-editing section. The resulting data were then exported as CSV files and processed in MATLAB.

3. Results

Figure 10 shows the group-level polynomial mappings relating the normalized actuation angle q 18 , norm [ 0 , 1 ] to the corresponding joint kinematics. Reference M C P and P I P trajectories were obtained from the filtered measurements of the permanent on-board encoder and the two temporarily mounted encoders, as described in Section 2.3. In contrast, the mapping functions were evaluated using the raw encoder signal (after the offset removal), in accordance with the intended usage in the Unity serious game. The top two panels of Figure 10 show the raw data taken from five participants. For each subject, the time interval corresponding to the finger closing movement with the best signal quality was selected and used to compute the final polynomial fit as a result of averaging the single polynomial coefficients across subjects (Table 1). For completeness, the InterQuartile Range (IQR) (25–75%) is reported in all panels.
A mixed group aged 20–30 years, female and male, were the volunteers for this preliminary dataset; the starting point was set as 0 for each group of samples. For these reasons, data variability tends to increase as the motion progresses away from the starting point. Reconstruction accuracy was quantified by comparing the estimated angles θ ^ with the reference trajectories θ ref (encoder measurements). Residuals were defined as e ( q 18 ) = θ ref ( q 18 ) θ ^ ( q 18 ) and are summarized in the bottom panels through their median and IQR. Table 1 reports the error metrics, including RMSE and MAE. Overall, M C P reconstruction achieved an RMSE of 8.1 ° and an MAE of 5.4 °, whereas P I P reconstruction resulted in an RMSE of 7.8 ° and an MAE of 5.6 °. Given the preliminary nature of this study and the intended use of the mapping to drive a virtual finger in a Unity-based serious game, the main requirement is a stable and visually plausible motion rendering rather than clinical-grade goniometric accuracy. Accordingly, the observed error levels are considered acceptable for the target real-time application.
Repeatability was assessed by analyzing the variability of the reference trajectories and the consistency of the reconstructed angles across repetitions and subjects. The residual spread is minimal at q 18 = 0 and increases toward the closed-hand configuration, which is consistent with the alignment procedure at the starting point and with residual sources of variability such as micro-movements of the hand and occasional stick–slip of the mechanism, despite filtering.
MoCap recordings, used to validate the learned kinematic relationships θ ( q 18 ) , were collected in separate sessions; five volunteers were asked to perform repetitions of closing–opening of the hand. Collected data was then mapped in relation to the normalized actuation angle q 18 , norm [ 0 , 1 ] to allow a comparison with θ ( q 18 ) (Figure 11). Specifically, encoder and MoCap recordings were subject to event-based alignment; hence, participants were asked to hold the open and closed positions longer then the rest of the movements so that the opening configuration and closed configuration could be used respectively as q 18 = 0 and q 18 = 1 . MoCap was used as an independent plausibility check of the reconstructed kinematics (range) rather than a synchronized point-wise ground truth. Agreement was quantified using MAE and IQR coverage, which are robust to marker noise and inter-subject variability.
Finally, Unity’s serious game was updated with the M C P ( q 18 ) and P I P ( q 18 ) functions, confirming that these results support the use of the group-level polynomial as a compact representation for real-time reconstruction.

4. Discussion and Conclusions

This work proposed a practical, two-stage workflow to reconstruct joint finger kinematics from a single on-board encoder signal of a 1-DoF FM of the DIEF-HES. During the first phase, the on-board permanent encoder and two temporary encoders were used to obtain reference M C P and P I P flexion–extension trajectories and identify third-order polynomial mappings θ ( q 18 ) (with q 18 being the variable that describes the 1-DoF FM kinematics). In the operational phase, the temporary sensors can be removed and the virtual finger motion can be rendered in real time using only the permanent encoder q 18 ( t ) , thus reducing instrumentation, setup time, and overall user burden in rehabilitation-oriented HRI scenarios.
Experimental results on five healthy subjects showed that the learned mappings provide a stable and visually plausible reconstruction over repeated opening–closing cycles. Against the encoder-based reference trajectories, the approach achieved an RMSE of 8.1 ° ( M C P ) and 7.8 ° ( P I P ), with an MAE of 5.4 ° ( M C P ) and 5.6 ° ( P I P ). Residual trends indicated that errors are not uniformly distributed along q 18 , norm , with larger deviations appearing away from the starting posture, consistent with increased variability during the motion. An additional validation was performed using independent MoCap recordings as a plausibility check of the reconstructed kinematic ranges. In this setting, agreement remained satisfactory, with a MoCap-based MAE of 3.3 ° ( M C P ) and 4.3 ° ( P I P ) and IQR coverage of 60 % and 77.5 % , respectively, supporting the consistency of the reconstructed motion within typical inter-subject dispersion.
Overall, the proposed method enables joint angle estimation suitable for interactive rehabilitation applications where the primary requirement is robust, repeatable, and realistic motion rendering (e.g., in a Unity serious game with average errors around 9°) [34,35,36,37], rather than clinical-grade goniometric accuracy (with average errors 5 °) [38,39]. Future work will (i) extend the evaluation to a larger group and to target users with motor impairments and (ii) generalize the approach to additional degrees of freedom and more complex grasp patterns, enabling richer HRI behaviors while preserving minimal sensing at run time.

Author Contributions

Conceptualization, A.D.N. and N.S.; methodology, A.D.N. and G.L.; software, A.D.N. and M.G.; validation, A.D.N.; formal analysis, A.D.N. and G.L.; investigation, A.D.N. and M.G.; data curation, A.D.N. and M.G.; supervision, A.R. and B.A.; writing—original draft preparation, A.D.N. and N.S. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Italian Ministry of University and Research, under the complementary actions to the NRRP ‘Fit4MedRob–Fit for Medical Robotics’ Grant (# PNC0000007).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Raghavan, P. The nature of hand motor impairment after stroke and its treatment. Curr. Treat. Options Cardiovasc. Med. 2007, 9, 221–228. [Google Scholar] [CrossRef]
  2. Du Plessis, T.; Djouani, K.; Oosthuizen, C. A review of active hand exoskeletons for rehabilitation and assistance. Robotics 2021, 10, 40. [Google Scholar] [CrossRef]
  3. Taub, E.; Miller, N.E.; Novack, T.A.; Cook, E.W.; Fleming, W.C.; Nepomuceno, C.S.; Connell, J.S.; Crago, J. Technique to improve chronic motor deficit after stroke. Arch. Phys. Med. Rehabil. 1993, 74, 347–354. [Google Scholar] [PubMed]
  4. Nizamis, K.; Athanasiou, A.; Almpani, S.; Dimitrousis, C.; Astaras, A. Converging robotic technologies in targeted neural rehabilitation: A review of emerging solutions and challenges. Sensors 2021, 21, 2084. [Google Scholar] [CrossRef] [PubMed]
  5. Tosto-Mancuso, J.; Tabacof, L.; Herrera, J.E.; Breyman, E.; Dewil, S.; Cortes, M.; Correa-Esnard, L.; Kellner, C.P.; Dangayach, N.; Putrino, D. Gamified neurorehabilitation strategies for post-stroke motor recovery: Challenges and advantages. Curr. Neurol. Neurosci. Rep. 2022, 22, 183–195. [Google Scholar] [CrossRef]
  6. Bartalucci, L.; Secciani, N.; Brogi, C.; Topini, A.; Della Valle, A.; Ridolfi, A.; Allotta, B. An original mechatronic design of a kinaesthetic hand exoskeleton for virtual reality-based applications. Mechatronics 2023, 90, 102947. [Google Scholar] [CrossRef]
  7. Topini, A.; Sansom, W.; Secciani, N.; Bartalucci, L.; Ridolfi, A.; Allotta, B. Variable admittance control of a hand exoskeleton for virtual reality-based rehabilitation tasks. Front. Neurorobot. 2022, 15, 789743. [Google Scholar] [CrossRef]
  8. Zhou, H.; Hu, H. Human motion tracking for rehabilitation—A survey. Biomed. Signal Process. Control 2008, 3, 1–18. [Google Scholar] [CrossRef]
  9. Buckingham, G. Hand tracking for immersive virtual reality: Opportunities and challenges. Front. Virtual Real. 2021, 2, 728461. [Google Scholar] [CrossRef]
  10. Ahmad, A.; Migniot, C.; Dipanda, A. Hand pose estimation and tracking in real and virtual interaction: A review. Image Vis. Comput. 2019, 89, 35–49. [Google Scholar] [CrossRef]
  11. Bouteraa, Y.; Abdallah, I.B.; Elmogy, A.M. Training of hand rehabilitation using low cost exoskeleton and vision-based game interface. J. Intell. Robot. Syst. 2019, 96, 31–47. [Google Scholar] [CrossRef]
  12. Wang, Q.; Markopoulos, P.; Yu, B.; Chen, W.; Timmermans, A. Interactive wearable systems for upper body rehabilitation: A systematic review. J. Neuroeng. Rehabil. 2017, 14, 20. [Google Scholar] [CrossRef] [PubMed]
  13. Aristidou, A. Hand tracking with physiological constraints. Vis. Comput. 2018, 34, 213–228. [Google Scholar] [CrossRef]
  14. Hamer, H.; Schindler, K.; Koller-Meier, E.; Van Gool, L. Tracking a hand manipulating an object. In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 29 September–2 October 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 1475–1482. [Google Scholar]
  15. Wang, R.Y.; Popović, J. Real-time hand-tracking with a color glove. ACM Trans. Graph. (TOG) 2009, 28, 63. [Google Scholar] [CrossRef]
  16. Metcalf, C.D.; Robinson, R.; Malpass, A.J.; Bogle, T.P.; Dell, T.A.; Harris, C.; Demain, S.H. Markerless motion capture and measurement of hand kinematics: Validation and application to home-based upper limb rehabilitation. IEEE Trans. Biomed. Eng. 2013, 60, 2184–2192. [Google Scholar] [CrossRef]
  17. Malik, S. Real-Time Hand Tracking and Finger Tracking for Interaction CSC2503F Project Report; Tech. Rep; Department of Computer Science, University of Toronto: Toronto, ON, Canada, 2003. [Google Scholar]
  18. O’Reilly, M.; Caulfield, B.; Ward, T.; Johnston, W.; Doherty, C. Wearable inertial sensor systems for lower limb exercise detection and evaluation: A systematic review. Sport. Med. 2018, 48, 1221–1246. [Google Scholar] [CrossRef]
  19. Madgwick, S.O.; Harrison, A.J.; Vaidyanathan, R. Estimation of IMU and MARG orientation using a gradient descent algorithm. In Proceedings of the 2011 IEEE International Conference on Rehabilitation Robotics, Zurich, Switzerland, 29 June–1 July 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 1–7. [Google Scholar]
  20. Burns, A.; Greene, B.R.; McGrath, M.J.; O’Shea, T.J.; Kuris, B.; Ayer, S.M.; Stroiescu, F.; Cionca, V. SHIMMER™—A wireless sensor platform for noninvasive biomedical research. IEEE Sensors J. 2010, 10, 1527–1534. [Google Scholar] [CrossRef]
  21. Lenz, J.E. A review of magnetic sensors. Proc. IEEE 2002, 78, 973–989. [Google Scholar] [CrossRef]
  22. Luo, Z.; Lim, C.K.; Yang, W.; Tee, K.Y.; Li, K.; Gu, C.; Nguen, K.D.; Chen, I.M.; Yeo, S.H. An interactive therapy system for arm and hand rehabilitation. In Proceedings of the 2010 IEEE Conference on Robotics, Automation and Mechatronics, Singapore, 28–30 June 2010; IEEE: Piscataway, NJ, USA, 2010; pp. 9–14. [Google Scholar]
  23. Casas, R.; Martin, K.; Sandison, M.; Lum, P.S. A tracking device for a wearable high-DOF passive hand exoskeleton. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Mexico, 26–30 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 6643–6646. [Google Scholar]
  24. Di Natale, A.; Bartalucci, L.; Secciani, N.; Liverani, G.; Ridolfi, A.; Allotta, B. Robot-Assisted Rehabilitation: Mechatronic Redesign of a Finger Exoskeleton to Improve Its Motion Tracking Capabilities. In Proceedings of the International Conference of IFToMM ITALY, Salerno, Italy, 6–8 May 2024; Springer: Berlin/Heidelberg, Germany, 2024; pp. 51–58. [Google Scholar]
  25. Ceolini, E.; Frenkel, C.; Shrestha, S.B.; Taverni, G.; Khacef, L.; Payvand, M.; Donati, E. Hand-gesture recognition based on EMG and event-based camera sensor fusion: A benchmark in neuromorphic computing. Front. Neurosci. 2020, 14, 637. [Google Scholar] [CrossRef]
  26. Kainz, O.; Jakab, F. Approach to hand tracking and gesture recognition based on depth-sensing cameras and EMG monitoring. Acta Inform. Pragensia 2014, 3, 104–112. [Google Scholar] [CrossRef]
  27. Arkenbout, E.A.; De Winter, J.C.; Breedveld, P. Robust hand motion tracking through data fusion of 5DT data glove and nimble VR Kinect camera measurements. Sensors 2015, 15, 31644–31671. [Google Scholar] [CrossRef]
  28. Roda-Sales, A.; Jarque-Bou, N.J.; Bayarri-Porcar, V.; Gracia-Ibáñez, V.; Sancho-Bru, J.L.; Vergara, M. Electromyography and kinematics data of the hand in activities of daily living with special interest for ergonomics. Sci. Data 2023, 10, 814. [Google Scholar] [CrossRef] [PubMed]
  29. Casile, A.; Fregna, G.; Boarini, V.; Paoluzzi, C.; Manfredini, F.; Lamberti, N.; Baroni, A.; Straudi, S. Quantitative comparison of hand kinematics measured with a markerless commercial head-mounted display and a marker-based motion capture system in stroke survivors. Sensors 2023, 23, 7906. [Google Scholar] [CrossRef] [PubMed]
  30. Roda-Sales, A.; Sancho-Bru, J.L.; Vergara, M. Hand kinematics requirements in feeding and cooking tasks: Ergonomic implications. Heliyon 2025, 11, e43417. [Google Scholar] [CrossRef]
  31. Della Santina, C.; Bianchi, M.; Averta, G.; Ciotti, S.; Arapi, V.; Fani, S.; Battaglia, E.; Catalano, M.G.; Santello, M.; Bicchi, A. Postural hand synergies during environmental constraint exploitation. Front. Neurorobot. 2017, 11, 41. [Google Scholar] [CrossRef]
  32. Acharya, R.; Challita, E.J.; Ilton, M.; Saad Bhamla, M. The ultrafast snap of a finger is mediated by skin friction. J. R. Soc. Interface 2021, 18, 20210672. [Google Scholar] [CrossRef]
  33. Mender, M.J.; Nason-Tomaszewski, S.R.; Temmar, H.; Costello, J.T.; Wallace, D.M.; Willsey, M.S.; Kumar, N.G.; Kung, T.A.; Patil, P.; Chestek, C.A. The impact of task context on predicting finger movements in a brain-machine interface. eLife 2023, 12, e82598. [Google Scholar] [CrossRef]
  34. Abdlkarim, D.; Di Luca, M.; Aves, P.; Maaroufi, M.; Yeo, S.H.; Miall, R.C.; Holland, P.; Galea, J.M. A methodological framework to assess the accuracy of virtual reality hand-tracking systems: A case study with the Meta Quest 2. Behav. Res. Methods 2024, 56, 1052–1063. [Google Scholar] [CrossRef]
  35. Prahm, C.; Bressler, M.; Gohlke, T.; Hönning, A.; Harhaus-Wähner, L.; Daigeler, A.; Kolbenschlag, J. Immersive virtual reality for functional hand and finger rehabilitation: Results from a randomized controlled trial in 150 patients after traumatic hand injuries. npj Digit. Med. 2025, 8, 792. [Google Scholar] [CrossRef]
  36. Bertolasi, J.; Garcia-Hernandez, N.V.; Memeo, M.; Guarischi, M.; Gori, M. Evaluation of HoloLens 2 for Hand Tracking and Kinematic Features Assessment. Virtual Worlds 2025, 4, 31. [Google Scholar] [CrossRef]
  37. Maggioni, V.; Azevedo-Coste, C.; Durand, S.; Bailly, F. Optimisation and comparison of markerless and marker-based motion capture methods for hand and finger movement analysis. Sensors 2025, 25, 1079. [Google Scholar] [CrossRef]
  38. Kuvijitsuwan, P.; Klaphajone, J.; Singjai, P.; Kumpika, T.; Thawinchai, N.; Angkurawaranon, C.; Aramrat, C.; Utarachon, K. Validity and reliability of a finger training tool for assessing metacarpal phalangeal joint ranges of motion in asymptomatic participants. Sci. Rep. 2024, 14, 20113. [Google Scholar] [CrossRef]
  39. Ellis, B.; Bruton, A. A study to compare the reliability of composite finger flexion with goniometry for measurement of range of motion in the hand. Clin. Rehabil. 2002, 16, 562–570. [Google Scholar] [CrossRef]
Figure 1. Schematic of the proposed workflow. (a) Baseline Finger Module (FM) of the exoskeleton in its original configuration, equipped with a single on-board (permanent) encoder; (b) temporary instrumentation with two additional encoders to measure the finger/phalange motion alongside the FM encoder; (c) identification of polynomial mapping functions relating the FM encoder signal to the phalange kinematics; and (d) update of the Unity-based serious game to enable hand motion estimation using only the permanent on-board encoder. Orange circle: permanent sensor; blue dashed circle: temporary sensor.
Figure 1. Schematic of the proposed workflow. (a) Baseline Finger Module (FM) of the exoskeleton in its original configuration, equipped with a single on-board (permanent) encoder; (b) temporary instrumentation with two additional encoders to measure the finger/phalange motion alongside the FM encoder; (c) identification of polynomial mapping functions relating the FM encoder signal to the phalange kinematics; and (d) update of the Unity-based serious game to enable hand motion estimation using only the permanent on-board encoder. Orange circle: permanent sensor; blue dashed circle: temporary sensor.
Applsci 16 01575 g001
Figure 2. Summary of motion-tracking methods, highlighting the main advantages and limitations of visual (left) and non-visual (right) tracking systems.
Figure 2. Summary of motion-tracking methods, highlighting the main advantages and limitations of visual (left) and non-visual (right) tracking systems.
Applsci 16 01575 g002
Figure 3. Rehabilitation exercise setup. The DIEF-HES (bottom right) is connected to the RAS (center), while the serious game interface is displayed on the monitor (top).
Figure 3. Rehabilitation exercise setup. The DIEF-HES (bottom right) is connected to the RAS (center), while the serious game interface is displayed on the monitor (top).
Applsci 16 01575 g003
Figure 4. Finger Module (FM) of the DIEF-HES. (a) The DIEF-HES worn on the hand, mounted on a glove and coupled to the intermediate phalanx of the fingers via thimbles. (b) Mechanical architecture of the FM with joints and links labeled; the independent generalized coordinate (rear crank rotation) q 18 is highlighted in blue, and the interfaces in contact with the hand are highlighted in orange.
Figure 4. Finger Module (FM) of the DIEF-HES. (a) The DIEF-HES worn on the hand, mounted on a glove and coupled to the intermediate phalanx of the fingers via thimbles. (b) Mechanical architecture of the FM with joints and links labeled; the independent generalized coordinate (rear crank rotation) q 18 is highlighted in blue, and the interfaces in contact with the hand are highlighted in orange.
Applsci 16 01575 g004
Figure 5. Comparison between the original and the modified FM design. (Left): Original design with a single magnetic encoder embedded in the rear crank. (Right): Redesigned version, including redesigned components (marked with *) and the added new PMM assembly, to incorporate the two new temporary encoders (in orange).
Figure 5. Comparison between the original and the modified FM design. (Left): Original design with a single magnetic encoder embedded in the rear crank. (Right): Redesigned version, including redesigned components (marked with *) and the added new PMM assembly, to incorporate the two new temporary encoders (in orange).
Applsci 16 01575 g005
Figure 6. Schematic of the closed-chain coupling between the finger and the exoskeleton mechanism used for M C P / P I P angle estimation. Matching colors indicate elements that are kinematically coupled (i.e., with no relative motion) and therefore share the same rotation. (a) Reference (open-hand) configuration used for offset removal/zeroing; anatomical joints ( M C P , P I P , and D I P ) and the relevant mechanism joints (e.g., Z and E) are indicated. (b) Example configuration at time t > 0 , showing the measured actuation and encoder angles q 18 ( t ) , θ e n c ( t ) , and ϕ e n c ( t ) (blue), together with the corresponding finger joint angles of interest.
Figure 6. Schematic of the closed-chain coupling between the finger and the exoskeleton mechanism used for M C P / P I P angle estimation. Matching colors indicate elements that are kinematically coupled (i.e., with no relative motion) and therefore share the same rotation. (a) Reference (open-hand) configuration used for offset removal/zeroing; anatomical joints ( M C P , P I P , and D I P ) and the relevant mechanism joints (e.g., Z and E) are indicated. (b) Example configuration at time t > 0 , showing the measured actuation and encoder angles q 18 ( t ) , θ e n c ( t ) , and ϕ e n c ( t ) (blue), together with the corresponding finger joint angles of interest.
Applsci 16 01575 g006
Figure 7. Hand postures and experimental setup used for joint angle measurements. (a) Open-hand configuration and (b) closed-hand configuration adopted during data acquisition. (c) Measurement setup and instrumentation used to record the joint angles q 18 , MCP, and PIP. The MCP and thimble sensors are highlighted in light blue to indicate temporary instrumentation, whereas q 1 8 is highlighted in orange to denote the permanent sensor.
Figure 7. Hand postures and experimental setup used for joint angle measurements. (a) Open-hand configuration and (b) closed-hand configuration adopted during data acquisition. (c) Measurement setup and instrumentation used to record the joint angles q 18 , MCP, and PIP. The MCP and thimble sensors are highlighted in light blue to indicate temporary instrumentation, whereas q 1 8 is highlighted in orange to denote the permanent sensor.
Applsci 16 01575 g007
Figure 8. Signal processing pipeline for the encoders’ measurement filtering phase. Two filters are used to obtain the final result: the low-pass Butterworth filter followed by the Savitzky–Golay filter.
Figure 8. Signal processing pipeline for the encoders’ measurement filtering phase. Two filters are used to obtain the final result: the low-pass Butterworth filter followed by the Savitzky–Golay filter.
Applsci 16 01575 g008
Figure 9. (a) The motion capture setup with the used markers; (b,c) show the two open and closed configurations during the MoCap measurement phase.
Figure 9. (a) The motion capture setup with the used markers; (b,c) show the two open and closed configurations during the MoCap measurement phase.
Applsci 16 01575 g009
Figure 10. Mapping between the normalized input q 18 , norm (0 = open, 1 = closed) and the M C P and P I P joint angles. Top panels show the experimental samples (grey circles), IQR (shaded, 25–75%), and group polynomial fit (thick black line). Bottom panels report the residuals (error, ref–est) versus q 18 , norm , with the median (black line) and IQR shaded).
Figure 10. Mapping between the normalized input q 18 , norm (0 = open, 1 = closed) and the M C P and P I P joint angles. Top panels show the experimental samples (grey circles), IQR (shaded, 25–75%), and group polynomial fit (thick black line). Bottom panels report the residuals (error, ref–est) versus q 18 , norm , with the median (black line) and IQR shaded).
Applsci 16 01575 g010
Figure 11. MoCap joint angle clouds for M C P (left) and P I P (right) as a function of the normalized input q 18 , norm . Grey dots denote individual samples and the shaded band shows the IQR (25–75%). The solid black line is the median MoCap trajectory, while the dashed black line is the polynomial fit θ ( q 18 ) .
Figure 11. MoCap joint angle clouds for M C P (left) and P I P (right) as a function of the normalized input q 18 , norm . Grey dots denote individual samples and the shaded band shows the IQR (25–75%). The solid black line is the median MoCap trajectory, while the dashed black line is the polynomial fit θ ( q 18 ) .
Applsci 16 01575 g011
Table 1. Polynomial mapping coefficients and performance metrics for M C P and P I P angle reconstruction. Encoder-based accuracy is reported in terms of RMSE and MAE, while MoCap validation is summarized using MAE and IQR coverage.
Table 1. Polynomial mapping coefficients and performance metrics for M C P and P I P angle reconstruction. Encoder-based accuracy is reported in terms of RMSE and MAE, while MoCap validation is summarized using MAE and IQR coverage.
Function [ a 0 , a 1 , a 2 , a 3 ] RMSE ref MAE ref MAE MoCap IQR MoCap
M C P ( q 18 ) [ 0 , 41.3 , 49.7 , 50 ] 8.1 ° 5.4 ° 3.3 ° 60 %
P I P ( q 18 ) [ 0 , 40.4 , 112.4 , 105.5 ] 7.8 ° 5.6 ° 4.3 ° 77.5 %
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Di Natale, A.; Gelli, M.; Liverani, G.; Ridolfi, A.; Allotta, B.; Secciani, N. A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton. Appl. Sci. 2026, 16, 1575. https://doi.org/10.3390/app16031575

AMA Style

Di Natale A, Gelli M, Liverani G, Ridolfi A, Allotta B, Secciani N. A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton. Applied Sciences. 2026; 16(3):1575. https://doi.org/10.3390/app16031575

Chicago/Turabian Style

Di Natale, Alessia, Matilde Gelli, Gherardo Liverani, Alessandro Ridolfi, Benedetto Allotta, and Nicola Secciani. 2026. "A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton" Applied Sciences 16, no. 3: 1575. https://doi.org/10.3390/app16031575

APA Style

Di Natale, A., Gelli, M., Liverani, G., Ridolfi, A., Allotta, B., & Secciani, N. (2026). A Resource-Efficient Method for Real-Time Flexion–Extension Angle Estimation with an Under-Sensorized Finger Exoskeleton. Applied Sciences, 16(3), 1575. https://doi.org/10.3390/app16031575

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop