Next Article in Journal
Research on Identification of Minimum Parameter Set in Robot Dynamics and Excitation Strategy
Previous Article in Journal
A Survey of 3D Reconstruction: The Evolution from Multi-View Geometry to NeRF and 3DGS
Previous Article in Special Issue
Autonomous Exploration in Unknown Indoor 2D Environments Using Harmonic Fields and Monte Carlo Integration
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

mmPhysio: Millimetre-Wave Radar for Precise Hop Assessment

by
José A. Paredes
1,
Felipe Parralejo
2,
Teodoro Aguilera
2 and
Fernando J. Álvarez
2,*
1
School of Computing, Engineering, and the Built Environment, University of Roehampton, Roehampton Lane, London SW15 5PU, UK
2
Sensory Systems Research Group, Universidad de Extremadura, Av. de Elvas, 06006 Badajoz, Spain
*
Author to whom correspondence should be addressed.
Sensors 2025, 25(18), 5751; https://doi.org/10.3390/s25185751
Submission received: 23 August 2025 / Revised: 11 September 2025 / Accepted: 12 September 2025 / Published: 15 September 2025
(This article belongs to the Special Issue Radar Remote Sensing and Applications—2nd Edition)

Abstract

Motion tests for physiotherapy purposes are a cornerstone in a rehabilitation process. For many reasons, clinicians have been manually measuring and tracking movements so far, being subject to inaccuracies and increasing the time spent in those assessments. This paper studies the reliability of a millimetre wave (mmWave) radar to perform motion tracking for accurate hop tests. Once the variables of interest are extracted and the system set up, the results demonstrate that this system’s accuracy allows its use in clinical environments, facilitating the task of tracking motion and extracting the distance covered by the subject when hopping. The radar outputs are compared against a well-known marker-based optical system, showing high agreement and validating the radar’s effectiveness, with a difference of less 8 cm in the single hop test, 10 cm in the triple hop test, and 21 cm in the crossover hop test for 75 % of all measurements. Hence, this approach offers a contactless, efficient, and precise alternative for physiotherapy motion assessment.

1. Introduction

The accurate assessment of human movement is an essential task for clinical diagnostics, rehabilitation, and sports science [1]. In recent years, the search for cost-effective, scalable systems capable of tracking human activity with high precision has increased [2].
Advanced motion capture systems—such as OptiTrack [3], Vicon [4], Nokov [5], and Motion Analysis [6]—offer exceptional spatial and temporal resolution through infrared cameras and reflective markers. However, these systems are expensive, require specialised environments, and demand extensive calibration and technical expertise, limiting their accessibility in everyday clinical settings.
In fact, despite the availability of these advanced technologies, physiotherapists and clinicians often rely on manual measurements, such as stopwatch timing, measuring tapes for distances, or basic video recordings to evaluate movement patterns. This introduces subjectivity and variability, especially in dynamic tasks like hopping, which are sensitive to subtle biomechanical changes.
The gap between high-end laboratory systems and practical clinical tools highlights the need for affordable, easy-to-use, and non-intrusive solutions that can deliver reliable data without compromising patient privacy or requiring extensive setup.
In this context, millimetre wave (mmWave) radar emerges as a promising alternative. Operating in the 60–64 GHz or 77–81 GHz frequency range, these radars can capture accurate point cloud data of moving subjects without the need for markers or cameras. Their advantages include (1) accuracy, being able to detect even sub-millimetre movements; (2) privacy-preserving sensing, as they do not capture identifiable visual data; (3) robustness to lighting conditions and some occlusions (low-density materials, such as fabrics, cardboard, etc.) unlike optical systems; (4) low cost; (5) compact size, light weight, and strong resistance to interference; (6) real-time tracking of parameters such as 3D coordinates and velocity. For a complete text on mmWave radar fundamentals and features, the reader is referred to [7].
Following these advantages, this paper introduces mmPhysio, a novel system leveraging mmWave radar for precise hop assessment—a task commonly used in physiotherapy to evaluate lower limb function. Our work specifically targets the quantitative assessment of dynamic, high-impact movements such as hopping, which present unique challenges in terms of accuracy and temporal resolution. We compare the radar-derived metrics with those obtained from a marker-based camera system, demonstrating that mmPhysio achieves comparable accuracy while offering significant benefits in terms of cost, ease of use, and privacy. This study aims to validate mmWave radar for detailed hop analysis in a clinical context, providing a practical and objective alternative to both manual and high-end laboratory assessments. Our findings suggest that mmPhysio can bridge the gap between advanced motion capture and manual assessment, offering a new pathway towards accessible and objective physiotherapy diagnostics.

2. Related Work

2.1. Physical Activity Monitoring

The use of sensors for limb movement monitoring in the rehabilitation process of different types of injuries is a crucial technological tool for clinicians and scientists when making decisions regarding the degree of patients’ recovery. In the literature, a great variety of technologies can be found that allow the estimation of biomechanical parameters by means of specific physical tests. The study carried out in this work focuses on those systems that perform tests in motion by means of the hop test [8].
An evaluation of the technologies employed for this purpose reveals a substantial number of studies using inertial sensors, such as gyroscopes and accelerometers. One of these works is carried out in [9], in which the development of a portable system composed of three inertial measurement unit (IMU) devices is described, with the purpose of enabling the angular measurement of joints in three dimensions during horizontal jumping tests. The angular measurements are validated against a camera-based system, and their applicability in clinical research settings is evaluated. The estimated angles has a root mean square (RMS) error of less than 2.3° for both joints, and correlation coefficients greater than 0.92 when compared to the camera-based system. Another clear example of this kind of system is [10], in which gyroscopes, accelerometers, and magnetometers are attached to the limbs of a group of participants to assess patterns of imbalance and fatigue when performing an up–down hop test. The results indicate that these patterns can be reliably monitored, thus offering increased measurement capacity and auditability in comparison to conventional systems. This is of particular significance in the case of older adults, for whom such capabilities are particularly important.
Some authors implement systems that use various sensors to perform the eight hop test. The test involves performing a circuit in the shape of an eight while jumping on one leg. An example is the work presented in [11], where a group of people of different ages and genders perform it. The study aims to identify gender and specific pathologies in the participants’ lower limbs by evaluating data obtained by means of neural networks and machine learning. The results show a 60 % success rate in identifying gender and a 50 % in detecting pathologies using the k-Nearest Neighbour method, improving to almost 100 % when machine learning techniques are employed. A similar example is the work carried out in [12], in which the IMU’s outputs and machine learning techniques are combined to measure vertical jumping reaction force in participants landing. The mean squared error obtained in estimates ranges from 4.6 % to 10.2 % .
Also, it is worth mentioning the work presented in [13], in which the strengthening of the Achilles tendon is studied using the eight hop test with reflective markers, a system of 12 motion-capture cameras and a strength platform on a group of 10 healthy people with satisfactory results. However, other authors use inertial sensors and hopping tests to monitor knee injuries. As an example, the work presented in [14] employs two IMUs per leg to compare flying and landing times and distances achieved in a triple single-leg hop test with those obtained using a camera system. The tests involve both healthy patients and patients with a knee injury and osteoarthritis outcome score. The results show differences in jump times of less than 6ms and distance estimation errors of less than 0.8 % compared to those measured with the camera system. In addition, in [15], a stopwatch, an accelerometer, and an isokinetic dynamometer are used to study the recovery of anterior cruciate ligament injuries in a group of 50 participants.
A study is conducted in [16] with 30 athletes suffering from anterior cruciate ligament injury using the commercial Loadsol® 3.6.0 system [17]. Here, the reaction force during the landing phase of the jump is evaluated by performing tests at different data acquisition frequencies (from 100 to 1920 Hz). This method has been proven to be reliable for evaluating the kinetics of the jump and monitoring the recovery of the athletes. For a more detailed study, an extensive review of this kind of system can be found in [18].
To conclude, some works rely on affordable 3D cameras to provide accurate motion tracking. In this field, there are systems such as Humantrak, developed in [19], where a 3D infrared camera (Azure Kinect or Orbbec Femto Bolt) is used to track the movement of people, offering a sampling rate of 30 FPS in a minimum space of 4 × 2 m2. The system accuracy is compared against a Vicon system composed of 12 cameras and markers on a set of 18 people (5 women and 13 men) in various tests: treadmill walking, treadmill running, drop vertical jumps, single-leg countermovement jump, bilateral countermovement jump, and single-leg squat. The results of these tests show a high degree of similarity between the two tracking systems, with intraclass correlation coefficients ranging from 0.80 to 0.93 for the measurement of lower limb angles. There are also other systems with similar characteristics such as the one developed in [20], where a Kinect v2 infrared camera is used to capture the angles of pelvis, hip, and knee joints during gait. Similarly, its performance is compared with a Vicon infrared camera system with markers, with good agreement between the results provided by the two systems.

2.2. Physical Activity Monitoring Through mmWave Radar

Although the above-mentioned works have proven to be effective in monitoring patient recovery, their use becomes problematic when facing different scenarios. For example, the use of inertial sensors can introduce drift-related errors when estimating the distance covered by users, as these sensors are not capable of directly measuring distance and need to integrate the acceleration [21]. In addition, the use of cameras carries privacy issues and also limits the accuracy of the system to the maximum frame rate of the device, so fast movements can be undersampled, missing important information in-between frames. This is especially important when the user is performing a jump or a fast horizontal movement, such as those needed for the application presented in this work. A reliable alternative to these devices might be a mmWave radar, because of its high frame rate and distance measuring accuracy, and its lack of privacy issues. It can also operate under challenging environmental conditions such as in poor lighting and through fog and smoke [22].
MmWave radars are capable of performing distance measurements up to sub-millimetre precision. For instance, the authors of [23] designed the mmWrite system to perform passive handwriting tracking of characters as small as 1 × 1 cm2. This system was developed using a 60 GHz device and achieved a median tracking error of 2.8 mm. In addition to this, higher precisions were demonstrated using a wideband frequency modulated continuous wave (FMCW) radar to measure distances with up to ± 4.5 µm of precision over a 5.2 m range using phase evaluation techniques [24].
Leveraging their accuracy, mmWave radars can be used to obtain a point cloud of moving objects in their field of view. Thanks to the use of electromagnetic waves, this point cloud can be updated so fast that it can be used for people tracking systems such as the one developed in [25]. Its authors implemented a sensor fusion approach combining multiple radars to improve tracking accuracy and handle occlusions. Their system achieved up to 99 % accuracy for single-person scenarios and 84 % accuracy with multiple subjects. Another example is found in work [26], where a real-time people tracking system was developed using sparse point-cloud data, achieving 91.62 % accuracy in identifying multiple subjects while operating at 15 frames per second on edge computing devices. Other approaches have focused on drone–human interaction, such as the work in [27], which addressed this challenge by developing a radar-based collision prevention system. It achieved over 99 % accuracy in distinguishing between people and drones, enabling safe autonomous drone operation in human-populated indoor environments.
The radar point cloud has also been used to develop more complex systems such as mm-Pose, a real-time human skeletal posture estimation using convolutional neural networks introduced in [28]. This system is capable of detecting more than 15 distinct skeletal joints with average localisation errors of 3.2 cm in depth and 2.7 cm in elevation, outperforming previous radio frequency-based approaches by approximately 24 % and 32 % respectively. Based on the frame rate capabilities of mmWave radar point clouds, the authors in [29] extended their application to traffic monitoring. They developed a segmentation technique for multimodal traffic analysis using single-frame data, making it especially effective for scenarios involving high-speed vehicles. This method successfully distinguished between pedestrians and vehicles, achieving intersection-over-union metrics above 50 % for both pedestrian and car detection.
In terms of healthcare capabilities, recent studies have made use of mmWave radar technology to perform gait and behaviour analysis. The work presented in [30] showed an unobtrusive gait analysis system leveraging radar micro-Doppler signatures. This system can distinguish between normal, pathological and assisted walks with a 93 % accuracy. Moreover, it proved particularly effective at detecting gait asymmetries and abnormalities in patients with diagnosed disorders. In addition to this, a multiple patient behaviour detection system was proposed in [31], employing deep convolutional neural networks. This system is capable of detecting behaviours such as falling, seizures, and distress signals of multiple patients simultaneously with accuracies ranging from 66 % to 88 % in two-patient scenarios. Both approaches leverage the advantages of radar sensing like privacy preservation, contact-free operation, and the ability to measure velocity and micro-movements directly. Another example can be found in [32], where the timed up and go test (used to measure a person’s dynamic balance) was analysed in a mmWave radar system. Here, the results showed the radar’s superior accuracy in distance-based measurements, with a 3.48 % error rate and a correlation of 0.9996 . This surpasses manual timing, which has a 4.26 % error rate and a correlation of 0.9960 .
Finally, it is worth mentioning the work presented in [33]. Here, the authors developed a system to extract gait parameters using a mmWave radar, employing a motion capture system as the reference device. The results showed that the radar-based system could estimate step time with an error of 4 ms and step length with an error of 2 mm in the radial dimension, demonstrating its potential for accurate gait analysis in clinical settings. A similar related work can be found in [34], where a study about the feasibility of using mmWave radar-based deep learning models to estimate gait smoothness was presented. The results indicated that these systems can robustly estimate walking speeds and directions. Both works highlight the potential of mmWave radar technology in clinical gait analysis applications. Our work differentiates itself by focusing on the specific application of hop tests, and it can be extended to general high-impact movements, an area not extensively covered in existing literature.

3. Methodology

This section outlines the methodology employed in this study to achieve the research objectives. The approach is structured into several key phases, each designed to systematically address the research questions. To schieve this, we first define the experimental tests based on physiotherapy and rehabilitation tasks, which are then used to evaluate the performance of the proposed methods. The tests are designed to be representative of real-world scenarios in those tasks, ensuring that the results are applicable and relevant. In this case, a series of hop tests are carried out, where the subjects are asked to perform a series of hops on one leg, with the goal of assessing the horizontal distance covered during the hops. The tests are designed to be simple yet effective in evaluating the subjects’ performance in a controlled environment:
  • Single Hop Test: The objective is to determine the maximum distance that can be covered with a single leg jump while maintaining balance and ensuring a firm landing. The distance is measured from the starting point to the heel of the landing leg.
  • Triple Hop Test: The objective is to execute three consecutive hops on a single leg while maintaining balance and ensuring a firm landing. The distance travelled is measured from the starting point to the great toe of the landing leg.
  • Crossover Hop Test: The objective is to execute three consecutive hops on a single leg, each with a maximum distance covered, while maintaining balance and ensuring a firm landing. Each hop involves a lateral movement across a midline, thereby incorporating side-to-side movement. The distance measured is from the starting line to the heel of the landing leg.
Figure 1 illustrates the three tests. The primary objective for all of them is to achieve a minimum difference of less than 10% in hop distance between the healthy limb and the uninjured limb. These situations are common in rehabilitation scenarios, where patients often have to perform similar tasks to regain strength and balance after an injury.

3.1. Experimental Setup

Based on the tests defined in the previous section, we designed an experimental setup to collect data from subjects performing the hop tests. As shown in Figure 2, the setup includes the following components.
  • An RGB camera is used to capture the positions of visual markers placed on the subjects’ legs. The camera is positioned at a height of 1.5 m and a distance of 2 m from the subjects, ensuring a clear view of the markers during the tests.
  • A mmWave radar is used to capture the motion of the subjects during the hop tests. The radar is positioned at a height of 0.5 m and a distance of 1.5 m from the subjects, similar to the camera setup.
A lateral view is chosen for both the camera and radar to ensure that the horizontal distance covered during the hops can be accurately measured. This perspective allows for a clear view of the subjects’ movements, minimizing occlusions and ensuring that both systems can effectively capture the necessary data. The lateral positioning is particularly important for hop tests, as it provides an unobstructed view of the take-off and landing phases, which are critical for accurate distance measurement. Additionally, this arrangement helps keep visual markers detected on images, avoiding long distances that could lead to loss of tracking and accuracy degradation.

3.1.1. Visual Marker System

For a reliable and accurate ground truth, we use a well-known visual marker system based on Aruco markers [35] which are widely used in motion capture applications.
As observed in Figure 2, a visual marker is placed on the radar, which provides the radar pose (rotation and translation [ R | t ] ) in the camera frame:
[ R | t ] = r 11 r 12 r 13 t x r 21 r 22 r 23 t y r 31 r 32 r 33 t z 0 0 0 1
The camera is calibrated using a chequerboard pattern [36], allowing for accurate pose estimation of the markers in the camera frame.
Additionally, each subject is also equipped with another visual marker to track their 3D position x = [ x 1 , x 2 , x 3 , 1 ] from the camera frame. Now, to align the subject’s position with the radar, we need to compute the transformation from the camera frame to the radar frame. This is achieved by applying the following transformation:
x = [ R | t ] 1 x .
Now, all the subject’s positions are expressed in the radar frame, allowing us to compare the positions captured by the radar and the camera.

3.1.2. Radar System

A mmWave radar operates by emitting high-frequency chirp signals in the GHz range, specifically designed to detect and track moving objects. Modern mmWave radars employ multiple-input multiple-output (MIMO) configurations, using several transmit (TX) and receive (RX) antennas to capture information from target objects, including their range, velocity, azimuth, and elevation angles.
The fundamental principle behind range detection relies on the intermediate frequecy f I F between the transmitted and received signals. The range r to a detected object can be determined through
r = c T 2 B f I F
where c is the speed of light, T represents the chirp duration, and B denotes the frequency bandwidth. This relationship comes up from the linear frequency modulation characteristics of fmcw radar systems [7].
For velocity measurements, the radar captures the Doppler shift generated by consecutive transmitted chirps with a fixed time interval T c . The phase difference Δ φ between consecutive receptions from a moving target enables velocity calculation:
v = λ 4 π T c Δ φ
where λ is the carrier wavelength.
Finally, angular information (azimuth and elevation) is obtained through spatial processing of signals received across the mimo array. With RX antennas separated by distance L, the AoA θ can be computed from the phase difference Δ φ ^ between adjacent receivers:
θ = sin 1 λ Δ φ ^ 2 π L
This equation applies to both azimuth and elevation measurements, depending on the antenna array configuration.
The signal processing chain to compute all these parameters takes place on the radar chip.
  • Data Cube Formation: The radar captures raw ADC samples for each chirp and each TX-RX pair, forming a 3D data cube with dimensions corresponding to fast time (chirps per frame), slow time (number of frames), and spatial dimension (number of virtual antennas R X × T X ):
    S = s ( t ) ( fast time , slow time , antennas )
  • Range calculation: For each TX-RX pair, windowing and fast Fourier transform (FFT) are applied over the ADC samples (slow time) to create the range spectrum of the scene:
    H 1 = FFT S
  • Static Clutter Removal: Now, an fft is applied over the fast time dimension (chirps) to extract the velocity information. Then, after averaging across chirps, a static clutter removal step is applied per range and antenna in order to eliminate static objects, removing points with velocity v 0 from the point cloud and further processing:
    V = FFT H 1
  • Range-Azimuth Estimation: The range-azimuth heatmap is generated by applying the Capon beamformer method [37] based on the steering vectors generated using azimuth-only transceiver pairs:
    H 2 = C H 1
  • Object detection: Performed through a 2-pass constant false alarm rate (CFAR) algorithm applied to the created range-azimuth heatmap H 2 . The first-pass CFAR returns a series of detections in the range domain per angle bin confirmed by a second-pass CFAR-caso (cell averaging smallest of) or a local peak search in the angle domain.
  • Elevation Estimation: For each detected point, a new Capon beamforming is applied. The strongest peak in the elevation spectrum is selected as the elevation angle of the detected point.
  • Doppler Estimation: The velocity of each detected point is extracted using again Capon beamforming, but this time over consecutive chirps in the radar cube S .
For more details on the signal processing chain, please refer to [38].
Hence, for each subject, the radar provides a cluster of points for each dimension:
y i = x 1 , x 2 , x 3 ,
of which the coordinates x 1 , x 2 , x 3 are expressed in the radar frame. The cluster is then processed to extract the centroid of the points, which represents the position of the subject in the radar frame:
y = 1 N i = 1 N y i ,
where N is the number of points in the cluster and y i are the coordinates of each point in the cluster.
As depicted in Figure 2, the detected point lies around 1 m above than where the visual markers are placed. Nonetheless, this does not affect the results, as the important aspect is the relative distance between the subjects’ positions, which is maintained across the two systems, as we see in the results section (Section 4).
Regarding the experimental configuration, the radar is set up to operate with the parameters shown in Table 1, which are optimised for people tracking applications.

4. Results, Analysis, and Discussion

This section presents the results obtained from the experimental setup described in the previous Section 3. They are analysed to determine the effectiveness of the mmWave radar in capturing the motion of subjects during the hop tests and to assess the accuracy of the distance measurements compared to the ground truth obtained from the visual marker system.
A representation of the whole experiment is shown in Figure 3. Here, the radar point cloud is represented as a collection of small yellow dots, which are reduced to their centroid (green dot). The red dot represents the position of the marker captured by the camera. The proximity of the green dot to the red dot indicates that the radar is able to estimate the position of the subject with reasonable accuracy. Additionally, the radar point cloud provides a more comprehensive representation of the subject’s position, capturing multiple points that contribute to the overall centroid calculation. This is particularly useful in dynamic scenarios where the subject may be moving rapidly, as it allows for a more robust estimation compared to relying on a single point measurement, as demonstrated in Figure 3c, where the visual marker cannot be detected due to motion blur but the radar still provides a reliable position estimate.
As mentioned in Section 3, although these tests take different measuring points, the important aspect for a subject under recovery lies in the difference between the injured and healthy legs. Therefore, the results are presented as the difference in distance covered by the subject’s legs during the hop tests compared to a reference point. In this respect, the results are presented in Figure 4. These graphs illustrate the distance measured separately in each dimension ( x 1 : width, x 2 : height, and x 3 : depth) for each of the three hop tests. These curves represent the distance covered from a reference point x i , 0 to the landing point—following Equations (2) and (11):
Δ x i = x i x i , 0 for i = 1 , 2 , 3 ;
where the different x i correspond to the three dimensions in the radar frame: width, height, and depth, respectively. The reference point can be arbitrarily chosen, but in this case, it consists of the average value for the first few frames, both for the camera and the radar:
x i , 0 = 1 N i = 0 N x i
where N is the number of frames considered for the average—in this case, N = 10 .
This strategy allows focus on the comparison between both technologies, rather than on the absolute values. As can be observed, the mmWave radar is able to track the subject, measuring similar distances to the camera system, being consistent across the three dimensions. Figure 4a–c show the results for the single, triple, and crossover hop tests, respectively. Noteworthy is that there is a gap in the camera measurements. This is due to the fact that the image becomes too blurry for the marker to be detected. As the radar was approximately orthogonally aligned with the measuring line, for all tests, there is no distance covered in height and depth, except for the crossover test, which involves lateral movement by definition, as explained in the previous section. In addition, the measured horizontal distances clearly describe the tests’ trajectories. The curves for depth ( Δ x 3 ) in the crossover test present a bit more misalignment, which is expected as the crossover hop involves lateral movement across the midline, making it more challenging for the radar to maintain a consistent trajectory. However, the initial and final positions are still well captured, indicating that the radar is able to track the subject’s motion effectively.
To continue with the analysis, let us now consider the total distance covered in the three dimensions:
Δ r = i = 1 3 Δ x i 2 .
This calculation absorbs any misalignment between the devices and the scene, avoiding the need for complicated alignment methods, which are always subject to extra sources of error. Figure 5 shows the different curves based on Equation (14). As can be seen, the camera and the mmWave radar outputs present a high level of agreement, again.
Finally, these results can be quantified. First, Table 2 shows a summary for the total distance covered in each test, calculated as Δ r f Δ r 0 , where f and 0 indicate final and initial positions, respectively.
Furthermore, although the camera and radar measurements were not synchronised, all the frames were taken in parallel under the same computer system at the same time, saving timestamps for each of them. Therefore, it is possible to calculate the difference between the two devices by interpolating each curve in Figure 5 and making both have the same number of samples in a certain period of time. This allows a histogram and a cumulative distribution function (CDF) to be represented in Figure 6, where we can observe the difference between the positions extracted from the camera and the radar:
Δ d = Δ r c Δ r r ,
for each of the frames.
Analysing those figures, we can observe that 75 % of all measurements show a difference of less than 8 cm for the single hop test, 10 cm for the triple hop test, and 21 cm for the crossover hop test, between the camera and radar estimates, highlighting the strong agreement and reliability of the radar-based approach for hop-test assessment. The larger discrepancies are primarily observed during periods when the subject is stabilising after landing, as the radar is highly sensitive to even small limb movements, which can temporarily affect the position estimate. Once the subject reaches a stable position, the difference between the two systems remains consistently low, further supporting the accuracy of the radar measurements throughout the tests.

5. Conclusions

This paper presented mmPhysio, a mmWave radar-based system for precise hop assessment focused on physiotherapy. The laboratory was set up as a clinic-like environment, equipped with a camera for marker-based optical tracking and a radar for markerless detection. The experimental results demonstrated that the proposed system achieves high accuracy in capturing hop dynamics, offering the required precision for these tests.
In the figures for the three hop tests analysed, 75 % of all measurements had a difference of less 8 cm for the single hop test, 10 cm for the triple hop test, and 21 cm for the crossover hop test. Most importantly, the higher discrepancies arose from a short transition period of stabilisation for the radar outputs.
Hence, this work demonstrated the reliability of a mmWave radar system for hop test assessment, being able to compare subjects’ leg performance and determine if the distance covered differs by more or less than 10 % , indicating whether the injured leg has recovered.
Future work will focus on expanding the system’s capabilities to other rehabilitation exercises and integrating advanced machine learning techniques for automated movement analysis and anomaly detection. In addition, to enhance the system’s robustness, we plan to conduct extensive testing in diverse clinical environments and with a broader demographic of participants. This will help to validate the system’s effectiveness across different settings and populations, ensuring its applicability in real-world physiotherapy scenarios.

Author Contributions

Conceptualisation, J.A.P.; methodology, J.A.P.; validation, J.A.P., F.P., T.A. and F.J.Á.; resources, J.A.P. and F.J.Á.; writing—original draft preparation, J.A.P., F.P. and T.A.; writing—review and editing, J.A.P., F.P., T.A. and F.J.Á.; funding acquisition, J.A.P. and F.J.Á. All authors have read and agreed to the published version of the manuscript.

Funding

This work has been partially supported by CIN/AEI/10.13039/501100100011033 and FEDER—A way to make Europe under Project PID2021-122642OB-C42, and by the regional Government of Extremadura and FEDER under Project GR24102.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author. The data are not publicly available due to policy reasons.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Van Der Woude, L.H.V.; Houdijk, H.J.P.; Janssen, T.W.J.; Seves, B.; Schelhaas, R.; Plaggenmarsch, C.; Mouton, N.L.J.; Dekker, R.; Van Keeken, H.; De Groot, S.; et al. Rehabilitation: Mobility, Exercise & Sports; a Critical Position Stand on Current and Future Research Perspectives. Disabil. Rehabil. 2021, 43, 3476–3491. [Google Scholar] [CrossRef]
  2. Chen, H. Advancements in Human Motion Capture Technology for Sports: A Comprehensive Review. Sens. Mater. 2024, 36, 2705. [Google Scholar] [CrossRef]
  3. Optitrack. Motion Capture Systems. 2025. Available online: https://optitrack3.payloadcms.app (accessed on 22 August 2025).
  4. Vicon. Award Winning Motion Capture Systems. 2025. Available online: https://www.vicon.com/ (accessed on 22 August 2025).
  5. Nokov. Optical Motion Capture System. 2025. Available online: https://en.nokov.com/ (accessed on 22 August 2025).
  6. Motion-Analysis. Premium Motion Capture Software and Systems. 2025. Available online: https://www.motionanalysis.com/ (accessed on 22 August 2025).
  7. Richards, M.A. Fundamentals of Radar Signal Processing, 2nd ed.; McGraw-Hill Education: New York, NY, USA, 2014. [Google Scholar]
  8. Losciale, J.M.; Le, C.Y. Clinimetrics: The Vertical Single Leg Hop Test. J. Physiother. 2025, 71, 136. [Google Scholar] [CrossRef]
  9. Ahmadian, N.; Nazarahari, M.; Whittaker, J.L.; Rouhani, H. Instrumented Triple Single-Leg Hop Test: A Validated Method for Ambulatory Measurement of Ankle and Knee Angles Using Inertial Sensors. Clin. Biomech. 2020, 80, 105134. [Google Scholar] [CrossRef]
  10. Capris, T.; Coelho, P.J.; Pires, I.M.; Cunha, C. Leveraging Mobile Device-Collected up-down Hop Test Data for Comprehensive Functional Mobility Assessment. Procedia Comput. Sci. 2024, 241, 594–599. [Google Scholar] [CrossRef]
  11. Pimenta, L.; Coelho, P.J.; Gonçalves, N.J.; Lousado, J.P.; Albuquerque, C.; Garcia, N.M.; Zdravevski, E.; Lameski, P.; Miguel Pires, I. A Low-Cost Device-Based Data Approach to Eight Hop Test. Procedia Comput. Sci. 2025, 256, 1135–1142. [Google Scholar] [CrossRef]
  12. Chaaban, C.R.; Berry, N.T.; Armitano-Lago, C.; Kiefer, A.W.; Mazzoleni, M.J.; Padua, D.A. Combining Inertial Sensors and Machine Learning to Predict vGRF and Knee Biomechanics during a Double Limb Jump Landing Task. Sensors 2021, 21, 4383. [Google Scholar] [CrossRef]
  13. Baxter, J.R.; Corrigan, P.; Hullfish, T.J.; ORourke, P.; Silbernagel, K.G. Exercise Progression to Incrementally Load the Achilles Tendon. Med. Sci. Sport. Exerc. 2021, 53, 124–130. [Google Scholar] [CrossRef]
  14. Ahmadian, N.; Nazarahari, M.; Whittaker, J.L.; Rouhani, H. Quantification of Triple Single-Leg Hop Test Temporospatial Parameters: A Validated Method Using Body-Worn Sensors for Functional Evaluation after Knee Injury. Sensors 2020, 20, 3464. [Google Scholar] [CrossRef] [PubMed]
  15. Ebert, J.R.; Edwards, P.; Preez, L.D.; Furzer, B.; Joss, B. Knee Extensor Strength, Hop Performance, Patient-Reported Outcome and Inter-Test Correlation in Patients 9–12 Months after Anterior Cruciate Ligament Reconstruction. Knee 2021, 30, 176–184. [Google Scholar] [CrossRef]
  16. Peebles, A.T.; Maguire, L.A.; Renner, K.E.; Queen, R.M. Validity and Repeatability of Single-Sensor Loadsol Insoles during Landing. Sensors 2018, 18, 4082. [Google Scholar] [CrossRef]
  17. Novel USA. Available online: https://www.novelusa.com/loadsol (accessed on 22 August 2025).
  18. Prill, R.; Walter, M.; Królikowska, A.; Becker, R. A Systematic Review of Diagnostic Accuracy and Clinical Applications of Wearable Movement Sensors for Knee Joint Rehabilitation. Sensors 2021, 21, 8221. [Google Scholar] [CrossRef]
  19. Humantrak Movement Analysis System. Available online: https://valdperformance.com/products/humantrak (accessed on 22 August 2025).
  20. Bilesan, A.; Behzadipour, S.; Tsujita, T.; Komizunai, S.; Konno, A. Markerless Human Motion Tracking Using Microsoft Kinect SDK and Inverse Kinematics. In Proceedings of the 2019 12th Asian Control Conference (ASCC), Kitakyushu-shi, Japan, 9–12 June 2019; pp. 504–509. [Google Scholar]
  21. Lepetit, K.; Hansen, C.; Mansour, K.B.; Marin, F. 3D Location Deduced by Inertial Measurement Units: A Challenging Problem. Comput. Methods Biomech. Biomed. Eng. 2015, 18, 1984–1985. [Google Scholar] [CrossRef]
  22. Kramer, A.; Heckman, C. Radar-Inertial State Estimation and Obstacle Detection for Micro-Aerial Vehicles in Dense Fog. In Experimental Robotics; Siciliano, B., Laschi, C., Khatib, O., Eds.; Springer: Cham, Switzerland, 2021; pp. 3–16. [Google Scholar]
  23. Regani, S.D.; Wu, C.; Wang, B.; Wu, M.; Liu, K.J.R. mmWrite: Passive Handwriting Tracking Using a Single Millimeter Wave Radio. IEEE Internet Things J. 2021, 8, 13291–13305. [Google Scholar] [CrossRef]
  24. Piotrowsky, L.; Jaeschke, T.; Kueppers, S.; Siska, J.; Pohl, N. Enabling High Accuracy Distance Measurements with FMCW Radar Sensors. IEEE Trans. Microw. Theory Tech. 2019, 67, 5360–5371. [Google Scholar] [CrossRef]
  25. Huang, X.; Tsoi, J.K.P.; Patel, N. mmWave Radar Sensors Fusion for Indoor Object Detection and Tracking. Electronics 2022, 11, 2209. [Google Scholar] [CrossRef]
  26. Pegoraro, J.; Rossi, M. Real-Time People Tracking and Identification from Sparse Mm-Wave Radar Point-Clouds. IEEE Access Pract. Innov. Open Solut. 2021, 9, 78504–78520. [Google Scholar] [CrossRef]
  27. Parralejo, F.; Paredes, J.A.; Aranda, F.J.; Álvarez, F.J.; Moreno, J.A. Millimetre Wave Radar System for Safe Flight of Drones in Human-Transited Environments. In Proceedings of the 2023 13th International Conference on Indoor Positioning and Indoor Navigation (IPIN), Nuremberg, Germany, 25–28 September 2023; pp. 1–6. [Google Scholar] [CrossRef]
  28. Sengupta, A.; Jin, F.; Zhang, R.; Cao, S. Mm-Pose: Real-time Human Skeletal Posture Estimation Using mmWave Radars and Cnns. IEEE Sens. J. 2020, 20, 10032–10044. [Google Scholar] [CrossRef]
  29. Jin, F.; Sengupta, A.; Cao, S.; Wu, Y.J. MmWave Radar Point Cloud Segmentation Using GMM in Multimodal Traffic Monitoring. In Proceedings of the 2020 IEEE International Radar Conference (RADAR), Washington, DC, USA, 28–30 April 2020; pp. 732–737. [Google Scholar] [CrossRef]
  30. Seifert, A.K.; Amin, M.G.; Zoubir, A.M. Toward Unobtrusive In-Home Gait Analysis Based on Radar Micro-Doppler Signatures. IEEE Trans. Biomed. Eng. 2019, 66, 2629–2640. [Google Scholar] [CrossRef] [PubMed]
  31. Jin, F.; Zhang, R.; Sengupta, A.; Cao, S.; Hariri, S.; Agarwal, N.K.; Agarwal, S.K. Multiple Patients Behavior Detection in Real-Time Using mmWave Radar and Deep Cnns. In Proceedings of the 2019 IEEE Radar Conference (RadarConf), Boston, MA, USA, 22 April 2019; pp. 1–6. [Google Scholar] [CrossRef]
  32. Dogu, E.; Paredes, J.A.; Alomainy, A.; Jones, J.; Rajab, K. A Falls Risk Screening Tool Based on Millimetre-Wave Radar. In Proceedings of the 10th International Conference on Information and Communication Technologies for Ageing Well and E-Health, Angers, France, 28–30 April 2024; pp. 161–168. [Google Scholar] [CrossRef]
  33. Wang, L.; Ni, Z.; Huang, B. Extraction and Validation of Biomechanical Gait Parameters with Contactless FMCW Radar. Sensors 2024, 24, 4184. [Google Scholar] [CrossRef] [PubMed]
  34. Brasiliano, P.; Carcione, F.L.; Pavei, G.; Cardillo, E.; Bergamini, E. Radar-Based Deep Learning for Gait Smoothness Estimation: A Feasibility Study. In Proceedings of the 2025 IEEE Medical Measurements & Applications (MeMeA), Chania, Greece, 28–30 May 2025; pp. 1–6. [Google Scholar] [CrossRef]
  35. Romero-Ramirez, F.J.; Muñoz-Salinas, R.; Medina-Carnicer, R. Speeded up Detection of Squared Fiducial Markers. Image Vis. Comput. 2018, 76, 38–47. [Google Scholar] [CrossRef]
  36. OpenCV. OpenCV: Camera Calibration. 2025. Available online: https://docs.opencv.org/4.x/dc/dbb/tutorial_py_calibration.html (accessed on 22 August 2025).
  37. Capon, J. High-Resolution Frequency-Wavenumber Spectrum Analysis. Proc. IEEE 1969, 57, 1408–1418. [Google Scholar] [CrossRef]
  38. Texas-Instruments. Tracking Radar Targets with Multiple Reflection Points. 2023. Available online: https://dev.ti.com/tirex/explore/node?node=A__AbnseWQ.JhKu2Kp1u4qtEw__radar_toolbox__1AslXXD__LATEST (accessed on 22 August 2025).
Figure 1. Illustration of the hop tests analysed in the study: (a) Single leg, (b) triple, and (c) crossover. The key aspect from a physiotherapy point of view is that the difference in hop distance between the healthy limb and the uninjured limb should be less than 10 % .
Figure 1. Illustration of the hop tests analysed in the study: (a) Single leg, (b) triple, and (c) crossover. The key aspect from a physiotherapy point of view is that the difference in hop distance between the healthy limb and the uninjured limb should be less than 10 % .
Sensors 25 05751 g001
Figure 2. Experimental setup for the hop tests. The setup includes a camera for capturing visual markers’ positions and a mmWave radar. The camera is positioned at a height of 1.5 m and a distance of 2 m from the subjects, while the radar is positioned at a height of 0.5 m and a distance of 1.5 m from the subjects. The visual markers are used to track the subjects’ positions, ensuring accurate ground truth data for comparison.
Figure 2. Experimental setup for the hop tests. The setup includes a camera for capturing visual markers’ positions and a mmWave radar. The camera is positioned at a height of 1.5 m and a distance of 2 m from the subjects, while the radar is positioned at a height of 0.5 m and a distance of 1.5 m from the subjects. The visual markers are used to track the subjects’ positions, ensuring accurate ground truth data for comparison.
Sensors 25 05751 g002
Figure 3. Example of radar point clouds (top two rows) and marker coordinates extracted from the scenes depicted in the bottom row: (a) subject at the starting position, (b) subject initiating the hop, (c) subject in mid-air, and (d) subject just after landing. The radar point cloud (small yellow dots) is reduced to its centroid (green dot). The position estimate is close to the marker (red dot) captured by the camera for the three dimensions. As can be observed in (d), the visual marker cannot be tracked on the air due to the motion blur.
Figure 3. Example of radar point clouds (top two rows) and marker coordinates extracted from the scenes depicted in the bottom row: (a) subject at the starting position, (b) subject initiating the hop, (c) subject in mid-air, and (d) subject just after landing. The radar point cloud (small yellow dots) is reduced to its centroid (green dot). The position estimate is close to the marker (red dot) captured by the camera for the three dimensions. As can be observed in (d), the visual marker cannot be tracked on the air due to the motion blur.
Sensors 25 05751 g003
Figure 4. Comparison between the locations extracted from the camera (dark dots) and the mmWave radar (light crosses), for (a) single hop, (b) triple hop, and (c) crossover hop tests. Although the radar was not specifically aligned with the scene, most of the motion takes place in the width dimension, except for the crossover test, which involves lateral movement. The gaps in the camera curves are due to blurry images that do not allow marker detection.
Figure 4. Comparison between the locations extracted from the camera (dark dots) and the mmWave radar (light crosses), for (a) single hop, (b) triple hop, and (c) crossover hop tests. Although the radar was not specifically aligned with the scene, most of the motion takes place in the width dimension, except for the crossover test, which involves lateral movement. The gaps in the camera curves are due to blurry images that do not allow marker detection.
Sensors 25 05751 g004
Figure 5. Curves representing the total distance covered Δ r for each test: (a) single, (b) triple, and (c) crossover hops. This variable removes the need for complex scene alignment. Again, both devices’ outputs remain close.
Figure 5. Curves representing the total distance covered Δ r for each test: (a) single, (b) triple, and (c) crossover hops. This variable removes the need for complex scene alignment. Again, both devices’ outputs remain close.
Sensors 25 05751 g005
Figure 6. (a) Histogram of the difference between the estimates from the camera and the radar—Equation (15). (b) CDF for the same results. A total of 75 % of all measurements have a difference of less than 21 cm (dashed red line) in the worst-case scenario. The higher differences come from stabilisation periods, where the radar measurement is reaching its final value.
Figure 6. (a) Histogram of the difference between the estimates from the camera and the radar—Equation (15). (b) CDF for the same results. A total of 75 % of all measurements have a difference of less than 21 cm (dashed red line) in the worst-case scenario. The higher differences come from stabilisation periods, where the radar measurement is reaching its final value.
Sensors 25 05751 g006
Table 1. Radar parameters used in the study. These parameters are optimised for people tracking applications, ensuring accurate detection and tracking of subjects during the hop tests.
Table 1. Radar parameters used in the study. These parameters are optimised for people tracking applications, ensuring accurate detection and tracking of subjects during the hop tests.
ParameterValue
Start Frequency60.75 GHz
Bandwidth3.23 GHz
Frame Period55 ms
Range Resolution8.43 cm
Max Range8 m
Velocity Resolution0.1 m/s
Max Velocity4.62 m/s
Table 2. Total distance covered in each test, taken as Δ r f Δ r 0 from Figure 5.
Table 2. Total distance covered in each test, taken as Δ r f Δ r 0 from Figure 5.
HopCamera 1mmWave Radar 1
Single0.7380.704
Triple1.8701.743
Crossover1.8651.828
1 All measurements are expressed in metres.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Paredes, J.A.; Parralejo, F.; Aguilera, T.; Álvarez, F.J. mmPhysio: Millimetre-Wave Radar for Precise Hop Assessment. Sensors 2025, 25, 5751. https://doi.org/10.3390/s25185751

AMA Style

Paredes JA, Parralejo F, Aguilera T, Álvarez FJ. mmPhysio: Millimetre-Wave Radar for Precise Hop Assessment. Sensors. 2025; 25(18):5751. https://doi.org/10.3390/s25185751

Chicago/Turabian Style

Paredes, José A., Felipe Parralejo, Teodoro Aguilera, and Fernando J. Álvarez. 2025. "mmPhysio: Millimetre-Wave Radar for Precise Hop Assessment" Sensors 25, no. 18: 5751. https://doi.org/10.3390/s25185751

APA Style

Paredes, J. A., Parralejo, F., Aguilera, T., & Álvarez, F. J. (2025). mmPhysio: Millimetre-Wave Radar for Precise Hop Assessment. Sensors, 25(18), 5751. https://doi.org/10.3390/s25185751

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop