3.2. Comparison among the Sensors
After the first evaluation of the sensors and the POF curvature sensor characterization, the sensors used in this work were compared using the experimental protocol depicted in
Section 2.5. In this case, the camera-based system was compared with both IMU and POF. The comparison was made with respect to the correlation coefficient and RMSE (as also performed in the previous sections).
Figure 11 shows the results obtained for all sensors in different planes, i.e., sagittal, transverse, and frontal planes, for subject M1.
The results presented in
Figure 11 show a good correlation between the errors of the POF curvature sensor and IMU, especially on sagittal and frontal planes. Although we used the same number of cycles to compare the sensors, the period of each movement is different, due to the fact that each subject was allowed to perform the movements at a comfortable self-velocity.
Furthermore, the range of movement at each plane is different, i.e., the movement at the sagittal plane occurs in a range of about 0–145
, whereas the one at the transverse plane reaches angles lower than 130
. Similarly, the angles at the frontal plane can be as high as 145
(as in the sagittal plane). From the tests, the mean deviation between POF curvature sensor and IMUs was about 6.5% on the tests in the sagittal plane. However, such deviation increased to about 10% on the transverse and frontal planes. The reason for this increase can be related to the POF positioning on the tests, since it is a critical factor on the angle assessment using such technology. In addition, it can be also related to the increase of the errors of the IMUs when the test was performed in planes different from the sagittal one, as reported in Vargas-Valencia et al. [
33]. Regarding the camera-based system, the results at the sagittal plane show an overestimation of the angle, when compared to IMU and POF curvature sensor. In this case, the angles estimated by the markerless camera system had a maximum value of about 160
, which is higher than the elbow range of motion [
39]. In contrast, the camera-based system underestimates the angles at the frontal plane when compared with the other two systems for angle assessment.
Such as aforementioned, the errors on the markerless camera system for angle assessment are related to issues, such as frame errors, exploitation of multiple image streams and, especially, due to self-occlusions. To further evaluate the errors obtained by the camera-based system,
Table 3 presents the correlation coefficient and RMSE between the markerless system and the IMUs for each of the 11 participants in all three planes tested, whereas
Table 4 presents the correlation and RMSE between the markerless system and the POF curvature sensor.
Table 3 and
Table 4 show a correlation coefficient higher than 0.9 in all analyzed cases, which indicates a high correlation between the responses of the sensors. In addition, the standard deviation of the correlation coefficient was below 0.01 in all the analyzed cases. Thus, it is possible to verify not only a high correlation between the data of the camera-based system compared to the wearable ones, but also that the results present a promising evidence of repeatability of such systems. The mean of correlation coefficients between the camera-based system and the IMUs were 0.990, 0.984 and 0.979 on the sagittal, transverse, and frontal planes, respectively. It is noteworthy that higher correlations were obtained between the camera-based system and the IMUs than the ones comparing the markerless system with the POF curvature sensor. The mean of the correlation coefficients considering the later comparison was 0.978, 0.964 and 0.975 for sagittal, transverse, and frontal planes, respectively.
Even though the proposed camera-based system presented high correlation with the wearable sensors in all scenarios, the errors of such system are generally high. Such as can be observed in
Figure 11, there are deviations on the angle estimation of the camera-based system when compared with the wearable sensors, where, considering all the performed tests, these errors can be as high as 15
on the worst case. In addition, the mean errors are about 10
when compared with the wearable sensors. It is noteworthy that these errors are lower than the ones reported on the literature [
40], which is mainly due to the use of two cameras to reduce the errors related to occlusions. However, errors of about 10
are still not sufficient when a reliable system for movement analysis is concerned. Nevertheless, the high correlations obtained in all tests for the comparison with the wearable sensors (see
Table 3 and
Table 4) indicate that the proposed markerless camera-based system can be a feasible solution for angle estimation if a post-processing technique for the correction of the angular errors is applied, as also discussed in Schmitz et al. [
40].
3.3. Technique for Angle Correction in Markerless Camera-Based Systems
The primary assumption for the proposed compensation technique for angle errors in markerless camera-based systems is that the errors mainly occur due to occlusions, or errors on computer vision algorithm for the tracking of the anatomical points used to calculate the parameters
and
in
Figure 4. If these parameters are incorrectly estimated, errors on the angle assessment will occur. Thus, it is possible to assume that these angular errors have a correlation with the anthropometric measurements of each participant. To verify this assumption and develop the compensation technique for the markerless system, each participant performed 3 flexion/extension cycles only on the sagittal plane (see
Section 2.5), and the angles estimated with the markerless camera-based system were compared with the ones measured by the POF curvature sensor. We used the POF curvature sensor for the development of the compensation technique, since it was already evaluated with respect to the potentiometer, presenting low errors in this characterization. However, we must emphasize that other sensor systems can be used as reference for the proposed compensation technique, including IMUs, marker-based camera systems and goniometers. The technique proposed here is based on the basic premise that the errors are mainly related to the detection of the parameters
,
and
(due to self-occlusions, numerical errors on the computer vision algorithm, among other reasons). Therefore, the errors can be correlated (and then compensated) by considering the actual value of the anthropometric measurements (
and
) used on the angle estimation, which can be measured on each subject or estimated using the subject’s height [
41].
For the first characterization of the technique, the flexion/extension cycles of five subjects (M1, M3, M5, F2 and F5) are analyzed, and a polynomial regression between the angles estimated by the camera-based system and the POF curvature sensor is performed for each of the five subjects, where each equation has the type shown in Equation (
7):
where
is the angle estimated by the camera,
is the angle measured by the POF curvature sensor. In addition, a, b, c and d are polynomial regression coefficients experimentally obtained through the regression between the angular responses of both sensor systems, i.e., markerless camera-based and POF sensor, using the least squares method. The coefficient d is the offset on the sensor response (in
). Therefore, if the sensors responses are normalized in the beginning of the test, the offset will be null. For this reason, the coefficient d is not employed on the analysis of correlation between the coefficients of the angular error correction and the anthropometric measurements of the subjects.
Figure 12 shows the regression between the angle measured by the camera-based system and the POF curvature sensor for the third flexion cycle (as an example) of subject F5. The results show a high correlation (0.998) between the responses using a third-order polynomial regression. Actually, such high correlation occurs for all the cycles of the five subjects analyzed, where the correlation coefficient was higher than 0.9 in all cases. Hence, the assumption of correlation between the errors of both sensor systems holds true (based on the analyses performed). Then, the next step is to correlate the polynomial coefficients (a, b and c) with the anthropometric measurements of each participant.
As discussed in
Section 2.1, the parameters used on the angle estimation by the camera-based system are the anthropometric distances (
and
), which are detected through computer vision algorithms. Thus, errors on the detection of such points will lead to errors on angle estimation, where such errors can be related to those anthropometric distances. However, these parameters (
and
) are intrinsic of each subject and can be easily measured. In addition, it is possible to use the height of each subject (as showed in
Table 1) in conjunction with anthropometric data for males and females to correlate the arm length with the subject’s height.
Figure 13 shows the correlation of the polynomial regression coefficients (a, b and c) with the subjects’ arm lengths (D).
The results as well as the equations presented in
Figure 13 indicate the feasibility of using the anthropometric measurements of each subject on equations for angular errors corrections in camera-based systems. The correlation coefficient is higher than 0.9 for all analyzed coefficients, indicating the possibility of using the proposed compensation technique for angle correction. Then, by substituting the equations shown in
Figure 13 in Equation (
7), it is possible to obtain a corrected angle as depicted in
Figure 14 for three flexion/extension cycles for subject F1. In addition, the uncompensated response, i.e., the response of the camera-based system without applying the equations for angle correction, is also presented for comparison purposes. The RMSE for the compensated response is also presented in order to verify the accuracy enhancement provided by the proposed technique. Compared to the uncompensated responses, where the RMSE was 15.04
, 9.25
and 10.23
for cycles 1, 2, and 3, respectively, the proposed angular error compensation was able to reduce the errors substantially in all three cycles. To further verify the performance of the proposed technique, the aforementioned compensation equations were applied for the responses in the sagittal plane for all subjects. The comparison between the RMSEs for the cases with and without the compensation technique is presented in
Table 5 for each subject in all three flexion/extension cycles analyzed, where the mean and standard deviation of the three cycles are presented for each participant.
The results presented in
Table 5 show the feasibility of the proposed technique, where the RMSE was reduced for all 11 subjects analyzed. The highest reduction occurred in Subject F1, in which the RMSE reduced from 11.52
to 3.52
after applying the correction equations. The mean of the RMSEs for the compensated responses is about 4.90
, whereas the uncompensated one is 10.42
, which means a two-fold reduction of the RMSE when the proposed compensation is applied. It is also worth to mention that the lowest RMSE reduction for the compensated case occurred in subject M3, where the RMSE reduced 2.11
. However, one should note that the RMSE of the uncompensated response of this subject was already low (6.90
) when compared to the ones of the other subjects and even when compared to the errors presented in the literature for similar systems [
40].
The proposed technique for angular errors correction in markerless camera-based system is a feasible and straightforward option to enhance the angular accuracy in such systems. There is a calibration step in which the response of the camera-based system has to be compared with the one of a reference sensor system, e.g., wearable or marker-based camera systems. Then, the errors obtained on the markerless camera-based system are compared with the subject’s anthropomorphic parameters (arm length in this case) in order to obtain an equation that relates the angle correction with the parameters of each subject. Therefore, an important caveat should be mentioned: the calibration routine must be performed with respect to a reliable reference, and the movements should be performed at one plane, i.e., sagittal, frontal, or transverse planes movements. In addition, the calibration has to be performed on the same range at which the angle analysis will be performed, i.e., if an angular interval of 0 to 160 will be analyzed, the calibration has to be made at this same angular range (0 to 160). By following these steps, it is possible to obtain accurate single plane angle measurements with a markerless camera-based system. Therefore, the main limitation of this approach is the necessity of a calibration stage prior to the application of the proposed sensor system in the same range and planes of movement envisaged on the proposed application. However, it is worth noting that the proposed approach can be extended for movement analysis of different degrees of freedom by adjusting the calibration stage accordingly and correlating the errors with the anthropometric parameters of each subject.