Next Article in Journal
Huber’s Non-Linearity for GNSS Interference Mitigation
Next Article in Special Issue
Design and Validation of an FPGA-Based Configurable Transcranial Doppler Neurofeedback System for Chronic Pain Patients
Previous Article in Journal
Direct Detection of Candida albicans with a Membrane Based Electrochemical Impedance Spectroscopy Sensor
Previous Article in Special Issue
Designing Interactive Experiences for Children with Cochlear Implant

Sensors 2018, 18(7), 2216; https://doi.org/10.3390/s18072216

Article
3D Analysis of Upper Limbs Motion during Rehabilitation Exercises Using the KinectTM Sensor: Development, Laboratory Validation and Clinical Application
1
Laboratory of Anatomy, Biomechanics and Organogenesis (LABO), Université Libre de Bruxelles, 1050 Brussels, Belgium
2
Department of Electronics and Informatics—ETRO, Vrije Universiteit Brussel, 1050 Brussels, Belgium
3
International Medical Equipment Collaborative (IMEC), Kapeldreef 75, B-3001 Leuven, Belgium
4
Department of Applied Mathematics, Peter the Great St. Petersburg Polytechnic University (SPbPU), 195251 Sankt-Peterburg, Russia
5
Institute of Computer Science and Mathematics, Slovak University of Technology, 81237 Bratislava, Slovakia
*
Author to whom correspondence should be addressed.
Received: 29 May 2018 / Accepted: 6 July 2018 / Published: 10 July 2018

Abstract

:
Optoelectronic devices are the gold standard for 3D evaluation in clinics, but due to the complexity of this kind of hardware and the lack of access for patients, affordable, transportable, and easy-to-use systems must be developed to be largely used in daily clinics. The KinectTM sensor has various advantages compared to optoelectronic devices, such as its price and transportability. However, it also has some limitations: (in)accuracy of the skeleton detection and tracking as well as the limited amount of available points, which makes 3D evaluation impossible. To overcome these limitations, a novel method has been developed to perform 3D evaluation of the upper limbs. This system is coupled to rehabilitation exercises, allowing functional evaluation while performing physical rehabilitation. To validate this new approach, a two-step method was used. The first step was a laboratory validation where the results obtained with the KinectTM were compared with the results obtained with an optoelectronic device; 40 healthy young adults participated in this first part. The second step was to determine the clinical relevance of this kind of measurement. Results of the healthy subjects were compared with a group of 22 elderly adults and a group of 10 chronic stroke patients to determine if different patterns could be observed. The new methodology and the different steps of the validations are presented in this paper.
Keywords:
Kinect; validation; assessment; functional evaluation; shoulder; markerless system

1. Introduction

Since the release of the first version of the KinectTM sensor for the Xbox 360 (Kinect) at the end of 2010, researchers and clinicians have directly felt the possible potential of this device. Many studies have been done in order to validate this device as a markerless system (MLS) for various uses (e.g., motion analysis, posture analysis, feedback during rehabilitation exercises, etc.) [1,2,3,4,5,6,7].
Interesting results have been found in terms of accuracy (compared to gold standard marker-based systems (MBSs)) and especially in terms of precision (reproducibility has been found higher for Kinect compared to MBS using a Plug-in Gait (PiG) model like the ViconTM protocol) [3].
3D motion analysis using MBS is considered to be the gold standard for clinical motion analysis, even if several issues have been previously raised and discussed in the literature. The accessibility of MBS is an issue due to the costs of such systems, and therefore only specialized centers can afford them. Furthermore, marker placement, which is time-consuming and is a potential source of error [8], and skin displacement during motion are two recognized problems within the MBS field [9].
Several studies have previously studied the use of the Kinect as an MLS for upper limb evaluation to assess reachable workspace on healthy subjects [10], on patients suffering from fascioscapulohumeral muscular dystrophy [11], and patients with Duchenne muscular dystrophy [12]. Another study compared simple planar motions (shoulder abduction and elbow flexion) and found good correlations with MBS results [4]. Since movement speed control is important in various neurological conditions, others have investigated the ability of the Kinect to detect arm movement speed in healthy subjects, and obtained good results after applying some filtering algorithms [13].
A new generation of Kinect, Kinect for Xbox One (Kinect One), was released in 2014. Research has been done to compare the two generations of Kinect for object detection. The results were better for the second generation of Kinect, especially when the distance between the object and the camera was increased [14]. The use of Kinect One to assess upper limb mobility or function has also been extensively studied. Authors found that 3D evaluation of shoulder ranges of motion were significantly more precise and with narrow limits of agreement than the measurements of trained observers (clinicians), based on the analysis of 1670 measurements [15]. Another study compared the results of the Kinect One, an MBS, and goniometry for range of motion (ROM) and motion smoothness. Kinect One had very good agreement of ROM measurement (r > 0.9) with the 3D motion analysis compared with goniometry. Kinect One also showed a good correlation and agreement of measurement of motion quality parameters compared with the 3D motion analysis [16].
The measurements performed in these studies were not taken at the same time, making comparisons of upper limb evaluation difficult. Other studies compared 3D full body kinematics analysis, mainly during gait–balance and adaptive postural control. Authors found that the accuracy of Kinect One landmark movements was moderate to excellent and depended on movement dimension, landmark location, and the task performed [17]. Another study showed that gait analysis using multiple Kinect One sensors can provide an accurate analysis of human gait compared to MBS [18]. During those two studies, authors successfully recorded motion simultaneously, showing that the infrared signals from the MBS do not provide too much noise and do not influence the results of the Kinect.
Despite these promising results, some issues still need to be solved in order to fully use this MLS in daily clinical practice and for unsupervised remote data collection by patients at home.
Due to the information provided by the Kinect SDK (i.e., a simple skeleton model composed by 20 points for the old Kinect and 25 points for the Kinect One), it is not possible to directly obtain three-dimensional joint orientation poses. Another issue related to motion analysis is the different conventions used (e.g., Euler’s sequences, orientation vector position), making comparison and interpretation of the results difficult [19].
Some solutions have been proposed to increase the quality of the results, such as fusing the data from the Kinect and accelerometers [20], modifying the placement of the sensor according to the type of measurements [21,22], fusing the data from multiple Kinect sensors [18,23], or developing new algorithms for skeleton detection based on raw data [24].
The aim of this study was to present an advanced (PiG, as in the ViconTM system ) MLS model [25] and a new method for motion analysis based on joint trajectories complementary to joint angles during rehabilitation exercises using a single Kinect camera. Previous studies have shown that the Kinect sensor could be used to follow patients’ evolution during rehabilitation exercises [26,27]. This kind of evaluation, done during the rehabilitation, has many advantages: (i) it is done in the natural environment of the patient (it is known that patients are not exhibiting the same performance when they are wearing underwear in a gait laboratory); (ii) when patients are immersed in the games they are less focused on the motion and on pain and can reach larger amplitudes than when they are asked to perform one particular motion; (iii) it provides time savings; and (iv) it is financially beneficial (the devices are affordable, and since the evaluation is done within the therapy session there is no dual pricing) [28].
The different steps of the development of this method, the laboratory validation (i.e., comparison with a gold-standard optoelectronic device) and clinical validation are presented in this paper. Results of this new method are easier to interpret and could therefore be used in clinics and at home to assess patient status and monitor follow-up.

2. Methods

Each frame of the MLS motion data was collected from the original hardware and was available as 3D coordinates of crude approximation of the main human joints (Microsoft Kinect SDK). By piecewise linear connection of those joints, one can develop a stick-based model (i.e., adjacent points are linked together by a line representing human segments) for visualization and motion analysis. The major drawback of this approach is the inability to allow anatomically correct descriptions of the joint angular motion according to current clinical conventions [19]. An algorithm developed to extend the crude model provided by the Kinect containing several steps was previously developed and validated [25].
Each link size can be corrected based on the assumption that the raw stick-based model supplies proper line orientation. Starting from the native thorax stick model, one can substitute the spatial location of extremity joints, and therefore segment size, by processing each link sequentially from the root (e.g., thoracic segment) to the end joint (shoulder, then elbow, and finally wrist joint).
In total, 19 local coordinate systems (LCSs), following International Society of Biomechanics (ISB) recommendations [29] for axis orientation, were located in the origins indicated by numbers 1–19 in Figure 1. Then, 33 LCSs origin motions relative to parent LCSs (Table 1) were created.
Each of the 33 trajectory plots were processed to assess different properties of the shape created by hodograph. Similar to Duarte et al., nine parameters (1–9 in Table 2) were estimated directly from the point trajectories [30]. This method was previously developed for balance analysis. Those parameters have been validated to assess dynamic balance during SG rehabilitation exercises [31]. We then extended this 2D analysis system into 3D analysis to get more information about the reaching area of the participants.
All trajectory points were also processed as a point cloud to assess principal and supplementary axes origin and orientation in the parent LCS. The main principal axis corresponded to the maximum eigenvalue. Then, the second axis was defined as perpendicular to the plane of the first axis and the radius vector of the principal axes origin. The last axis direction was the right hand perpendicular to the first two. The size of each axis was defined by min/max points distance estimated from projection on axis the trajectory cloud.
Then, 32 (numbered from 10 to 41) additional trajectory shape definition parameters were evaluated (Table 2). Those 41 parameters were computed using the following equations.
Trajectories data were defined in LCS by a sequence of N points with frequency f (e.g., f = 30 s 1 ) by
p i = [ p i x , p i y , p i z ] , i = 1 , , N .
The point instantaneous absolute velocity value ( | | | | is the Euclidean norm) and the total velocity ( 1 × N ) matrix are given by
v i = | | [ v i x , v i y , v i z ] | | ,
V = [ v 1 , , v N ] .
The current length of the trajectory (travel) corresponds to L N :
L i = k = 1 i 1 | | p k + 1 p k | | , i = 2 , , N .
The scalar motion parameters for the hodograph velocity are obtained with
V M e a n = 1 N k = 1 N v k ,
V s t d = ( 1 N 1 k = 1 N ( v k V M N ) 2 ) 1 2 ,
V M a x = m a x ( V ) .
The total area of the trajectory reached by each joint is given by Equation (8), where S i , i = 1 , , N 1 is the area of the triangle defined by three points [ o , p i , p i + 1 ] :
Δ N = i = 1 N 1 S i .
The angle between two rays [ p i , p i + 1 ] is obtained by Equation (9), where h i = 2 S i / | | p i | | :
α i = a r c s i n ( h i | | p i + 1 | | ) .
From there, the total angular travel (in degrees) is obtained with
Φ N = 180 π i = 1 N 1 α i .
The angular velocity instant value and the total angular velocity ( 1 × N 1 ) matrix are given by Equations (11) and (12), respectively
ω i = 180 π f α i
Ω = [ ω 1 , , ω N 1 ]
From Equation (12), the mean, standard deviation, and maximum angular velocity are easily obtained ( Ω M e a n , Ω s t d , Ω M a x ) .
The mean (central) point of the cloud is given by
p M e a n = 1 N k = 1 N p k ,
and the centered point coordinate is
q i = p i p M e a n .
The instantaneous inertia matrix is obtained with Equation (15), and the total cloud inertia matrix by Equation (16):
I i = E q i q i T q i T q i , E = d i a g ( 1 ) ,
I = i = 1 N I i .
Then, using singular value decomposition, a main first principal axis is obtained
G 1 = [ G 1 x , G 1 y , G 1 z ] T , | | G 1 | | = 1 .
Two additional orthogonal axes can be obtained using Equations (18) and (19), where c is a normalized (unit) vector with C x , C y , C z projections:
c = c | | c | | , c ˜ = 0 c z c y c z 0 c x c y c x 0 ,
G 2 = p ˜ M e a n G 1 / | | p ˜ M e a n G 1 | | ,
G 3 = G ˜ 1 G 2 .
c ˜ , p ˜ , and G ˜ are skew-symmetric matrix representation for using in matrix shape of vector cross product.
A ( 3 × 3 ) orientation matrix is obtained with Equation (20), and a ( N × 3 ) projection matrix of q 1 on its axis with Equation (21):
G = [ G 1 , G 2 , G 3 ] ,
Q G = [ q 1 , , q N ] T G .
The minimal and maximal points on orthogonal axis (size ( 1 × 3 )):
Q M i n = m i n ( Q G i j ) i = 1 , , N , j = 1 , 2 , 3 ( X , Y , Z ) ,
Q M a x = m a x ( Q G i j ) i = 1 , , N , j = 1 , 2 , 3 ( X , Y , Z ) .
For each axis number i = 1 , 2 , 3 , the two end point positions (parameters 19–30 for i = 1 , 2 ):
B 1 i = p M e a n + Q M i n i G i T ,
B 2 i = p M e a n + Q M a x i i G i T .
The angle of view (parameters 12 and 13 for i = 1 , 2 ):
Φ B i = ( 180 / π ) a r c c o s ( B 1 i B 2 i / ( | | B 1 i | | | | B 2 i | | ) ) .
The size of the axis (parameters 14 and 15 for i = 1 , 2 ):
L B i = | | B 1 i B 2 i | | .
The surface area of the rhomboid defined by the end points of the first and second axes (parameter 10) is given by Equation (29):
S m i d = L B 1 L B 2 / 2 .
The volume of the diamond defined by six end points is finally obtained (parameter 11):
V o l = S m i d L B 3 / 3 .
Parameters 31–37 (Table 2) are obtained using point fitting by sphere. Parameters 38–41 are obtained using Delaunay triangulation and convex hull functions in Matlab.
Example of the different scores related to the reaching area are presented in Figure 2.
Using this method, up to 697 variables can be processed (41 parameters × 17 joints). An intuitive and easy-to-interpret visualization tool must be developed to present the results and allow comparison between patients and control or to perform longitudinal patient follow-up.
This score can be based on single or multiple joints analysis.
For single joints, the total amount of analyzed joints is N ϕ . For each single joint i, the angular values are ϕ i k = [ ϕ i x k , ϕ i y k , ϕ i z k ] , i = 1 , , N ϕ , k = c , p t , where the k index corresponds to control (c) or patient ( p t ) data. The joint angle dimensionless score values can be defined as s i j ϕ = ϕ i j c / ϕ i j p t , i = 1 , , N ϕ , j = x , y , z , and the joint mean weighted score as S ϕ i = ( w i x ϕ s i x ϕ + w i y ϕ s i y ϕ + w i z ϕ s i z ϕ ) / ( w i x ϕ + w i y ϕ + w i z ϕ ) , i = 1 , , N ϕ , where w i j ϕ are weight factors. Total body joint mean weighted ( W i ϕ ) score, from selected joints i i ϕ , can finally be derived as C ϕ = i i ϕ W i ϕ S ϕ i / i i ϕ W i ϕ .
This approach can be extended in the case of multiple joints data analysis based on trajectories analysis. In the presented model for upper limb assessment, the total number of trajectories ( N j ) is 17, and the total number of parameters for each trajectory ( N p j ) is 41. Parameter values can be obtained with p i j k , i = 1 , , N p j , j = 1 , , N j , k = c , p t for a selected parent joint j, parameter number i, and control (c) or patient ( p t ) parameter values. Then, a dimensionless parameter score can be defined as s i j p = p i j c / p i j p , i = 1 , , N p j , j = 1 , , N J , and a trajectory j mean weighted ( W i j p ) score, from a selected joint i I p , can be defined as S p j = i I p W i j p S i j p / i I p W i j p , j = 1 , , N J . The total body trajectory-based analysis mean weighted score ( W j p ), from selected trajectories j J p , can be finally derived as C p = j J p W j p S p j / j J p W j p .
An example of this visualization tool is presented in Figure 3.

3. Laboratory Setting Validation

3.1. Participants

Forty healthy adults (24 ± 6 years old, 172 ± 8 cm height, 68 ± 10 kg weight, 23 ± 3 kg/m 2 BMI, 18 women) were recruited to participate in this study. This study was approved by the Ethical Committee of the Erasme Hospital (EudraCT/CCB : B406201215142), and written informed consent was obtained from all subjects prior to their participation.

3.2. Material

The MLS sensor (Kinect) was placed on a tripod 1.5 m above the floor. Subjects stood 2.5 m from the camera; this distance was found to provide optimal results in a previous study [3]. Subjects were in underwear to allow reliable placement of the markers for the MBS analysis taking place simultaneously. Experiments took place in a motion analysis laboratory.
Prior to motion analysis, the subjects were asked to stand still in anatomical position facing the MLS camera. Subjects were then asked to maintain three different poses (3 s for each of the poses; see Figure 4) before recording the motion in order to calibrate the MLS data processing pipeline.
MBS data were simultaneously collected from a state-of-the-art stereophotogrammetric system (Vicon, 8 MXT40s cameras, Vicon Nexus software, frequency: 90 Hz) that tracks the spatial trajectories of the reflective markers set on the subjects. A modified Plug-in Gait (PiG) model was adopted. In addition tothe usual PiG markers, markers were set on the medial epicondyle of the humeral and femoral bones. Thirty-four markers were positioned by the same observer during the entire study. The image frame rate used was equal to 30 fps for the MLS. MLS data were collected with a laptop (Sony Vaio SVF15323CXB, 1.6 GHz Intel Core i5-4200U, 6 GB DDR3L SDRAM, 750 GB (5400 rpm) SATA Hard Drive).

3.3. The Serious Games

Participants played one mini-game that was especially developed for physical rehabilitation: the Wipe Out [32] (Figure 5). The player has to clean the screen covered with mud using a tissue controlled by mediolateral and inferior–superior displacements of the wrist relative to the trunk. Participants were asked to play three repetitions of the games. Motions were simultaneously recorded with the MBS and the MLS.

3.4. Data Processing and Statistics

The different scores and parameters described above were computed for the two devices. The mean of the results of three repetitions of the games were computed for statistical analysis. Normality of the data was checked graphically (histogram, boxplot, and qplot) and Shapiro–Wilk test. Mean values and standard deviations were calculated. Discrepancies between the MBS and the different versions of MLS were tested using Pearson’s correlation coefficient (R). The reproducibility coefficient ( R C P = 1.96 × S T D ) and the coefficient of variation ( C V = S T D / M e a n ) were expressed as percentages. Statistics and data processing were done with MATLAB and Statistics Toolbox Release 2016a (The Mathworks, Inc., Natick, Massachusetts, United States).

3.5. Results of the Laboratory Validation

Due to space restriction only some results are presented and will be discussed.
For upper limb analysis, up to 328 parameters can be obtained (4 joints × 2 sides × 41 parameters)
Results of the seven selected parameters for the relative displacements of elbow relative to shoulder (“shoulder”, points 4 and 5 in Figure 1) and wrist relative to shoulder (“wrist”, points 16 and 17 in Figure 1) for right and left sides are presented in Table 3.
All the parameters, except the angular velocity (mean R = 0.51 for the four joints), presented good correlation between results of the MLS and the MBS. On the other hand, the best results in terms of correlation, RCP, and CV were obtained for the velocity (expressed in m/s).
For both shoulders and wrists, better results were obtained for the total length of the trajectory, the total angle, and the mean velocity.
Although good correlations were found for parameters related to the reaching area (i.e., volume, sphere, and surface), lower RCP and CV were found for them.
The next step of the study was to determine if those parameters are sensitive enough to discriminate healthy subjects from patients.

4. Validation in Clinical Environment

4.1. Participants

Three groups of subjects and patients were tested in order to evaluate the clinical relevance of the newly developed evaluation method and scores:
  • Adults: Sixteen healthy young adults (results of the laboratory validation were used)
  • Elderly: Seventeen patients ( 79 ± 5 years old) hospitalized in a geriatric department were included in the study. This study was approved by the local ethical committee of Erasme Hospital (Eudract: B406201628246), and informed consent was obtained from the patients prior to their participation.
  • Stroke: 10 patients with chronic stroke ( 73 ± 8 years old) participated in this study. This study was approved by the ethical committee of Erasme Hospital (EudraCT: B406201526116), and informed consent was obtained from the patients prior to their participation.

4.2. Material

The same methodology as for the laboratory setting validation was used. Patients were in underwear so that clothing did not interfere with skeletal detection. Experiments were done in the hospital rooms, in contrast to the other protocol, the patients were not equipped with reflective markers. This situation is more natural than the evaluation performed in the gait lab and close to the daily clinics and rehabilitation.

4.3. Data Processing and Statistics

Each participant played three repetitions of the games. The different scores and parameters described above were computed for the three groups. The mean of the results of three repetitions of the games were computed for statistical analysis. Normality of the data was checked using the Shapiro—Wilk test. Mean values and standard deviations were calculated. One-way analysis of variance (ANOVA) was used to compared the groups, and post-hoc analysis was done using the Bonferroni procedure.

4.4. Results of the Validation in the Clinical Environment

Mean results of the three groups and statistics are presented in Table 4. The same parameters as during the clinical validation are presented.
Concerning the shoulders, no statistically significant difference was found for the length, but highly significant differences were found for both the total angle, the velocity for young adults and elderly individuals, and stroke patients. The only parameters that could differentiate the three groups, based on relative motion of the elbow relative to the shoulder, was the volume of the sphere.
Concerning the wrists, statistically significant differences were found for the length and the velocity between young adults, elderly, and stroke patients. For the volume, significant differences were found only between young adults and elderly individuals. The surface and the total angles presented statistically significant differences between the three groups.

5. Discussion

3D evaluation of the upper limbs is still a complex task in clinics, due to non-cyclic motions, various degrees of freedom, different conventions for presenting the results or processing methods [19,29], etc.
The availability of the Kinect sensors coupled to the development and use of serious games in physical rehabilitation [33] offers a new perspective for long-term evaluation and follow-up during rehabilitation. Microsoft stopped manufacturing the Kinect in 2015 and the Kinect One in 2017. Therefore, other 3D cameras (e.g., Orbbec Astra ProTM, Asus Xtion sensorsTM) or other affordable devices (e.g., multiple RGBD cameras [34]) could be used instead of the Kinect.
It is indeed possible to track and analyze motions performed by patients during serious games exercises [35]. However, there are still some problems to solve in order to get relevant information to provide feedback for both patients and clinicians. Compared to the most-used motion analysis in clinics (the gait analysis), the data collected during serious games rehabilitation exercises are usually longer (mini-games are approximately one minute, gait analysis is only focusing on a few steps), non-cyclic (gait cycles are normalized by step), and involve free motions (patients need to perform a task but they can use different strategies (e.g., shoulder or elbow)). Therefore, it is not possible to average and normalize the motions performed by the patients, and analyzing only the ranges of motion is too restrictive to summarize one-to-two minute exercises.
Two solutions are possible to obtain relevant information from the rehabilitation exercises:
The first is to analyze the performance of the patients within the games [36]: time required to finish, number of successes, failures, precision, etc. Although those parameters are relevant in clinics, they are only an indirect indicator of the status of the patients. Direct indicators (i.e., biomechanic and functional analysis) should be obtained by analyzing the motions performed by the patients and extracting clinically relevant information.
The second solution, presented in this paper, is to analyze the trajectories performed by the patients and extract relevant information about speed, total displacement, and reaching area.
First, the results obtained with the MLS were compared with gold-standard MBS. Good agreements were found between the MLS and MBS for the different studied parameters, especially for the speed-related parameters (m/s) and the reaching area (Table 3). Gross and fine motor controls are complex tasks involving many different components of the central and peripheral nervous systems [37]. Important natural alterations occur during a lifetime: a slow maturation of all the components during childhood to acquire gross and later fine motor control [38]; then, physiological declines of motor functions are observed starting around 60 years old [39]. Speed of motion is among the most clinically relevant information in aging [40] and in various neurological diseases (e.g., stroke [41]). However, not only the speed of motion is important in clinics [42]—without gross and fine motor control, patients cannot independently perform activities of daily living [43]. Therefore, this analysis must be coupled with an accuracy assessment [36].
Reaching area and other related parameters informing about the autonomy of patients are popular in rehabilitation and occupational therapy, since this is a good indicator of the independence of the patients [44]. The variables about the volume reached or the area swept in the average plane of motion were highly correlated between both devices (Table 3).
During the second part of this study, we tested the system with elderly and stroke patients to determine if the scores could differentiate the three groups. Concerning aging, we observed a decrease of velocity and of the reaching area, which is coherent with the physiology of aging [39]. Concerning the comparison between elderly and stroke patients, the results must be interpreted carefully because of the small sample size. This part of the study was a proof of concept to evaluate the possibility of such kind of techniques. Larger studies are needed to determine if different motor patterns can be found depending on various pathologies.
One of the possible issues related to this method is that up to 328 parameters can be obtained for upper limbs analysis. It is thus not possible for clinicians or patients to analyze all those parameters. Two problems must be solved before the system can be used in clinics: data reduction/selection and data visualization.
Two methods can be used to determine the more relevant parameters.
The first is based on expert’s (i.e., clinician’s) opinion and expertise. According to the pathology, they select what they think is the most appropriate and relevant [45].
The second approach is to use statistical methods. Principal component analysis can be used to select the most discriminant parameters for each population (if the sample size is large enough). Clustering or other machine learning methods can be used to determine the most relevant parameters to detect differences between healthy subjects and patients [46].
Both methods have advantages and disadvantages. The advantage of expert-based selection (supervised) is that the clinicians (the final users of this solution) are choosing parameters that they understand and are meaningful. The weak point of this method is that they are probably missing plenty of relevant information because of the number of new parameters that they are unaware of. Concerning the unsupervised method, it is the opposite situation: all of the data is analyzed without prior clinical assumptions and therefore parameters will be selected that are relevant from a statistical point of view but which may be difficult to interpret and/or understand for the clinicians. This gap between the clinic and the development of new methods and technology is becoming increasingly important, and special attention must be paid to it in order to continue developing useful technologies [47]. A mixed approach between clinically oriented selection of the data (experts’ opinion) and machine learning methods must be encouraged in order to have solutions that can be used in daily clinics.
Due to financial constraints and the lack of access to clinicians, the time in front of the patient during consultation is continuously decreasing [48]. In this particular context, rapid and easy-to-interpret visualization tools must be developed. An alternative visualization of the scoring (compared to Table 3 and Table 4) is presented in Figure 3.
Selected parameters (n = 17) for visualization are grouped per angular (Ang, n = 6), volumetric (Vol, n = 5), and length (Len, n = 6) characteristics. The reference value of the score is 100%, and is indicated by the yellow circle. Score values of the parameters in the range [±100%] are presented inside each sector, and the radius of the sector is proportional to the score value. Mean scores are presented for each group, and the total score from the 17 parameters is depicted in the yellow circle in the center. The main representative results for parameters from angle, length, and volume are plotted in a star diagram (Figure 3).
In this example, results for both limbs are compared with reference values of healthy subjects and expressed in percentage.
In the case of asymmetric pathologies (e.g., hemiplegia), results of the affected limb can be compared with the healthy one [49].
Future work will focus on the selection of the best parameters and including other relevant parameters such as the smoothness of the motion using normalized jerk in order to assess the quality of the exercises [50].

6. Conclusions

Quantified 3D functional evaluation of the upper limb is still a challenge for clinicians. This evaluation is particularly important in order to best guide treatment and revalidation in order to guarantee optimal results and thus improve patient autonomy.
From a technological point of view, the major advantage of this new method is the frame-by-frame straightforward calculation of its 34 additional points from the crude skeleton captured by MLS in order to evaluate and visualize full 3D data in real time. Point trajectory analysis is usually used for converting marker tracks to six degrees of freedom (DoFs) link motion if at least three link related markers are available [51]. This is a well-defined way of representing motion kinematics, but it requires some specific knowledge about orientation and translation representation in global or/and local coordinate systems. This knowledge (ISB conventions, biomechanical background) has been incorporated in the model to enrich MLS data.
In addition to the optimization algorithm, several parameters were processed based on the trajectories performed by the patients. Further studies are needed to select which parameters are the most relevant to perform functional evaluation and long-term follow-up during the rehabilitation. Results of the analysis are presented for intuitive and easy-to-understand interpretation for both patients and clinicians thanks to the user-friendly visualization interface.
From the rehabilitation point of view, the innovative approach presented in this paper of combining revalidation exercises with functional evaluation offers many advantages: saving time and money, patients are immersed in the game and can therefore perform more repetitions but also more natural movements because they do not have the impression of being evaluated, evaluation is automatic and objective, it does not require the presence of a clinician, measurements can be made in the patient’s ecological environment on a frequent basis, patients can directly visualize the evolution of their results from session-to-session, etc.
The proposed new scoring system to perform functional assessment coupled to rehabilitation exercises has been validated. Therefore, results of this kind of evaluation could be used to monitor patients and to perform long-term follow-up during rehabilitation thanks to the visualization interface. These tools can be useful for both patients and clinicians.

Author Contributions

Conceptualization, B.B.; Data curation, B.B.; Formal analysis, B.B. and V.S.; Investigation, B.B.; Project administration, S.V.S.J. and B.J.; Resources, S.V.S.J.; Software, V.S. and L.O.; Supervision, S.V.S.J. and B.J.; Visualization, V.S.; Writing—original draft, B.B.; Writing—review & editing, S.V.S.J. and B.J.

Funding

This study is a part of the ICT4Rehab and RehabGoesHome projects (www.ict4rehab.org). Those projects were funded by Innoviris (Brussels Capital Region).

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
MLS:Markerless system
MBS:Marker-based system
LCS:Local coordinate system
RCP:Reproducibility coefficient
CV:Coefficient of variation

References

  1. Clark, R.; Pua, Y.; Fortin, K.; Ritchie, C.; Webster, K.; Denehy, L.; Bryant, A. Validity of the Microsoft Kinect for assessment of postural control. Gait Posture 2012, 36, 372–377. [Google Scholar] [CrossRef] [PubMed]
  2. Clark, R.; Pua, Y.; Bryant, A.; Hunt, M. Validity of the Microsoft Kinect for providing lateral trunk lean feedback during gait retraining. Gait Posture 2013, 38, 1064–1066. [Google Scholar] [CrossRef] [PubMed]
  3. Bonnechère, B.; Jansen, B.; Salvia, P.; Bouzahouene, H.; Omelina, L.; Sholukha, V.; Cornelis, J.; Rooze, M.; Van Sint Jan, S. Determination of the precision and accuracy of morphological measurements using the Kinect™ sensor: Comparison with standard stereophotogrammetry. Ergonomics 2014, 57, 622–631. [Google Scholar] [CrossRef] [PubMed]
  4. Bonnechère, B.; Jansen, B.; Salvia, P.; Bouzahouene, H.; Omelina, L.; Moiseev, F.; Sholukha, V.; Cornelis, J.; Rooze, M.; Van Sint Jan, S. Validity and reliability of the Kinect within functional assessment activities: Comparison with standard stereophotogrammetry. Gait Posture 2014, 39, 593–598. [Google Scholar] [CrossRef] [PubMed]
  5. Pfister, A.; West, A.; Bronner, S.; Noah, J. Comparative abilities of Microsoft Kinect and Vicon 3D motion capture for gait analysis. J. Med. Eng. Technol. 2014, 38, 274–280. [Google Scholar] [CrossRef] [PubMed]
  6. Gray, A.; Willis, B.; Skubic, M.; Huo, Z.; Razu, S.; Sherman, S.; TM, G.; Jahandar, A.; Gulbrandsen, T.; Miller, S.; Siesener, N. Development and Validation of a Portable and Inexpensive Tool to Measure the Drop Vertical Jump Using the Microsoft Kinect V2. Sport Health 2017, 9, 537–544. [Google Scholar] [CrossRef] [PubMed]
  7. Kim, D.; Kim, D.; Kwak, K. Classification of K-Pop Dance Movements Based on Skeleton Information Obtained by a Kinect Sensor. Sensors 2017, 17, 1261. [Google Scholar] [CrossRef] [PubMed]
  8. Della Croce, U.; Leardini, A.; Chiari, L.; Cappozzo, A. Human movement analysis using stereophotogrammetry. Part 4: Assessment of anatomical landmark misplacement and its effects on joint kinematics. Gait Posture 2005, 21, 226–237. [Google Scholar] [CrossRef] [PubMed]
  9. Leardini, A.; Chiari, L.; Della Croce, U.; Cappozzo, A. Human movement analysis using stereophotogrammetry. Part 3. Soft tissue artifact assessment and compensation. Gait Posture 2005, 21, 212–225. [Google Scholar] [CrossRef] [PubMed]
  10. Kurillo, G.; Chen, A.; Bajcsy, R.; Han, J. Evaluation of upper extremity reachable workspace using Kinect camera. Technol. Health Care 2013, 21, 641–656. [Google Scholar] [PubMed]
  11. Han, J.; Kurillo, G.; Abreash, R.; de Bie, E.; Nicorici, A.; Bajcsy, R. Reachable Workspace in Facioscapulohumeral muscular dystrophy (FSHD) by Kinect. Technol. Health Care 2015, 51, 168–175. [Google Scholar] [CrossRef] [PubMed]
  12. Han, J.; de Bie, E.; Nicorici, A.; Abreash, R.; Anthonisen, C.; Bajcsy, R.; Kurillo, G.; Mcdonald, C. Reachable workspace and performance of upper limb (PUL) in duchenne muscular dystrophy. Muscle Nerve 2016, 53, 545–554. [Google Scholar] [CrossRef] [PubMed]
  13. Elgendi, M.; Picon, F.; Magnenat-Thalmann, N.; Abbott, D. Arm movement speed assessment via a Kinect camera: A preliminary study in healthy subjects. Biomed. Eng. Online 2014, 13, 88. [Google Scholar] [CrossRef] [PubMed]
  14. Pagliari, D.; Pinto, L. Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors. Sensors 2015, 15, 27569–27589. [Google Scholar] [CrossRef] [PubMed][Green Version]
  15. Wilson, J.; Khan-Perez, J.; Marley, D.; Buttress, S.; Walton, M.; Li, B.; Roy, B. Can shoulder range of movement be measured accurately using the Micorsoft Kinect sensor plus Medical Interactive Recovery Assistant (MIRA) software? J. Shoulder Elb. Surg. 2017, 26, e382–e389. [Google Scholar] [CrossRef] [PubMed]
  16. Zulkarnain, R.; Kim, G.; Adikrishna, A.; Hong, H.; Kim, Y.; Jean, I. Digital data acquisition of shoulder range of motion and arm motion smoothness using Kinect v2. J. Shoulder Elb. Surg. 2017, 26, 895–901. [Google Scholar] [CrossRef] [PubMed]
  17. Otte, K.; Kayser, B.; Mansow-Model, S.; Verrel, J.; Paul, F.; Brandt, A.; Schmitz-Hübsch, T. Accuracy and Reliability of the Kinect Version 2 for Clinical Measurement of Motor Function. PLoS ONE 2016, 11, e0166532. [Google Scholar] [CrossRef] [PubMed]
  18. Müller, B.; Ilg, W.; Giese, M.; Ludolph, N. Validation of enhanced kinect sensor based motion capturing for gait assessment. PLoS ONE 2017, 12, e0175813. [Google Scholar] [CrossRef] [PubMed]
  19. MacWilliams, B.; Davis, R. Addressing some misperceptions of the joint coordinate system. J. Biomech. Eng. 2013, 135, 54506. [Google Scholar] [CrossRef] [PubMed]
  20. Atrsaei, A.; Salarieh, H.; Alasty, A. Human Arm Motion Tracking by Orientation-Based Fusion of Inertial Sensors and Kinect Using Unscented Kalman Filter. J. Biomech. Eng. 2016, 138, 091005. [Google Scholar] [CrossRef] [PubMed]
  21. Seo, N.; Fathi, M.; Hur, P.; Crocher, V. Modifying Kinect placement to improve upper limb joint angle measurement accuracy. J. Hand Ther. 2016, 29, 465–473. [Google Scholar] [CrossRef] [PubMed]
  22. Xu, X.; Robertson, M.; Cehn, K.; Lin, J.; McGorry, R. Using the Microsoft Kinect™ to assess 3-D shoulder kinematics during computer use. Appl. Ergon. 2017, 65, 418–423. [Google Scholar] [CrossRef] [PubMed]
  23. Liao, Y.; Sun, Y.; Li, G.; Kong, J.; Jiang, G.; Cai, H.; Ju, Z.; Yu, H.; Liu, H. Simultaneous Calibration: A Joint Optimization Approach for Multiple Kinect and External Cameras. Sensors 2017, 17, 1491. [Google Scholar] [CrossRef] [PubMed]
  24. Shen, W.; Deng, K.; Bai, X.; Leyvand, T.; Guo, B.; Tu, Z. Exemplar-based human action pose correction. IEEE Trans. Cybern. 2014, 44, 1053–1066. [Google Scholar] [CrossRef] [PubMed]
  25. Sholukha, V.; Bonnechère, B.; Salvia, P.; Moiseev, F.; Rooze, M.; Van Sint Jan, S. Model-based approach for human kinematics reconstruction from markerless and marker-based motion analysis systems. J. Biomech. 2013, 46, 2363–2371. [Google Scholar] [CrossRef] [PubMed]
  26. Brandao, A.; Dias, D.; Castellano, G.; Parizotto, N.; Trevelin, L. RehabGesture: An Alternative Tool for Measuring Human Movement. Telemed. J. E Health 2016, 22, 584–589. [Google Scholar] [CrossRef] [PubMed]
  27. Ding, W.; Zheng, Y.; Su, Y.; Li, X. Kinect-based virtual rehabilitation and evaluation system for upper limb disorders: A case study. J. Back Musculoskelet. Rehabil. 2018, 1–11. [Google Scholar] [CrossRef] [PubMed]
  28. Bonnechère, B.; Jansen, B.; Omelina, L.; Sholukha, V.; Van Sint Jan, S. Patients’ follow-up using biomechanical analysis of rehabilitation exercises. Int. J. Serious Games 2017, 4, 3–13. [Google Scholar] [CrossRef]
  29. Wu, G.; van der Helm, F.; Veeger, H.; Makhsous, M.; Roy, P.V.; Anglin, C.; Nagels, J.; Karduna, A.; McQuade, K.; Wang, X.; et al.; International Society of Biomechanics ISB recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion—Part II: Shoulder, elbow, wrist and hand. J. Biomech. 2005, 38, 981–992. [Google Scholar] [CrossRef] [PubMed]
  30. Duarte, M.; Freitas, S. Revision of posturography based on force plate for balance evaluation. Rev. Bras. Fisioter. 2010, 14, 183–192. [Google Scholar] [CrossRef] [PubMed]
  31. Bonnechère, B.; Jansen, B.; Omelina, L.; Sholukha, V.; Van Sint Jan, S. Validation of the Balance Board for Clinical Evaluation of Balance During Serious Gaming Rehabilitation Exercises. Int. J. Rehabil. Res. 2016, 22, 709–717. [Google Scholar] [CrossRef] [PubMed]
  32. Omelina, L.; Jansen, B.; Bonnechère, B.; Van Sint Jan, S.; Cornelis, J. Serious games for physical rehabilitation: Designing highly configurable and adaptable games. In Proceedings of the 9th International Conference on Disability, Virtual Reality & Associated Technologies (ICDVRAT), Laval, France, 10–12 September 2012. [Google Scholar]
  33. Bonnechère, B.; Jansen, B.; Omelina, L.; Van Sint Jan, S. The use of commercial video games in rehabilitation: A systematic review. Int. J. Rehabil. Res. 2016, 39, 277–290. [Google Scholar] [CrossRef] [PubMed]
  34. Liu, Z.; Huang, J.; Han, J.; Bu, S.; LV, J. Human Motion Tracking by Multiple RGBD Cameras. IEEE Trans. Circuits Syst. Video Technol. 2017, 27, 2014–2027. [Google Scholar] [CrossRef]
  35. Van Diest, M.; Stegenga, J.; Wörtche, H.; Postema, K.; Verkerke, G.; Lamoth, C. Suitability of Kinect for measuring whole body movement patterns during exergaming. J. Biomech. 2014, 47, 2925–2932. [Google Scholar] [CrossRef] [PubMed]
  36. Bonnechère, B.; Sholukha, V.; Omelina, L.; Vooren, M.V.; Jansen, B.; Van Sint Jan, S. Suitability of functional evaluation embedded in serious game rehabilitation exercises to assess motor development across lifespan. Gait Posture 2017, 57, 35–39. [Google Scholar] [CrossRef] [PubMed]
  37. Rosenbaum, D. Human Motor Control; Elsevier: New York, NY, USA, 2009. [Google Scholar]
  38. Teulier, C.; Lee, D.; Ulrich, B. Early gait development in human infants: Plasticity and clinical applications. Dev. Psychobiol. 2015, 57, 447–458. [Google Scholar] [CrossRef] [PubMed]
  39. Trewartha, K.M.; Garcia, A.; Wolpert, D.; Flanagan, J. Fast but fleeting: Adaptive motor learning processes associated with aging and cognitive decline. J. Neurosci. 2014, 34, 13411–13421. [Google Scholar] [CrossRef] [PubMed]
  40. Bherer, L.; Kramer, A.; Peterson, M.; Colcombe, S.; Erickson, K.; Becic, E. Transfer effects in task-set cost and dual-task cost after dual-task training in older and younger adults: Further evidence for cognitive plasticity in attentional control in late adulthood. Exp. Aging Res. 2008, 34, 188–219. [Google Scholar] [CrossRef] [PubMed]
  41. Morone, G.; Paolucci, S.; Iosa, M. In What Daily Activities Do Patients Achieve Independence after Stroke? J. Stroke Cerebrovasc. Dis. 2015, 24, 1931–1937. [Google Scholar] [CrossRef] [PubMed]
  42. Michel, E.; Molitor, S.; Schneider, W. Differential changes in the development of motor coordination and executive functions in children with motor coordination impairments. Child Neuropsychol. 2018, 24, 20–45. [Google Scholar] [CrossRef] [PubMed]
  43. Lipskaya-Velikovsky, L.; Zeilig, G.; Weingarden, H.; Rozental-Iluz, C.; Rand, D. Executive functioning and daily living of individuals with chronic stroke: Measurement and implications. Int. J. Rehabil. Res. 2018, 41, 122–127. [Google Scholar] [CrossRef] [PubMed]
  44. Maitra, K.; Philips, K.; Rice, M. Grasping naturally versus grasping with a reacher in people without disability: Motor control and muscle activation differences. Am. J. Occup. Ther. 2010, 64, 95–104. [Google Scholar] [CrossRef] [PubMed]
  45. Warren, J.; Mwanza, J.; Tanna, A.; Budenz, D. A Statistical Model to Analyze Clinician Expert Consensus on Glaucoma Progression using Spatially Correlated Visual Field Data. Transl. Vis. Sci. Technol. 2016, 5, 14. [Google Scholar] [CrossRef] [PubMed][Green Version]
  46. Yates, E.; Harvey, L.Y.H. Machine learning “red dot”: Open-source, cloud, deep convolutional neural networks in chest radiograph binary normality classification. Clin. Radiol. 2018. [Google Scholar] [CrossRef] [PubMed]
  47. Goldman, M. Education in Medicine: Moving the Boundaries to Foster Interdisciplinarity. Front. Med. 2016, 3, 15–16. [Google Scholar] [CrossRef] [PubMed]
  48. Lemon, T.; Smith, R. Consultation content not consultation lenght improves patient satisfaction. J. Fam. Med. Prim. Care 2014, 3, 333–339. [Google Scholar]
  49. Schlenstedt, C.; Arnold, M.; Mancini, M.; Deuschl, G.; Weisser, B. The effect of unilateral balance training on postural control of the contralateral limb. J. Sports Sci. 2017, 35, 2265–2271. [Google Scholar] [CrossRef] [PubMed]
  50. Buma, F.; van Kordelaar, J.; Raemaekers, M.; van Wegen, E.; Ramsey, N.; Kwakkel, G. Brain activation is related to smoothness of upper limb movements after stroke. Exp. Brain Res. 2016, 234, 2077–2089. [Google Scholar] [CrossRef] [PubMed][Green Version]
  51. Cappozzo, A.; Della Croce, U.; Leardini, A.; Chiari, L. Human movement analysis using stereophotogrammetry. Part 1: theoretical background. Gait Posture 2005, 21, 186–196. [Google Scholar] [PubMed]
Figure 1. Joint center estimation from the Kinect (red circle), reconstructed Plug-in Gait (PiG)-like data (transparent 34 circles), and 19 local coordinate system origins (indicated by numbers).
Figure 1. Joint center estimation from the Kinect (red circle), reconstructed Plug-in Gait (PiG)-like data (transparent 34 circles), and 19 local coordinate system origins (indicated by numbers).
Sensors 18 02216 g001
Figure 2. Example of the visualization of results obtained from the rehabilitation game. Visualization is performed here using LHPFusionBox for a limited set of parameters (i.e., volumetric parameters for wrist and elbow by point trajectory triangulation). The reachable volume is clearly visible, but no direct quantification (i.e., score) is available.
Figure 2. Example of the visualization of results obtained from the rehabilitation game. Visualization is performed here using LHPFusionBox for a limited set of parameters (i.e., volumetric parameters for wrist and elbow by point trajectory triangulation). The reachable volume is clearly visible, but no direct quantification (i.e., score) is available.
Sensors 18 02216 g002
Figure 3. Example of scoring visualization for Right and Left upper limbs from selected motion scoring for one trial of a stroke patient. The scoring was obtained from the 17 parameters defined above. Parameters are grouped by angular (Ang, in red), length (Len, in blue), and volumetric (Vol, in green) properties. Yellow contour corresponds to 100% (healthy group comparison). Parameter sign values are explained in Table 2. Scores for each group and total scores are depicted near the sector of the group and in the center, respectively.
Figure 3. Example of scoring visualization for Right and Left upper limbs from selected motion scoring for one trial of a stroke patient. The scoring was obtained from the 17 parameters defined above. Parameters are grouped by angular (Ang, in red), length (Len, in blue), and volumetric (Vol, in green) properties. Yellow contour corresponds to 100% (healthy group comparison). Parameter sign values are explained in Table 2. Scores for each group and total scores are depicted near the sector of the group and in the center, respectively.
Sensors 18 02216 g003
Figure 4. The three calibration poses: (A) “T-pose”; (B) “Wide pose”; (C) “Upright pose”.
Figure 4. The three calibration poses: (A) “T-pose”; (B) “Wide pose”; (C) “Upright pose”.
Sensors 18 02216 g004
Figure 5. (A) Screenshot of the especially developed serious games used in the study; (B) Illustration of the motion required to control the game.
Figure 5. (A) Screenshot of the especially developed serious games used in the study; (B) Illustration of the motion required to control the game.
Sensors 18 02216 g005
Table 1. Relative coordinate systems topology for upper limbs assessment, origin segments (child local coordinate systems) are expressed relative to parent local coordinate systems. 0 corresponds to a global coordinate system. Point numbers are presented in Figure 1.
Table 1. Relative coordinate systems topology for upper limbs assessment, origin segments (child local coordinate systems) are expressed relative to parent local coordinate systems. 0 corresponds to a global coordinate system. Point numbers are presented in Figure 1.
Motion1451415161718192021222324252627
Child14514154516171819161718191819
Parent0222214151415141545451617
Table 2. List of parameters evaluated for trajectory analysis.
Table 2. List of parameters evaluated for trajectory analysis.
ParameterUnitValueEquation
1mTotal length of the trajectory4
2degTotal angle of the trajectory (hodograph)10
3–5m/sMean, std, and max of the hodograph velocity5–7
6–8deg/sMean, std, and max of hodograph angular velocity12
9deg/sMean hodograph angular velocity from parameter 3 and mean radiusFrom 5
10cm 2 Square of cross-sectional rhomboid, defined by first and second axes29
11cm 3 Volume of two pyramids (diamond) constructed from three axes end points30
12, 13degAngles of view of two main axes from parent local coordinate system (LCS) origin27
14, 15mmSize of the two main axes28
16–18mmPosition of principal axes origin in the parent LCS
19–30mmPosition of the end points of the first two axes in parent LCS25, 26
31degRadius of the cloud fitting by spherePoint fitting by sphere
32mmDistance between LCS origin and sphere centrePoint fitting by sphere
33mmMean residual of sphere fittingPoint fitting by sphere
34mmstd residual of sphere fittingPoint fitting by sphere
35–37 Position in LCS of the fitted sphere centrePoint fitting by sphere
38, 39mm 2 , mm 3 Triangulated surface area and conic volume (with vertex in the LCS origin)Delaunay triangulation and convex hull functions
40, 41srSolid angle (steradians (sr)) of the sphere and triangulated surfaceDelaunay triangulation and convex hull functions
Table 3. Comparison between the optoelectronic (marker-based system, MBS) and the Kinect (markerless system, MLS) systems. R is the Pearson coefficient of correlation, RCP is the reproducibility coefficient expressed in percent, and CV is the coefficient of variation.
Table 3. Comparison between the optoelectronic (marker-based system, MBS) and the Kinect (markerless system, MLS) systems. R is the Pearson coefficient of correlation, RCP is the reproducibility coefficient expressed in percent, and CV is the coefficient of variation.
JointVariablesRRCP (%)CV
Right ShoulderLength (mm)0.71 *4532
Angle (deg)0.56 *3241
Velocity (m/s)0.96 *3118
Angular velocity (deg/s)0.507147
Volume (mm 3 )0.73 *6545
Sphere (cm 3 )0.98 *6340
Surface (mm 2 )0.83 *5253
Left ShoulderLength (mm)0.72 *3538
Angle (deg)0.58 *4641
Velocity (m/s)0.94 *3119
Angular velocity (deg/s)0.566744
Volume (mm 3 )0.64 *5455
Sphere (cm 3 )0.96 *5538
Surface (mm 2 )0.98 *6051
Right WristLength (mm)0.71 *3538
Angle (deg)0.88 *2126
Velocity (m/s)0.95 *3316
Angular velocity (deg/s)0.515875
Volume (mm 3 )0.79 *5740
Sphere (cm 3 )0.97 *6656
Surface (mm 2 )0.98 *5348
Left WristLength (mm)0.68 *3934
Angle (deg)0.92 *1624
Velocity (m/s)0.89 *2815
Angular velocity (deg/s)0.475746
Volume (mm 3 )0.72 *4149
Sphere (cm 3 )0.88 *5545
Surface (mm 2 )0.95 *4743
* Statistically significant correlation (p < 0.05).
Table 4. Mean (std) results of the studied variables for the three groups, p-Values are the results of the  ANOVA.
Table 4. Mean (std) results of the studied variables for the three groups, p-Values are the results of the  ANOVA.
JointVariablesAdultsElderlyStrokep-Value
Right shoulderLength (mm)3.81 × 10 7 (3.7 × 10 7 )3.64 × 10 7 (9.1 × 10 6 )5.71 × 10 7 (1.89 × 10 7 )0.21
Angle (deg)2.95 × 10 4 (1.3 × 10 3 )1.12 × 10 4 (7.2 × 10 3 ) α 1.11 (5.2 × 10 3 ) β <0.001
Velocity (m/s)0.21 (0.09)0.12 (0.06) α 0.10 (0.4) β <0.001
Angular velocity (deg/s)315 (283)403 (775)329 (221)0.71
Volume (mm 3 )6.21 × 10 11 (2.1 × 10 11 )8.12 × 10 11 (1.6 × 10 11 )6.94 × 10 11 (1.53 × 10 11 )0.12
Sphere (cm 3 )3.52 × 10 11 (2.1 × 10 11 )7.68 × 10 11 (2.4 × 10 11 ) α 4.85 × 10 11 (1.4 × 10 11 ) β , γ 0.04
Surface (mm 2 )6.25 × 10 11 (2.5 × 10 11 )2.31 × 10 12 (8.9 × 10 11 )3.62 × 10 11 (3.1 × 10 11 )0.07
Left ShoulderLength (mm)3.88 × 10 7 (1.6 × 10 6 )2.96 × 10 7 (3.4 × 10 7 )4.38 × 10 7 (1.0 × 10 8 )0.64
Angle (deg)2.72 × 10 4 (1.2 × 10 3 )1.23 × 10 4 (7.2 × 10 3 ) α 1.2 × 10 4 (5.1 × 10 3 ) β <0.001
Velocity (m/s)0.19 (0.06)0.13 (0.06) α 0.10 (0.04) β , γ <0.001
Angular velocity (deg/s)271 (251)344 (230)345 (317)0.61
Volume (mm 3 )6.13 × 10 11 (4.6 × 10 11 )1.17 × 10 12 (1.1 × 10 12 )7.71 × 10 11 (1.3 × 10 11 )0.13
Sphere (cm 3 )3.81 × 10 11 (8.4 × 10 10 )1.18 × 10 12 (2.3 × 10 12 ) α 4.18 × 10 11 (1.30 × 10 11 ) β , γ 0.03
Surface (mm 2 )6.27 × 10 11 (1.4 × 10 11 )9.8 × 10 11 (1.9 × 10 11 )3.36 × 10 11 (5.8 × 10 10 )0.06
Right WristLength (mm)3.77 × 10 7 (3.1 × 10 7 )5.58 × 10 7 (5.2 × 10 7 ) α 5.9 × 10 7 (7.5 × 10 7 ) β 0.04
Angle (deg)3.13 × 10 7 (3.2 × 10 7 )3.89 × 10 7 (8.43 × 10 6 )7.91 × 10 7 (1.2 × 10 7 ) β 0.03
Velocity (m/s)0.23 (0.09)0.13 (0.07) α 0.10 (0.04) β <0.001
Angular velocity (deg/s)280 (226)351 (242)323 (311)0.58
Volume (mm 3 )7.01 × 10 11 (1.2 × 10 11 )1.12 × 10 12 (1.5 × 10 11 ) α 5.67 × 10 11 (8.44 × 10 11 )0.04
Sphere (cm 3 )5.81 × 10 11 (7.1 × 10 10 )8.41 × 10 11 (9.0 × 10 10 )6.11 × 10 11 (5.5 × 10 10 )0.21
Surface (mm 2 )5.92 × 10 11 (1.2 × 10 10 )1.33 × 10 12 (2.8 × 10 11 ) α 2.71 × 10 11 (6.4 × 10 11 ) β , γ 0.03
Left WristLength (mm)3.69 × 10 7 (3 × 10 7 )5.57 × 10 7 (4.2 × 10 7 ) α 5.29 × 10 7 (4.3 × 10 7 ) β 0.04
Angle (deg)3.11 × 10 7 (3.3 × 10 7 )3.64 × 10 7 (6.1 × 10 7 ) α 6.33 × 10 7 (7.6 × 10 7 ) β , γ 0.03
Velocity (m/s)0.31 (0.14)0.12 (0.07) α 0.11 (0.05) β <0.001
Angular velocity (deg/s)281 (246)384 (314)294 (245)0.22
Volume (mm 3 )6.84 × 10 11 (1.1 × 10 11 )1.52 × 10 12 (2.65 × 10 11 ) α 5.91 × 10 11 (9.1 × 10 11 )0.03
Sphere (cm 3 )5.89 × 10 11 (6.4 × 10 11 )5.44 × 10 11 (6.1 × 10 11 )4.12 × 10 11 (5.4 × 10 11 )0.42
Surface (mm 2 )4.61 × 10 11 (9.8 × 10 10 )1.41 × 10 12 (2.6 × 10 11 ) α 3.42 × 10 11 (2.6 × 10 11 ) β , γ 0.02
α Statistically significant difference between Adults and Elderly after Bonferroni correction. β Statistically significant difference between Adults and Stroke after Bonferroni correction. γ Statistically significant difference between Elderly and Stroke after Bonferroni correction.

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Back to TopTop