Next Article in Journal
Quantitative Prediction of SYNTAX Score for Cardiovascular Artery Disease Patients via the Inverse Problem Algorithm Technique as Artificial Intelligence Assessment in Diagnostics
Next Article in Special Issue
Motion Capture Technologies for Ergonomics: A Systematic Literature Review
Previous Article in Journal
Association Between Atherosclerosis-Related Cardiovascular Disease and Uveitis: A Systematic Review and Meta-Analysis
Previous Article in Special Issue
Nonlinear and Linear Measures in the Differentiation of Postural Control in Patients after Total Hip or Knee Replacement and Healthy Controls
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mathematical Analysis and Motion Capture System Utilization Method for Standardization Evaluation of Tracking Objectivity of 6-DOF Arm Structure for Rehabilitation Training Exercise Therapy Robot

1
Department of Biomedical Engineering, College of Health Science, Gachon University, 191 Hambak-Moero, Yeonsu-gu, Incheon 21936, Republic of Korea
2
Medical Devices R&D Center, Gachon University Gil Medical Center, 21, 774 Beon-gil, Namdong-daero, Namdong-gu, Incheon 21565, Republic of Korea
3
Department of Biomedical Engineering, College of Medicine, Gachon University, 38-13, 3 Beon-gil, Dokjom-ro 3, Namdong-gu, Incheon 21565, Republic of Korea
4
Department of Health Sciences and Technology, Gachon Advanced Institute for Health Sciences and Technology (GAIHST), Gachon University, 38-13, 3 Beon-gil, Dokjom-ro, Namdong-gu, Incheon 21565, Republic of Korea
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Diagnostics 2022, 12(12), 3179; https://doi.org/10.3390/diagnostics12123179
Submission received: 8 September 2022 / Revised: 12 December 2022 / Accepted: 12 December 2022 / Published: 15 December 2022
(This article belongs to the Special Issue The Use of Motion Analysis for Diagnostics)

Abstract

:
A treatment method for suppressing shoulder pain by reducing the secretion of neurotransmitters in the brain is being studied in compliance with domestic and international standards. A robot is being developed to assist physical therapists in shoulder rehabilitation exercise treatment. The robot used for rehabilitation therapy enables the training of patients to perform rehabilitation exercises repeatedly. However, the biomechanical movement (or motion) of the shoulder joint should be accurately designed to enhance efficiency using a shoulder rehabilitation robot. Furthermore, safely treating patients by accurately evaluating biomechanical movements in compliance with domestic and international standards is a major task. Therefore, an in-depth analysis of shoulder movement is essential for understanding the mechanism of shoulder rehabilitation using robots. This paper proposes a method for analyzing shoulder movements. The rotation angle and range of motion (ROM) of the shoulder joint are measured by attaching a marker to the body and analyzing the inverse kinematics. The first motion is abduction and adduction, and the second is external and internal rotation. The location information of the marker is transmitted to an application software through an infrared camera. For the analysis using an inverse kinematics solution, five males and five females participated in the motion capture experiment. The subjects did not have any disability, and abduction and adduction were repeated 10 times. As a result, ROM of the abduction and adduction were 148° with males and 138.7° in females. Moreover, ROM of the external and internal rotation were 111.2° with males and 106° in females. Because this study enables tracking of the center coordinates of the joint suitably through a motion capture system, inverse kinematics can be accurately calculated. Additionally, a mathematical inverse kinematics equation will utilize follow-up study for designing an upper rehabilitations robot. The proposed method is assessed to be able to contribute to the definition of domestic and international standardization of rehabilitation robots and motion capture for objective evaluation.

1. Introduction

Motor nerves transmit signals from the brain to muscles to induce movement of the shoulder and arm. In particular, when shoulder pain is induced, the muscle can be relaxed by physically stimulating it to relieve pain. Therefore, research is being conducted in compliance with domestic and international standards (IEC 80601-2-78:2019 and SC43) to suppress shoulder pain by reducing the neurotransmitter secretion in the brain. Shoulder pain is a common complication that can be caused by adhesive capsulitis and hemiplegia induced by a stroke [1]. In particular, the adhesive capsulitis causes shoulder pain due to the thickening of the joint capsule and the adhesion of tendons or ligaments [1]. Adhesive capsulitis also causes additional complications due to rotator cuff tears. Therefore, shoulder pain can be reduced through stretching and passive and active joint exercise treatment [1].
Shoulder pain in hemiplegia and adhesive capsulitis requires nonsurgical treatment and shoulder rehabilitation (SR). Rehabilitation exercises have been enabled through conventional manual therapy by physical therapists. However, owing to the development of biomedical engineering technology, the research and development of medical robots for rehabilitation treatment continues through the convergence of physical therapy and engineering [2,3,4,5,6,7]. The advantage of a rehabilitation robot is that therapists are able to train the patient, such that a male or female can repeatedly perform rehabilitation exercises [8,9]. The safety requirements of robots for rehabilitation exercise therapy are extremely important, as specified in the international standards (IEC 80601-2-78:2019). A representative requirement of international standardization of the safety of robots for rehabilitation exercise therapy is that when a hemiplegic or speech-impaired person is trained in a robot system to receive SR, communication between the therapist and the patient must be established [8]. However, it is difficult for a patient with a disability to convey meaning to the therapist, and if an emergency occurs, the paralyzed person must deliver a message to the therapist. However, it is difficult for these patients to convey a clear message. Therefore, these problems lead to medical accidents, making it necessary to establish domestic and international standardization of computer interfaces through which patients and therapists can communicate. Consequently, it is necessary to introduce an intelligent rehabilitation treatment robot to be able to deliver a message in an emergency and monitor the patient’s condition.
In addition, the characteristics of the SR robot enable repetitive exercise training through the automation system, reducing the fatigue of the therapist who needs to perform extensive work, and can guide SR exercise training more accurately [9,10]. However, it is important to accurately implement the biomechanical movement (or motion) of the shoulder joint to enhance the efficiency of using a shoulder rehabilitation robot. The accurate movement of the SR robot can ensure patient safety and prevent accidents [9,10]. Therefore, an in-depth analysis of shoulder movement is essential for understanding the mechanism of SR robots. Various studies on the mechanism of shoulder movement have been conducted [11,12,13,14,15,16].
Wu et al., from the International Society of Biomechanics, proposed a shoulder model based on the definition of the shoulder joint coordinate system (JCS). In particular, the proposed method presented the standardization of the JCS for the shoulder, elbow, wrist, and hand [14], thereby contributing to smooth communication between researchers and clinicians regarding kinematics. However, during the repetitive experiment, the standard position of the joint is not constant and has limitations [14]. Jackson et al. analyzed shoulder kinematics by attaching a marker to the skin to fix the standard joint position. In particular, the method using the chain model and Kalman filter reconstructs the shoulder kinematics by tracking the trajectory of the marker. Therefore, the burden is reduced to an extent that it is unnecessary for the reconstruction of the mathematical model for the determination of the range of motion (ROM) [15]. Zhang et al. proposed a kinematic model using a Vicon motion capture system and markers. In particular, the shoulder elevation and depression phases, and the movement coupling relationship between the displacement of the glenohumeral (GH) joint center with respect to the thoracic coordinate system and elevation of the humerus was investigated. As a result, a new design model for an upper extremity rehabilitation robot consistent with the actual situation of the human body structure was developed [16].
Similar to previous studies, this study proposes a method for analyzing shoulder movements. The rotation angle and ROM of the shoulder joint were measured by attaching a marker to the body and analyzing the inverse kinematics. In particular, a rigid body was designated through a marker to accurately determine the internal center point of the joint. For the experiment, subjects of this study (five males and five females) without any functional disability in the body participated in the motion capture test. Based on the information, which was obtained by tracking the position of the marker, the ROM of each joint was analyzed using inverse kinematics. Consequently, motion analysis using inverse kinematics will be applied to the mechanism of rehabilitation robots. In addition, ROM information of a normal subject can be used as a database for utilizing an SR robot for rehabilitation exercises.

2. Analysis of Motion Capture

In the process of using the robot system for rehabilitation-based training treatment, patients receiving treatment for shoulder pain disease with hemiplegia or speech impairment can communicate with the therapist using a computer, as shown in Figure 1a [8].
Quadriplegic, deaf, blind, and speech-impaired patients cannot express themselves accurately to therapists during exercise training programs for rehabilitation treatment [17]. Therefore, if emergencies occur during the course of training and treatment using treatment instruments, the therapist may not recognize the patient’s condition and a medical accident can occur. Brain computer interface (BCI) defines a technology for interaction between the brain and a computer [18]. This technology refers to a control technology that provides a service so that a computer can grasp the thoughts intended by humans and move objects [19]. In other words, BCI detects brain waves so that computers can grasp cognition, learning, and reasoning similarly to the human brain [20]. Therefore, it is predicted that the use of BCI technology will be high for quadriplegic, hearing-impaired, visually-impaired, and speech-impaired patients who need rehabilitation exercise. BCI technology uses a camera to capture the movement of the patient, and accurately reads an EEG from the patient. It then analyzes the data obtained from the camera and EEG diagnosis to identify the patient’s movement pattern. Therefore, it is possible to predict the treatment outcome by understanding the patient’s requirements and condition.
It is desirable to use a robotic system in which such brain-computer interface (BCI) standardization (SC43) has been established. The most important aspect when moving the arm of the robot in the process of robot motion is matching the movement of the patient’s shoulder. Therefore, an objective evaluation is important to match the patient’s shoulder movement when the robot’s arm moves, and domestic or international standardization work for this evaluation method is highly important [8]. In considering the movement of the robot arm and patient shoulder to establish standardization, it is important to study the construction of a motion capture-based monitoring system for objective evaluation and a mathematical algorithm analysis method for verifying the objective evaluation. In this way, it is possible to provide a safe rehabilitation robot therapy (IEC 80601-2-78:2019) to patients.
Figure 1b shows the setup environment for the motion capture experiment. The overall movement, such as position data of the arm, was tracked through motion capture, and the value of the end effector was obtained. In this study, the wrist was designated as an end effector and utilized as input data to interpret the inverse kinematics. Accordingly, the position and direction vectors of the wrist were tracked in real time through the motion capture system.
The subjects wore stretchy suits to demonstrate that the markers could be attached to the skin. The markers were coated with a material that reflects infrared light, which transmits the position data of the markers to the application software (Motive) using an infrared camera (Flex13, OptiTrack). Consequently, the position vector and direction vector of the markers were extracted in real time based on the absolute coordinate system in the software. In this study, the position data of the markers were analyzed by tracking the two rehabilitation motions. The first motion is abduction and adduction, and the second is external and internal rotation.
Figure 2 shows the location of the markers that were attached to the elastic suit. As shown in Figure 2a, the markers were attached to the clavicle, shoulder, elbow, and wrist. The joints of the arms are located internally and contribute to the rotation of the bones. Therefore, the markers were attached with the center position coinciding with the internal center of the joint. While attaching the markers to designate the subjects’ joint center points, the accuracy was increased by attaching the markers with help of an on-site physical therapist.
Figure 2b shows the locations of the marker attachments and central coordinates of the bone structure of the right arm. The sternoclavicular (SC) protrudes because the muscular membrane and skin covering the joint are thinner than other areas of the body. Therefore, one marker was attached without calculating the central coordinate. Three markers were attached to the shoulder to designate the glenohumeral (GH) joint as the central coordinate system. Two markers were attached to the elbow and wrist, and the humeroulnar (HU) joint and distal radioulnar (DRU) joint were designated as the center coordinates.

3. Mechanism and Mathematical Analysis

3.1. Forward Kinematics

Before interpreting an inverse kinematics solution, forward kinematics was analyzed and defined as a homogeneous transformation matrix [21]. Figure 3 shows the forward kinematics modeling of the right arm that is expressed based on the rotation joint.
Figure 3a shows the rotation joints contributing to the movement of the arm at each central joint position. In particular, points O, S, E, and EE (indicated by the blue dashed circle) are the center points of the joint coordinate system and represent the center coordinates of the joint rotation designated through motion capture. Point O (SC joint) comprises a two-axis rotation joint that involved the vertical and horizontal rotation of the clavicle. Point O is designated as the base point in the kinematics model. Point S (GH joint) is composed of three-axis rotation joints that involved the roll, pitch, and yaw rotation. Point E (HU and HR joint) is composed of a uniaxial rotation joint that involved the flexion and extension of the arm. Finally, point EE (DRU joint) is designated as the end effector of the forward kinematics. In the following kinematics analysis process, the central coordinates of the clavicle, shoulder, elbow, and wrist are expressed as points O, S, E, and EE, respectively.
Figure 3b shows the forward kinematics model of the shoulder with the moving coordinate system. In each joint, the X i , Y i , and Z i   ( i = 0   to   6 ) axes that are the movement coordinate systems were mapped to the joint θ i . The links and rotation parameters based on the forward kinematics are shown in Table 1 and were determined from the Denavit–Hartenberg proof [22,23]. In particular, θ i is the rotation joint and directly concerns the rehabilitation exercise. Therefore, it is an important to measure θ i and ROM in this study.
The link offset and length (e.g., humerus or radius) are from different subjects. Therefore, the links can be calculated through the distance formula between two points in 3-dimensional space to substitute inverse kinematics as a constant value. Equation (1) represents the distance formula of links ( d i or l i ) based on the arbitrary 3-dimensional position from the X n ,   Y n and Z n   n = n a t u r a l   n u m b e r position. To reflect the links that change in real time in the forward and inverse kinematics, a MATLAB tool was used.
l i = d i = X i X i 1 2 + Y i Y i 1 2 + Z i Z i 1 2
T   0 1 = C 1 0 S 1 0 S 1 0 C 1 0 0 1 0 0 0 0 0 1 ,   T   1 2 = C 2 0 S 2 l 2 C 2 S 2 0 C 2 l 2 S 2 0 1 0 0 0 0 0 1 ,   T   2 3 = C 3 0 S 3 0 S 3 0 C 3 0 0 1 0 0 0 0 0 1   ,
T   3 4 = C 4 0 S 4 0 S 4 0 C 4 0 0 1 0 0 0 0 0 1 ,   T   4 5 = C 5 0 S 5 0 S 5 0 C 5 0 0 1 0 d 5 0 0 0 1 ,   T   5 6 = C 6 S 6 0 l 6 C 6 S 6 C 6 0 l 6 S 6 0 0 1 0 0 0 0 1
Based on the information in Table 1, a homogeneous transformation matrix of each rotation joint is shown in Equation (2). Among the components of the matrix, the 3 × 3 matrix (row: 1 to 3, column: 1 to 3) represents the rotation matrix, and the 3 × 1 matrix (row: 1 to 3, column: 4) represents the position vector.
T   0 6 = T   0 1 T   1 2 T   2 3 T   3 4 T   4 5 T   5 6 = R 11 R 12 R 13 P x R 21 R 22 R 23 P y R 31 R 32 R 33 P z 0 0 0 1
T   0 5 = T   0 1 T   1 2 T   2 3 T   3 4 T   4 5 = r 11 r 12 r 13 X e r 21 r 22 r 23 Y e r 31 r 32 r 33 Z e 0 0 0 1
Equation (3) represents the multiplication of the matrix from points O to EE. The direction vectors are expressed as R ij   i , j = 1   to   3 and the position vectors are expressed as P i   i = x ,   y ,   z . Equation (4) represents the multiplication of the matrix from point O to point E. Similarly, the direction vectors are included as r ij   i , j = 1   to   3 and the position vectors are included as I e   I = X ,   Y ,   Z .

3.2. Inverse Kinematics

3.2.1. Position Vector Analysis

The end effector is defined as a homogeneous transformation matrix through motion capture. Subsequently, the position vector of the elbow is calculated utilizing the end effector data. Figure 4 shows the position and direction vector of each point. As shown in Equation (5), the position vector of point E ( X e , Y e , Z e ) is calculated through the x-axis direction vector of the end effector and link l 6 .
EE = P x P y P z , E = EE l 6 R 1 0 0 = P x l 6 R 11 P y l 6 R 21 P z l 6 R 31
In particular, R ij   i , j = 1   to   3 represents the rotation matrix of the end effector. Therefore, the direction vector of the x-axis is analyzed by multiplying the transposition matrix 1   0   0 T with the R matrix, and the links ( l 6 ) are multiplied to calculate the magnitude of the x-axis direction. Consequently, the position vector of point E ( X e ,   Y e ,   Z e ) is calculated by subtracting, as shown in Equation (5).
ES = OS OE = X c X e , Y c Y e , Z c Z e
R z = R 13 , R 23 , R 33
ES · R z = ES · R z · cos π 2 = 0
ES · EO   = ES · EO   · cos θ 0 = X c X e , Y c Y e , Z c Z e · X e , Y e , Z e
R 13 X c + R 23 Y c + R 33 Z c = α , α = R 13 X e + R 23 Y e + R 33 Z e
X e X c + Y e Y c + Z e Z c = β , ( β = X e 2 + Y e 2 + Z e 2 ES · EO · cos θ 0
cos θ 0 = d 5 2 + ( X e 2 + Y e 2 + Z e 2 ) l 2 2 2 · d 5 · X e 2 + Y e 2 + Z e 2
In Equation (6), the ES vector is calculated by subtracting the vectors OS and OE . In Equation (7), the vector R z is defined as the z-axis direction vector of the end effector. Equations (8) and (9) show the dot product formula between vectors ES and EO . As shown in Equation (8), vectors ES and R z are always perpendicular, and the magnitude of the dot product always converges to zero. Equation (9) shows the left and right mathematical expression that represent the identities. Equations (8) and (9) can be induced and arranged into Equations (10) and (11). In particular, α and β are substituted variable values for constant value through the position and direction vector of EE and E. Consequently, cos θ 0 is obtained by calculating the internal angle through ES and EO in Δ OSE .
( R 23 Y e X e R 13 ) Y C + ( R 33 Z e X e R 13 ) Z C = α R 13 X e β     p 1 Y c + q 1 Z c = r 1
( R 13 X e Y e R 23 ) X C + ( R 33 Z e Y e R 23 ) Z C = α R 23 Y e β     p 2 X c + q 2 Z c = r 2
Equations (10) and (11) are combined and expressed as a simultaneous equation and induced to Equations (13) and (14). In particular, the argument of X C , Y C , Z C , and right mathematical expression are defined as constant values in Equations (5)–(12). Therefore, p 1 , q 1 , and r 1 are respectively defined as variable values of X C , Y C , and Z C in Equation (13). Similarly, Equation (14) defines the variable value as p 2 , q 2 , and r 2 .
X c 2 + Y c 2 + Z c 2 = l 2 2
( q 1 2 p 1 2 + q 2 2 p 2 2 + 1 ) Z C 2 2 ( q 1 r 1 p 1 2 + q 2 r 2 p 2 2 ) Z C + ( r 1 2 p 1 2 + r 2 2 p 2 2 ) = l 2 2 ,   ( Z C > 0 )
Equation (15) is the equation of a sphere that has center point from point O. The distance between points S and O represents the radius of the sphere and is equal to link l 2 . Therefore, by substituting Equations (13)–(15), Equation (16) can be expressed as a quadratic equation for Z C .
Figure 5 shows the mathematical relationship between Equations (10), (11), (15), and (16) in 3D coordinate space. It is possible to geometrically interpret a quadratic equation that Z C is a variable. In particular, Equations (10) and (11) are presented by a three-dimensional plane. Therefore, the two planes are crossed and make an intersection line, and the intersection line passes through the sphere to obtain the two intersection points. Consequently, the two intersection points have a potential to be solutions of Equation (16), being the z-axis position vector of point S.
Two solutions are obtained in Equation (16). According to the joint structure of the upper limb, one solution is selected by considering the normal biomechanical movement. Figure 6 shows the biomechanical relationship between the shoulder and the acromioclavicular joint. In Figure 6a, the head of the humerus is covered by the glenohumeral joint and the subacromial bursa. The head of the humerus relaxes or contracts through the supraspinatus and becomes the axis of shoulder rotation. Simultaneously, with the rotation of the shoulder, the clavicle rotates through the sternal end that becomes the axis of rotation. Therefore, the rotary direction of the shoulder and clavicle are the same, as shown in the normal state in Figure 6b. In contrast, the rotation of the shoulder and clavicle are in opposite directions in the abnormal state shown in Figure 6b. Therefore, the movement of the shoulder has the potential to create friction between the humeral head and the acromion.
Two solutions of Equation (16) determine Z C as the position vectors of point S. According to biomechanical analysis, the calculation of Equation (16) can add two conditions. A comparison is possible when it is assumed that two Z C values are expressed as Z C 1 and Z C 2 . If Z C 2 > Z C 1 and Z C 1 is selected as the solution, the center coordinate of the shoulder is always located below the horizontal line. Therefore, the clavicle has a downward oblique angle and an abnormal state, as shown in Figure 6b. In contrast, if Z C 2 is selected as the solution, point S is located above the horizontal line. Therefore, the clavicle maintains the upper oblique angle and a normal state, as shown in Figure 6a. As a result, a condition is ensured to select Z C 2 when the condition is added, such as Z C 2 > 0 > Z C .
Based on Equations (13) and (14), the position values of X C and Y C were calculated using the selected Z C . The head of the humerus is attached to the acromion and fixed by the pectoralis major, supraspinatus, and infraspinatus. Therefore, when determining X C , the condition X C > 0 is ensured, based on point O (sternoclavicular). As a result, when determining Z C , the conditions that Z C 2 > 0 > Z C and X C > 0 can be added.
Figure 7 shows the results when the conditions ( Z C > 0 and X C > 0) are violated by the simulation (Robo analyzer). The position and direction vector of the end effector are inputted, and the angle of the rotation joint is calculated. In abduction, the Z C and X C values are negative, causing shoulder dislocation, as shown in Figure 7A. Similarly, if Z C is negative during external rotation, shoulder dislocation occurs, as shown in Figure 7B. In summary, the position vector of points EE, E, and S are calculated by adding appropriate conditions. Based on the proper position vector, the angle of the rotation joint will be obtained.

3.2.2. Joint Angle Analysis

The joint rotation angles are analyzed to calculate the ROM of each rehabilitation motion. In particular, the inverse kinematics solution of the 6-degree of freedom (DOF) is obtained by solving the position vectors of points E and S in advance [19]. This study used the Mathematica tool (Wolfram Alpha) to solve complex trigonometric functions. In this section, cos θ n and sin θ n are replaced by the C n and S n (n = positive number).
Equations (17) and (18) show the calculation process for the joint angle θ 1 . In Equation (17), X C and Y C are the position vectors of point S. In particular, because the coordinate of one point is included in the spherical coordinate system, X C and Y C are expressed as l 2 , C 1 , C 2 , and S 1 . Therefore, θ 1 is calculated by dividing the two position vectors. Arctan2 is used to consider the sign of the angle.
X C = l 2 C 2 C 1 ,   Y C = l 2 C 2 S 1
θ 1 = atan 2 Y C , X C
Equations (19)–(21) show the calculation process for the joint angle θ 2 . Because the left and right mathematical expressions of Equation (19) constitute the same homogeneous transformation matrix, both sides of the matrix have equal element values. Therefore, Equations (20) and (21) are derived through the comparison of the element (row: 1 column: 4) and (row: 2 column: 4) by the homogeneous transformation matrix. As a result, θ 2 is calculated through dividing l 2 S 2 and l 2 C 2 . Similar to the calculation process for θ 2 , the remaining joint angle is solved by comparing the element from both sides of the homogeneous transformation matrix.
T   0 1 1 · T   0 2 = T   1 2
C 1 X C + S 1 Y C = l 2 C 2
Z C = l 2 S 2
θ 2 = atan 2 Z C ,   C 1 X C + S 1 Y C
In Equation (23), both sides of the element values of (row: 1 column: 4) and (row: 2 column: 4) are compared. Equations (24) and (25) are the left element equation and are substituted with characteristics such as a and b. Subsequently, characteristics a and b are multiplied by C 3 and S 3 to derive Equation (26), which is expressed in a simultaneous equation with Equations (27) and (28). Similarly, both sides of the element values of (row: 1 column: 1) and (row: 2 column: 1) are compared. Equations (27) and (28) are the left element equation and are substituted with c and d. After respectively multiplying c and d by C 3 and S 3 , Equation (29) can be expressed through a simultaneous equation. As a result, Equations (26) and (29) are pressed by comparing both sides of the element and divided to derive θ 3 .
T   0 2 1 · T   0 6 = T   2 6
a = C 1 C 2 P x + C 2 S 1 P y S 2 P z l 2
b = S 1 P x + C 1 P y
l 6 C 6 S 5 = a S 3 + bC 3
c = C 1 C 2 R 11 + C 2 S 1 R 21 S 2 R 31
d = S 1 R 11 + C 1 R 21
C 6 S 5 = c S 3 + dC 3
θ 3 = atan 2 b l 6 d ,   a l 6 c
In Equation (31), the element values of (row: 1, column: 3) and (row: 2 and column: 3) are compared. The left and right mathematical expression of the matrix element are replaced by P and Q, as shown in Equations (32) and (33). As a result, θ 4 is calculated by dividing Q and P.
T   0 3 1 · T   0 6 = T   3 6
P = C 1 C 2 C 3 S 1 C 3 R 13 + C 2 C 3 S 1 + C 1 S 3 R 23 C 3 S 2 R 33 = C 4 S 5
Q = C 2 S 2 R 13 S 1 S 2 R 23 C 2 R 33 = S 4 S 5
θ 4 = atan 2 Q ,   P
In the left mathematical expression of Equation (35), the element values of (row: 1, column: 1), (row 1, column 2), and (row 1, column 3) are substituted with α, β, and γ, respectively. On the right side, (row 2, column 1), (row 2, column 2), and (row 2, column 3) are substituted with a, b, and c, respectively. As a result, Equations (36) and (37) are divided to calculate θ 5 .
T   0 4 1 · T   0 6 = T   4 6
α R 11 + β R 21 + γ R 31 = S 5 C 6
aR 11 + bR 21 + cR 31 = C 5 C 6
θ 5 = atan 2 α R 11 + β R 21 + γ R 31 , aR 11 + bR 21 + cR 31
Finally, θ 6 compares the element values of (row 1, column 1) and (row 2, column 1) in the left and right terms of Equation (39). In matrix T   0 5 1 , (row 1, column 1), (row 1, column 2), and (row 1, column 3) are replaced by U 1 , U 2 , U 3 , respectively. Additionally, (row 2, column 1), (row 2, column 2), and (row 2, column 3) are replaced by V 1 , V 2 , and V 3 , respectively. Consequently, Equations (40) and (41) are divided to calculate θ 6 .
T   0 5 1 · T   0 6 = T   5 6
U 1 R 11 + U 2 R 21 + U 3 R 31 = C 6
V 1 R 11 + V 2 R 21 + V 3 R 31 = S 6
θ 6 = atan 2 V 1 R 11 + V 2 R 21 + V 3 R 31 , U 1 R 11 + U 2 R 21 + U 3 R 31

4. Experiment Results and Discussion

4.1. Abduction and Adduction

Prior to the analysis, five randomized males and five randomized females participated in the motion capture experiment. The subjects did not show any disability. Abduction and adduction motions were repeated 10 times. Figure 8 shows the joint rotation angle, ROM, and simulation results from abduction and adduction.
Figure 8a shows the joint rotation pattern of a subject who performed the abduction and adduction. While each subject performed the exercise 10 times, the similar pattern of the joint angle appeared from θ 1 to θ 6 . In particular, the shoulder joint ( θ 4 ) has the largest variation degree. Simultaneously, the clavicle joint ( θ 2 ) rotates in the same direction with θ 4 . All subjects have different ROM, and the quantitative ROM information is listed in Table 2.
Table 2 shows the ROM of males (M) and females (F) in abduction and adduction. The average ROM for the horizontal angle of the clavicle ( θ 1 ) was 28.9° and 18.3° for males and females, respectively, and the ROM for the vertical angle of the clavicle ( θ 2 ) was 17.6° and 11.5°, respectively. Therefore, both θ 1 and θ 2 average ROM for males was higher than that of females. Roll ( θ 3 ), pitch ( θ 4 ), and yaw ( θ 5 ) of the shoulder joint contribute to the shoulder rotation. The average ROM of roll ( θ 3 ) was 46.1° and 31.9° for males and females, respectively, and yaw ( θ 5 ) was 69.3° and 44.8° for males and females, respectively, indicating that the ROM of males was higher than that of females. In particular, the ROM of pitch ( θ 4 ) was 130.4° and 127.2°. Therefore, θ 3 , θ 4 , and θ 5 values for the males were higher than the females. Elbow joint ( θ 6 ) was 20.7° and 26.2° for males and females, respectively. As a result, males have higher average ROM in the clavicle and shoulder than females, whereas females have higher average ROM in the elbow. For the average ROM by rotation angle of 10 subjects, the standard deviation ( σ ) was calculated. The standard deviation of θ 2 and θ 4 that significantly contributes the abduction and adduction is 4.8 and 10.0.
Figure 8b graphically shows the average ROM for 10 subjects through an analysis of Table 2. Rounding was performed to the first decimal place. The angles of θ 1 and θ 2 that contributed to the movement of the clavicle were 24° and 15°, respectively. Moreover, angles of θ 3 , θ 4 , and θ 5 that are involved in shoulder movement were 39°, 129°, and 57°, respectively. The elbow movement ( θ 6 ) has 23° in abduction and adduction. Figure 8c shows the simulation of the abduction and adduction based on the average ROM. The standing motion was set to the initial position that reflects initial angle value in the parameter of Figure 8c. Consequently, robot simulation shows the accurate trajectory that starts from the initial point and end point of the rotation angle with six-axis joints.
Figure 8d shows the joint-centered trajectory graph in a 6-axis arm structure based on the authors’ motion capture experiment. In the figure, the clavicle maintains a relatively constant position. On the other hand, the shoulder, elbow, and wrist have repetitive movements. Based on the figure, we can analyze the one difference between the simulation and movement of humans in Figure 8c,d. The robot simulation gives a certain angle to form repetitive ROM in one caption line. However, in the human movement based on the motion capture data, the ROM was obtained through repetitive motion in various caption lines, as shown in Figure 8d. The caption line of abduction and adduction changes within 45 degrees and moves to maintain a constant ROM.

4.2. External and Internal Rotation

Figure 9 shows the joint rotation angle, ROM, and simulation results from external rotation and internal rotation.
Figure 9a shows the joint rotation pattern of a subject who performed the external and internal rotation. As with abduction and adduction, each subject performed the exercise 10 times, and the similar pattern of the joint angle appeared from θ 1 to θ 6 . In particular, the shoulder joint ( θ 5 ) has the largest degree of variation. Furthermore, all the joints, except the elbow joint (θ6), showed relatively small movement. As for abduction and adduction, Table 3 summarizes the ROM for the ten subjects in an external and internal rotation experiment.
The average ROM for the horizontal angle of the clavicle ( θ 1 ) was 4.9° and 2.9° for males and females, respectively, and the ROM for the vertical rotation ( θ 2 ) was 3.2° and 3.4°, respectively. Therefore, the θ 1 degree for males is significantly higher than females. However, degree θ 2 for females is significantly higher than males. The average ROM of roll ( θ 3 ) was 8.0° and 7.9° for males and females, respectively, and pitch ( θ 4 ) was 8.5° and 7.5° for males and females, respectively, indicating that both ROM for males was slightly higher than females. In particular, the average ROM of yaw ( θ 5 ) was 111.1° and 106.0°. Therefore, degree θ 5 for the males is significantly higher than females. The ROM of the elbow ( θ 6 ) was almost same in males and females at 23.4° and 23.6°, respectively. The standard deviation of the average ROM ( σ ) was calculated, and the standard deviation of θ 5 , which significantly contributes the external and internal rotation, is 18.2.
Figure 9b graphically shows the average ROM for 10 subjects through an analysis of Table 3. Rounding was performed to the first decimal place. Angle of θ 1 and θ 2 that contributed to movement of clavicle were 4° and 3°, respectively. Moreover, angle of θ 3 , θ 4 and θ 5 that involved in shoulder movement were 8°, 8° and 109°. The elbow movement ( θ 6 ) has 24° in external and internal rotation. Figure 9c shows the simulation of the external and internal based on the average ROM. The standing motion was set to the initial position that reflects initial angle value in the parameter of the Figure 8c. Consequently, robot simulation shows the accurate trajectory that starts from initial point and end point of the rotation angle with six-axis joints. Figure 8d shows the joint-centered trajectory graph in 6-axis arm structure. In the figure, clavicle and shoulder maintain a relatively constant position. On the other hand, wrist have repetitive movements with axis of radius. As a result, we analyzed that both the simulation and the subject maintain a constant scription line, repeating the movement.

4.3. Discussion

This study measured the joint angle degree and ROM of the 6-DOF for two SR motions of the subjects. The joint angles ( θ 4 ) that are most significantly involved in abduction and adduction were 130.4° and 127.2° for males and females, respectively. Therefore, the ROM of abduction and adduction is calculated by adding θ 2 with θ 4 , and obtained 148° and 138.7° for males and females, respectively. The joint angles ( θ 5 ), which are most significantly involved in external and internal rotation movements, were 111.10° and 105.96° for males and females, respectively. Therefore, the ROM of external and internal rotation is calculated by θ 5 , and obtained 111.1° and 106° for males and females, respectively. In conclusion, the average ROM of ten subjects for abduction/adduction and external/internal rotation was, respectively, 143.4° and 108.6°. In abduction and adduction, males showed significantly higher ROM than females. Moreover, the elbow angle ( θ 6 ) of females was higher than males. Therefore, it is judged that females use the elbow more when moving in abduction and adduction than males to tracking motion trajectory.
Unlike external and internal rotation, there are θ 2 and θ 4 that are centrally involved in the ROM of the shoulder in abduction and adduction. Besides θ 2 and θ 4 , the rotation angles of θ 3 , θ 5 , and θ 5 stand out. θ 3 represents the left and right rotation of the shoulder. We think that this is likely due to the shoulder rotation working together with the help of the scapula during the rotation process of the shoulder. θ 5 represents the rotation of the radius or ulna. In the posture of performing the initial movement, the direction vector of the palm faces the center of the body, but as ROM increases, it rotates outward and moves away from the center of the body. θ 6 represents extension and flexion of the elbow. In the course of abduction and adduction exercise through motion capture, the exercise standard is 10 circular movements in a 180-degree range of motion. Therefore, in the process of exercising with the wrist in a half-moon-shaped orbit, if the ROM of the shoulder is limited, it is determined that the rotation follows the half-moon-shaped trajectory by flexion of the elbow.
Ropars classified shoulder hypersensitivity using a motion capture system and physical therapy goniometer. As a result, in the process of measuring standard data for the general public, the average ROM of the shoulder abduction and adduction was 129.9° ± 7.4°. Furthermore, the average external and internal rotational ROM of the shoulder was 94.3° ± 14.1 [24]. To analyze the scapular–humerus rhythm, Bagg, S. measured the movement of the scapula and humerus in abduction and adduction. As a result, the average ROM was 104.3° and a maximum movable range was 111.8° [25]. Barnes analyzed the ROM of shoulder movement using linear regression analysis and studied the age, gender, and dominance as comparative subjects. As a result, abduction and adduction was 180.1° ± 18.2 in males and 187.6° ± 16.1 in females, and the external and internal rotation was 101.2° ± 11.6 in males and 104.9° ± 12.0 in females. [26]. Rigoni validated an IMU for measuring shoulder range of motion in healthy adults. Each movement was assessed with a goniometer, and the IMU by two testers independently. Therefore, the compared agreements were assessed with intra-class correlation coefficients (ICC) and Bland–Altman 95% limits of agreement (LOA). As a result, the ROM of abduction and adduction were measured as 151.4 and 152.2, respectively; and internal and external rotation were measured as 141.1 and 142.3 with intra-class correlation (>0.90) [27].
The last thing to consider is the accuracy of the function of the motion capture (OptiTrack) system. Motion capture (OptiTrack) is very important for the accuracy of the sensor’s response when an object moves. Therefore, we can present the excellence of the accuracy of the proposed method by analyzing [28,29,30,31,32,33]. The method of this study and the methods of studies from [28,29,30,31,32,33] have different objects of observation and different quantities of sensors. However, since all of them used the same motion capture, it is possible to present an average value of accuracy for the function of motion capture. The average value for accuracy is recorded in Table 4, and it can be seen that the accuracy of this study improved by more than 10% compared to [28,29,30,31,32,33].

5. Conclusions

Based on the results of this study, the kinematics solution for 6-DOF of ROM could be determined through the standardized motion of the SR exercise, starting with the clavicle as the base point. In particular, based on the end effector information, we tried to solve the homogeneous transformation matrix of Equations (2) and (3) at first. However, because the constant values for the parameters (shoulder position) could not be solved, we changed the direction of the study to obtain the shoulder position first, then calculate the inverse kinematics formula. As a result, our approach differs compared to the commonly known 6-DOF inverse kinematics solution that combines 3-DOF of the wrist and the other joint angle of the 3-DOF. Through solving the 6-DOF inverse kinematics, future research and development of the 6-axis rehabilitation robot will be conducted. In the follow-up study, we will consider collecting the end effector data through force and torque sensors instead of using a motion capture system. If the end effector data is collected, the rehabilitation robot follows the trajectory of the patient’s motion through a kinematics solution. At the same time, the robot measures the maximum ROM of the patients, and it is envisaged that the patient will be able to perform stretching or passive or active-assisted exercises through the designated ROM.
Upper limb joints are structurally deeper than skin tissue, muscle tissue, and cartilage tissue. The existing research methods have limitations in objective evaluation because they do not select and analyze the central coordinates of the joint. However, because inverse kinematics can be automatically calculated by determining the center of the joint through motion capture, we think that it is extremely advantageous to suitably interpret the center coordinates of the joint, and the study results are more accurate and superior. Additionally, in order to reduce the standard deviation of the ROM and increase the accuracy of the experimental data, additional experiments should be conducted with increased subjects sample size.
If the cause of the difference in ROM is identified in the same rehabilitation motion, it is expected to make a great contribution to the analysis of rehabilitation exercise and human body mechanics. We think of two reasons why errors occur, even after repeating the same motion 10 times and obtaining an average ROM. The first is the degree of flexibility according to the ROM and muscle mass according to the patient’s own body shape. The second is judged to be a relative error of the center position according to the attachment position of the marker, even if it is purely the same operation.
The movement of the SR robot must be the same as the human rehabilitation motion. Therefore, the proposed mathematical analysis method is sufficiently applicable because it is an analysis method for the objective evaluation of the movement of the rehabilitation robot. In conclusion, this study shows that a person will be able to exercise efficiently by wearing the rehabilitation robot with suggested kinematics model. Additionally, this study facilitated the determination of the ROM in the rehabilitation robot considering the ROM of normal subjects. Using the proposed model, it is possible to increase the accuracy of the trajectory of the rehabilitation robot and contribute to the improvement in safety. Comprehensively, utilizing the mathematical inverse kinematics equation that were debuted in this study, we will fabricate an upper rehabilitation robot through designing the mechanism instructor and motor in a follow-up study. In addition, because the rehabilitation exercise training-guided robot is linked to brain-related diseases, it contributes to the definition of domestic and international standardization of rehabilitation robots, affording universal training methods, accurate results, and objective evaluation for the safe treatment of patients.

Author Contributions

Design and experiment: J.S., analysis and fabrication: K.Y., and guidance and supervisor: K.G.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the GRRC program of the Gyeonggi province (No. GRRC-Gachon2020 (B01)) and suported by Gachon University (GCU-202205980001).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available upon request from the corresponding author. The data are not publicly available because of privacy and ethical re-strictions.

Acknowledgments

Jaehwang Seol and Kicheol Yoon equally contributed to this work. Jaehwang Seol and Kicheol Yoon are the co-first (lead) authors.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Cho, C.H.; Bae, K.C.; Kim, D.H. Treatment strategy for frozen shoulder. Clin. Orthop. Surg. 2019, 11, 249–257. [Google Scholar] [CrossRef] [PubMed]
  2. Cho, K.H.; Song, W.K. Robot-assisted reach training for improving upper extremity function of chronic stroke. Tohoku J. Exp. Med. 2015, 237, 149–155. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Kang, B.; Lee, H.; In, H.; Jeong, U.; Chung, J.; Cho, K.J. Development of a polymer-based tendon-driven wearable robotic hand. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) 2016, Stockholm, Sweden, 16–21 May 2016. [Google Scholar]
  4. Woo, H.; Lee, J.; Kong, K. Gait assist method by wearable robot for incomplete paraplegic patients. J. Korea Robot. Soc. 2017, 12, 144–151. [Google Scholar] [CrossRef]
  5. Lee, K.S.; Park, J.H.; Beom, J.; Park, H.S. Design and evaluation of passive shoulder joint tracking module for upperlimb rehabilitation robots. Front. Neurorobot. 2018, 12, 1–14. [Google Scholar] [CrossRef] [Green Version]
  6. Cho, K.H.; Song, W.K. Robot-assisted reach training with an active assistant protocol for long term upper extremity impairment poststroke: A randomized controlled trial. Arch. Phys. Med. Rehabil. 2019, 100, 213–219. [Google Scholar] [CrossRef] [PubMed]
  7. Yang, Q.; Pan, X.; Guo, Y.; Qu, H. Design of rehabilitation medical product system for elderly apartment based on intelligent endowment. ASP Trans. Internet Things 2022, 2, 1–9. [Google Scholar]
  8. Wolpaw, J.R. Brain-computer interface technology: A review of the first international meeting. IEEE Trans. Rehabil. Eng. 2000, 8, 164–173. [Google Scholar] [CrossRef]
  9. Yang, H.E.; Kyeong, S.; Lee, S.H.; Lee, W.J.; Ha, S.W.; Kim, S.M. Structural and functional improvements due to robot-assisted gait training in the stroke-injured brain. Neurosci. Lett. 2017, 637, 114–119. [Google Scholar] [CrossRef]
  10. Mehrholz, J.; Thomas, S.; Kugler, J.; Pohl, M.; Elsner, B. Electromechanical-assisted training for walking after stroke. Cochrane Database Syst. Rev. 2020, 10, CD006185. [Google Scholar]
  11. Carpinella, I.; Lencioni, T.; Bowman, T. Effects of robot therapy on upper body kinematics and arm function in persons post stroke: A pilot randomized controlled trial. J. Neuroeng. Rehabil. 2020, 17, 1–19. [Google Scholar] [CrossRef] [Green Version]
  12. Pereira, S.; Silva, C.C.; Ferreira, S. Anticipatory postural adjustments during sitting reach movement in post-stroke subjects. J. Electromyogr. Kinesiol. 2014, 24, 165–171. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  13. Kim, J. A Study on the Therapeutic Effect of the Upper Limb Rehabilitation Robot “Camillo” and Improvement of Clinical Basis in Stroke Patients with Hemiplegia. Master’s Thesis, Dongguk University, Seoul, Republic of Korea, 2021. [Google Scholar]
  14. Wu, G.; Helm, V.D.; Veeger, F.C.; Makhsous, H.E.; Van Roy, M.; Anglin, P.; Nagels, C.; Karduna, J.; McQuade, A.R.; Wang, K.; et al. ISB Recommendation on definitions of joint coordinate systems of various joints for the reporting of human joint motion--Part II: Shoulder, elbow, wrist and hand. J. Biomech. Int. Soc. Biomech. 2005, 38, 981–992. [Google Scholar] [CrossRef] [PubMed]
  15. Jackson, M.; Michaud, B.; Tétreault, P.; Begon, M. Improvements in measuring shoulder joint kinematics. J. Biomech. 2012, 45, 2180–2183. [Google Scholar] [CrossRef] [PubMed]
  16. Zhang, C.; Dong, M.; Li, J.; Cao, Q. A modified kinematic model of shoulder complex based on vicon motion capturing system: Generalized GH joint with floating centre. Sensors 2020, 20, 3713. [Google Scholar] [CrossRef]
  17. Liu, Y.; Huang, S.; Huang, Y. Motor imagery EEG classification for patients with amy-otrophic lateral sclerosis using fractal dimension and Fisher’s criterion-based channel selection. Sensors 2017, 17, 1557. [Google Scholar]
  18. Gainmann, B.; Allison, B.; Pfurtscheller, G. Brain-Computer Interface, Revolutionizing Human-Computer Interaction; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  19. Nijholt, A.; Tan, D. Brain-computer interfacing for intelligent system. IEEE Intell. Syst. 2008, 23, 72–79. [Google Scholar] [CrossRef] [Green Version]
  20. Sajda, P.; Muller, K.-R.; Shenoy, K.V. Brain-computer interfaces. IEEE Signal Process. Mag. 2008, 25, 16–28. [Google Scholar] [CrossRef]
  21. Asif, S.; Webb, P. Kinematics analysis of 6-DoF articulated robot with spherical wrist. Math. Probl. Eng. 2021, 2021, 1–11. [Google Scholar] [CrossRef]
  22. Denavit, J.; Hartenberg, R.S. A kinematic notation for lower-pair mechanisms based on matrices. ASME J. Appl. Mech. 1995, 22, 215–221. [Google Scholar] [CrossRef]
  23. Yoon, K.C.; Cho, S.M.; Kim, K.G. Coupling effect suppressed compact surgical robot with 7-Axis multi-joint using wire-driven method. Mathematics 2022, 10, 1698. [Google Scholar] [CrossRef]
  24. Ropars, M.; Cretual, A.; Thomazeau, H.; Kaila, R.; Bonan, I. Volumetric definition of shoulder range of motion and its correlation with clinical signs of shoulder hyperlaxity. A motion capture study. J. Shoulder Elb. Surg. 2015, 24, 310–316. [Google Scholar] [CrossRef]
  25. Bagg, S.D.; Forrest, W.J. A biomechanical analysis of scapular rotation during arm abduction in the scapular plane. Am. J. Phys. Med. Rehabil. 1988, 67, 238–245. [Google Scholar] [PubMed]
  26. Barnes, C.J.; Van Steyn, S.J.; Fischer, R.A. The effects of age, sex, and shoulder dominance on range of motion of the shoulder. J. Shoulder Elb. Surg. 2001, 10, 242–246. [Google Scholar] [CrossRef] [PubMed]
  27. Rigoni, M.; Gill, S.; Babazadeh, S.; Elsewaisy, O.; Gillies, H.; Nguyen, N.; Pathirana, P.N.; Page, R. Assessment of Shoulder Range of Motion Using a Wireless Inertial Motion Capture Device-A Validation Study. Sensors 2019, 19, 1781. [Google Scholar] [CrossRef] [Green Version]
  28. Ahatzitofis, A.; Zarpals, D.; Kollias, S.; Daras, P. DeepmoCap: Deep optical motion capture using multiple depth sensors and retro-reflectors. Sensors 2019, 19, 282. [Google Scholar]
  29. Cao, Z.; Simon, T.; Wei, S.E.; Sheikh, Y. Realtime multi-person 2D pose estimation using part affinity fields. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, USA, 21–26 July 2017; pp. 7291–7299. [Google Scholar] [CrossRef] [Green Version]
  30. Qiu, S.; Hao, Z.; Wang, Z.; Liu, L.; Liu, J.; Zhao, H.; Fortino, G. Sensor combination selection strategy for kayak cycle phase segmentation based on body sensor networks. IEEE Internet Things J. 2022, 9, 1–12. [Google Scholar] [CrossRef]
  31. Walha, R.; Lebel, K.; Gaudreault, N.; Dagenais, P.; Cereatti, A.; Croce, U.D. The accuracy and precision of gait spatio-temporal parameters extracted from an instrumented sock during treadmill and overground walking in healthy subjects and patients with a foot impairment secondary to psoriatic arthritis. Sensors 2021, 21, 6179. [Google Scholar] [CrossRef]
  32. Kim, Y.; Baek, S.; Bae, B.C. Motion capture of the human body using multiple depth sensors. ETRI J. 2017, 39, 181–190. [Google Scholar] [CrossRef]
  33. Steinebach, T.; Grosse, E.H.; Glock, C.H.; Wakula, J.; Lunin, A. Accuracy evaluation of two markerless motion capture systems for measurement of upper extremities: Kinect V2 and Captiv. Hum. Factors Ergon. Manuf. Serv. Ind. 2020, 30, 291–302. [Google Scholar] [CrossRef]
Figure 1. Configuration of a motion capture system for standardized rehabilitation exercise therapy. (a) Definition of the brain–computer interface (BCI). (b) Experimental environment setup for the motion capture and tracking markers.
Figure 1. Configuration of a motion capture system for standardized rehabilitation exercise therapy. (a) Definition of the brain–computer interface (BCI). (b) Experimental environment setup for the motion capture and tracking markers.
Diagnostics 12 03179 g001
Figure 2. Photograph of the motion capture. (a) Abduction/adduction and external/internal rotation was performed to obtain the position and direction data of the markers. (b) The markers were attached to the skin to coincide with the central coordinate of the joint.
Figure 2. Photograph of the motion capture. (a) Abduction/adduction and external/internal rotation was performed to obtain the position and direction data of the markers. (b) The markers were attached to the skin to coincide with the central coordinate of the joint.
Diagnostics 12 03179 g002
Figure 3. Shoulder complex modeling. (a) Mechanism of the shoulder complex model with rotation joints. (b) Forward kinematics modeling of the shoulder complex with a relative position coordinate system.
Figure 3. Shoulder complex modeling. (a) Mechanism of the shoulder complex model with rotation joints. (b) Forward kinematics modeling of the shoulder complex with a relative position coordinate system.
Diagnostics 12 03179 g003
Figure 4. Position vector and direction vector of points E and EE.
Figure 4. Position vector and direction vector of points E and EE.
Diagnostics 12 03179 g004
Figure 5. Two shoulder position vectors that are expressed through three-dimensional space.
Figure 5. Two shoulder position vectors that are expressed through three-dimensional space.
Diagnostics 12 03179 g005
Figure 6. The biomechanical relationship between the shoulder and the acromioclavicular joint. (a) Anatomical structure of the shoulder joint and the acromioclavicular joint. (b) Normal or abnormal correlation of the inclination of the clavicle and shoulder rotation.
Figure 6. The biomechanical relationship between the shoulder and the acromioclavicular joint. (a) Anatomical structure of the shoulder joint and the acromioclavicular joint. (b) Normal or abnormal correlation of the inclination of the clavicle and shoulder rotation.
Diagnostics 12 03179 g006
Figure 7. Simulation of shoulder movement based on the position vector of the end effector. (A) Math condition violation in abduction ( Z C < 0 and X C < 0 ). (B) Math condition violation in external rotation ( Z C < 0 ).
Figure 7. Simulation of shoulder movement based on the position vector of the end effector. (A) Math condition violation in abduction ( Z C < 0 and X C < 0 ). (B) Math condition violation in external rotation ( Z C < 0 ).
Diagnostics 12 03179 g007
Figure 8. Joint rotation angle with ROM and simulation by analysis of the inverse kinematics. (a) Realized rotation degree variation. (b) Average ROM of males and females in abduction and adduction. (c) Simulation results of abduction and adduction. (d) Joint-centered trajectory graph in a 6-axis arm structure.
Figure 8. Joint rotation angle with ROM and simulation by analysis of the inverse kinematics. (a) Realized rotation degree variation. (b) Average ROM of males and females in abduction and adduction. (c) Simulation results of abduction and adduction. (d) Joint-centered trajectory graph in a 6-axis arm structure.
Diagnostics 12 03179 g008aDiagnostics 12 03179 g008bDiagnostics 12 03179 g008c
Figure 9. Joint rotation angle with ROM and simulation. (a) Realized rotation degree variation in external and internal rotation. (b) Average ROM for males and females. (c) Simulation results of external and internal rotation. (c,d) Simulation results of abduction and adduction. (d) Joint-centered trajectory graph in a 6-axis arm structure.
Figure 9. Joint rotation angle with ROM and simulation. (a) Realized rotation degree variation in external and internal rotation. (b) Average ROM for males and females. (c) Simulation results of external and internal rotation. (c,d) Simulation results of abduction and adduction. (d) Joint-centered trajectory graph in a 6-axis arm structure.
Diagnostics 12 03179 g009aDiagnostics 12 03179 g009b
Table 1. Denavit–Hartenberg Table.
Table 1. Denavit–Hartenberg Table.
JointLink Angle
θ i   ( rad )
Link Offset
d i   ( mm )
Link Length
l i   ( mm )
Link Twist
a i   ( rad )
1 θ 1 0 0 π 2
2 θ 2 0 l 2 π 2
3 θ 3 0 0 π 2
4 θ 4 0 0 π 2
5 θ 5 d 5 0 π 2
6 θ 6 0 l 6 0
Table 2. Data collection for ROM (°) for males and females (abduction and adduction).
Table 2. Data collection for ROM (°) for males and females (abduction and adduction).
Height
(mm)
θ 1 θ 2 θ 3 θ 4 θ 5 θ 6
M.avg 177 28.9 17.6 46.1 130.4 69.3 20.7
M1 170 33.1 18.1 44.5 127.3 79.9 26.4
M2 174 30.8 15.9 45.3 133.7 72.0 18.3
M3 177 24.8 25.9 43.9 130.4 69.8 24.0
M4 190 29.1 14.9 33.8 134.3 54.8 10.5
M5 172 26.8 13.4 63.1 126.4 70.1 24.5
SD 7.12.94.49.53.28.15.8
SEM3.21.32.04.21.43.62.6
F.avg16018.311.531.9127.244.826.2
F116121.511.326.0132.948.224.9
F215914.412.425.8116.123.533.7
F316513.56.633.1108.263.628.0
F416021.714.845.2146.442.514.5
F515420.312.529.2132.546.329.9
SD 3.53.62.77.213.512.96.5
SEM1.61.61.23.26.05.72.9
T.avg 23.6 14.6 39.0 128.8 57.1 23.5
SEM 2.0 1.5 3.53.15.2 2.1
SD ( σ ): Standard deviation. SEM: Standard error of the mean. T.avg: Total average.
Table 3. Data collection for the ROM (°) for males and females (external and internal rotation).
Table 3. Data collection for the ROM (°) for males and females (external and internal rotation).
Height θ 1 θ 2 θ 3 θ 4 θ 5 θ 6
M.avg 177 4.93.28.08.5111.123.4
M1 170 2.63.48.59.4109.918.8
M2 174 2.82.93.53.997.522.1
M3 177 2.02.25.49.399.520.6
M4 190 9.04.411.510.2125.932.1
M5 172 8.03.311.29.5122.723.6
SD 7.13.0 0.7 3.2 2.3 11.6 4.6
SEM 3.21.3 0.31.4 1.0 5.2 2.1
F.avg1602.93.47.97.5106.023.6
F11616.66.912.38.8138.542.9
F21591.31.05.65.179.722.3
F31651.21.85.74.489.919.2
F41603.44.57.58.4122.419.5
F51542.03.08.410.990.313.9
SD 7.12.0 2.12.42.4 22.4 10.0
SEM 3.20.9 0.9 1.11.110.0 4.5
T.avg 3.9 3.3 8.08.0107.6 23.5
SEM 0.9 0.50.9 0.85.72.5
SD ( σ ): Standard deviation. SEM: Standard error of the mean. T.avg: Total average.
Table 4. Comparison for accuracy of proposed system and others.
Table 4. Comparison for accuracy of proposed system and others.
ReferenceAverage Accuracy [%]Performance of a Motion Capture
this work97.6OptiTrack
[28]94.8optical motion capture method (DeepMoCap)
[29]93.6multi-person pose estimation
[30]95.9OptiTrack
[31]95.0IMU Sensor (mobilitylLab system)
[32]85.3multiple Kinect sensors
[33]70.0kinect V2 and captiv sensor
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Seol, J.; Yoon, K.; Kim, K.G. Mathematical Analysis and Motion Capture System Utilization Method for Standardization Evaluation of Tracking Objectivity of 6-DOF Arm Structure for Rehabilitation Training Exercise Therapy Robot. Diagnostics 2022, 12, 3179. https://doi.org/10.3390/diagnostics12123179

AMA Style

Seol J, Yoon K, Kim KG. Mathematical Analysis and Motion Capture System Utilization Method for Standardization Evaluation of Tracking Objectivity of 6-DOF Arm Structure for Rehabilitation Training Exercise Therapy Robot. Diagnostics. 2022; 12(12):3179. https://doi.org/10.3390/diagnostics12123179

Chicago/Turabian Style

Seol, Jaehwang, Kicheol Yoon, and Kwang Gi Kim. 2022. "Mathematical Analysis and Motion Capture System Utilization Method for Standardization Evaluation of Tracking Objectivity of 6-DOF Arm Structure for Rehabilitation Training Exercise Therapy Robot" Diagnostics 12, no. 12: 3179. https://doi.org/10.3390/diagnostics12123179

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop