Next Article in Journal
Robotic Platform for Horticulture: Assessment Methodology and Increasing the Level of Autonomy
Previous Article in Journal
Spatio-Temporal Calibration of Multiple Kinect Cameras Using 3D Human Pose
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Sensor-Based Motion Tracking System Evaluation for RULA in Assembly Task

by
Wenny Franciska Senjaya
1,2,
Bernardo Nugroho Yahya
1,* and
Seok-Lyong Lee
1,*
1
Department of Industrial and Management Engineering, Hankuk University of Foreign Studies, Yongin 17035, Republic of Korea
2
Faculty of Information Technology, Maranatha Christian University, Bandung 40164, Indonesia
*
Authors to whom correspondence should be addressed.
Sensors 2022, 22(22), 8898; https://doi.org/10.3390/s22228898
Submission received: 12 October 2022 / Revised: 7 November 2022 / Accepted: 11 November 2022 / Published: 17 November 2022
(This article belongs to the Section Intelligent Sensors)

Abstract

:
Industries need a mechanism to monitor the workers’ safety and to prevent Work-related Musculoskeletal Disorders (WMSDs). The development of ergonomics assessment tools helps the industry evaluate workplace design and worker posture. Many studies proposed the automated ergonomics assessment method to replace the manual; however, it only focused on calculating body angle and assessing the wrist section manually. This study aims to (a) propose a wrist kinematics measurement based on unobtrusive sensors, (b) detect potential WMSDs related to wrist posture, and (c) compare the wrist posture of subjects while performing assembly tasks to achieve a comprehensive and personalized ergonomic assessment. The wrist posture measurement is combined with the body posture measurement to provide a comprehensive ergonomics assessment based on RULA. Data were collected from subjects who performed the assembly process to evaluate our method. We compared the risk score assessed by the ergonomist and the risk score generated by our method. All body segments achieved more than an 80% similarity score, enhancing the scores for wrist position and wrist twist by 6.8% and 0.3%, respectively. A hypothesis analysis was conducted to evaluate the difference across the subjects. The results indicate that every subject performs tasks differently and has different potential risks regarding wrist posture.

1. Introduction

Work-related Musculoskeletal Disorders (WMSDs) are the major outcome of occupational injuries in the industry [1]. The Korean Ministry of Employment and Labor reported 9440 cases of WMSDs in 2019; these case rates increased by 40.6% from the previous year [2]. WMSDs can decrease workers’ productivity and increase workers’ compensation costs as WMSDs are the main cause of absenteeism, lost working days, and temporary or permanent disability [3,4]. The dominant factors that lead to WMSDs in manual assembly tasks are awkward working postures, repetitive movements, and workloads [5,6,7]. Repetitive movements are a significant risk factor for upper limb WMSDs, especially for the arm and wrist. Some injuries and disorders affect the wrist, such as carpal tunnel syndrome, wrist tendonitis, and lateral epicondylitis [8].
Many studies have proposed a method to help industries evaluate their work environment. There is a trend in measuring technical ergonomics based on sensors, such as wearable inertial sensors [9,10]. Many ergonomic assessment research works rely on a vision-based approach utilizing either image [11,12] or motion sensors like Kinect [13] to focus on high-level activities (e.g., lifting, sitting, kneeling, hammering, and drilling) without considering the Therbligs activities. The most active body part in the assembly process is the hand (i.e., arm, wrist, and fingers). Therbligs motions (e.g., grasp, assemble, inspect, etc.) can be used to describe the low-level activity done by hand, and these motions are related to wrist kinematics [14]. For example, when a worker grasps something, the wrist might be extended or twisted. Existing works have addressed ergonomic assessment with low-level or Therbligs activities through two different approaches: manual assessment and biosensor-based assessment. The former approach mainly utilizes the body skeleton data to determine some body parts’ position scores automatically; however, the wrist twist and position score were determined manually by the ergonomist [15,16]. Manual assessment is time-consuming [12] and is mostly based on the ergonomist’s assumptions. The latter approach uses a biosensor-based assessment (e.g., Shimmer wearable sensors, Delsys TrignoTM, Electrodes) to measure the wrist kinematics [17,18]; however, the sensor attached to the worker’s body may cause discomfort and affect postural behavior [13]. Work using unobtrusive sensors was addressed by [19] with two limitations in wrist kinematics measurement: weak joint points selection and missing the adjustment score. The weak joint points selection mostly produced a perpendicular angle and resulted in a high-risk score. In addition, the previous work did not cover the adjustment score, which is important for measuring the risk of wrist posture.
Accurately measuring the risk posture is essential because the measurement results can be used as analysis material for the evaluation of the workplace setup and workers’ training to reduce the risk in order to prevent WMSDs and maintain safety at the workplace. In addition, posture risk assessment can also be used for advanced applications, such as virtual training and monitoring systems. Many industrial projects adopt virtual training to increase productivity during training and solve the problem of the lack of training instructions. It is essential that a training session accurately simulates an actual working environment and any challenges that a worker might experience in the workplace. Monitoring systems in the industry can help workers to raise awareness of occupational health and safety by providing training on safety and risk assessments for high-risk jobs (e.g., hazard inspection at a construction site) [20].
This study focuses on developing a method to measure wrist kinematics based on unobtrusive sensors. Wrist kinematics occurs as the result of complex 3D movements between individual carpal bones, including between carpal bones and the bones surrounding the forearm [21]. To evaluate the overall posture risk in assembly tasks, we perform RULA and synchronize multimodal unobtrusive sensors, a body tracking sensor, and a hand tracking sensor. Both sensors are unobtrusive; therefore, the sensors do not interfere or change the postural behavior of workers. Recent studies have proven that multimodal sensor fusion can achieve high-performance generalization and tackle some problems that are too complex to solve with single sensor values [22]. In addition, we not only measure a certain posture from an activity but also evaluate the transition movement. Therefore, the proposed method can evaluate and detect potential postures that cause WMSDs in workers’ movements more precisely. Unlike previous works that measured the potential risk based on the frequency of the RULA grand score [19], this study attempts to elaborate the adjustment of wrist kinematics (i.e., flexion–extension and radial–ulnar deviation) and forearm rotation (i.e., pronation–supination) to enhance the ergonomics assessment method. The main contributions of our works are summarized as follows:
  • We propose a geometric-based wrist kinematics measurement for radial–ulnar deviation and wrist twist measurement based on the flexion–extension angle and forearm pronation–supination from the hand tracking data sensor.
  • We present a comprehensive ergonomics assessment (i.e., RULA) using a derived wrist posture measure, along with a body posture measure, in the assembly process automatically.
  • We present an extensive experiment to show a personalized ergonomic assessment using multimodal unobtrusive sensors (i.e., body tracking and hand tracking sensors).
The remaining sections of this work are structured as follows: the second section presents a summary of the related works in the fields of ergonomics assessment and wrist kinematics. The third section describes our proposed method. The fourth section describes the experimental settings and evaluation results. The fifth section discusses the findings from the experimental results. The last section concludes this study and the potential future works from this study.

2. Related Work

2.1. Ergonomics Assessment

Ergonomics assessment is the method to evaluate the body posture of workers while they are accomplishing a task. This assessment helps evaluate workplace design, reduce the risk of WMSDs, and improve worker productivity [23]. Most previous studies have used RULA and REBA to assess ergonomics; both methods use joint angle measurements to determine the risk level of body posture. The joint angle is measured for every body part, such as the upper arm, lower arm, trunk, wrist, etc. One of the most challenging parts to measure is the wrist. Some previous studies have an ergonomist manually set the wrist score based on an image or video [13,15,16]. A possible reason for this is that they only utilized data from one source to measure the body joint angle and lacked data to measure the wrist. Other previous studies determined the wrist score automatically; however, they did not explain the wrist joint angle measurement in detail.
The previous studies usually used OpenPose to detect poses from images [13,15,16] or motion capture technology to record body movements [24,25,26]. Some previous studies addressed motion capture and motion tracking using diverse types of sensors, such as an accelerometer shirt [27], inertial sensors, a smartphone, or a camera. Some of the previous studies used recorded sensor data to perform pose-by-pose reconstructions. In [28], the author proposed reconstructing a virtual character’s full-body movement method for different movement behaviors (e.g., walking, running, etc.) by using a single IMU. The authors in [29] proposed a Gaussian Process Latent Variable Model (GPLVM) to visualize high dimensional data. GPLVM was used by [30] to represent pose data with latent coordinate points and used a Hidden Markov Model (HMM) to segment the phase. The phase at each frame is automatically determined based on the HMM classifier. A Multilayer Perceptron (MLP) model has been applied to define the mapping between the training sensor data and the latent coordinates points [30]. The author in [31] presented the iNEMO inertial measurement unit to estimate the orientation of human body segments. iNEMO consists of various Micro Electro-Mechanical Systems (MEMS) that can estimate the orientation of the human body segment [31].
This study uses the data from the multimodal unobstructive sensors to provide the ergonomics assessment based on RULA. Furthermore, previous studies have only evaluated several postures, while this study evaluates the whole posture across a sequence of activities that occur while assembling a drawer. Table 1 shows the list of related works that developed the ergonomics assessment method. Previous studies usually used OpenPose to detect poses from images [15,16,32] or used motion capture technology to record body movements [24,25]. This study uses data from multimodal unobstructive sensors to provide the ergonomics assessment based on RULA.

2.2. Wrist Kinematics

Some previous studies focused on the wrist assessment for a specific activity, such as laparoscopic surgery [36,37] and activity during mouse use [38]. The authors in [36] measured the flexion–extension angles of the surgeon’s wrist using wireless electro-goniometer sensors and CAPTIV version L7000 software. The sensors were placed on the back of the right and left hands and forearms [36]. Ref. [37] utilized five sensors for monitoring the forces exerted by the distal phalanges of the surgeon’s thumb; index, middle, and ring fingers; and palm. All data were examined with custom-made software [37]. Ref. [38] used an infrared three-dimensional (3D) motion analysis system (Optotrak Certus, Northern Digital, Ontario, Canada) to measure the posture of the upper extremities; moreover, six-degrees-of-freedom force–torque sensors (ATI, Apex, NC, USA) were attached to each arm support plate and the mouse pad. This study evaluated six support conditions (i.e., forearm support, flat palm support, raised palm support, forearm + flat palm support, forearm + raised palm support, and no support) and the joint angles were calculated from the Euler angles; these are defined by the rotation matrices that describe the orientation of the distal segment relative to the proximal segment concerning the anatomical position, and the vertical position was used to determine the joint angles [38].
Some previous studies estimated wrist kinematics by collecting hand motion data using a biosensor. The author in [18] proposed a novel regression scheme for supervised domain adaptation (SDA), wherein the domain shift effects were effectively reduced to enhance the inter-subject performances of CNN in the wrist kinematics estimation. This study recorded sEMG signals using the Delsys TrignoTM system. The electrodes were placed over five primary wrist muscles in the right forearm: Flexor Carpi Radialis (FCR), Flexor Carpi Ulnaris (FCU), Extensor Carpi Radialis Longus (ECRL), Extensor Carpi Radialis Brevis (ECRB), and Extensor Carpi Ulnaris (ECU) [18]. Another previous work proposed a Convolutional Neural Network (CNN)-Long Short-Term Memory (LSTM) hybrid model to fully explore the temporal–spatial information from surface electromyography (sEMG). The wrist movements were simultaneously recorded with Shimmer wearable sensors, which were attached to the back of the testing hand [17].

3. Method

Figure 1 shows our ergonomics assessment workflow, which consists of three sections: data preparation, automated RULA, and evaluation. Data preparation has three phases: data collection, data fusion, and data processing. In the automated RULA section, the process starts with the expert labeling the data with a RULA score. The expert assesses the worker’s posture by visualizing the data using 3D visualization tools and matches it with the video recorded during data collection. We use the expert-labeled data as ground truth. The next phase is the proposed ergonomics assessment method that evaluates the body posture and wrist kinematics (i.e., flexion–extension, radial–ulnar deviation, and pronation–supination) to determine the RULA score. The last section is the evaluation, wherein we measure the similarity of the RULA scores that were calculated by the expert and automatically calculated by the system. In addition, to evaluate the heterogeneity between subjects, we perform hypothesis testing to justify the personalized measurement.

3.1. Data Preparation

The data was collected from 12 subjects—eleven subjects were right-handed and one subject was left-handed. All subjects included in the data collection signed the “Research Subject Confidentiality Agreement” form. We collected the data using a body tracking sensor and a hand tracking sensor. The first sensor is the Orbbec camera, which has a sampling rate of 30Hz and is used to record body movements. The second sensor, the Leap Motion, aims to record hand motions with a 64 Hz sampling rate. Since both sensors have different sampling rates, we have to balance the sampling rates to fuse the data from both sensors. We used Piecewise Aggregate Approximation (PAA) to reduce the dimensionality of the time series to 20 Hz.
PAA uses three steps to reduce dimensionality. First, it divides the original time series into N equal-length segments. Second, it computes the mean of each segment. Third, it replaces the value of the segment with its mean. Given this, X T = X T 1 , X T 2 , , X T k , , X T m is a multivariate time series where m denotes the number of variables 1 k m ; X T k is an n -length univariate time series, where X T k = x t 1 , x t 2 , , x t i , , x t n . x t i 1 i n   i denotes the value in time; and n is the entire length of the time series X T k . For example, X T 1 = x t 1 , x t 2 , ,   x t i , , x t n is a univariate time series with n-length, and PAA approximates time series X T 1 into X T 1 ¯ = x t 1 ¯ ,   x t 2 ¯ , , x t ¯ i , ,   x t N ¯ , where N is the number of segments [39]. Equation (1) shows the formula x t ¯ i for calculating each x t ¯ i .
x t ¯ i = N n j = n N i 1 + 1 n N i x t j
Using Ref. [39], we adjusted the data with a 20 Hz sampling rate for the next step. During data collection, we also label the data with three labels that represent the activity levels: high-level, middle-level, and low-level activities.The data are labeled after the dimension reduction process is complete. Furthermore, because RULA requires calculating the joint angles for six body sections, we cannot skip the calculation for any section; therefore, we remove all null value rows.

3.2. Labeling by Expert

We validate our automated RULA score calculation using an expert’s evaluation. The expert herein means a specialist who understands the whole assembly process, follows the data collection process and has knowledge of ergonomics assessment and risk posture. The expert also has a background in industrial and computer science. The expert-labeled data includes the RULA score and risk level. Since labeling postures for 12 subjects will take a long time, the expert only evaluates and labels the posture for every ten rows of data (per 0.5 s). The posture evaluation is based on 3D visualizations and recorded videos. The 3D visualization is provided through two types of projection: a front-right view (XZY-projection) and back-left view (ZXY-projection). The total number of expert-labeled postures labeled is 4729. The expert assesses the left upper arm, right upper arm, left lower arm, right lower arm, right wrist, left wrist, neck, trunk, and legs for every pose. We develop a visualization tool to make labeling easier. This tool displays a 3D visualization of the skeleton with the XZY-projection or ZXY-projection. Figure 2 describes the annotation process that the expert uses.

3.3. Automated RULA

3.3.1. RULA Score Calculation

The RULA score is determined based on the body joint angle at each region. Six body regions need to be evaluated: the upper arm, lower arm, wrist, neck, trunk, and legs. There are several aspects that must be considered in the evaluation of each part of the body. For example, the right upper arm evaluation involves the right upper arm position, right shoulder raise, and right upper arm abduction [40]. The first step in determining the upper arm position score is the calculation of the joint angles between the right elbow, right shoulder, and right hip. Afterward, we can determine the upper arm position score according to the RULA worksheet threshold.
RULA divides the body assessment into two sections: section A and section B. Section A consists of the arm and wrist, while section B consists of the neck, trunk, and legs [40]. Both sections generate an intermediate RULA score of A and B, respectively. Afterward, the RULA grand score is calculated by combining both intermediate RULA scores [16]. In this study, intermediate RULA scores A combines the multiple sensor angle calculation. The joint angle of the upper arm and lower arm position is calculated using the data from the body tracking sensor, while the wrist position is calculated using the data from the hand tracking sensor.
We calculated the joint angle from the left and right sides. Furthermore, we calculated a Max score that is defined by the highest score between the left and right position. To determine intermediate RULA score B, we need the score from the neck position, trunk position, and legs. The joint angle of the neck and trunk position are calculated using the data from the body tracking sensor, while the legs score is directly defined (i.e., 2) for all postures given that all of the subjects perform all of the tasks in a standing position, and the legs and feet are not supported. After scores A and B are calculated, we can calculate the grand score (score C). There are three kinds of grand score: the left grand score, which is calculated based on score A from the left side and score B; the right grand score, which is calculated based on score A from the right side and score B; and the general grand score, which is calculated based on the Max score and score B. Figure 3 illustrates the RULA scoring mechanism. The RULA grand score is categorized into four risk levels: negligible, low, medium, and high. Table 2 presents the risk levels with the actions that should be taken for each risk level.

3.3.2. Wrist Kinematics Measurement

This section discusses wrist kinematics measurement. The wrist bone structure consists of eight carpal bones that connect proximally to the forearm’s radius and connect distally to the hand’s five metacarpals. The flexion–extension and radial–ulnar deviation of the wrist are its global motions. The wrist is able to perform those movements due to articulation at the radiocarpal joint, midcarpal joint, the carpo-metacarpal joint, and between individual carpal bones. The forearm pronation–supination motion is possible due to the distal radioulnar joint [21].
RULA involves the evaluation of the wrist flexion–extension, wrist radial–ulnar deviation, and forearm pronation–supination to assess the wrist section. This study assesses the wrist section with the data collected by the finger tracking sensor. Figure 4 shows the finger joints from the finger tracking sensor. Each hand has 25 joints, and all of the joints are represented by a 3D coordinate ( x , y , z ). The finger joints are used to calculate the wrist position and wrist twist score. As shown in Figure 4, we label every finger joint—‘L’ for the left hand and ‘R’ for the right hand. A list of the finger joints is provided in Table A1.
Table 3 shows the finger joints that are involved in wrist kinematics measurement. Four wrist kinematics need to be measured: wrist flexion–extension, wrist radial–ulnar deviation, forearm pronation–supination, and the range of the forearm when pronating and supinating.
  • Wrist flexion–extension
The calculation of the wrist’s flexion–extension angle involves two 3D points from the middle finger. Figure 5 illustrates the point position from the right hand, where R13 and R14 represent the middle finger proximal phalanges endpoint and middle finger metacarpals endpoint, respectively.
In this case, a x 1 ,   y 1 , z 1 is the first point and b x 2 ,   y 2 , z 2 is the second point. Equation (2) shows the formula that is used to calculate the angle of wrist position [41].
c o s θ = a · b a b
2.
Wrist radial–ulnar deviation
We also assess the wrist radial–ulnar deviation (wrist bending) to adjust the RULA score for the wrist. We assume that if the wrist is bent 15° or more, the wrist score will increase by 1. The joint involved in assessing wrist bending is (R05, R04, R24) for the right hand and (L05, L04, L24) for the left hand. The three joint points represent the position of the start point of the index finger of the distal phalanges, the metacarpals endpoint of the thumb, and the metacarpals endpoint of the pinky finger. The illustration of the assessment of the wrist’s radial–ulnar deviation angle is shown in Figure 6.
The deviation angle is the angle between two vectors that is formed by three joint points, where each joint consists of 3D coordinates ( x , y , z ). In this case,   a x 1 ,   y 1 , z 1 , b x 2 , y 2 , z 2 , and c x 3 , y 3 , z 3 represent the first, second, and third joint positions, respectively. Equation (3) shows the formula that is used to calculate the joint angle.
cos θ = a b · b c a b b c
3.
Forearm pronation–supination
The next position that should be assessed is the pronation–supination (wrist twist). Pronation is the rotation of the hand from the handshake position to the position where the palm is facing downward. In contrast, supination is the rotation of the hand from the handshake position to an upward-facing palm surface position [40]. Figure 7 illustrates wrist twists in mid-range (a) and the wrist at or near the end of range (b). To evaluate the wrist twist, we calculate the angle between two normal vectors. I P R n v , C P R n v , I P L n v , and C P L n v represent the normal vectors of the initial position of the right hand, the current position of the right hand, the initial position of the left hand, and the current position of the left hand, respectively.
The normal vector is a function that is used to find the normal vector from the coronal plane. Suppose a x 1 ,   y 1 , z 1 , b x 2 , y 2 , z 2 , and c x 3 , y 3 , z 3 represent three joint positions, then the equation that is used to find the normal vector is defined as shown in Equation (4) [41].
N V = b a × c a
Joints L13, L04, and L24 are involved in finding the normal vector for the left hand, while the normal vector for the right hand is defined by joints R13, R04, and R24. Figure 8 is an illustration of the wrist twist (pronation–supination) angle calculation. The angle between two planes is calculated to evaluate the pronation–supination position. The angle is determined between two normal vectors from two planes. In this case, the two normal vectors are determined by the initial position ( I P R n v and I P L n v ) and the current position ( C P R n v and C P L n v ).
For adjusting the wrist twist posture assessment, Figure 9 illustrates the evaluation of the wrist twist posture in the flexion–extension range. The angle calculation for this assessment is also used for the calculation between the two normal vectors that are produced from the initial position plane and the current position plane. Equations (5) and (6) define the formula that is used to calculate the angle of the wrist twist for the right and left hand, respectively.
Right   Hand : cos θ = I P R n v · C P R n v I P R n v C P R n v
Left   Hand : cos θ = I P L n v · C P L n v I P L n v C P l n v

3.3.3. Body Posture Measurement

The measurement method for body posture (i.e., upper arm, lower arm, neck, trunk, and legs) is discussed in this section. Figure 10 shows the body joints from the body tracking sensor. There are 16 body joints, and each joint is represented by a 3D coordinate x , y , z . A list of the body joints is provided in Table A2. The body joints are used to calculate the score for the upper arm, lower arm, neck, and trunk.
Table 4 shows the body joints that are involved in angle calculations. Since a x 1 ,   y 1 , z 1 , b x 2 , y 2 , z 2 , and c x 3 , y 3 , z 3 represent the first, second, and third joint positions, respectively; the calculation of the joint angle is performed by Equation (2). Since the subjects are working in standing positions without any support during the assembly process, the legs score is determined to be 2. Furthermore, we do not consider the neck twist, muscle, or force score.
The NV() is a function that is used to find the normal vector from the coronal plane. The equation that is used to find the normal vector is defined as shown in Equation (4) [41].

3.4. Evaluation

To evaluate the proposed method, this study compares the score from expert observation with that calculated by our method; moreover, we measure the similarity of the RULA score on each high-level activity. Since E S = e s 1 , e s 2 , , e s n is the list of scores from expert observation and S S = s s 1 ,   s s 2 ,   ,   s s n is the list of the scores from system calculation, Equation (7) shows the similarity n u m S i m e s , s s between two numbers, e s and s s . Then, Equation (8) is the formula that is used to calculate the mean similarity t s i m E S , S S . The intervals for the similarity measurement are [0, 1], where 1 indicates the maximum similarity [42].
n u m S i m e s , s s = 1 e s s s e s + s s
t s i m E S , S S = 1 n a = 1 n n u m S i m e s i , s s i
Furthermore, a hypothesis analysis was also conducted to evaluate and compare the differences in wrist posture across the subjects. This analysis was performed using a one-way ANOVA and t-test.

4. Experiment and Results

4.1. Laboratory Setup

The data were collected from 12 subjects. Every subject had to follow the assembly steps that were given by the instructor. Figure 11 shows the laboratory setup during data collection. We used two sensors to record the assembly activity. The Orbecc Astra Pro camera collected body movement data from the subjects during assembly; it was placed in front of the subjects. The second sensor was Leap Motion, which was used to collect finger movement during the assembly process. It was located on the table, and we ensured that the sensor could cover the entire assembly area. There was also a monitor that showed the skeleton movements from the Orbecc and Leap Motion sensors. This laboratory setup was also used for training sessions by the subjects to train in the assembly process before the data collection process.
Figure 12 shows the product we assembled (a) and the panel positioning we used during the data collection (b). The size of the product is 30 × 28 × 31, which consists of main panels, side panels, wooden pins, and a drawer. The height of the table is 72 cm. We collected the data from 12 subjects with a height range of 155–178 cm. The average subject’s height was 170 cm. Each subject assembled the product three times, and we chose the best performance based on the number of null values during the assembly process.
The process for drawer assembly is divided into five main steps: assemble side panel, assemble main panel, integrate panel, prepare the workspace, and slide the mid-panel. Figure 13 shows the list of assembly activities with three levels of activity. The low level is represented by the Therblig activity. Therblig is a system for assessing the motions required to complete a task, and it is able to examine the smallest of motions. There are 18 motions that can be used to describe the task [43,44]. This study only uses the six Therbligs that are the most relevant to our assembly task (i.e., grasp, transport loaded, assemble, inspect, position, and release load). The middle-level label aims to bridge the low-level activities and high-level activities.
Figure 14 shows an example of one of the activities in the assembly process in our data. The “assemble the main panel” activity consists of three steps. The first step is to pick the panel. The “pick panel” activity consists of two activities: “Grasp” and “Transport Loaded”. The second step is the “preparation” activity, which involves inspecting the panel position; therefore, the panel is ready to be installed on the main panel. The last step is to assemble the main panel.

4.2. Evaluation Results

This section conveys two different subsections to show the results of our experiments through two lenses: similarity measurement and personalized measurement. The similarity measurement takes place based on the respective body parts. The personalized measurement considers the statistical hypothesis in order to justify that the proposed measurement could be utilized as a personalized measure.

4.2.1. Similarity Measurement

The evaluation of proposed method is evaluated by measuring the similarities between the scores calculated by the expert and the scores calculated by the system. The evaluation is divided for each high-level activity (i.e., assemble side panel, assemble main panel, integrate panel, prepare the workspace, and slide the mid-panel) and is based on the mechanism in Figure 3. We divided the evaluation results into five tables (Table 5, Table 6, Table 7, Table 8 and Table 9).
In the first table (Table 5), we present the similarities between the section A (i.e., upper arm, lower arm, wrist, and wrist twist) scores for the left and right sides. The similarity score for the left side is higher than that for the right side. A possible reason for this is the right hand is more active than the left hand; therefore, the left hand is more likely to stay in the same position.
The second table (Table 6) shows the similarities for the section A scores when we use the highest score between the right or left score (Max score) for the upper arm, lower arm, and wrist. As a comparison with the previous study, we measured the similarities between the wrist position and wrist twist score, which were calculated by the previous study method [19], then compared it with the similarities score from the current study method. The similarities results can be seen in the third table (Table 7). Beforehand, the previous study had not performed a score comparison analysis and only considered the higher score (Max score) from the two sides. Other than that, the previous study did not address the adjustment score (i.e., radial–ulnar deviation and range of pronation–supination) and instead relied on the ergonomist’s evaluation. Based on the results, the current study enhances the performance of the ergonomics assessment; it improves the similarity scores for the wrist position and wrist twist by 6.8% and 0.3%, respectively. The slight difference in scores for wrist twist is due to the score on wrist twist being measured as only 1 or 2. Table 1 shows the previous studies related to ergonomics assessment; however, most of them only focused on body assessments and the wrist score was manually assessed by an expert (ergonomists). Moreover, some previous studies did not explain their wrist assessment. Therefore, we can only compare our proposed method with one previous study [19]. We also had difficulty implementing our method with another dataset because no public dataset provided a RULA score.
The fourth table (Table 8) presents the evaluation for section B (i.e., neck, trunk, and legs). Due to all of the subjects performing all of the tasks in the standing position, the legs score is directly defined (i.e., 2) for all postures. Therefore, we only present the similarities results for the neck and trunk. The “integrate panel” activity has the lowest similarity. A possible reason for this is that the “integrate panel” activity is a difficult task to perform, and each subject performs the task differently. Compared to our previous study, the current study has an 82% average similarity score for the neck, while the method in our previous study was only 77.2%; this means that changing the torso joint point to the waist joint point can improve the ergonomic assessment performance.
The last table (Table 9) presents the similarities in the RULA grand score. In the experiment, we calculated three kinds of RULA grand scores. First, the Right RULA grand score is calculated based on the score in section B and the right-side score from section A. Second, the Left RULA grand score is calculated from the section B score and the left-side score of section A. Third, considering that the higher the RULA score the higher the risk level, we calculated the General RULA grand score from section B and the Max score from section A using the mechanism in Figure 3. Based on the results, the general grand scores are higher than the right and left sides.

4.2.2. Personalized Measurement

This section evaluates the significant differences among the RULA of all subjects with a one-way ANOVA and t-test. The null hypothesis is H0: µ1 = µ2 = µ3 = … = µs, where s is the number of subjects and α = 0.05. The p-value for the ANOVA result is 1.1078 × 10−41, which shows that H0 is rejected. This result justifies that there is at least one subject that is different from other subjects. For a more in-depth analysis, we performed a t-test on every pair of subjects.
Table 10 and Table 11 present the t-test results for subjects 1–12. Most of the results are significant, which means that there are differences between subjects in performing the tasks; this shows that even though the subject follows the instructions and has a training session, every subject performs the tasks in their own way. The underlined numbers are the non-significant differences results, which means that they were similarly performed assembly tasks.

5. Discussion

In this section, we discuss the results of the evaluation methods based on the high-level activities. Figure 15 shows the distribution of the RULA grand scores from each high-level activity. Each activity has a different number of samples. Table 12 indicates the number of samples from the 12 subjects, wherein the RULA grand scores were generated using this system.
Based on the similarity measurement results, the overall similarity scores achieve more than 80% similarity. The high similarity scores for wrist position (i.e., flexion–extension and radial–ulnar deviation) and wrist twist (i.e., pronation–supination and the adjustment posture) indicate that our proposed method can measure wrist kinematics and detect potential risk factors; moreover, it improved the performance of ergonomics assessment similarity scores for wrist position and wrist twist by 6.8% and 0.3% on average.
Regarding the high-level activities, the activity “Assemble side panel” has the highest average score. This activity consists of simple tasks, such as attaching the wooden pin to the side panel. It can be seen from Figure 15 that this activity has the fewest high-risk RULA scores. On the contrary, the “Integrate panel” activity has the lowest average score. Possible reasons for this could be that this activity involves a difficult task. This can be seen from Figure 15, which indicates that the activity has the highest number of high-risk RULA scores; moreover, the Orbbec sensor cannot properly capture the joint data because the view is blocked by the drawer when the subject attaches a side panel to the main panel, especially considering the data for the arm position (see Figure 16). A RULA score is determined from the scores from the six body regions. The score for each body region is determined based on the joint angle. The joint angle is calculated based on the position of the body joints that is represented by a 3D coordinate (x, y, z). The 3D coordinate for each body joint is detected and recorded by the Orbbec sensor. Therefore, if the Orbbec sensor cannot properly capture the joint position data, it will affect the evaluation results. Furthermore, the difference in how workers install panels can also affect the result—i.e., sometimes the sensor view for the left hand is blocked, and sometimes the right hand is blocked. The Orbbec sensor actually works well for tracking body motions. Furthermore, since the Orbbec sensor is an unobtrusive sensor, workers can do their tasks without being disturbed by sensors being attached to their bodies; however, if there is a possibility of obstruction, it might be better to use more than one sensor and place it on the other side.
Based on the statistical hypothesis testing, it showed that there is a significant difference among most of the subjects (refer to Table 10 and Table 11). Some possible factors that impact this are handedness and work duration. In terms of handedness, some subjects claimed to be left-handed. For example, subject 5 and subject 6 had different dominant hands (subject 5 was left-handed while subject 6 was right-handed) and showed significant differences in the statistical hypothesis—although they have similar heights (170 and 168 cm) and have almost the same work duration (4.33 and 4.28 min). In terms of work duration, subject 9 and 10 showed a significant difference. Although both subjects have similar heights (170 and 166 cm, respectively) and both are right-handed, subject 9 was the subject with the fastest working time (3.63 min), while subject 10 had the longest working time (6.48 min); this can indicate that significant differences occur due to the handedness and work duration.
Some results showed an insignificant difference. For example, subject 5 and subject 11 had insignificant differences in their results, which means there was no difference in their risk. A possible reason for this is that both subjects are left-handed, which can indicate similar work patterns. A similar work duration can possibly cause an insignificant difference. For example, subject 4 and subject 5 have similar work durations—4.32 and 4.33 min, respectively.
The risk assessment method in this study can help industries evaluate workplace design to make workers work more efficiently. Furthermore, the method can also provide evaluation methods and suggestions to workers related to their posture in order to increase health awareness and prevent injuries.
In our experiment, the calculation of the RULA score is separate from data collection. We first collected the data from the 12 subjects, then combined all the data. During data collection, the joint can be detected and recorded in real time. We also cleaned the data before calculating the RULA score. As explained in Section 3.3.1, the calculation of the RULA score starts from the joint angle calculation for every body section and is used to determine the score based on the joint angle. Afterward, the calculation process continues with the calculation of the intermediate score (RULA score of section A and section B). The last step is the calculation of the RULA grand score, then saving all of the calculation results.
The total number of expert-labeled postures is 4729. After we measured the similarity between the RULA scores calculated by the expert and the scores calculated by the system, we generated the score for all poses from the 12 subjects. The total number of poses is 54,321. Our risk assessment method required around 0.043156 s to calculate the RULA grand score. This indicates that our method can be implemented for an ergonomics assessment system to evaluate poses in real time. The details of the execution time can be seen in Table A3.
Our method focuses on the development of ergonomics assessments for monitor the safety of workers and the prevention of WMSDs; however, other related domains that use ergonomics assessments, such as elderly health monitoring, worker efficiency management, and medical training, can also benefit from applying our proposed method.
A limitation of this study is that the evaluation of the method only used laboratory data. The muscle and force scores were not considered because the sensors only collected data on the joint positions. A comparison with the existing method is limited, given that no previous studies adopted a combination of body angle calculations and wrist posture assessments. Furthermore, due to the different devices used to collect the data, there is a dissimilarity in data features from the previous studies.

6. Conclusions

This study proposed a geometric-based wrist kinematics measurement to detect potential risk factors related to wrist position and wrist twist. Our study is able to provide an ergonomics assessment that combines wrist kinematics measurements with body posture measurements during an assembly process based on multimodal unobtrusive sensors (i.e., body tracking and hand tracking sensors). An extensive experiment and evaluation are provided to evaluate our proposed method. The evaluation results show that the current study improves the performance of the ergonomics assessment for wrist position and wrist twist. In addition, the personalized measurement results show that every subject is unique, and even though the subject has a training session before data collection, each subject still has differences from other subjects; this shows that humans perform the same task in their own way and have different potential risks.
In future research, we plan to evaluate our method by extracting data from a real-world industrial environment, analyze the sensitivity of the RULA score changes in similar postures with a slight difference in joint angle, and compare them with other ergonomics assessment tools. In addition, we also plan to build a warning system that provides an alert to a worker if they’re in a high-risk posture. This system can be used for assessing training, especially with new workers.

Author Contributions

Conceptualization, W.F.S., B.N.Y. and S.-L.L.; methodology, W.F.S. and B.N.Y.; software, W.F.S.; validation, W.F.S. and B.N.Y.; formal analysis, W.F.S.; investigation, W.F.S. and B.N.Y.; resources, B.N.Y. and S.-L.L.; data curation, W.F.S., B.N.Y. and S.-L.L.; writing—original draft preparation, W.F.S.; writing—review and editing, B.N.Y. and S.-L.L.; visualization, W.F.S.; supervision, B.N.Y. and S.-L.L.; project administration, B.N.Y. and S.-L.L.; funding acquisition, B.N.Y. and S.-L.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Ministry of Trade, Industry, and Energy (MOTIE) and the Korea Institute for Advancement of Technology (KIAT) through the International Cooperative R&D program (Project No. P0022316), and by the Ministry of Science and ICT of the Republic of Korea and the National Research Foundation of Korea (NRF-2021R1F1A1060244 and NRF-2021R1F1A1047577).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

All subjects included in the data collection signed the “Research Subject Confidentiality Agreement” form.

Data Availability Statement

Data available upon request to corresponding author due to ethical restrictions.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. List of finger joints.
Table A1. List of finger joints.
Joint LabelJoint Name
L01–L04Left thumb joint
L05–L09Left index finger joint
L010–L14Left middle finger joint
L15–L19Left ring finger joint
L20–L24Left pinky finger joint
L25Left palm joint
R01–R04Right thumb joint
R05–R09Right index finger joint
R10–R14Right middle finger joint
R15–R19Right ring finger joint
R20–R24Right pinky finger joint
R25Right palm joint
Table A2. List of body joints.
Table A2. List of body joints.
Joint LabelJoint Name
01Head
02Neck
03Torso
04Waist
05Right hip
06Left hip
07Right shoulder
08Right elbow
09Right wrist
10Left shoulder
11Left elbow
12Left wrist
13Right knee
14Right ankle
15Left knee
16Left ankle
Table A3. The execution time required to determine the RULA score.
Table A3. The execution time required to determine the RULA score.
Body SectionAll Data (Seconds)1 Pose (Seconds)
Upper arm378.33979580.00696489
Lower arm430.02066660.007916288
Wrist461.66375990.008498808
Neck322.84101370.005943208
Trunk297.46681690.005476092
Section A (Left)70.832190510.001303956
Section A (Right)63.708746910.00117282
Section A (Max)76.780082230.001413451
Section B64.086474660.001179773
Section C (Left)56.959849360.001048579
Section C (Right)51.788263080.000953375
Section C (Max)69.789417030.001284759
Total2344.2770770.043156

References

  1. Humadi, A.; Nazarahari, M.; Ahmad, R.; Rouhani, H. In-field instrumented ergonomic risk assessment: Inertial measurement units versus Kinect V2. Int. J. Ind. Ergon. 2021, 84, 103147. [Google Scholar] [CrossRef]
  2. Kee, D. Development and evaluation of the novel postural loading on the entire body assessment. Ergonomics 2021, 64, 1555–1568. [Google Scholar] [CrossRef]
  3. Li, L.; Xu, X. A deep learning-based RULA method for working posture assessment. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2019, 63, 1090–1094. [Google Scholar] [CrossRef]
  4. Nath, N.D.; Chaspari, T.; Behzadan, A.H. Automated ergonomic risk monitoring using body-mounted sensors and machine learning. Adv. Eng. Inform. 2018, 38, 514–526. [Google Scholar] [CrossRef]
  5. Merikh-Nejadasl, A.; El Makrini, I.; Van De Perre, G.; Verstraten, T.; Vanderborght, B. A generic algorithm for computing optimal ergonomic postures during working in an industrial environment. Int. J. Ind. Ergon. 2021, 84, 103145. [Google Scholar] [CrossRef]
  6. Wang, J.; Han, S.; Li, X. 3D fuzzy ergonomic analysis for rapid workplace design and modification in construction. Autom. Constr. 2020, 123, 103521. [Google Scholar] [CrossRef]
  7. Donisi, L.; Cesarelli, G.; Coccia, A.; Panigazzi, M.; Capodaglio, E.; D’Addio, G. Work-Related Risk Assessment According to the Revised NIOSH Lifting Equation: A Preliminary Study Using a Wearable Inertial Sensor and Machine Learning. Sensors 2021, 21, 2593. [Google Scholar] [CrossRef] [PubMed]
  8. Colim, A.; Faria, C.; Braga, A.C.; Sousa, N.; Rocha, L.; Carneiro, P.; Costa, N.; Arezes, P. Towards an Ergonomic Assessment Framework for Industrial Assembly Workstations—A Case Study. Appl. Sci. 2020, 10, 3048. [Google Scholar] [CrossRef]
  9. Huang, C.; Kim, W.; Zhang, Y.; Xiong, S. Development and Validation of a Wearable Inertial Sensors-Based Automated System for Assessing Work-Related Musculoskeletal Disorders in the Workspace. Int. J. Environ. Res. Public Health 2020, 17, 6050. [Google Scholar] [CrossRef] [PubMed]
  10. Norasi, H.; Tetteh, E.; Money, S.R.; Davila, V.J.; Meltzer, A.J.; Morrow, M.M.; Fortune, E.; Mendes, B.C.; Hallbeck, M.S. Intraoperative posture and workload assessment in vascular surgery. Appl. Ergon. 2020, 92, 103344. [Google Scholar] [CrossRef] [PubMed]
  11. Abobakr, A.; Nahavandi, D.; Hossny, M.; Iskander, J.; Attia, M.; Nahavandi, S.; Smets, M. RGB-D ergonomic assessment system of adopted working postures. Appl. Ergon. 2019, 80, 75–88. [Google Scholar] [CrossRef] [PubMed]
  12. Chatzis, T.; Konstantinidis, D.; Dimitropoulos, K. Automatic Ergonomic Risk Assessment Using a Variational Deep Network Architecture. Sensors 2022, 22, 6051. [Google Scholar] [CrossRef] [PubMed]
  13. Plantard, P.; Shum, H.P.; Le Pierres, A.-S.; Multon, F. Validation of an ergonomic assessment method using Kinect data in real workplace conditions. Appl. Ergon. 2017, 65, 562–569. [Google Scholar] [CrossRef] [PubMed]
  14. Ito, H.; Nakamura, S. Rapid prototyping for series of tasks in atypical environment: Robotic system with reliable program-based and flexible learning-based approaches. ROBOMECH J. 2022, 9, 1–14. [Google Scholar] [CrossRef]
  15. Li, L.; Martin, T.; Xu, X. A novel vision-based real-time method for evaluating postural risk factors associated with musculoskeletal disorders. Appl. Ergon. 2020, 87, 103138. [Google Scholar] [CrossRef] [PubMed]
  16. Massiris Fernández, M.; Fernández, J.; Bajo, J.M.; Delrieux, C.A. Ergonomic risk assessment based on computer vision and machine learning. Comput. Ind. Eng. 2020, 149, 106816. [Google Scholar] [CrossRef]
  17. Bao, T.; Zaidi, S.A.R.; Xie, S.; Yang, P.; Zhang, Z.-Q. A CNN-LSTM Hybrid Model for Wrist Kinematics Estimation Using Surface Electromyography. IEEE Trans. Instrum. Meas. 2020, 70, 1–9. [Google Scholar] [CrossRef]
  18. Bao, T.; Zaidi, S.A.R.; Xie, S.; Yang, P.; Zhang, Z.-Q. Inter-Subject Domain Adaptation for CNN-Based Wrist Kinematics Estimation Using sEMG. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1068–1078. [Google Scholar] [CrossRef]
  19. Senjaya, W.F.; Prathama, F.; Setiawan, F.; Prabono, A.G.; Yahya, B.N.; Lee, S.L. Automated RULA for a sequence of activities based on sensor data. In Proceedings of the 2020 Fall Conference of the Korean Society of Industrial Engineering, Seoul, Republic of Korea, 23–25 November 2020. [Google Scholar]
  20. Xie, B.; Liu, H.; Alghofaili, R.; Zhang, Y.; Jiang, Y.; Lobo, F.D.; Li, C.; Li, W.; Huang, H.; Akdere, M.; et al. A Review on Virtual Reality Skill Training Applications. Front. Virtual Real. 2021, 2, 645153. [Google Scholar] [CrossRef]
  21. Charles, S.K. It’s All in the Wrist: A Quantitative Characterization of Human Wrist Control; Massachusetts Institute of Technology: Cambridge, MA, USA, 2008. [Google Scholar]
  22. Nweke, H.F.; Teh, Y.W.; Mujtaba, G.; Alo, U.R.; Al-Garadi, M.A. Multi-sensor fusion based on multiple classifier systems for human activity identification. Human-Centric Comput. Inf. Sci. 2019, 9, 1–44. [Google Scholar] [CrossRef]
  23. Das, B. Improved work organization to increase the productivity in manual brick manufacturing unit of West Bengal, India. Int. J. Ind. Ergon. 2021, 81, 103040. [Google Scholar] [CrossRef]
  24. Bortolini, M.; Gamberi, M.; Pilati, F.; Regattieri, A. Automatic assessment of the ergonomic risk for manual manufacturing and assembly activities through optical motion capture technology. Procedia CIRP 2018, 72, 81–86. [Google Scholar] [CrossRef]
  25. Nahavandi, D.; Hossny, M. Skeleton-free RULA ergonomic assessment using Kinect sensors. Intell. Decis. Technol. 2017, 11, 275–284. [Google Scholar] [CrossRef]
  26. Tamantini, C.; Cordella, F.; Lauretti, C.; Zollo, L. The WGD—A Dataset of Assembly Line Working Gestures for Ergonomic Analysis and Work-Related Injuries Prevention. Sensors 2021, 21, 7600. [Google Scholar] [CrossRef]
  27. Slyper, R.; Hodgins, J.K. Action Capture with Accelerometers. In Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation, Dublin, Ireland, 7–9 July 2008; pp. 193–199. [Google Scholar]
  28. Mousas, C. Full-Body Locomotion Reconstruction of Virtual Characters Using a Single Inertial Measurement Unit. Sensors 2017, 17, 2589. [Google Scholar] [CrossRef] [Green Version]
  29. Lawrence, N.D. Gaussian Process Latent Variable Models for Visualisation of High Dimensional Data. In Proceedings of the 16th International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 8–13 December 2003. [Google Scholar]
  30. Eom, H.; Choi, B.; Noh, J. Data-Driven Reconstruction of Human Locomotion Using a Single Smartphone. Comput. Graph. Forum 2014, 33, 11–19. [Google Scholar] [CrossRef]
  31. Brigante, C.M.N.; Abbate, N.; Basile, A.; Faulisi, A.C.; Sessa, S. Towards Miniaturization of a MEMS-Based Wearable Motion Capture System. IEEE Trans. Ind. Electron. 2011, 58, 3234–3241. [Google Scholar] [CrossRef]
  32. Doniyorbek, K.; Jung, K. Development of a Semi-Automatic Rapid Entire Body Assessment System using the Open Pose and a Single Working Image. In 2019 Fall Conference of the Korean Society of Industrial Engineering; Springer: New York, NY, USA, 2020; pp. 1503–1517. [Google Scholar]
  33. Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Bevilacqua, V.; Trotta, G.F.; Monno, G. Real time RULA assessment using Kinect v2 sensor. Appl. Ergon. 2017, 65, 481–491. [Google Scholar] [CrossRef]
  34. Mehrizi, R.; Peng, X.; Xu, X.; Zhang, S.; Metaxas, D.; Li, K. A computer vision based method for 3D posture estimation of symmetrical lifting. J. Biomech. 2018, 69, 40–46. [Google Scholar] [CrossRef]
  35. Li, X.; Han, S.; Gül, M.; Al-Hussein, M. Automated post-3D visualization ergonomic analysis system for rapid workplace design in modular construction. Autom. Constr. 2018, 98, 160–174. [Google Scholar] [CrossRef]
  36. Bartnicka, J.; Zietkiewicz, A.A.; Kowalski, G.J. An ergonomics study on wrist posture when using laparoscopic tools in four techniques in minimally invasive surgery. Int. J. Occup. Saf. Ergon. 2018, 24, 438–449. [Google Scholar] [CrossRef] [PubMed]
  37. Sánchez-Margallo, J.A.; González, A.G.; Moruno, L.G.; Gómez-Blanco, J.C.; Pagador, J.B.; Sánchez-Margallo, F.M. Comparative Study of the Use of Different Sizes of an Ergonomic Instrument Handle for Laparoscopic Surgery. Appl. Sci. 2020, 10, 1526. [Google Scholar] [CrossRef] [Green Version]
  38. Onyebeke, L.C.; Young, J.G.; Trudeau, M.B.; Dennerlein, J.T. Effects of forearm and palm supports on the upper extremity during computer mouse use. Appl. Ergon. 2014, 45, 564–570. [Google Scholar] [CrossRef] [PubMed]
  39. Ma, C.; Weng, X.; Shan, Z. Early Classification of Multivariate Time Series Based on Piecewise Aggregate Approximation. Lect. Notes Comput. Sci. 2017, 10594, 81–88. [Google Scholar] [CrossRef]
  40. Middlesworth, M. A Step-by-Step Guide Rapid Upper Limb Assessment (RULA). Ergonomics Plus. 2012. Available online: www.ergo-plus.com (accessed on 19 July 2021).
  41. Gellert, W.; Hellwich, M.; Kästner, H.; Küstner, H. (Eds.) VNR Concise Encyclopedia of Mathematics, 2nd ed.; Springer Science & Business Media: New York, NY, USA, 2012. [Google Scholar]
  42. Cassisi, C.; Montalto, P.; Aliotta, M.; Cannata, A.; Pulvirenti, A. Similarity Measures and Dimensionality Reduction Techniques for Time Series Data Mining. In Advances in Data Mining Knowledge Discovery and Applications; IntechOpen: London, UK, 2012; pp. 71–96. [Google Scholar] [CrossRef] [Green Version]
  43. Ferguson, D. Therbligs: The Keys to Simplifying Work. The GIlbreth Network. 2000. Available online: http://web.mit.edu/allanmc/www/Therblgs.pdf (accessed on 28 December 2020).
  44. Oyekan, J.; Hutabarat, W.; Turner, C.; Arnoult, C.; Tiwari, A. Using Therbligs to embed intelligence in workpieces for digital assistive assembly. J. Ambient Intell. Humaniz. Comput. 2019, 11, 2489–2503. [Google Scholar] [CrossRef]
Figure 1. Workflow of Ergonomics Assessment.
Figure 1. Workflow of Ergonomics Assessment.
Sensors 22 08898 g001
Figure 2. Annotation used by the expert.
Figure 2. Annotation used by the expert.
Sensors 22 08898 g002
Figure 3. RULA Scoring Mechanism.
Figure 3. RULA Scoring Mechanism.
Sensors 22 08898 g003
Figure 4. Finger joints.
Figure 4. Finger joints.
Sensors 22 08898 g004
Figure 5. Calculate wrist twist position (flexion–extension).
Figure 5. Calculate wrist twist position (flexion–extension).
Sensors 22 08898 g005
Figure 6. Wrist bending position (radial–ulnar deviation).
Figure 6. Wrist bending position (radial–ulnar deviation).
Sensors 22 08898 g006
Figure 7. Wrist twist assessment: (a) wrist is twisted in mid-range (pronation–supination); (b) wrist is at or near the end of range.
Figure 7. Wrist twist assessment: (a) wrist is twisted in mid-range (pronation–supination); (b) wrist is at or near the end of range.
Sensors 22 08898 g007
Figure 8. Wrist twist (pronation–supination) angle calculation.
Figure 8. Wrist twist (pronation–supination) angle calculation.
Sensors 22 08898 g008
Figure 9. Wrist twist position in the range.
Figure 9. Wrist twist position in the range.
Sensors 22 08898 g009
Figure 10. Body joints.
Figure 10. Body joints.
Sensors 22 08898 g010
Figure 11. Laboratory Setup.
Figure 11. Laboratory Setup.
Sensors 22 08898 g011
Figure 12. (a) Product; (b) Panel positioning for assembly.
Figure 12. (a) Product; (b) Panel positioning for assembly.
Sensors 22 08898 g012
Figure 13. List of Assembly Activity.
Figure 13. List of Assembly Activity.
Sensors 22 08898 g013
Figure 14. The example of activities to assemble the main panel.
Figure 14. The example of activities to assemble the main panel.
Sensors 22 08898 g014
Figure 15. RULA score distribution.
Figure 15. RULA score distribution.
Sensors 22 08898 g015
Figure 16. Installation of the side panel to the main panel.
Figure 16. Installation of the side panel to the main panel.
Sensors 22 08898 g016
Table 1. List of Related Works.
Table 1. List of Related Works.
Ref.DataBody JointFinger JointAssessment ToolsWrist Score
[15]Image17-RULASet manually
[16]Image and video25-RULASet manually
[3]Image17-RULASet manually
[11]Image--RULANot available
[33]Skeleton25-RULASet manually
[13]Skeleton12-RULASet manually
[34]Image26-RULANot available
[32]Image17-REBASet manually
[35]3D Model22-RULA and REBANot available
[10]Survey and wearable sensors4-RULANot available
[24]Skeleton25-EAWSNot available
[9]Wearable inertial sensors17-RULA and REBANot available
Ours- Body-Tracking Sensor
- Hand-Tracking Sensor
1650RULACalculate by system
Table 2. RULA Risk level.
Table 2. RULA Risk level.
ScoreRisk LevelAction to Be Taken
1–2NegligibleAcceptable posture if it is not repeated for a longer period
3–4LowFurther investigation and change may be needed in future
5–6MediumThe investigation and change are required soon
7HighThe investigation and change are required immediately
Table 3. RULA wrist posture scoring criteria.
Table 3. RULA wrist posture scoring criteria.
Wrist PostureWrist KinematicsSideFormulaScore
WristWrist Positionflexion–extensionRight R13, R14+1 (0°)
+2 (15° up, 15° down)
+3 (>15° up, >15° down)
Left L13, L14
Wrist is bent from midlineradial–ulnar deviationRight R05, R04, R24+1 (15° left, 15° right)
Left L05, L04, L24
Wrist TwistWrist is twisted in mid-rangepronation–supinationRight   I P R n v , C P R n v +1 (75°,105°)
Left   I P L n v , C P L n v
Wrist is at or near the end of the rangerange of pronation–supinationRight   I P R n v , C P R n v +2 (105°, 165°)
Left   I P L n v , C P L n v
Table 4. RULA body posture score criteria.
Table 4. RULA body posture score criteria.
Body RegionsSideFormulaScore
Upper ArmUpper arm positionRight 08, 07, 05+1 (−20°, 20°)
+2 (−∞, −20°)
+2 (20°, 45°)
+3 (45°, 90°)
+4 (90°, ∞)
Left 11, 10, 06
Shoulder is raisedRight 04, 02, 07+1 (90°, ∞)
Left 04, 19, 02
Upper arm is abductedRight 02, 07, 08+1 (20°, ∞)
Left 02, 10, 11
Lower ArmLower arm positionRight 09, 08, 07+1 (60°, 100°)
+2 (0°, 60°)
+2 (100°, ∞)
Left 12, 11, 10
Arm is working across the midlineRight 07, 03, 09+1 (90°, ∞)
Left 10, 03, 12
Arm is out to the side of the bodyRight 08, 07, 05+1 (30°, ∞)
Left 11, 10, 06
NeckNeck position 01, 02, 04+1 (0°, 10°)
+2 (10°, 20°)
+3 (20°, ∞)
+4 (−∞, 0°)
Neck is side bending 90 - ( 10, 02, 01)+1 (20°, ∞)
TrunkTrunk position 180 - ( 01, 04, [0,0,1])+1 (0°)
+2 (0°, 20°)
+3 (20°, 60°)
+4 (60°, ∞)
Trunk is twisted 04, 06, NV (02, 05,06)+1 (20°, ∞) to left and right
Trunk is side bendingRight 02, 04, 05+1 (20°, ∞)
Left 02, 04, 06
Table 5. Similarity measurement results for section A (right and left).
Table 5. Similarity measurement results for section A (right and left).
High-Level ActivityUpper ArmLower ArmWrist PositionWrist Twist
RightLeftRightLeftRightLeftRightLeftAVG
Assemble side panel0.8910.9020.9140.9530.940.9470.8460.9160.914
Assemble main panel0.870.8730.8530.8610.9410.9420.9710.8370.894
Prepare the workspace0.8820.9140.9170.8820.9340.9390.8370.890.899
Integrate panel0.8310.8730.8390.8220.9540.9470.8210.8950.873
Slide the mid-panel0.8650.9090.8760.8560.9440.9510.8330.9070.893
AVG0.870.890.880.870.940.950.860.89
Table 6. Similarity measurement results for section A (Max score).
Table 6. Similarity measurement results for section A (Max score).
High-Level ActivityUpper ArmLower ArmWrist PositionWrist TwistAVG
Assemble side panel0.9260.8650.9790.9610.933
Assemble main panel0.8970.8450.9710.9680.920
Prepare the workspace0.8810.8320.9630.9490.906
Integrate panel0.8990.8240.9810.9660.918
Slide the mid-panel0.8610.8490.9790.9680.914
AVG0.8930.8430.9750.962
Table 7. Previous study similarity measurement results for wrist section.
Table 7. Previous study similarity measurement results for wrist section.
High-Level ActivityPrevious Study [19]This Study
Wrist PositionWrist TwistWrist PositionWrist Twist
Assemble side panel0.9790.9580.9790.961
Assemble main panel0.9150.9660.9710.968
Prepare the workspace0.8690.9480.9630.949
Integrate panel0.8630.9560.9810.966
Slide the mid-panel0.9080.9680.9790.968
AVG0.9070.9590.9750.962
Table 8. Similarity measurement results for section B.
Table 8. Similarity measurement results for section B.
High-Level ActivityNeckTrunkAVG
Assemble side panel0.8390.8280.834
Assemble main panel0.8190.8050.812
Prepare the workspace0.8060.8350.821
Integrate panel0.810.8080.809
Slide the mid-panel0.8360.830.833
AVG0.820.82
Table 9. Similarity measurement results for grand RULA score.
Table 9. Similarity measurement results for grand RULA score.
High-Level ActivityGrand Score
RightLeftGeneral
Assemble side panel0.880.8840.899
Assemble main panel0.8780.8810.898
Prepare the workspace0.8520.8430.861
Integrate panel0.890.8820.908
Slide the mid-panel0.8690.8580.882
AVG0.870.870.89
Table 10. t-test results for subjects 1–6.
Table 10. t-test results for subjects 1–6.
S1S2S3S4S5S6
S11.00.74 × 10−31.41 × 10−152.53 × 10−79.77 × 10−131.0
S20.74 × 10−31.02.32 × 10−76.54 × 10−20.18 × 10−30.74 × 10−3
S31.41 × 10−152.32 × 10−71.00.27 × 10−32.48 × 10−21.41 × 10−15
S42.53 × 10−76.54 × 10−20.27 × 10−31.06.32 × 10−22.53 × 10−7
S59.77 × 10−130.18 × 10−32.48 × 10−26.32 × 10−21.09.77 × 10−13
S61.00.74 × 10−31.41 × 10−152.53 × 10−79.77 × 10−131.0
S78.53 × 10−10.22 × 10−31.04 × 10−162.76 × 10−84.06 × 10−148.53 × 10−1
S84.36 × 10−35.11 × 10−15.05 × 10−90.99 × 10−25.55 × 10−60.44 × 10−2
S96.30 × 10−84.57 × 10−20.24 × 10−39.24 × 10−16.87 × 10−26.30 × 10−8
S109.07 × 10−10.32 × 10−37.12 × 10−182.59 × 10−81.25 × 10−149.07 × 10−1
S119.79 × 10−85.49 × 10−20.19 × 10−39.48 × 10−16.90 × 10−29.79 × 10−8
S121.01 × 10−18.66 × 10−28.12 × 10−110.43 × 10−35.85 × 10−81.01 × 10−1
Table 11. t-test results for subjects 7–12.
Table 11. t-test results for subjects 7–12.
S7S8S9S10S11S12
S18.53 × 10−10.44 × 10−26.30 × 10−89.07 × 10−19.79 × 10−81.01 × 10−1
S20.22 × 10−35.11 × 10−10.46 × 10−10.32 × 10−35.49 × 10−20.87 × 10−1
S31.04 × 10−165.05 × 10−90.24 × 10−37.12 × 10−180.19 × 10−38.12 × 10−11
S42.76 × 10−80.99 × 10−29.24 × 10−12.59 × 10−89.48 × 10−10.43 × 10−3
S54.06 × 10−145.55 × 10−60.69 × 10−11.25 × 10−140.69 × 10−15.85 × 10−8
S68.53 × 10−10.436 × 10−26.30 × 10−89.07 × 10−19.79 × 10−81.01 × 10−1
S71.00.150 × 10−25.49 × 10−97.45 × 10−11.48 × 10−80.59 × 10−1
S80.15 × 10−21.00.58 × 10−20.25 × 10−20.84 × 10−22.56 × 10−1
S95.49 × 10−90.58 × 10−21.05.06 × 10−99.78 × 10−10.19 × 10−3
S107.45 × 10−10.25 × 10−25.06 × 10−91.01.07 × 10−80.92 × 10−1
S111.48 × 10−80.84 × 10−29.78 × 10−11.06 × 10−81.00.31 × 10−3
S125.87 × 10−22.56 × 10−10.19 × 10−30.91 × 10−10.31 × 10−31.0
Table 12. Number of samples.
Table 12. Number of samples.
High-Level ActivityNumber of Samples
Assemble side panel458
Assemble main panel1936
Integrate panel464
Prepare the workspace1257
Slide the mid-panel614
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Senjaya, W.F.; Yahya, B.N.; Lee, S.-L. Sensor-Based Motion Tracking System Evaluation for RULA in Assembly Task. Sensors 2022, 22, 8898. https://doi.org/10.3390/s22228898

AMA Style

Senjaya WF, Yahya BN, Lee S-L. Sensor-Based Motion Tracking System Evaluation for RULA in Assembly Task. Sensors. 2022; 22(22):8898. https://doi.org/10.3390/s22228898

Chicago/Turabian Style

Senjaya, Wenny Franciska, Bernardo Nugroho Yahya, and Seok-Lyong Lee. 2022. "Sensor-Based Motion Tracking System Evaluation for RULA in Assembly Task" Sensors 22, no. 22: 8898. https://doi.org/10.3390/s22228898

APA Style

Senjaya, W. F., Yahya, B. N., & Lee, S.-L. (2022). Sensor-Based Motion Tracking System Evaluation for RULA in Assembly Task. Sensors, 22(22), 8898. https://doi.org/10.3390/s22228898

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop