Next Article in Journal / Special Issue
A Bionic Control Method for Human–Exoskeleton Coupling Based on CPG Model
Previous Article in Journal
A Non-Arrhenius Model for Mechanism Consistency Checking in Accelerated Degradation Tests
Previous Article in Special Issue
A Large Force Haptic Interface with Modular Linear Actuators
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on Identifying Robot Collision Points in Human–Robot Collaboration Based on Force Method Principle Solving

1
College of Mechanical Engineering, North China University of Science and Technology, Tangshan 063210, China
2
Hebei Province Research Institute of Industrial Robot Industry Technology, Tangshan 063210, China
*
Author to whom correspondence should be addressed.
Actuators 2023, 12(8), 320; https://doi.org/10.3390/act12080320
Submission received: 16 July 2023 / Revised: 3 August 2023 / Accepted: 5 August 2023 / Published: 9 August 2023

Abstract

:
After years of more rigid and conventional production processes, the traditional manufacturing industry has been moving toward flexible manufacturing and intelligent manufacturing in recent years. After more than half a century of development, robotics has penetrated into all aspects of human production and life, bringing significant changes to the development of human society. At the same time, the key technology of human–machine cooperative operation has become a research hotspot, and how to realize a human–machine integrated safety system has attracted attention. Human–machine integration means that humans and robots can work in the same natural space, coordinating closely and interacting naturally. To realize real human–robot integration under human–robot cooperative operation, the good judgment of intentional interaction and accidental collision and the detection of collision joints and collision points when accidental collision occurs are the key points. In this paper, we propose a method to identify the collision joints by detecting real-time current changes in each joint of the robot and solve the collision point location information of the collision joints through the principle of virtual displacement and the principle of force method using the force sensor data installed at the base of the robot as the known condition. The results show that the proposed method of identifying the collision joints using changes in joint current and then establishing a collision detection algorithm to solve the collision point location is correct and reliable.

1. Introduction

With the development of basic theories and key technologies for human–machine collaboration and human–machine integration, research focusing on the scientific issues of “human–machine–environment multimodal perception and interaction” is at a critical stage. The construction of a human–machine and machine–machine interaction system can promote the progress of robotics and theory while also meeting the needs of contemporary intelligent development [1,2,3]. Establishing collision detection models, deriving force control algorithms during robot motion, and establishing robot collision recognition and fast response mechanisms are the keys to ensuring safety during human–machine collaboration [4,5,6,7,8].
Common collision detection methods include installing optoelectronic, force-controlled, and joint torque sensors in specific parts of the robot [9,10,11,12] or detecting the relative position relationship between robot and human through vision systems, imaging systems, ranging systems, and envelope box methods [13,14,15] to ensure the safety of human–robot collaboration systems through pre-collision safety mechanisms. However, some of the systems are more complex in composition, have various types of sensors, have poor backward compatibility, have long fusion response time, and incur expensive system costs, thus making them not suitable for large-scale dissemination and use. Bharadwaja et al. [16] proposed a neural network-based path planning control system for service robots, which was compared with the traditional probabilistic roadmap (PRM) algorithm, and the robot’s obstacle distance prediction accuracy improved by about 36%. Meanwhile, the vision system can be used to model the surrounding environment in 3D and a genetic algorithm can be used to generate the optimal choice for the robot’s forward route. A. Lomakin et al. [17] proposed a rigid robot collision detection method considering collision and external force, which provides an evaluable expression that can estimate the calculation error caused by numerical error and parameter uncertainty, which was applied in research into human–robot collaboration and robot safety. T. Lindner et al. [18] proposed a Reinforcement Learning (RL)-based collision avoidance strategy, which controls the robot’s movement through trial-and-error interaction with the environment and achieves the goal of obstacle avoidance through the combination of the Deep Deterministic Policy Gradient (DDPG) algorithm and Hindsight Experience Replay (HER) algorithm. Alchan Yun et al. [19] designed a strain gauge-based cylindrical three-axis sensor to detect external force information by connecting the sensor to the robot, and it was experimentally verified that measuring force information with this structure of sensor can improve the efficiency of collision detection. In the literature [20,21], by installing color spheres on human shoulders, the head, limbs, etc., the camera IP captures data information to estimate human pose and position in performing the HRC task and uses the DLT method to estimate the 3D information of color spheres on the human body to avoid collisions through the use of image processing techniques. S. A. Gadsden et al. [22] proposed the smooth variable structure filter (SVSF), which has stronger accuracy and robustness in collision recognition and fault diagnosis for human–computer interaction systems with uncertainty in modeling and has been applied in the field of aerospace brakes.
Compared with the use of vision sensors, multi-sensor fusion, and energy drive for collision detection, the use of force sensors for collision recognition research has the advantages of direct measurement, simple algorithms, conduciveness to decoupling, and real-time response and is an important detection method in the field of human–machine integration. In this paper, we propose a collision detection algorithm based on the numerical information collected by a six-axis force/torque sensor installed at the base of a robot after static/dynamic force compensation and apply the force law principle and the virtual displacement principle to lock the location information of the collision point when the collision joint information is identified by real-time feedback of the current situation of each joint of the robot. Finally, the force compensation algorithm and collision detection algorithm are experimentally verified by carrying the experimental platform.

2. Model Building

In order to complete the collision detection task, a six-axis force/torque sensor was mounted at the base of the robot, namely the KWR200X. It is a large-range multi-dimensional force sensor with a built-in high-precision embedded data acquisition system, which can measure and transmit the orthogonal force F F x , F y , F z and orthogonal moment M M x , M y , M z in three directions in real time. It is made of high-strength alloy steel with strong bending resistance. The assembly method is bolted to the table and the robot is bolted to the six-dimensional force cell for better monitoring of experimental data. We define the robot base coordinate system as O 0 X 0 Y 0 Z 0 , where the Z 0 direction is opposite to the gravity direction and vertical upward. We define the six-axis force/torque sensor coordinate system as O s X s Y s Z s , and the six-axis force/torque sensor coordinate system can be obtained by rotating the robot base coordinate system around the X 0 axis by the θ angle; the attitude transformation matrix is shown in Equation (1), and the position relationship is shown in Figure 1.
R s 0 = 1 0 0 0 cos θ sin θ 0 sin θ cos θ
When the robot is working normally, each linkage is a common static structure, and the forces are fixed at both ends. When the robot linkage is collided with by the external environment, if the detected external force is defined as F i , the original stationary structure becomes three-time super-stationary. The six-axis force/torque sensor installed at the robot base can measure the orthogonal forces and orthogonal moments in three directions: F F x , F y , F z and M M x , M y , M z , respectively.
Therefore, the cosine of the magnitude and direction of the combined force is
F i = F x 2 + F y 2 + F z 2
α = cos F i , i = F x F i β = cos F i , j = F y F i γ = cos F i , k = F z F i
The location information of the collision point can be expressed as
M = P F
P P x , P y , P z in Equation (4) is the position vector of the collision point relative to the six-axis force/torque sensor coordinate system. According to the vector information about force and moment in Equation (4), it is not possible to lock the collision point position coordinates, and other methods are needed to complete the task of collision point identification.
As shown in Figure 1, the directions of the three axes in the six-axis force/torque sensor coordinate system are known, and when a collision occurs, the robot linkage coordinate system needs to be re-established based on the original six-axis force/torque sensor coordinate system. We define the robot linkage coordinate system as O n X n Y n Z n , and the robot linkage coordinate system can be rotated by the six-axis force/torque sensor coordinate system around X s by the α angle first, then around the Y s axis by the β angle, and finally around Z s by the γ angle, and its attitude transformation matrix is shown in Equation (5).
R n s X s Y s Z s α , β , γ = R Z s γ R Y s β R X s α = c α c β s α c β s β c α s β s α c γ s α s β s γ + c α c γ c β s γ c α s β c γ + s α s γ s α s β c γ c α s γ c β c γ
Here, s α stands for the abbreviation of sin α and c α stands for the abbreviation of cos α . The presence of the tilt angle of the collaborative robot base mounting and the zero value of the six-axis force/torque sensor will provide the difference between the force information measured by the six-axis force/torque sensor and the actual magnitude of the external force of the collision. In the literature [23], the above compensation method is verified, where the collaborative robot base mounting inclination angle θ can be found. The coordinates at each joint are known at the factory when the robot is shipped, and when combined with the positions of each coordinate system defined above, the angles α , β , and γ can be found. According to the Cartesian coordinate transformation rule, the magnitude of each directional component force measured by the six-axis force/torque sensor can be expressed in the linkage coordinate system as
F x n = R n s X s Y s Z s α , β , γ T F x F y n = R n s X s Y s Z s α , β , γ T F y F z n = R n s X s Y s Z s α , β , γ T F z
The X n direction is perpendicular to axis i and the plane in which the connecting rod i is located, the Y n direction is parallel to the connecting rod i and points to axis i + 1 , and the Z n direction is parallel to axis i and the plane in which the connecting rod i is located.
As shown in Figure 2, the robot connecting rod i is subjected to the external force F i after the collision of the robot connecting rod i . The effect of the force in the direction of F x n is the same as the effect of the force in the direction of F z n . When considering the force on rod i , the effect on F z n is used as an example to derive the result, and the algebraic summation of the two parts is performed.
In the figure, point C i is defined as the position of the collision point where force F i collides with robot rod i ; point D is defined as the intersection point of axis i and the axis i + 1 extension line; a is the distance between axis i and axis i + 1 center-of-mass positions A i and B i to point D ; α is the angle between the two extension lines; β is the angle between a line in the plane where forces F z n and F x n are located and the axis i + 1 angle of the extension line; and γ is an angle variable in the force method to solve the super-stationary structure.

3. Force Compensation Algorithm

When the robot is at rest, the six-axis force/torque sensor mounted at the base will output different values in each direction due to the gravity of the body in different positions. Therefore, in order to identify the collision point at rest, the gravity should be compensated to achieve a zero reading of the six-axis force/torque sensor mounted at the base when the robot is at rest. Therefore, when the sensor value changes irregularly, it is assumed that an accidental collision has occurred between the robot and the external environment.
To establish the position and transformation relationship between the adjacent links of the collaborative robot using the “ D - H ” parameter method, the parameters and variables of each link should be defined first. α i 1 is the angle of rotation from Z i 1 to Z i around the X i 1 axis; a i 1 is the distance from Z i 1 to Z i along the X i 1 axis; d i is the distance from X i 1 to X i along the Z i axis; and θ i is the angle of rotation from X i 1 to X i around the Z i axis. Figure 3 shows the “ D - H ” coordinate space of the two adjacent joints.
In the robot linkage coordinate system, the coordinate system i 1 can be obtained by four flush transformations of the coordinate system i ; therefore, the flush transformation matrix between two adjacent linkage coordinate systems i 1 and i is
T i i 1 = R o t X , α i 1 T r a n s X , a i 1 T r a n s Z , d i R o t Z , θ i = 1 0 0 0 0 0 0 c α i 1 s α i 1 0 s α i 1 c α i 1 0 0 0 1 1 0 0 a i 1 0 0 0 1 0 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 0 0 0 1 0 0 d i 1 c θ i s θ i 0 0 s θ i 0 0 c θ i 0 0 0 1 0 0 0 1 = c θ i s θ i 0 a i 1 s θ i c α i 1 s θ i s α i 1 0 c θ i c α i 1 c θ i s α i 1 0 s α i 1 c α i 1 0 d i s α i 1 d i c α i 1 1
By bringing the parameters of each linkage of the robot into Equation (7), the positional transformation matrix of the specified linkage can be obtained. Here, the transformation matrix of the coordinate system i with respect to the base coordinate system can be expressed as
T i 0 = T 1 0 T 2 1 T 3 2 T i 1 i 2 T i i 1 = n x o x a x p x n y n z 0 o y o z 0 a y a z 0 p y p z 1
where R i i 1 = n x n y n z o x o y o z a x a y a z denotes the rotation matrix of the coordinate system i with respect to the coordinate system i 1 , and p i i 1 = p x p y p z T denotes the position matrix of the coordinate system i with respect to the coordinate system i 1 .
We define the position vector of the center of mass c i of any linkage i with respect to the joint coordinate system i as
c i = c i x   c i y   c i z T
The transformation matrix of the connecting rod coordinate system i with respect to the base coordinate system can be represented by R i 0 . The connecting rod center-of-mass coordinate system c i has the same direction as the coordinate system i , and its position relationship is shown in Figure 1. If the gravity vector of a robot linkage on its center-of-mass coordinate system is
G i = 0   0   G i T
then the chi-square transformation matrix of the linkage’s center-of-mass coordinate system c i with respect to the base coordinate system is
T c i 0 = T i 0 T c i i = R c i 0 p c i 0 0 1
where p c i 0 = p c x   p c y   p c z T .
Then, when the robot is in the static state, its own gravity vector, when transformed from the linkage coordinate system to the base coordinate system, is expressed by f 0 as
f 0 = 0   0   i = 0 n G i T
Its moment vector, when converted from the linkage coordinate system to the base coordinate system, is expressed in m 0 as
m 0 = i 1 n G i × p c y i 1 n G i × p c x   0 T
Thus, the transformation matrices of the force and moment vectors in the force sensor coordinate system and the base coordinate system can be obtained as
f s m s =   R o s 0 P 0 O R G s R o s R o s f o m o
where R o s = 1 0 0 0 1 0 0 0 1 ; P 0 O R G s = 0 p z p y p z 0 p x p y p x 0 = 0 h 0 h 0 0 0 0 0 .
The robot’s speed and acceleration change over time during normal operation, so gravity compensation alone cannot meet the requirements. After the dynamic force compensation, the sensor reading will be zero, so that if a collision occurs, the collision point can be identified and calculated using the compensated data.
The Newton–Euler method can build a rigid robot dynamics model using only the velocity, angular velocity, and rotational inertia of each linkage of the robot. The velocity and acceleration of each linkage of the robot are obtained by first pushing outward from the robot base and then pushing inward from the outside to obtain the generalized force of interaction between two adjacent linkages.
For a single operator arm linkage i of the robot, the forces are shown in Figure 4.
We define the mass of each operating arm of the robot as m i i = 1 , 2 , 3 , , 6 , and the position vector of the center of mass c i of robot joint i with respect to the joint coordinate system i can be expressed as r i i = 1 , 2 , 3 , , 6 . We define the displacement of robot joint i as θ i , the velocity as θ ˙ i , and the acceleration as θ ¨ i . The extrapolation process according to Equation (15) is mainly to solve for the acceleration of the center of mass of each joint of its robot.
ω i + 1 i + 1 = R i i + 1 ω i i + θ ˙ i + 1 Z i + 1 i + 1 ω i + 1 l + 1 ˙ = R i i + 1 ω i i + R i i + 1 ω i i × θ ˙ i + 1 Z i + 1 i + 1 + θ ¨ i + 1 Z i + 1 i + 1 v i + 1 l + 1 ˙ = R i i + 1 ω i i × P i + 1 i + ω i i × ( ω i i × P i + 1 i ) + v i i v c i i = ω i i × r i + 1 i + ω i i × ( ω i i × r i + 1 i ) + v i i
where ω i , ω ˙ i , v ˙ i and v ˙ c i are the angular velocity, angular acceleration, linear acceleration, and center-of-mass linear acceleration of the robot linkage i , respectively; ω 0 = ω ˙ 0 = v 0 = v ˙ 0 = 0 0 0 T ; Z i ( i = 1 , 2 , , 6 ) = 0 0 0 T .
Accordingly, Newton’s equation and Euler’s equation of motion can be established:
F i i 1 = F i i 1 + m i v ˙ c i G i c i
M i i 1 = M i i 1 r i + 1 , c i × F i i + 1 + r i , c i × F i i + 1 + I i ω ˙ i + ω i × I i ω i
where F i i 1 and M i i 1 denote the force and moment of the robot linkage i 1 on the linkage i , respectively; F i i + 1 and M i i + 1 denote the force and moment of the robot linkage i + 1 on the linkage i , respectively; r i , c i and r i + 1 , c i denote the position vector from the coordinate origin to the center of mass c i attached on robot joints i and i + 1 , respectively; I i denotes the inertia tensor of robot linkage i with respect to the center of mass c i ; and ω i × I i ω i denotes the Coe-style force term.
For the n-degree-of-freedom robot, when there is no load at its end, the extrapolated recursive formulas from Equations (15)–(17) can calculate F o 1 = F 1 o and M o 1 = M 1 o , which are then carried over to gravity compensation Formula (12) to (14) in the stationary state. The required gravity compensation values for the six-axis force/torque sensor at the base, namely f S and M S , can be solved for when the robot is in motion.
When the robot is in normal operation, the six-dimensional force sensor installed at the base reads the force and moment information as f D and M D . The force and moment relationship information obtained after gravity compensation is F and M , followed by
M i i 1 = M i i 1 r i + 1 , c i × F i i + 1 + r i , c i × F i i + 1 + I i ω ˙ i + ω i × I i ω i
Thus, when the robot is in normal operation, the six-axis force/torque sensor mounted at the base of the robot reads zero after dynamic force compensation. Then, when the value of the six-axis force/torque sensor exceeds a set threshold, a collision between the robot and the external environment is considered to have occurred. Based on the six-axis force/torque sensor and the working principle of the robot itself, the spatial coordinates at the front axis of the collision linkage and the spatial coordinates at the rear axis of the collision linkage at the time of collision are
P 1 ( x 1 , y 1 , z 1 ) P 2 ( x 2 , y 2 , z 2 )

4. The Principle of Force Method to Solve the Location of the Collision Point

When the joints where the robot collides are known, the forces at robot joint axes i and i + 1 due to the collision are shown in Figure 5. We define the point C i as the location of the collision point where the force F i collides with the robot joint i , we define l as the length of the robot joint i , and we define l as the length of the robot axis i + 1 to the cross section where the collision point is located and the variables in the force law principle.
When applying the principle of force method to solve the position of the collision point, the B i end of the connecting rod i where the collision occurs is first released, and the original super-stationary structure becomes a basic static system. In this static system, in addition to the action of F z n , the vertical force F 1 z , the horizontal force F 2 z , and the moment M 1 z , which are in the same direction as axis i 1 and perpendicular to axis i , act at the end of A i + 1 , as shown in Figure 6.
The robot linkage is a linear elastic structure, so the displacement is proportional to the force. Δ 1 F denotes the displacement of the B i end along the direction of F 1 z under the force F z n , and δ 11 , δ 12 , and δ 13 denote the displacement of the B i end of the robot linkage i along the direction of F 1 z when F 1 z , F 2 z , and M 1 z are unit forces, respectively, with individual displacement acting on the robot linkage i . Thus, the total displacement of the B i end of the robot linkage i along the F 1 z direction is
Δ 1 = δ 11 F 1 z + δ 12 F 2 z + δ 13 M 1 z + Δ 1 F
Since the B i end of the robot linkage i is a fixed constraint, the displacement at the B i point along the F 1 z direction should be 0, and the deformation coordination equation thus becomes
Δ 1 = δ 11 F 1 z + δ 12 F 2 z + δ 13 M 1 z + Δ 1 F = 0
Following the same method, two other deformation coordination equations can be obtained for the displacement of the robot linkage B i point along the F 2 z direction equal to zero and the angle of rotation in the M 1 z direction equal to zero. Thus, the linear equation equation is as follows:
δ 11 F 1 z + δ 12 F 2 z + δ 13 M 1 z + Δ 1 F = 0 δ 21 F 1 z + δ 22 F 2 z + δ 23 M 1 z + Δ 2 F = 0 δ 31 F 1 z + δ 32 F 2 z + δ 33 M 1 z + Δ 3 F = 0
The nine coefficients δ i j i = 1 , 2 , 3 ; j = 1 , 2 , 3 and three constant terms Δ i F i = 1 , 2 , 3 in Equation (22) are represented as shown in Figure 7. In the figures, (a)–(d) show the displacement diagrams of the B i end of the robot linkage along the F 1 z , F 2 z , and M 1 z directions when F 1 z , F 2 z , M 1 z , and F z n are unit forces.
When considering the impact of the collision process on the displacement of the robot linkage, the impact of the shear force and the axial force is much smaller than the impact of the moment and can be neglected. As shown in Figure 6, when the basic static system in which the robot linkage i is located only acts with the external force F z n , the moment on the robot linkage is
M = 0   0 x l ,   M = F z n l l x l   0 x l
If the moment of the increase in the curvature of the robot linkage is specified as positive, the moment equation when a unit force is applied at point B i on the robot linkage along the direction of F 1 z is
M 1 ¯ = x
When a unit force is applied at point B i on the robot linkage along the F 2 z direction, the moment equation is
M 2 ¯ = 0
When a unit force is applied at point B i on the robot linkage along the direction M 1 z , the moment equation is
M 3 ¯ = 1
According to the displacement reciprocity theorem, it is proved that δ i j = δ j i i , j = 1 , 2 , 3 . Therefore, it is obtained that
Δ 1 F = l M M 1 ¯ E I d s = F z n l l l 3 3 E I
Δ 2 F = l M M 2 ¯ E I d s = 0
Δ 3 F = l M M 3 ¯ E I d s = F z n l l l 2 2 E I
δ 11 = l M 1 ¯ M 1 ¯ E I d s = l 4 3 E I
δ 22 = l M 2 ¯ M 2 ¯ E I d s = 0
δ 33 = l M 3 ¯ M 3 ¯ E I d s = l 2 E I
δ 12 = δ 21 = l M 1 ¯ M 2 ¯ E I d s = 0
δ 13 = δ 31 = l M 1 ¯ M 3 ¯ E I d s = l 3 2 E I
δ 23 = δ 32 = l M 2 ¯ M 3 ¯ E I d s = 0
The system of Equation (22) is simplified to obtain the formula for the force-moment:
F 1 z = a 11 + a 12 + a 13 b 11
F 2 z = a 21 + a 22 + a 23 b 22
M 1 z = a 31 + a 32 + a 33 b 33
Among them,
a 11 = δ 23 2 δ 22 δ 23 δ 33 Δ 1 F , a 12 = δ 13 2 δ 12 + δ 12 δ 23 δ 33 Δ 2 F , a 13 = δ 23 2 δ 12 + δ 13 δ 22 δ 23 Δ 3 F b 11 = 2 δ 23 2 δ 12 δ 13 δ 23 3 δ 11 δ 12 2 δ 23 δ 33 + δ 11 δ 22 δ 33 δ 23 δ 13 2 δ 22 δ 23 a 21 = δ 12 2 δ 33 δ 12 δ 13 δ 22 Δ 1 F a 22 = δ 13 2 δ 12 δ 11 δ 12 δ 33 Δ 2 F ,   a 23 = δ 12 2 δ 13 + δ 11 δ 12 δ 23 Δ 3 F b 22 = 2 δ 12 2 δ 13 δ 23 δ 23 2 δ 11 δ 12 δ 12 3 δ 33 + δ 11 δ 22 δ 33 δ 12 δ 13 2 δ 12 δ 22 a 31 = δ 12 2 δ 23 δ 12 δ 13 δ 22 Δ 1 F a 32 = δ 12 2 δ 13 δ 11 δ 12 δ 23 Δ 2 F , a 33 = δ 12 2 + δ 11 δ 12 δ 22 Δ 3 F b 33 = δ 13 2 δ 12 δ 22 2 δ 12 2 δ 13 δ 23 + δ 23 2 δ 11 δ 12 + δ 12 3 δ 33 δ 11 δ 22 δ 33 δ 12
When solving the effect of the robot linkage due to the action of F x n , the method is exactly the same as the calculation and solution method for solving F z n . When F z n and F x n act on the robot connecting rod at the same time when establishing the connecting rod coordinate system, the two sets of forces are in two different planes after being unconstrained. The force-moments generated under the action of F n are F 1 z , F 2 z , and M 1 z , and the force-moments generated under the action of F x n are F 1 x , F 2 x , and M 1 x , with the two sets of force-moments algebraically summed as F 1 , F 2 , and M 1 . Therefore, when a collision occurs, the effect of the external force on the robot joint axis i is shown in Figure 5. In practical applications, when the software and hardware of the robot system are installed, the program is run. First, the collision joint is locked according to the change in the output joint current. Combined with the force and moment values output by the robot during normal operation and some parameter information for the robot body, it is brought into the Equations (23)–(35) to solve the algorithm parameters. In Equations (36)–(38), because the value information of force and moment at the time of collision is known, only the only variable l in the algorithm is the unknown condition, and l and the collision point location P can be found. At this point, the theoretical derivation of the force compensation algorithm and the principle of the force law, i.e., how the location of the collision point is solved, is completed.

5. Experimental Verification and Analysis of Results

5.1. Identification of Collision Joints for Experimental Verification

In the process of human–robot cooperative operation, the robot system information can be read by observing changes in the current of each joint of the robot and making a judgment as to whether the robot has had an accidental collision. To read the current data of each joint of the robot in real time, we can write a script program to write the robot system by making the robot communicate with the host computer through a network protocol to read the data. The main program is mainly divided into four parts: serial communication between the robot and the host computer, the assignment of values to each joint of the robot, loading real-time data for transfer to the host computer, and transmission of the data collected in the RET function to the host computer for display, of which part of the main program is shown below.
Joint_AO_1=get modbus io status(“Joint_AO_1”)
Joint_AO_2=get modbus io status(“Joint_AO_2”)
Joint_AO_3=get modbus io status(“Joint_AO_3”)
Joint_AO_4=get modbus io status(“Joint_AO_4”)
Joint_AO_5=get modbus io status(“Joint_AO_5”)
Joint_AO_6=get modbus io status(“Joint_AO_6”)
if (Joint_AO_1 > 30,000) then
Joint_current_1=Joint_AO_1-65535
end
if (Joint_AO_1 < 30,000) then
Joint_current_1=Joint_AO_1
end
if (Joint_AO_2 > 30,000) then
Joint_current_2=Joint_AO_2-65535
end
if (Joint_AO_2 < 30,000) then
Joint_current_2=Joint_AO_2
end
if (Joint_AO_3 > 30,000) then
Joint_current_3=Joint_AO_3-65535
end
if (Joint_AO_3 < 30,000) then
Joint_current_3=Joint_AO_3
end
if (Joint_AO_4 > 30,000) then
Joint_current_4=Joint_AO_4-65535
end
if (Joint_AO_4 < 30,000) then
Joint_current_4=Joint_AO_4
end
if (Joint_AO_5 > 30,000) then
Joint_current_5=Joint_AO_5-65535
end
if (Joint_AO_5 < 30,000) then
Joint_current_5=Joint_AO_5
end
if (Joint_AO_6 > 30,000) then
Joint_current_6=Joint_AO_6-65535
end
if (Joint_AO_6 < 30,000) then
Joint_current_6=Joint_AO_6
end
To run the program, you first need to understand the current values of each joint when the robot works normally. According to the output real-time current value of each joint, it is possible to determine whether an accidental collision has occurred. When an accidental collision occurs, the algorithm parameters are solved according to the parameter information of the robot body and the output value of the force sensor. Then, based on the value of the force sensor at the time of collision, the magnitude of variable l in the algorithm is solved. At this point, the location of the collision point can be represented. The algorithm block diagram is shown in Figure 8.
As shown in Figure 9a, volunteers simulated the collision experiment with their hands and applied external forces to robot joints 1 to 6, respectively. At the same time, the computer collected the current at each joint of the robot in six action states and output it to the computer in the form of numerical values. This current information is output in groups every 500 ms. Figure 9b shows a line chart corresponding to the output of the process of the volunteer hand simulation collision experiment in Figure 9a. The horizontal axis of this line chart is represented as a set of current information collected every 500 ms. The vertical axis of the line chart represents the real-time current situation of each joint of the robot corresponding to each set of current information.

5.2. Force Method Principle for Collision Point and Error Analysis

The force method used the AUBO i5 series collaborative robot body parameters as input conditions and applied two external forces of “ 60 N ” and “ 100 N ” in various direction, as measured by a tensiometer tester. The calculated values are obtained by substituting the formula. The material of the robot linkage was the common carbon fiber “ T 300 ” and its elasticity model is E . The robot arm cross-section is circular and its moment of inertia is I z . The numerical information of each parameter of the force principle is shown in Table 1.
By bringing the values of the parameters in Table 1 into Equations (27)–(35), the numerical solution of the force generated by the robot linkage due to the collision action of F z n and F x n can be obtained. In order to verify the correctness and accuracy of the calculation procedure, an error analysis of the experimental results was performed.
The absolute error for the actual external contact force is calculated by the formula
δ F = Δ F s Δ F c
where Δ F s is the experimental value of the external force and Δ F c is the calculated value of the external force.
The relative error for the actual external contact force is calculated by the formula
Δ δ F = F F e × 100 %
where F n is the absolute error of the external contact force and F e is the mode of the applied external force vector. The experimental results obtained by applying different magnitudes of external contact forces at different positions of the robot are shown in Table 2.
As shown in Figure 10, when a collision occurs, the relative error of the external force on the collision point is 4.333% of the maximum when the position of the collision point is solved by the principle of force law and the principle of virtual displacement, and the accuracy is higher the closer the position of the collision point is to the midpoint of the collision joint of the robot, thus meeting accuracy requirements. Therefore, when the force situation meets the accuracy requirement, the variable l in the force method principle can be found, and the coordinates of the collision point P can also be found.

6. Conclusions

In this paper, we propose a method to establish communication between the robot and the host computer with the aim of clarifying information about joints where accidental collisions occur during robot operation by observing the real-time current changes in each joint axis of the robot. The derivation of the force compensation algorithm is completed by the data information transmitted from the six-dimensional force sensor installed at the robot base. The force law principle and the virtual displacement principle are introduced to solve the specific position information of the specific collision point in the known collision joints. Using the AUBO collaborative robot as the main body of the experiment, experimental verification was performed, experimental data were recorded, and conclusions were drawn through comparative analysis. The algorithm of locking the collision joint is based on real-time current changes in the robot body and the collision point position information being solved based on the force method principle and the virtual displacement principle, and the error of the force situation is under a reasonable range, which meets the accuracy requirement of the robot collision point identification algorithm.

Author Contributions

Conceptualization, Z.W. and B.Z.; methodology, B.Z.; software, B.Z.; validation, Z.W., B.Z. and Y.Y.; formal analysis, B.Z. and Z.W.; investigation, B.Z.; resources, Z.L.; data curation, Z.L.; writing—original draft preparation, B.Z.; writing—review and editing, B.Z. and Z.W.; visualization, Z.L.; supervision, Z.W. and Z.L.; project administration, Z.W. and Z.L.; funding acquisition, Z.W. All authors have read and agreed to the published version of the manuscript.

Funding

National Natural Science Foundation of China (No. 51505124), Science and Technology Project of Hebei Education Department (ZD2020151), and Tangshan Science and Technology Innovation Team Training Plan Project (21130208D).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Madonna, M.; Monica, L.; Anastasi, S. Evolution of cognitive demand in the human–machine interaction integrated with industry 4.0 technologies. Wit Trans. Built Environ. 2019, 189, 13–19. [Google Scholar]
  2. Wang, T.R. Development of the robotics. Robot 2017, 39, 385–386. [Google Scholar]
  3. Jing, G.G.; Wu, G.T.; Chang, B.Y. Contact collision dynamics analysis of manipulator. Trans. Chin. Soc. Agric. Mach. 2016, 47, 369–375. [Google Scholar]
  4. Lai, Y.N.; Ye, X.; Ding, H. Major research plans and progress on the integration. Chin. J. Mech. Eng. 2021, 57, 1–11+20. [Google Scholar]
  5. Wang, Z.J.; Liu, L.; Li, Z.X. Research on robot body collision point detection based on six-dimensional force sensors. J. Agric. Mach. 2021, 52, 395–401+410. [Google Scholar]
  6. Saenz, J.; Behrens, R.; Schulenburg, E. Methods for considering safety in design of robotics applications featuring human-robot collaboration. Int. J. Adv. Manuf. Technol. 2020, 107, 2313–2331. [Google Scholar] [CrossRef]
  7. Safeea, M.; Neto, P.; Bearee, R. On-line collision avoidance for collaborative robot manipulators by adjusting off-line generated paths: An industrial use case. Robot. Auton. Syst. 2019, 119, 278–288. [Google Scholar] [CrossRef] [Green Version]
  8. Yu, X.S.; Wang, Z.A.; Wu, G.X. Design of human-machine co-integration systems for different interaction tasks. J. Autom. 2022, 48, 2265–2276. [Google Scholar]
  9. Wang, B.L. Research on Collision Detection and Response Strategy and Supple Control Algorithm of Robotic Arm. Harbin Inst. Technol. 2019. Available online: https://kns.cnki.net/kcms2/article/abstract?v=3uoqIhG8C475KOm_zrgu4lQARvep2SAkWfZcByc-RON98J6vxPv10azsRg9B71I1c60b40mP4o4DwCMhP9xnq1d-e9Bn7Pmr&uniplatform=NZKPT (accessed on 15 July 2023).
  10. Zhu, Z.; Gong, Z.; Zhang, Y.; Chen, S.; Zhao, Z.; Zhou, X.; Gao, M.; Huang, S. Research on Collision Detection and Collision Reaction of Collaborative Robots. In Intelligent Robotics and Applications, Proceedings of the 14th International Conference on Intelligent Robotics and Applications, Yantai, China, 22–25 October 2021; Springer: Cham, Switzerland, 2021; Volume 13014, pp. 510–520. [Google Scholar] [CrossRef]
  11. Yu, W.F. A review of collision detection algorithms based on virtual reality technology. J. Civ. Aviat. 2019, 3, 85–87+96. [Google Scholar]
  12. Govender, N. Collision detection of convex polyhedra on the NVIDIA GPU architecture for the discrete element method. Appl. Math. Comput. 2015, 267, 810–829. [Google Scholar] [CrossRef]
  13. Geng, H.; Gao, L.L. Improved hybrid hierarchical wraparound box collision detection algorithm for aircraft virtual maintenance. Sci. Technol. Eng. 2018, 18, 63–68. [Google Scholar]
  14. Zheng, Y.; Chen, C.M.; Adiwahono, A.H. A GJK-Based Approach to Contact Force Feasibility and Distribution for Multi-Contact Robots; North-Holland Publishing Co.: Amsterdam, The Netherlands, 2011; Volume 59, pp. 194–207. [Google Scholar]
  15. Wang, Z.F.; Liu, C.Y.; Sui, X. Three-dimensional point cloud target segmentation and collision detection based on depth projection. Opt. Precis. Eng. 2020, 28, 1600–1608. [Google Scholar] [CrossRef]
  16. Bharadwaj, H.; Kumar, V. Comparative study of neural networks in path planning for catering robots. Procedia Comput. Sci. 2018, 133, 417–423. [Google Scholar] [CrossRef]
  17. Lomakin, A.; Deutscher, J. Reliable Algebraic Fault Detection and Identification of Robots. IEEE Trans. Autom. Sci. Eng. 2022, 19, 3821–3837. [Google Scholar] [CrossRef]
  18. Lindner, T.; Milecki, A. Reinforcement Learning-Based Algorithm to Avoid Obstacles by the Anthropomorphic Robotic Arm. Appl. Sci. 2022, 12, 6629. [Google Scholar] [CrossRef]
  19. Yun, A.; Lee, W.; Kim, S. Development of a robot arm link system embedded with a three-axis sensor with a simple structure capable of excellent external collision detection. Sensors 2022, 22, 1222. [Google Scholar] [CrossRef] [PubMed]
  20. Fu, J.S.; Ding, Y.B.; Liu, H.T. A vision localization method for mobile machining robots. J. Mech. Eng. 2022, 58, 25–34. [Google Scholar]
  21. Phong, L.D.; Choi, J.; Lee, W.; Kang, S. A novel method for estimating external force: Simulation study with a 4-DOF robot manipulator. Int. J. Precis. Eng. Manuf. 2015, 16, 755–766. [Google Scholar] [CrossRef]
  22. Gadsden, S.A.; Song, Y.; Habibi, S.R. Novel Model-Based Estimators for the Purposes of Fault Detection and Diagnosis. Trans. Mechatron. 2013, 18, 1237–1249. [Google Scholar] [CrossRef]
  23. Zhang, L.J.; Hu, R.Q.; Yi, W.M. Research on end-load force sensing of industrial robots based on six-dimensional force sensors. J. Autom. 2017, 43, 439–447. [Google Scholar]
Figure 1. Relative position of the six-axis force/torque sensor’s mounting position and coordinate system to the base coordinate system.
Figure 1. Relative position of the six-axis force/torque sensor’s mounting position and coordinate system to the base coordinate system.
Actuators 12 00320 g001
Figure 2. Collision force schematic of robot linkage i with external force F i .
Figure 2. Collision force schematic of robot linkage i with external force F i .
Actuators 12 00320 g002
Figure 3. D - H ” coordinate space of two adjacent joint links.
Figure 3. D - H ” coordinate space of two adjacent joint links.
Actuators 12 00320 g003
Figure 4. Robot linkage force diagram.
Figure 4. Robot linkage force diagram.
Actuators 12 00320 g004
Figure 5. Collision force schematic of robot linkage i subjected to external force F i .
Figure 5. Collision force schematic of robot linkage i subjected to external force F i .
Actuators 12 00320 g005
Figure 6. Schematic diagram of robot linkage i after unconstraint under the action of F z n .
Figure 6. Schematic diagram of robot linkage i after unconstraint under the action of F z n .
Actuators 12 00320 g006
Figure 7. Diagram of the displacement along different directions when different forces are applied to the robot linkage. (a) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is F 1 z . (b) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is F 2 z . (c) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is M 1 z . (d) F n plot of the displacement of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is united.
Figure 7. Diagram of the displacement along different directions when different forces are applied to the robot linkage. (a) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is F 1 z . (b) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is F 2 z . (c) Displacement diagram of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is M 1 z . (d) F n plot of the displacement of the connecting rod B i along the F 1 z , F 2 z , and M 1 z directions when the unit force is united.
Actuators 12 00320 g007
Figure 8. Diagram of the collision detection algorithm.
Figure 8. Diagram of the collision detection algorithm.
Actuators 12 00320 g008
Figure 9. Human−robot interaction collision detection experiment. (a) Volunteer−simulated collision experiment. (b) Joint 1−Joint 6 variations in current data.
Figure 9. Human−robot interaction collision detection experiment. (a) Volunteer−simulated collision experiment. (b) Joint 1−Joint 6 variations in current data.
Actuators 12 00320 g009
Figure 10. Experimental results. (a) Force absolute error. (b) Force relative error.
Figure 10. Experimental results. (a) Force absolute error. (b) Force relative error.
Actuators 12 00320 g010
Table 1. Information on the value of each parameter of the force law principle.
Table 1. Information on the value of each parameter of the force law principle.
Projects Location   Information / l Split   Force F z n / N δ 11 δ 22 δ 33 δ 12 = δ 21 δ 13 = δ 31 δ 23 = δ 32 Δ 1 F Δ 2 F Δ 3 F
10.1600.003100.161600.01940−0.00790−0.1178
20.11000.003100.161600.01940−0.01310−0.1964
30.11600.003100.161600.01940−0.00970−0.1324
40.111000.003100.161600.01940−0.01620−0.2206
50.12600.003100.161600.01940−0.01160−0.1454
60.121000.003100.161600.01940−0.01940−0.2424
70.13600.003100.161600.01940−0.01360−0.1564
80.131000.003100.161600.01940−0.02260−0.2607
90.14600.003100.161600.01940−0.01540−0.1649
100.141000.003100.161600.01940−0.02560−0.2749
Table 2. Experimental results of different collision external forces.
Table 2. Experimental results of different collision external forces.
Projects Location   Information / l External   Forces / N Computing   Power / N Simulation   Power / N Force   Absolute   Error / N Relative   Error   of   Force / %
10.16032.400352.6004.333
20.110054.35658.333.9743.974
30.116032.13632.50.3640.607
40.1110053.35054.1670.8170.817
50.126030.400300.4000.667
60.1210050.324500.3240.324
70.136026.85127.50.6491.082
80.1310045.11845.8330.7150.715
90.146022.810252.193.65
100.1410038.40041.6673.2673.267
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Z.; Zhu, B.; Yang, Y.; Li, Z. Research on Identifying Robot Collision Points in Human–Robot Collaboration Based on Force Method Principle Solving. Actuators 2023, 12, 320. https://doi.org/10.3390/act12080320

AMA Style

Wang Z, Zhu B, Yang Y, Li Z. Research on Identifying Robot Collision Points in Human–Robot Collaboration Based on Force Method Principle Solving. Actuators. 2023; 12(8):320. https://doi.org/10.3390/act12080320

Chicago/Turabian Style

Wang, Zhijun, Bocheng Zhu, Yue Yang, and Zhanxian Li. 2023. "Research on Identifying Robot Collision Points in Human–Robot Collaboration Based on Force Method Principle Solving" Actuators 12, no. 8: 320. https://doi.org/10.3390/act12080320

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop