Next Article in Journal
The Impact of Base Cell Size Setup on the Finite Difference Time Domain Computational Simulation of Human Cornea Exposed to Millimeter Wave Radiation at Frequencies above 30 GHz
Next Article in Special Issue
Efficient Obstacle Detection and Tracking Using RGB-D Sensor Data in Dynamic Environments for Robotic Applications
Previous Article in Journal
Alpine Skiing Activity Recognition Using Smartphone’s IMUs
Previous Article in Special Issue
A Review of Sensors Used on Fabric-Handling Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Collision Detection of a HEXA Parallel Robot Based on Dynamic Model and a Multi-Dual Depth Camera System

1
Graduate Institute of Automation and Control, National Taiwan University of Science and Technology, Taipei 106, Taiwan
2
Center of Automation and Control, National Taiwan University of Science and Technology, Taipei 106, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2022, 22(15), 5923; https://doi.org/10.3390/s22155923
Submission received: 21 June 2022 / Revised: 28 July 2022 / Accepted: 5 August 2022 / Published: 8 August 2022
(This article belongs to the Special Issue State-of-Art in Sensors for Robotic Applications)

Abstract

:
This paper introduces a Hexa parallel robot and obstacle collision detection method based on dynamic modeling and a computer vision system. The processes to deal with the collision issues refer to collision detection, collision isolation, and collision identification applied to the Hexa robot, respectively, in this paper. Initially, the configuration, kinematic and dynamic characteristics during movement trajectories of the Hexa parallel robot are analyzed to perform the knowledge extraction for the method. Next, a virtual force sensor is presented to estimate the collision detection signal created as a combination of the solution to the inverse dynamics and a low-pass filter. Then, a vision system consisting of dual-depth cameras is designed for obstacle isolation and determining the contact point location at the end-effector, an arm, and a rod of the Hexa robot. Finally, a recursive Newton-Euler algorithm is applied to compute contact forces caused by collision cases with the real-Hexa robot. Based on the experimental results, the force identification is compared to sensor forces for the performance evaluation of the proposed collision detection method.

1. Introduction

A six-degree-of-freedom parallel structure called the Hexa parallel robot was introduced by Pierrot et al. in [1,2]. Despite its complicated kinematics structure and limited workspace, the Hexa robot has remarkable advantages over the serial robot: high accuracy, high stiffness, high speed, and high carrying capability. However, working with this type of robot must imply solving complex and computationally not only inverse/forward kinematics problems but also dynamics ones, which are complicated. Meanwhile, the inverse kinematics issue is tractable by geometrical methods [3,4], but the forward kinematics problem involves a system of nonlinear equations that usually has no closed-form solution. Therefore, the artificial neural networks (ANNs) method is proposed for finding a unique closed-form analytic solution to the forward kinematics problem [5]. There have been few studies on the dynamic modeling of the experimental configuration Hexa parallel robot [6,7]. In-depth study of the Hexa robot dynamics is needed to take full advantage of its strengths for trajectory control and fault detection.
It is a critical issue to study the collisions of a robot manipulator due to diverse application scenarios. An approach for accurate and real-time monitoring of serial robot manipulator collisions is presented in [8], which uses proprioception and IMU sensing for velocity and acceleration estimation. Another observer method to adapt the erroneous dynamics model of the serial manipulators is introduced in [9], where the observer significantly improves the collision detection sensitivity. Sensorless collision detection approaches based on different methods are presented in [10,11,12]. Accelerometers are attached along the surface of the robot arm to detect and localize contact events [13]. Tactile sensors mounted with the robot manipulator are applied to provide a tactile perception of the robot [14,15,16]. In [17,18], a virtual force sensor has been proposed for collision detection and estimating external forces. Authors of [19] have used neural networks for collision detection and classification. Most of the above methods are extremely difficult to apply to parallel manipulators. However, some papers use the dynamics equations and first-order filter structure in combination to build an observer as the collision identification signal called a residual [20]. The residual equals zero up to noise and disturbances during free motion or higher than a suitable threshold in response to a collision. The performance of this method is suitable not only for robot manipulators, and many pieces of research related to humanoid robots have been carried out [21,22].
Recently, a vision-based control system has been widely utilized in robotics to provide information about the target objects or the robot’s pose as position and orientation. Besides that, many vision-based techniques during the isolation phase have been presented. The authors of [23,24] have analyzed the problem of sensor-based collision detection and isolation for a robot manipulator using a set of images taken from several stationary cameras. A system combining deep learning and stereo vision for object detection, classification, and distance calculation [24] can be effectively applied for not only autonomous robot navigation but also collision isolation. For moving object detection, a method of estimating the motion of objects and extracting shapes of detected objects from stereo video cameras has been proposed in [25]. Another system with dual-depth camera information and visual odometry was used to detect moving objects, and an adaptive thresholding method was adopted to enhance performance [26]. An algorithm based on background subtraction and edge detection was applied to detect and segment objects in video frames [27]. In the collision isolation phase, moving objects in the workspace should be detected based on the multi-camera system, using the advantages of image processing techniques. One of them is the interesting You-Only-Look-Once (YOLO) algorithm, firstly presented by Redmon et al. in [23,28], so the contact point location and the Hexa robot pose in this paper are determined with the improved algorithm.
After collision detection and collision isolation, it is essential to determine the suitable method for identifying the contact force. In [29], data from a 6D wrist Force/Torque sensor is combined with the probabilistic approach to link contact estimation based on geometric considerations and compliant motions. In [12,30], the time series model and fuzzy identification were used for determining contact location. Once the contact point is detected, the exchanged force estimation must be the next stage for a complete identification phase. Some dynamics algorithms are approached to perform the external force estimation, such as the recursive Newton–Euler algorithm (RNEA) for the inverse dynamics [31,32], the articulated body algorithm (ABA) for the forward dynamics, and the composite-rigid-body algorithm (CRBA) for calculating the joint-space inertia matrix (JSIM) [33,34]. A recursive Newton-Euler method for inverse dynamics is suitable to analyze external forces acting on the Hexa parallel robot. Based on the above discussion, this paper is motivated by the primary algorithms to solve the detected collision issues as follows:
  • The analysis of implementing a dynamic model of the moving Hexa parallel robot for the collision detection problem is proposed.
  • The obstacles and the Hexa robot are isolated by the proposed geometrical algorithm and the set of real-time images taken from the multi-cameras system is used to locate the contact point.
  • Finally, the Cartesian external force at the contact point is estimated by the recursive Newton-Euler algorithm for the Hexa parallel robot, compared to the force sensor in different collision point cases.
The rest of this paper is organized as follows. We present the configuration, the kinematics and dynamics problems of the Hexa robot in Section 2. In Section 3, the proposed collision detection, isolation, and identification methods applied to the Hexa robot are presented. We show block diagrams to illustrate the entire Hexa robot system and introduce equipment used in this study in Section 4. Section 5 includes the experimental scheme of study cases where the proposed methods are validated based on the given trajectories. Finally, we conclude the paper in Section 6.

2. Modeling of the Hexa Parallel Robot

2.1. Configurations of the Hexa Robot

The model of the Hexa parallel robot with six serial kinematics chains is shown in Figure 1. Each kinematic chain is described as a revolute-universal-spherical chain link between the base platform and the active plate. The arms driven by six motors at revolute joints connect with the rods at the universal joints. These rods are associated with the active plate at spherical joints. There are two main coordinate frames when the Hexa robot is considered. The first fixed coordinate frame { O } = { O X Y Z } is located in the center of the base platform with the Z -axis perpendicular to the base platform upwards. The X -axis direction points from the origin O to the midpoint of the edge A 1 A 2 . The Y -axis is determined by the right-hand rule. The second rotating coordinate frame { P } = { P x y z } is located in the center of the active plate with the z -axis perpendicular to the surface of the active plate. The x -axis direction points from the end-effector to the midpoint of the edge C 1 C 2 . The y -axis is determined in the same way as as the Y-axis. For the i th kinematic chain ( i = 1 , , 6 ) , the point A i represents the center of the revolute joint located in the i th arm. The point B i is the center of the universal joint that connects the i th rod with the i th arm. The point C i is the center of the spherical joint that links between the i th rod and the active plate. The length of the i th arm and the i th rod are l a and l d , respectively. The angle ζ i is the rotation angle of the revolute joint of the i th arm.
The point P representing the end-effector and the Euler angles ( ϕ ,   θ ,   ψ ) of the active plate with respect to the fixed frame { O } are given. Three angles ( ϕ ,   θ ,   ψ ) are defined in compliance with the intrinsic rotation convention of Euler angles. The coordinate with respect to the active frame { P } can be transformed to the coordinate with respect to the fixed frame { O } by using the rotation matrix R P O :
R P O = [ cos ϕ cos θ cos ψ sin ϕ sin ψ cos ϕ cos θ sin ψ sin ϕ cos ψ cos ϕ sin θ sin ϕ cos θ cos ψ + cos ϕ sin ψ sin ϕ cos θ sin ψ + cos ϕ cos ψ sin ϕ sin θ sin θ cos ψ sin θ sin ψ cos θ ]
The position and orientation of the active plate can be determined by three Euler angles and the position of point P :
r P O = [ x P   y P   z P   ] T
The angular velocity ω P of the active plate can be expressed as:
ω P = [ ψ ˙ cos ϕ sin θ θ ˙ sin ϕ ψ ˙ sin ϕ sin θ + θ ˙ cos ϕ ψ ˙ cos θ + ϕ ˙ ]
At the point O on the base platform, three frames { O i } = { O i X i Y i Z i }   ( i = 1 , 3 , 5 ) are built with the Z i -axis coinciding with the Z -axis. The X i -axis direction points to the center of edge A i A i + 1 . The Y i -axis is determined by the right-hand rule. Simultaneously, at the point P on the active plate, three frames { P i } = { P i x i y i z i }   ( i = 1 , 3 , 5 ) are also constructed. The rotation matrix of frame { O i } relative to frame { O } and frame { P i } relative to frame { P } can be expressed as:
R O i O = R P i P = [ cos α i sin α i 0 sin α i cos α i 0 0 0 1 ]
where
[ α 1   α 2   α 3   α 4   α 5   α 6 ] T = [ 0   0   2 π 3   2 π 3   4 π 3   4 π 3 ] T
At the point A i on the base platform, a frame { A 1 i } = { A 1 i X 1 i Y 1 i Z 1 i }   ( i = 1 , , 6 ) is built with the Z 1 i -axis along with the rotation axis. The Y 1 i -axis is perpendicular to the base platform upwards. The X 1 i -axis is determined by the right-hand rule. The rotation matrix of frame { A 1 i } relative to frame { O i } is:
R A 1 i O i = [ 1 0 0 0 0 1 0 1 0 ]
At the point A i , a frame { A 2 i } = { A 2 i X 2 i Y 2 i Z 2 i }   ( i = 1 , , 6 ) attached to the i th arm is established with the Z 2 i -axis coinciding with the Z 1 i -axis. The X 2 i -axis coincides with the axial line of the i th arm. The Y 2 i -axis is determined by the right-hand rule. The rotation matrix of frame { A 2 i } relative to frame { A 1 i } is:
R A 2 i A 1 i = [ cos ζ i sin ζ i 0 sin ζ i cos ζ i 0 0 0 1 ]
where ζ i is the rotation angle of the revolute joint of the arm.
At the point B i , a frame { B 1 i } = { B 1 i x 1 i y 1 i z 1 i }   ( i = 1 , , 6 ) attached to the i th arm and a frame { B 2 i } = { B 2 i x 2 i y 2 i z 2 i }   ( i = 1 , , 6 ) attached to the i th rod are built with the x 1 i -axis pointing from A i to B i . The x 2 i -axis points from B i to C i . The z 1 i -axis and the z 2 i -axis coincide with the two rotating axes of the universal joint, respectively. The y 1 i -axis and the y 2 i -axis are determined by the right-hand rule. The rotation matrix of frame { B 1 i } relative to frame { A 2 i } is:
R B 1 i A 2 i = [ 1 0 0 0 1 0 0 0 1 ]
The rotation matrix of the frame { B 2 i } relative to the frame { B 1 i } can be calculated by transformations. Initially, the points of A i , B i , and C i align on the same line. The frame { B 2 i } is obtained by rotating the frame { B 1 i } around the two rotating axes of the universal joint by the two rotation angles β i and γ i , respectively. The rotation matrix of frame { B 2 i } relative to frame { B 1 i } is:
R B 2 i B 1 i = [ cos β i cos γ i cos β i sin γ i sin β i sin β i cos γ i sin β i sin γ i cos β i sin γ i cos γ i 0 ]
Finally, the rotation matrix of the frame { A 2 i } relative to the frame { O } and the frame { B 2 i } relative to the frame { O } are:
R A 2 i O = R O i O R A 1 i O i R A 2 i A 1 i = [ cos α i cos ζ i cos α i sin ζ i sin α i sin α i cos ζ i sin α i sin ζ i cos α i sin ζ i cos ζ i 0 ]
R B 2 i O = R O i O R A 1 i O i R A 2 i A 1 i R B 1 i A 2 i R B 2 i B 1 i = [ R 11 R 12 R 13 R 21 R 22 R 23 R 31 R 32 R 33 ]
where
β i = ζ i arctan ( R 31 R 11 cos α i + R 21 sin α i   )   with π 2 < β i + ζ i < π 2 γ i = arcsin ( R 21 cos α i R 11 sin α i )   with π 2 < γ i < π 2
and
R 11 = sin α i sin γ i cos α i cos ( β i + ζ i ) cos γ i R 12 = sin α i cos γ i + cos α i cos ( β i + ζ i ) sin γ i R 13 = cos α i sin ( β i + ζ i ) R 21 = cos α i sin γ i sin α i cos ( β i + ζ i ) cos γ i R 22 = cos α i cos γ i + sin α i cos ( β i + ζ i ) sin γ i R 23 = sin α i sin ( β i + ζ i ) R 31 = sin ( β i + ζ i ) cos ( γ i ) R 32 = sin ( β i + ζ i ) sin ( γ i ) R 33 = cos ( β i + ζ i )

2.2. Kinematics

2.2.1. Inverse Kinematics

In the inverse kinematics, the position and Euler angles of the active plate are given. The rotation angles of motors need to be determined. A kinematics chain of the Hexa robot is shown in Figure 2. The coordinate of point A i in frame { O } and C i in frame { P } are given. The vector r C i O can be expressed as follows:
r C i O = r A i O + A B i + B C i = r P O + P C i
The kinematics equations can be expressed as follows:
l r 2 = l a 2 + ( x A i x C i ) 2 + ( y A i y C i ) 2 + ( z A i z C i ) 2 …… + ( 2 l a cos α i ( x A i x C i ) 2 l a sin α i ( y A i y C i ) ) cos ζ i + ( 2 l a ( z A i z C i ) ) sin ζ i
where
r C i O = [ x C i   y C i   z C i   ] T r A i O = [ x A i   y A i   z A i   ] T
a i = l r 2 l a 2 ( x A i x C i ) 2 ( y A i y C i ) 2 ( z A i z C i ) 2 b i = 2 l a cos α i ( x A i x C i ) 2 l a sin α i ( y A i y C i ) c i = 2 l a ( z A i z C i ) with   i = 1 , ,   6
The rotation angle ζ i can be calculated as the following expression:
ζ i = arctan a i c i 2 b i c i 2 ( a i 2 b i 2 c i 2 ) c i ( a i b i + c i 2 ( a i 2 b i 2 c i 2 ) )

2.2.2. Forward Kinematics

The procedure of neural network training is presented in Figure 3. In the forward kinematics problem, the rotation angles of the arms are given, and the pose of the end-effector is determined by solving kinematics equations. These equations are nonlinear, so analytically, finding their solutions is highly complicated. For the above reason, artificial neural networks (ANNs) are employed to calculate the approximate pose of the end-effector. Firstly, a data set consisting of the potential pose positions of the active plate is created randomly. This data set should be modified to reject the singular points and points outside the workspace. The modified data set is called the target data in ANN training. From the target data, the inverse kinematics problem is solved and the data set of angles ζ i of six arms is created. This data set is called the input data in ANN training. The training results for weight and bias factors are used to calculate the forward kinematics problem.

2.3. Dynamics

The dynamics equations can be obtained by several methods, such as the Newton-Euler Formulation, Lagrange equations of the first type, and the Virtual Work Principle. In this study, the Lagrange equations of the first type are applied to derive dynamics equations. Those equations are arranged into two sets and expressed as follows:
d d t ( L q ˙ j ) L q j = τ j + i = 1 6 λ i f i q j     ( j = 1 , ,   6 )
d d t ( L q ˙ j ) L q j = Q j + i = 1 6 λ i f i q j     ( j = 7 , ,   12 )
where q = [ ζ 1   ζ 2   ζ 3   ζ 4   ζ 5   ζ 6   x p   y p   z p   ϕ   θ   ψ ] T presents the generalized coordinates; L denotes the Lagrangian function of the whole system, which is the Lagrangian functions summation of the active plate, the arms, and the rods; τ j = K t I j is the actuator torque; K t is the motor torque constant and I j is the motor armature current; Q j is the generalized force; λ i   ( i = 1 , , 6 ) is the i th Lagrangian multiplier; f i = ( b i cos ζ i + c i sin ζ i a i ) is the i th constraint equation; a i , b i , and c i are in Equation (17).

2.3.1. Inverse Dynamics

Twelve motion variables q j and six generalized forces τ j contributed by the actuators are given. Solving the inverse dynamics problem is to find out the six generalized external forces.
The first six-equation set Equation (19) contains the six Lagrange multipliers λ i   ( i = 1 , , 6 ) as the unknowns. The first set can be written in the form:
i = 1 6 λ i f i q j = d d t ( L q ˙ j ) L q j τ j   ( j = 1 , ,   6 )
The six generalized external forces Q j   ( j = 7 , , 12 ) can be calculated by solving the second six-equation set Equation (20):
Q j = d d t ( L q ˙ j ) L q j i = 1 6 λ i f i q j ( j = 7 , ,   12 )

2.3.2. Forward Dynamics

Six generalized external forces Q j and six generalized forces τ j contributed by the actuators are given. Solving the forward dynamics problem is to find out the twelve motion variables q j .
Analyzing the problem, we have eighteen equations, including twelve dynamics equations in Equations (19) and (20), and six kinematics equations in Equation (15). There are eighteen unknown variables, including twelve motion variables q j and six Lagrange multipliers λ i   ( i = 1 , , 6 ) . This equation system is a differential-algebraic equations system containing differential equations and algebraic equations. The strategy to solve this system of equations is to reduce differential order, leading to an increment in the number of variables and equations. Then, this system of equations becomes algebraic equations. MATLAB has a function to solve differential algebraic equations (DAEs), but solving forward dynamics problems is still extremely complicated.

3. Collision Detection, Isolation and Identification of the Hexa Robot

3.1. Collision Detection

When robot manipulators move freely in the environment, an obstacle may come to the workspace of the robot manipulator at any time. If that obstacle collides with the robot, a collision has occurred. Implementing collision detection is to recognize whether a robot collision occurred or not. An observer calculated based on the dynamics equations can detect the occurrence of a collision. A suitable threshold is selected based on the requirement of the accuracy for this observer. A collision occurs if the observer signal is greater than the chosen threshold.
The generalized momentum of the Hexa robot is defined as:
p j = L q ˙ j ( j = 1 , , 12 )
According to the second dynamics equations set Equation (19), the generalized force can be written in the form:
Q j = p ˙ j L q j i = 1 6 λ i f i q j ( j = 7 , ,   12 )
The collision identification signal, or simply the residual r , is:
r ˙ = K r + K Q
Substituting Q j from (24) into (25), we obtain:
r ˙ j = K j r j + K j ( p ˙ j L q j i = 1 6 λ i f i q j )   ( j = 7 , , 12 )
Lagrange multipliers λ i   ( i = 1 , , 6 ) can be computed using the first dynamics equations set Equation (19). When a collision occurs, at least an element r j of vector r raises with a time constant 1/ K j . The detection is performed as soon as | r j | > r l o w , where r l o w   is the chosen threshold.

3.2. Collision Isolation

After the residual indicates that a collision occurs on the Hexa robot, it is necessary to identify the collision location. Collision isolation aims at localizing the contact point. In our paper, a vision system consisting of dual depth cameras is applied to detect the contact point if a collision occurs. The first camera’s photos at the time of a collision case are analyzed to identify whether the contact point is at the active plate or the rods. Otherwise, the contact point is detected by processing the image captured by the second camera. Based on the multi-camera vision system, we propose the flowchart of the collision isolation procedure in Figure 4 and the mathematics processing for localization of the contact point in Figure 5.
In this study, every external object coming to the workspace should be from the outside and cause the Hexa robot obscuration. Because the whole Hexa parallel manipulator cannot be observed in only images captured by the first dual depth camera, this camera is responsible for the external object isolation on the active plate and the rods from these images. On the contrary, the external obstacle for the arms case is identified by the second dual depth camera. Both arm and rod situations are considered in the direction to see the obstacle obscuring them.
  • Case 1: Contact points are determined by the first camera.
Firstly, a bounding box around the active plate is recognized using YOLOv3. Because the color of the rods is different from the background, the rods can be isolated from the image, so the RGB color image is converted to a YCbCr color image. Then, the Canny edge detector and Hough transform are applied to recognize straight-line profiles around rods. From these detected lines, the rods are identified in the image. The considered image and the background image are converted to grayscale images. A binary image is created by subtracting the grayscale background image from the grayscale considered image. After combining the image subtraction with depth conditions of the objects in the workspace, the external object is identified by removing the Hexa robot from the binary image.
To find the contact point on the active plate, all points belonging to the external object are checked to see if any of them touch the bounding box around the active plate. Then, the image depth is considered at these points. The difference between the depth of these points and the bounding box must be less than a chosen threshold. A point represents intersections of the external object and the bounding box around the active plate, which is obtained by taking an average. A line passes through the mentioned point and the end-effector. Finally, the contact point is located at the junction of the line and the active plate’s hexagonal border.
To find the contact point on a rod, the possible pairs of points belonging to the external object are checked respectively to see if the midpoint of these points is on the rod. Then, the image depth is considered at these points. The difference between these points’ depth and the midpoint’s depth on this rod must be less than the chosen threshold. All of these midpoints are probably the contact point. As a result, a point represents the contact point location, obtained by averaging all the midpoint locations.
  • Case 2: Contact points are determined by the second camera.
Based on the image depth, only objects in a limited workspace are considered in the image captured from the second camera. Because the color of the arms is blue, the arms can be isolated from the image. An appropriate threshold for the HSV color scale is chosen for probable extracting of the blue region from the image. Then, the Canny edge detector and Hough transform are applied to recognize straight-line profiles along arms. The direction of arms can be identified from these lines. Two fixed points are chosen as the motor location where two arms are connected with two motors. According to the fixed points and the directions of the arms, all points belonging to the blue region in the arm directions are checked to see if any of them intersect the non-blue region. The intersections are probable contact point locations. The correct contact point location can be detected with the depth conditions.
After the RGB color image is converted to a YCbCr color image, the rods can be isolated from the image. The Canny edge detector and Hough transform are applied to recognize straight-line profiles along rods. The direction of rods is calculated from these profiles. Two points are computed as the connector location where two arms are connected with two rods. According to the computed points and the direction of the rods, the white region points in the rod direction are checked for the non-white intersect probability. The intersections are probable contact point locations. The correct location of the contact point is determined based on the depth conditions and the above junctions.

3.3. Collision Identification

After the collision occurs and the contact point location is known, the information on the contact force should be determined. This study applies a Recursive Newton-Euler Algorithm (RNEA) to calculate how much the contact force was instead of using more devices. The inputs for this algorithm are the current intensity measured from current sensors and the pose of the end-effector obtained from image processing.
Three cases correspond to three possibilities that a collision occurs at an arm, a rod, or the active plate. RNEA is developed separately for the application with the Hexa robot in each case.
F a i = m a v ˙ a i M a i = I a i ω ˙ a i + ω a i × ( I a i ω a i ) F r i = m r v ˙ r i M r i = I r i ω ˙ r i + ω r i × ( I r i ω r i ) F p = m p v ˙ p M p = I p ω ˙ p + ω p × ( I p ω p )
where F a i is the inertia force acting on the i th arm; M a i   is the inertia moment acting on the i th arm; m a is the mass of the arm; v a i is the linear velocity of the center of mass (CoM) of the i th arm; I a i is the tensor of mass moment of inertia of the i th arm; ω a i is the angular velocity of the i th arm; F r i is the inertia force acting on the i th rod; M r i is the inertia moment acting on i th rod; m r is the mass of the rod; v r i is the linear velocity of the center of mass (CoM) of the i th rod; I r i is the tensor of mass moment of inertia of the i th rod; ω r i is the angular velocity of the i th rod; F p is the inertia force acting on the active plate; M p is the inertia moment acting on the active plate; m p is the mass of the active plate; v p is the linear velocity of the center of mass (CoM) of the active plate; I p is the tensor of mass moment of inertia of the active plate; ω p is the angular velocity of the active plate.
  • Case 1: A collision occurs on the active plate.
First step: According to six kinematics chains, calculate f B i and f C i   ( i = 1 , , 6 ) .
A B i × f B i = n A i n B i 1 2 A B i × F a i M a i
f B i f C i = F r i  
B C i × f C i = n B i n C i 1 2 B C i × F r i M r i
where f A i and n A i are the reaction force and the reaction moment exerted on the ith arm, a quantity of n A i can be computed by the ith motor torque; f B i and n B i = 0 are the force and the moment exerted on the rod by the arm; f C i and n C i = 0   are the force and the moment exerted on the active plate by the ith rod.
Second step: Calculate the external force F e p .
F e p = i = 1 6 f C i + F p
  • Case 2: A collision occurs at an arm.
First step: Assume collision occurred on the first kinematics chain. According to the five remaining kinematics chains, calculate f B i and f C i   ( i = 2 , , 6 ) according to Equations (28)–(30).
Second step: Calculate f C 1 based on the force balance equation at the active plate.
f C 1 = i = 2 6 f C i + F p
Third step: Calculate the external force F e 1 on the first arm.
A D 1 × F e 1 = n B 1 n A 1 + A B 1 × f B 1 + 1 2 A B 1 × F a 1 + M a 1
where D is the contact point, which is determined in the collision isolation.
  • Case 3: A collision occurs at a rod.
First step: Assume collision occurred in the first kinematics chain. According to the five remaining kinematics chains, calculate f B i and f C i   ( i = 2 , , 6 )   according to Equations (28)–(30).
Second step: Calculate f C 1 based on the force balance equation at the active plate according to Equation (32).
Third step: Calculate the external force F e 1 on the first rod.
A B 1 × f B 1 = n A 1 n B 1 1 2 A B 1 × F a 1 M a 1
f B 1 + F e 1 = f C 1 + F r 1
B D 1 × F e 1 = n C 1 n B 1 + B C 1 × f C 1 + 1 2 B C 1 × F r 1 + M r 1
where D is the contact point, which is determined in the collision isolation.

4. Experimental Setup

Figure 6 is the block diagram of the Hexa robot system. Both the computer and control software are responsible for controlling the robot via RS232 and solving collision problems. The robot controller is an Arduino Uno R3 microcontroller board. The six Servo motors are used to drive the arms. A current sensor module measures and sends current signals to Arduino Mega 2560. The vision system, including dual stereo cameras, monitors the Hexa robot. Meanwhile, the computer and software perform image processing. The first camera is mounted under the base platform, and the second camera is fixed away from the workspace of the Hexa robot. A force sensor should be used to verify the accuracy of the algorithm. A force/torque sensor is attached to the active plate to measure forces and torques.
Figure 7 is the block diagram of the collision problems solving process. In the first step, the end-effector pose [ x p   y p   z p   ϕ   θ   ψ ] T can be obtained from processed images captured by the first camera. Six rotation angles [ ζ 1   ζ 2   ζ 3   ζ 4   ζ 5   ζ 6   ] T and the actuator torque τ can be sequentially computed by solving the inverse kinematics problems based on the current [ I 1   I 2   I 3   I 4   I 5   I 6   ] T measured by current sensors. In the second step, with the above variables, Lagrangian multipliers λ i   ( i = 1 , , 6 ) can be calculated using the first dynamics equations set. In the third step, the residual r = [ r 1   r 2   r 3   r 4   r 5   r 6   ] T can be computed with the end effector pose and Lagrangian multipliers. According to the residual r , it denotes whether a robot collision occurred or not. In the fourth step, the contact point location r D is determined based on images captured by two stereo cameras. In the final step, after collision isolation, the Recursive Newton-Euler Algorithm is used to identify the external force F e . This system is like a sensor wherein the inputs are measured current and the pose of the end-effector, and the outputs are the external contact force and the contact point position. Further research will probably cover exploiting those outputs and regenerating the trajectory.
Figure 8 shows the Hexa robot system used in this study. Six arms are driven by six Servo motors fixed on the base platform. The first stereo camera is mounted under the base platform to monitor the workspace from above. The second stereo camera is fixed on an immovable bar to observe from the outside. The force/torque sensor is mounted on the active plate to evaluate the external force in the experiment.

5. Experimental Results

5.1. Case 1: Collision at the Active Plate

5.1.1. Collision Detection at the Active Plate

In this experiment, the Hexa robot is controlled to move the active plate in a circular trajectory in the workspace. A collision occurred due to an external force acting on the active plate. When the active plate moves in a circle trajectory, the first camera mounted under the base platform captures images. These images are processed to compute the pose of the end-effector, shown in Figure 9. The analog voltage signal from six current sensors is continuously sent to the computer via RS232. The measured current intensity is shown in Figure 10. Equations of the circular trajectory are presented as follows:
{ x = 100 cos ( 0.4 π t )           ( mm ) y = 100 sin ( 0.4 π t )           ( mm ) z = 550 ( mm ) α = 6 ( deg ) β = 1 30 cos ( 0.4 π t )     ( deg ) γ = 1 + 30 sin ( 0.4 π t )       ( deg )
Three angles ( α ,   β ,   γ ) are defined in compliance with the Z-Y-X extrinsic rotation convention of Tait–Bryan angles. The three angles ( α ,   β ,   γ ) can be converted to the three angles ( ϕ ,   θ ,   ψ ) . The rotation matrix is expressed as follows:
R = [ cos α cos β cos α sin β sin γ sin α cos γ cos α sin β cos γ + sin α sin γ sin α cos β sin α sin β sin γ + cos α cos γ sin α sin β cos γ + cos α sin γ sin β cos β sin γ cos β cos γ ]
According to Equation (26), the residuals are computed to detect a collision, shown in Figure 11. Because there is a model error less than 2 (N), a suitable threshold r l o w is chosen equal to 2 (N). When a collision occurs, the residual r j   ( j = 7 , 8 , 9 ) raises with a time constant 1/K. Detection is performed as soon as | r j | > r l o w , where r l o w is the chosen threshold. According to Figure 11, a collision occurred at 3.4 s because the residual in the Z -axis is greater than r l o w = 2 (N). The external force reached the maximum value at 3.8 s during the collision. When contact is lost, | r j | decreases quickly until | r j | < r l o w .

5.1.2. Collision Isolation at the Active Plate

The image captured when a collision started occurring is considered to find out the contact point location. According to the flowchart of the procedure of collision isolation shown in Figure 4, a bounding box around the active plate is recognized using YOLOv3, shown in Figure 12. The RGB color image shown above is converted to a YCbCr color image. Then, the Canny edge detector and Hough transform are applied to recognize straight-line profiles around rods. These straight lines are shown in Figure 12b. The rods can be identified in the image, shown in Figure 12c. The considered image and the background image are converted to grayscale images. A binary image is created by subtracting the grayscale background image from the grayscale considered image. After performing image subtraction, objects that are not in the workspace cannot be completely eliminated. Combined with the image depth, only objects in the workspace are considered in the image, which is shown in Figure 12d. Because the Hexa robot, including the active plate and the rods, is determined, the external object can be identified by removing the Hexa robot from the binary image. The image containing the external object is shown in Figure 12e. According to the flowchart of the mathematics processing shown in Figure 5, 180 points are determined to satisfy the conditions. All of them are intersections between the external object and the bounding box around the active plate, shown in Figure 12f. A point represents the above 180 pixels by taking an average. A line passes through the mentioned pixel and the end-effector. According to the marker’s location, the hexagonal profile of the active plate can be calculated. The contact point location is the intersection between the mentioned line and the hexagonal profile, shown in Figure 12g. Following the strategy mentioned above, the contact point location in the considered image is determined as [ 358.87   618.66 ] T   (pixels). According to the camera model, the contact point coordinate is also identified as [ 10.65 48.46 578.45 ] T (mm) when a collision occurs at the active plate.
A collision occurred on the active plate, and the external force reached the maximum value at 3.8 s. Similarly, 155 points are intersections between the external object and the bounding box around the active plate. The contact point location in the considered image is determined as [ 406.37   580.09 ] T (pixels). The contact point coordinate is also identified as [ 15.96 71.62 579.72 ] T (mm) when the external force reaches the maximum value.

5.1.3. Collision Identification at the Active Plate

According to Equation (27), the inertia forces acting on the arms, the rods, and the active plate are calculated. According to Equations (28)–(31), the external force acting on the active plate can be calculated, which is shown in Figure 13.
Based on Figure 13, the external forces computed by Lagrange’s equations and RNEA are close to each other and acceptable when these results are compared with the force sensor. The collision started occurring on the active plate at 3.4 s, and the external force was about [ 0.67 0.69   3.13 ] T (N). Then, the force reached the maximum value of [ 5.36 2.27   11.036 ] T (N) at 3.8 s.
The root mean square error (RMSE) is applied to evaluate the accuracy of RNEA, expressed as follows:
R M S E = 1 n e i 2   n
where e i   ( i = 1 , 2 , , 50 ) is the error between the computed force by RNEA and the force sensor; n = 50 is the number of samples.
RMSE in the X -axis is equal to 0.819 (N), RMSE in the Y -axis is equal to 0.4762 (N), and RMSE in the Z -axis is equal to 1.2823 (N).

5.2. Case 2: Collision at an Arm

5.2.1. Collision Detection an Arm

In this experiment, when the active plate moved in a helix trajectory, a collision occurred due to an external force acting on this arm. The pose of the end-effector is shown in Figure 14. The measured current intensity is shown in Figure 15. Equations of the helix trajectory are presented as follows:
{ x = 100 cos ( 0.4 π t )       ( mm ) y = 100 sin ( 0.4 π t )       ( mm ) z = 550 + 5 t ( mm ) α = 6 ( deg ) β = 1 25 cos ( 0.4 π t ) ( deg ) γ = 1 + 25 sin ( 0.4 π t ) ( deg )
According to Figure 16, a collision occurred at 5.2 s because the residual in the Z -axis is greater than r l o w = 2 (N). The external force reached the maximum value at 5.6 s during the collision.

5.2.2. Collision Isolation an Arm

When a collision started occurring, no contact point was detected in the image captured from the first camera. The image captured from the second camera at 5.2 s is considered to find out the contact point location, shown in Figure 17a. According to the flowchart of the procedure of collision isolation, shown in Figure 4, objects in this limited workspace are considered in the image captured from the second camera, shown in Figure 17b. Because the color of the arms is blue, the arms can be isolated from the image. An appropriate threshold for the HSV color scale is chosen to extract the blue region in the image, which is shown in Figure 17c. Then, the Canny edge detector and Hough transform are applied to recognize straight-line profiles along arms. These straight lines are shown in Figure 17d. According to the flowchart of the mathematics processing, shown in Figure 5, a fixed pixel is chosen as a motor location where an arm is connected with that motor. According to the fixed pixel and the direction of the arm, all pixels belonging to the blue region in the arm direction are checked to see if any of them intersect the non-blue regions, shown in Figure 17e. The intersections are potential contact point locations. Two intersections should be considered, shown in Figure 17f. The first intersection coordinate is [ 145   811 ] T (pixels), the depth at this point is 440 (mm). This intersection is not the contact point location because a point in the non-blue region has a coordinate equal to [ 165   818 ] T (pixels), and its depth is 379 (mm). The difference between two point’s depth is larger than a chosen depth threshold. The second intersection coordinate is [ 262   850 ] T (pixels), and the depth at this point is 419 (mm). This intersection is the contact point location because a pixel in the non-blue region has a coordinate equal to [ 242   845 ] T (pixel), and its depth is 408 (mm). The difference between two point’s depth is smaller than a chosen depth threshold. The contact point coordinate is identified as [ 113.25   263.67 111.3 ] T (mm), shown in Figure 17g.
At 5.6 s, a collision occurred on the arm, and the external force reached the maximum value. The intersection coordinate is [ 232   885 ] T (pixels), and the depth at this point is 409 mm. The contact point coordinate is identified as [ 121.99   278.95 99.66 ] T (mm).

5.2.3. Collision Identification an Arm

According to Equation (27), the inertia forces acting on the arms, the rods, and the active plate are calculated. According to Equations (32) and (33), the external force acting on the arm can be calculated, shown in Figure 18.
Based on Figure 18, the collision started occurring on the arm at 5.2 s, and the external force was about [ 1.976 2.557 2.254 ] T (N). Then, the force reached the maximum value of [ 4.941 7.38 5.785 ] T (N) at 5.6 s.
According to Equation (39), root mean square error (RMSE) is applied to evaluate the accuracy of RNEA. RMSE in the X-axis is equal to 1.1193 (N), RMSE in the Y-axis is equal to 1.4974 (N), and RMSE in the Z-axis is equal to 1.1056 (N).

5.3. Case 3: Collision at a Rod

5.3.1. Collision Detection at a Rod

In this experiment, when the active plate moved in an ellipse trajectory, a collision occurred due to an external force acting on this rod. The pose of the end-effector is shown in Figure 19. The measured current intensity is shown in Figure 20. Equations of the ellipse trajectory are presented as follows:
{ x = 80 cos ( 0.4 π t )         ( mm ) y = 110 sin ( 0.4 π t )       ( mm ) z = 550 ( mm ) α = 6 ( deg ) β = 1 30 cos ( 0.4 π t )   ( deg ) γ = 1 + 30 sin ( 0.4 π t )   ( deg )
According to Figure 21, a collision occurred at 4.4 s because the residual in the X -axis is greater than r l o w = 2 (N). The external force reached the maximum value at 4.6 s during the collision.

5.3.2. Collision Isolation at a Rod

In a similar method to Case 1, the collision isolation results are shown in Figure 22a–e. According to the flowchart of the mathematics processing shown in Figure 5, 46 midpoints are determined to satisfy the conditions, as shown in Figure 22f. A point represents the above 46 pixels by taking an average. The contact point location is shown in Figure 22 g. Following the strategy mentioned above, the contact point location in the considered image is determined as [ 382.8   698.83 ] T (pixels). According to the camera model, the contact point coordinate is also identified as [ 122.93 144.08 411.59 ] T (mm) when a collision occurs at a rod.
At 4.6 s, a collision occurred on the rod, and the external force reached the maximum value. Similarly, 447 pixels are potential contact points on the rod. The contact point location in the considered image is determined as [ 399.56   702.97 ] T (pixels). The contact point coordinate is also identified as [ 125.49 137.89 406.5207 ] T (mm) when the external force reaches the maximum value.

5.3.3. Collision Identification at a Rod

According to Equation (27), the inertia forces acting on the arms, the rods, and the active plate are calculated. According to Equations (34)–(36), the external force acting on the rod can be calculated, which is shown in Figure 23.
Based on Figure 23, the collision started occurring on the rod at 4.4 s, and the external force was about [ 2.487 0.925 1.121 ] T (N). Then, the force reached the maximum value of [ 8.055 2.631 3.837 ] T (N) at 4.6 s.
According to (39), the root mean square error (RMSE) is applied to evaluate the accuracy of RNEA. RMSE in the X -axis is equal to 1.2268 N, RMSE in the Y -axis is equal to 1.1874 N, and RMSE in the Z -axis is equal to 1.2627 N.

6. Conclusions

This study has presented a strategy to solve collision problems for the Hexa parallel robot. The proposed virtual force observer is created by combining the inverse dynamic problem solution and a low-pass filter. When the Hexa robot controls the active plate move with the designed trajectory, the observer monitors and indicates the collision time in the cases of three main parts: active plate, rods, and arms. The multi-dual depth camera system performs effectively for monitoring the workspace and the external obstacle. The contact point location in each collision case is obtained based on the proposed flowchart for the collision phases. The Recursive Newton-Euler algorithm is applied to analyze the dynamics free-body diagram of the Hexa robot. With the determined contact point location, finally, the intensity and directions of the external force are identified based on the Recursive Newton-Euler algorithm. The comparison results for the experimental force sensor show the potential for solving the collision problem in the Hexa robot structure.

Author Contributions

Conceptualization, X.-B.H. and Y.-L.K.; methodology, X.-B.H. and P.-C.P.; software, X.-B.H.; validation, P.-C.P. and Y.-L.K.; formal analysis, X.-B.H. and P.-C.P.; investigation, X.-B.H.; resources, Y.-L.K.; data curation, X.-B.H.; writing—original draft preparation, X.-B.H.; writing—review and editing, P.-C.P. and Y.-L.K.; visualization, X.-B.H.; supervision, Y.-L.K.; project administration, Y.-L.K.; funding acquisition, Y.-L.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Ministry of Science and Technology, Taiwan, the grant number is MOST 110-2221-E-011-124.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data available on request due to restrictions of privacy.

Acknowledgments

This work was supported in part by the Ministry of Science and Technology, Taiwan, under Grant MOST 110-2221-E-011-124.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Merlet, J.P.; Gosselin, C. Parallel Mechanisms and Robots. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2008; pp. 269–285. [Google Scholar]
  2. Pierrot, F.; Dauchez, P.; Fournier, A. HEXA: A Fast Six-DOF Fully Parallel Robot. In Proceedings of the Fifth International Conference on Advanced Robotics ’Robots in Unstructured Environments, Pisa, Italy, 19–22 June 1991; Volume 2, pp. 1158–1163. [Google Scholar] [CrossRef]
  3. Huynh, B.-P.; Wu, C.-W.; Kuo, Y.-L. Force/Position Hybrid Control for a Hexa Robot Using Gradient Descent Iterative Learning Control Algorithm. IEEE Access 2019, 7, 72329–72342. [Google Scholar] [CrossRef]
  4. Filho, S.C.T.; Cabral, E.L.L. Kinematics and Workspace Analysis of a Parallel Architecture Robot: The Hexa. ABCM Symp. Ser. Mechatron. 2005, 2, 158–165. [Google Scholar]
  5. Dehghani, M.; Ahmadi, M.; Khayatian, A.; Eghtesad, M.; Farid, M. Neural network solution for forward kinematics problem of HEXA parallel robot. In Proceedings of the 2008 American Control Conference, Seattle, WA, USA, 11–13 June 2008; Volume 60, pp. 4214–4219. [Google Scholar] [CrossRef]
  6. Filho, S.C.T.; Cabral, E.L.L. Dynamics and Jacobian Analysis of a Parallel Architecture Robot: The Hexa. ABCM Symp. Ser. Mechatron. 2005, 2, 166–173. [Google Scholar]
  7. Wang, X.; Tian, Y. Inverse Dynamics of Hexa Parallel Robot Based on the Lagrangian Equations of First Type. In Proceedings of the 2010 International Conference on Mechanic Automation and Control Engineering, Wuhan, China, 26–28 June 2010; pp. 3712–3716. [Google Scholar] [CrossRef]
  8. Birjandi, S.A.B.; Kuhn, J.; Haddadin, S. Observer-Extended Direct Method for Collision Monitoring in Robot Manipulators Using Proprioception and IMU Sensing. IEEE Robot. Autom. Lett. 2020, 5, 954–961. [Google Scholar] [CrossRef]
  9. Birjandi, S.A.B.; Haddadin, S. Model-Adaptive High-Speed Collision Detection for Serial-Chain Robot Manipulators. IEEE Robot. Autom. Lett. 2020, 5, 6544–6551. [Google Scholar] [CrossRef]
  10. Qiu, Z.; Ozawa, R. A Sensorless Collision Detection Approach Based on Virtual Contact Points. In Proceedings of the IEEE International Conference on Robotics and Automation, Brisbane, QLD, Australia, 21–25 May 2018; pp. 7639–7645. [Google Scholar] [CrossRef]
  11. Tian, Y.; Chen, Z.; Jia, T.; Wang, A.; Li, L. Sensorless Collision Detection and Contact Force Estimation for Collaborative Robots Based on Torque Observer. In Proceedings of the 2016 IEEE International Conference on Robotics and Biomimetics (ROBIO), Qingdao, China, 3–7 December 2016; pp. 946–951. [Google Scholar] [CrossRef]
  12. Yao, Y.; Shen, Y.; Lu, Y.; Zhuang, C. Sensorless Collision Detection Method for Robots with Uncertain Dynamics Based on Fuzzy Logics. In Proceedings of the 2020 IEEE International Conference on Mechatronics and Automation, ICMA 2020, Beijing, China, 13–16 October 2020; pp. 413–418. [Google Scholar] [CrossRef]
  13. Mcmahan, W.; Romano, J.M.; Kuchenbecker, K.J. Using Accelerometers to Localize Tactile Contact Events on a Robot Arm. In Workshop on Advances in Tactile Sensing and Touch-Based Human-Robot Interaction, Proceedings of the ACM/IEEE International Conference on Human-Robot Interaction; 2012. [Google Scholar]
  14. Fritzsche, M.; Saenz, J.; Penzlin, F. A Large Scale Tactile Sensor for Safe Mobile Robot Manipulation. In Proceedings of the 2016 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI), Christchurch, New Zealand, 7–10 March 2016; Volume 2016, pp. 427–428. [Google Scholar] [CrossRef]
  15. Fritzsche, M.; Elkmann, N.; Schulenburg, E. Tactile Sensing: A Key Technology for Safe Physical Human Robot Interaction. In Proceedings of the 6th International Conference on Human-Robot Interaction—HRI ’11, Lausanne, Switzerland, 6–9 March 2011; p. 139. [Google Scholar] [CrossRef]
  16. Cannata, G.; Maggiali, M.; Metta, G.; Sandini, G. An Embedded Artificial Skin for Humanoid Robots. In Proceedings of the IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems, Seoul, Korea, 20–22 August 2008; pp. 434–438. [Google Scholar] [CrossRef]
  17. Yen, S.H.; Tang, P.C.; Lin, Y.C.; Lin, C.Y. Development of a Virtual Force Sensor for a Low-Cost Collaborative Robot and Applications to Safety Control. Sensors 2019, 19, 2603. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  18. Magrini, E.; Flacco, F.; de Luca, A. Estimation of Contact Forces using a Virtual Force Sensor. In Proceedings of the IEEE International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014; pp. 2126–2133. [Google Scholar] [CrossRef]
  19. Popov, D.; Klimchik, A.; Mavridis, N. Collision Detection, Localization & Classification for Industrial Robots with Joint Torque Sensors. In Proceedings of the 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, 28 August–1 September 2017; pp. 838–843. [Google Scholar] [CrossRef]
  20. de Luca, A.; Albu-Schaffer, A.; Haddadin, S.; Hirzinger, G. Collision Detection and Safe Reaction with the DLR-III Lightweight Manipulator Arm. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 1623–1630. [Google Scholar] [CrossRef] [Green Version]
  21. Flacco, F.; Paolillo, A.; Kheddar, A. Residual-Based Contacts Estimation for Humanoid Robots. In Proceedings of the 2016 IEEE-RAS 16th International Conference on Humanoid Robots (Humanoids), Cancun, Mexico, 15–17 November 2016; pp. 409–415. [Google Scholar] [CrossRef] [Green Version]
  22. Vorndamme, J.; Schappler, M.; Haddadin, S. Collision Detection, Isolation and Identification for Humanoids. In Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore, 29 May–3 June 2017; pp. 4754–4761. [Google Scholar] [CrossRef] [Green Version]
  23. Ebert, D.M.; Henrich, D.D. Safe Human-Robot-Cooperation: Image-based collision detection for Industrial Robots. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and System, Lausanne, Switzerland, 30 September–4 October 2002; Volume 2, pp. 1826–1831. [Google Scholar] [CrossRef] [Green Version]
  24. Rahul; Nair, B.B. Camera-Based Object Detection, Identification and Distance Estimation. In Proceedings of the 2018 2nd International Conference on Micro-Electronics and Telecommunication Engineering (ICMETE), Ghaziabad, India, 20–21 September 2018; pp. 203–205. [Google Scholar] [CrossRef]
  25. Mito, Y.; Morimoto, M.; Fujii, K. An Object Detection and Extraction Method Using Stereo Camera. In Proceedings of the 2006 World Automation Congress, Budapest, Hungary, 24–26 July 2006; pp. 1–6. [Google Scholar] [CrossRef]
  26. Lin, S.F.; Huang, S.H. Moving Object Detection from a Moving Stereo Camera via Depth Information and Visual Odometry. In Proceedings of the 2018 IEEE International Conference on Applied System Invention (ICASI), Chiba, Japan, 13–17 April 2018; pp. 437–440. [Google Scholar] [CrossRef]
  27. Mohan, A.S.; Resmi, R. Video Image Processing for Moving Object Detection and Segmentation using Background Subtraction. In Proceedings of the 2014 First International Conference on Computational Systems and Communications (ICCSC), Trivandrum, India, 17–18 December 2014; pp. 288–292. [Google Scholar] [CrossRef]
  28. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar] [CrossRef] [Green Version]
  29. Petrovskaya, A.; Park, J.; Khatib, O. Probabilistic Estimation of Whole Body Contacts for Multi-Contact Robot Control. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 568–573. [Google Scholar] [CrossRef] [Green Version]
  30. Dimeas, F.; Avendaño-Valencia, L.D.; Aspragathos, N. Human—Robot Collision Detection and Identification Based on Fuzzy and Time Series Modelling. Robotica 2015, 33, 1886–1898. [Google Scholar] [CrossRef]
  31. Khalil, W. Dynamic Modeling of Robots Using Recursive Newton-Euler Techniques. In Proceedings of the ICINCO 2010—Proceedings of the 7th International Conference on Informatics in Control, Automation and Robotics (2010), Funchal, Madeira, Portugal, 15–18 June 2010. [Google Scholar]
  32. Bianco, C.G.L. Evaluation of Generalized Force Derivatives by Means of a Recursive Newton–Euler Approach. IEEE Trans. Robot. 2009, 25, 954–959. [Google Scholar] [CrossRef]
  33. Featherstone, R.; Orin, D. Robot Dynamics: Equations and algorithms. In Proceedings of the 2000 ICRA. Millennium Conference. IEEE International Conference on Robotics and Automation. Symposia Proceedings (Cat. No.00CH37065), San Francisco, CA, USA, 24–28 April 2000; Volume 1, pp. 826–834. [Google Scholar] [CrossRef] [Green Version]
  34. Featherstone, R.; Orin, D.E. Dynamics. In Engineering Science, 6th ed.; Routledge: Berlin/Heidelberg, Germany, 2015; Volume 4, pp. 63–70. [Google Scholar]
Figure 1. The model of the Hexa robot: (a) in 3D view, (b) in top view.
Figure 1. The model of the Hexa robot: (a) in 3D view, (b) in top view.
Sensors 22 05923 g001
Figure 2. A kinematics chain of the Hexa robot.
Figure 2. A kinematics chain of the Hexa robot.
Sensors 22 05923 g002
Figure 3. The procedure of neural network training [3].
Figure 3. The procedure of neural network training [3].
Sensors 22 05923 g003
Figure 4. The flowchart of the collision isolation procedure: for the first camera (a); for the second camera (b).
Figure 4. The flowchart of the collision isolation procedure: for the first camera (a); for the second camera (b).
Sensors 22 05923 g004
Figure 5. The flowchart of the mathematics processing for the first camera (a); the second camera (b).
Figure 5. The flowchart of the mathematics processing for the first camera (a); the second camera (b).
Sensors 22 05923 g005
Figure 6. The block diagram of the Hexa robot system.
Figure 6. The block diagram of the Hexa robot system.
Sensors 22 05923 g006
Figure 7. The block diagram of the collision problems solving process.
Figure 7. The block diagram of the collision problems solving process.
Sensors 22 05923 g007
Figure 8. The Hexa robot system.
Figure 8. The Hexa robot system.
Sensors 22 05923 g008
Figure 9. The pose of the end-effector in Case 1: (a) The position of the end- effector; (b) the orientation of the active plate.
Figure 9. The pose of the end-effector in Case 1: (a) The position of the end- effector; (b) the orientation of the active plate.
Sensors 22 05923 g009
Figure 10. Current signal measured from current sensors in Case 1: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Figure 10. Current signal measured from current sensors in Case 1: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Sensors 22 05923 g010
Figure 11. Comparison of the residuals for the force sensor in Case 1.
Figure 11. Comparison of the residuals for the force sensor in Case 1.
Sensors 22 05923 g011
Figure 12. The collision isolation procedure in Case 1: (a) The bounding box around the active plate; (b) straight lines around rods; (c) recognized rods; (d) objects in the workspace; (e) the external object; (f) intersections of the external object and the bounding box; (g) the contact point.
Figure 12. The collision isolation procedure in Case 1: (a) The bounding box around the active plate; (b) straight lines around rods; (c) recognized rods; (d) objects in the workspace; (e) the external object; (f) intersections of the external object and the bounding box; (g) the contact point.
Sensors 22 05923 g012
Figure 13. Residuals, the computed forces by RNEA and the force sensor in Case 1.
Figure 13. Residuals, the computed forces by RNEA and the force sensor in Case 1.
Sensors 22 05923 g013
Figure 14. The pose of the end-effector in Case 2: (a) The position of the end- effector; (b) the orientation of the active plate.
Figure 14. The pose of the end-effector in Case 2: (a) The position of the end- effector; (b) the orientation of the active plate.
Sensors 22 05923 g014
Figure 15. Current signal measured from current sensors in Case 2: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Figure 15. Current signal measured from current sensors in Case 2: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Sensors 22 05923 g015
Figure 16. Comparison of the residuals for the force sensor in Case 2.
Figure 16. Comparison of the residuals for the force sensor in Case 2.
Sensors 22 05923 g016
Figure 17. The collision isolation procedure in Case 2: (a) The image captured; (b) The limited workspace; (c) The blue region; (d) Straight lines along arms; (e) The direction of the arm; (f) The intersections of the blue region and the non-blue region in the arm direction; (g) The contact point.
Figure 17. The collision isolation procedure in Case 2: (a) The image captured; (b) The limited workspace; (c) The blue region; (d) Straight lines along arms; (e) The direction of the arm; (f) The intersections of the blue region and the non-blue region in the arm direction; (g) The contact point.
Sensors 22 05923 g017
Figure 18. Residuals, the computed forces by RNEA and the force sensor in Case 2.
Figure 18. Residuals, the computed forces by RNEA and the force sensor in Case 2.
Sensors 22 05923 g018
Figure 19. The pose of the end-effector in Case 3: (a) The position of the end- effector; (b) the orientation of the active plate.
Figure 19. The pose of the end-effector in Case 3: (a) The position of the end- effector; (b) the orientation of the active plate.
Sensors 22 05923 g019
Figure 20. Current signal measured from current sensors in Case 3: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Figure 20. Current signal measured from current sensors in Case 3: (a) The 1st, 2nd, and 3rd sensor; (b) the 4th, 5th, and 6th sensor.
Sensors 22 05923 g020
Figure 21. Comparison of the residuals for the force sensor in Case 3.
Figure 21. Comparison of the residuals for the force sensor in Case 3.
Sensors 22 05923 g021
Figure 22. The collision isolation procedure in Case 3: (a) The bounding box around the active plate; (b) straight lines around rods; (c) the recognized rods; (d) objects in the workspace; (e) the external object; (f) potential contact points; (g) the contact point.
Figure 22. The collision isolation procedure in Case 3: (a) The bounding box around the active plate; (b) straight lines around rods; (c) the recognized rods; (d) objects in the workspace; (e) the external object; (f) potential contact points; (g) the contact point.
Sensors 22 05923 g022
Figure 23. Residuals, the computed forces by RNEA and the force sensors in Case 3.
Figure 23. Residuals, the computed forces by RNEA and the force sensors in Case 3.
Sensors 22 05923 g023
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hoang, X.-B.; Pham, P.-C.; Kuo, Y.-L. Collision Detection of a HEXA Parallel Robot Based on Dynamic Model and a Multi-Dual Depth Camera System. Sensors 2022, 22, 5923. https://doi.org/10.3390/s22155923

AMA Style

Hoang X-B, Pham P-C, Kuo Y-L. Collision Detection of a HEXA Parallel Robot Based on Dynamic Model and a Multi-Dual Depth Camera System. Sensors. 2022; 22(15):5923. https://doi.org/10.3390/s22155923

Chicago/Turabian Style

Hoang, Xuan-Bach, Phu-Cuong Pham, and Yong-Lin Kuo. 2022. "Collision Detection of a HEXA Parallel Robot Based on Dynamic Model and a Multi-Dual Depth Camera System" Sensors 22, no. 15: 5923. https://doi.org/10.3390/s22155923

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop