1. Introduction
Mechanical robots have become crucial for modern welding owing to high-volume profitability since manual welding yields low production rates [
1]. Robotic welding brings different favorable circumstances, for instance, it has made strides in efficiency, weld quality, adaptability and workspace use, and it diminishes work costs in addition to focused unit cost [
2].
Be that as it may, most welding robots still work in the working mode of “teach and playback” and their adaptability is not enough when the welding object or other conditions are changed [
3]. Since welding as an empirical process is influenced by numerous factors, such as the mistakes of pre-machining, fitting of work pieces, and in-process defects, can result in variation in welding seam. However, welding robots in teach and playback mode have no such capacities and typically weld a weldment with many defects and poor penetration [
1].
There are generally three stages in robotic welding: (i) preparation—calibration, robot programming, and weld parameter, work-piece setting, (ii) welding—seam tracking, alternation of weld parameters in real time, (iii) analysis—weld quality inspection [
4]. The seam tracking operation is essential for extracting weld seam characteristics which can be fed into the controller of welding robot to instruct the motion of the robot along the welding seam path. Seam tracking technology with laser vision sensing has the advantages of no contact, fast speed, and high precision, which are the keys to realizing welding automation and intelligence [
5,
6].
In order to fulfill the required welding accuracy for robotic welding, a seam tracking algorithm that enables the robot to plan its path along the actual welding line is necessary. Therefore, many studies have been conducted on automatic seam tracking using sensors such as tactile, touch, probe, vision sensors [
7,
8], laser sensors [
9,
10], arc sensors [
11,
12], electromagnetic sensors [
13,
14], and ultrasonic sensors [
15,
16]. The sensors have a very important role in robotic seam tracking; the chief tasks would be weld starting and ending points detection, weld edge detection, joint width measurement.
A basic laser sensor consists of three parts: laser diode, CCD camera, and filter. The laser diode could produce a stripe or dot which would be scanned by the camera. The CCD camera is always fixed at an angle to the laser to capture properly the projection of laser on the work piece [
17]. The welding seam tracking system based on laser vision combines laser measurement and computer vision technology. It has the advantages of rich information acquisition, obvious welding seam characteristics, and strong anti-interference ability [
18,
19], which are suitable for real-time tracking systems. The mathematical model of transforming the laser feature points pixel coordinate to the three-dimensional coordinate of the welding feature points by designing the mechanical structure of the sensor was proposed [
20].
Chen et al. [
21] proposed a feature points positioning method that only needs two profile scans, which can effectively calculate the initial position of the weld. Chang et al. [
22] filtered, derived and convolved the weld profile data, and located the feature points by finding the local maxima. Wang et al. [
23] established welding seam profile detection and feature points extracting algorithms based on a NURBS-snake and visual attention model, and verified their effectiveness. Mastui et al. [
24] introduced an adaptive welding robot system controlled by laser sensor for welding of thin plates with gap variation in single pass.
In a flexible welding process, Ciszak et al. [
25] developed a low-cost system for identifying shapes in order to program industrial robots for a welding process in two dimension. The programming of industrial robots was to detect geometric shapes proposed by humans and to approximate them. Based on this, the robot could weld the same profiles on a two-dimensional plane. This is time-consuming as many welding robot applications are programmed by teach and playback, which means that they need to be reprogrammed each time they deal with a new task. Hairol et al. [
26] suggested an alternative approach that can automatically recognize and locate the butt-welding position at starting, middle, auxiliary, and end point under three conditions which are (i) straight, (ii) saw tooth, and (iii) curve joint. This was done without any prior knowledge of the shapes involved. As an automatic welding process may experience different disturbances, Li et al. [
27] proposed a robust method for identifying this seam based on cross-modal perception so as to precisely identify and automatically track the welding seam.
Wojciechowski et al. [
28] proposed the method of automatic robotic assembly of two or more parts placed without fixing instrumentation and positioning on the pallet, which could support a robotic assembly process based on data from optical 3D scanners. The sequence of operations from scanning to place the parts in the installation position by an industrial robot was developed. Suszynski et al. [
29] presented the concept of using an industrial robot equipped with a triangulation scanner in the assembly process in order to minimize the number of clamps that could hold the units in a particular position in space based on the proposed multistep processing algorithm.
These efforts have brought about many improvements in the feature points of the target weldment. However, there are certain limitations in the positioning accuracy due the factors such as the change of the welding type (especially oriented to complex welding seam) or the surface defects of the welding.
Due to these circumstances, we here introduce a novel seam tracking technique with a four-step method. First, a laser sensor is used to scan the groove of the weldment to collect profile data; then the data are processed by a filtering algorithm to smooth the noise; next, the second derivative algorithm is proposed to initially locate the feature points based on linear fitting to accurately locate the feature points; finally, according to the results of the sensor pose calibration, the three-dimensional coordinates in the base coordinate system of the welding robot are calculated from the two-dimensional coordinates of the image feature points, and the path planning is completed, with both the line and curve of the Y-shaped groove being targeted as well. The proposed seam tracking technique is tested and verified by way of experimental investigation.
Our proposed seam tracking technique with a four-step method utilizes edge detection and curvature recognition techniques based on laser scan data. The offset of the welding robot’s motion with respect to the welding seam is measured by a laser sensor. By adding a differential point searching method, the feature points of the cross-section of the welding seam are found. Comparing to other seam tracking algorithms, we show the improvement of the required welding accuracy oriented to complex welding seam through theoretical proof, simulation, and experiments.
This paper is organized as follows:
Section 2 presents the seam tracking system composition;
Section 3 introduces the seam tracking methodology with four steps;
Section 4 shows the results of the experimental investigation based on the proposed seam tracking technique;
Section 5 gives the conclusion and perspective.
4. Experimental Procedures
Experimental demonstration had been carried out at the proposed seam tracking method with four steps to guide the movement of the welding torch under actual testing conditions.
Figure 9 reveals the prototype of whole experimental system, which mainly includes ABB IRB 1410 welding robot, IRC5 controller, LS-100CN laser sensor, Ehave CM350 welding power supply, RS-485 communication module, and an industrial computer.
In this paper, two typical weldments with materials of A304 stainless steel are selected as the welding objects, the physical prototypes of two typical welding grooves are illustrated in
Figure 10, and the groove parameters of the weldment with straight line and curve are listed in
Table 5.
When scanning the welding groove, the laser sensor is set to the trigger mode, and the welding robot is constantly moved to obtain the overall shape characteristics of the welding seam. The process of scanning two typical welding grooves by the laser sensor is represented in
Figure 11.
Before the experiment, we mark the starting and ending points of the welding path on the weldment, and then the straight and curved grooves are respectively taught a section of motion trajectory in the model of “teach”, as shown in
Figure 10. The red point is the teaching point, which is the position of the end point of the robotic welding torch. Multiple teaching points are connected to form a welding trajectory, and the pose data of the teaching trajectory in the welding torch coordinate system will be recorded simultaneously, which is used as a reference to calculate the experimental deviation.
During the experiment, if the straight groove is taken as an example, let us first move the end-effector of the robot, i.e., the welding torch, along the teaching trajectory. When it reaches reference point
L1, as shown in
Figure 10a, the laser sensor will be turned on to scan the welding groove and collect data. At the same time, the current tool coordinate system of the welding robot will be switched to the end coordinate system, the position and posture data of the end coordinate system are obtained in real time through the API interface of the welding robot, and the sampling period is consistent with that of the laser sensor.
The welding robot continues to move. When the end of the welding torch moves to reference point
L2, as shown in
Figure 10a, the laser sensor will be turned off, the data transmission of the API interface is stopped, the data collection is completed. According to the feature points of the groove, the center point of the welding torch is calculated; according to the position and posture data of the end coordinate system obtained by API interface, the trajectory reference point is calculated. Through the calibration matrix of laser sensor (Formula (7)), the position data of the welding torch center point is transformed into the welding robot end coordinate system, and then through the calibration matrix of welding torch, it is transformed into the welding torch coordinate system.
After the above process, the groove data collected by the laser sensor are transformed into the center point data of the robotic welding torch, and the end coordinate system data collected by the API interface are transformed into the trajectory reference point data. The experimental results of two different welding grooves of straight and curved lines with both initial positioning and precise positioning using the proposed seam tracking method are compared in
Figure 12.
The accuracy of the feature points positioning method is evaluated by comparing the deviation between the calculated welding center point and the actual welding torch end point. Among them, the average deviation
d (mm) represents the average value of the difference between each welding center point and the end point of the welding torch; the deviation degree
p (%) indicates the deviation degree of the deviation in this direction relative to the entire groove. The average deviation
d (mm) and deviation degree
p (%) can be written as:
where
dx and
dz are the average deviation in the
X and
Z directions, respectively.
xtcp(i) and
ztcp(i) are the coordinates of the welding center point,
xt(i) and
zt(i) are the coordinates of the trajectory reference point, respectively.
n is the number of points.
where
px and
pz are the deviation degrees the in
X and
Z directions, respectively.
l is the total length of the groove, and
h is the depth of the groove.
The comparative results of different positioning methods for feature points are depicted in
Table 6. As can be seen from the figures and table, the average deviations
dx (mm) of the two different welding seams of both straight line and curve in the
X direction are relatively large when only initial positioning is carried out. After precise positioning, the average deviations are reduced to 0.387 mm and 0.429 mm, respectively. Experimental procedures show promising results, in that the average deviations display a significant decrease by 38.38% and 41.71%, respectively.
It is worth noting that the average deviations in both
X and
Z direction of two different welding seams of both straight line and curve after precise positioning are no more than 0.5 mm; this value is defined by Kovacevic et al. [
42] and could fulfill the minimum accuracy requirements of robotic welding. Therefore, it is suggested that the proposed seam tracking method with four steps is feasible and effective, and provides a reference for future seam tracking research.