Open Access
This article is
 freely available
 reusable
Symmetry 2019, 11(11), 1385; https://doi.org/10.3390/sym11111385
Article
A VisionBased Robotic Laser Welding System for Insulated Mugs with Fuzzy Seam Tracking Control
^{1}
Zhejiang Key Lab of Robotics and Intelligent Manufacturing Equipment Technology, Ningbo Institute of Material Technology and Engineering, Chinese Academy of Sciences (CAS), Ningbo 315201, China
^{2}
Zhejiang Haers Vacuum Containers Co., Ltd., Yongkang 321300, China
^{*}
Author to whom correspondence should be addressed.
Received: 23 October 2019 / Accepted: 5 November 2019 / Published: 8 November 2019
Abstract
:The symmetrical insulated mug is composed of two layers. The two ends of the two layers form the mouth and bottom seams of the insulated mug. The weld quality of the two seams is very important to keep the vacuum degree of the air between the two layers, which is vital for the heatinsulating property of the mug. Due to the narrow seam, laser welding is used. Since laser welding has high demand on the relative position of the seam and the laser torch, a visionbased seam tracking system is designed. Before welding is started, the vision sensor scans the seam and feature sample points are collected. A reconstruction algorithm is proposed to form the image containing the seam. Then, a least square fitting (LSF) method combined with random sample consensus (RANSAC) method is proposed to detect the smooth seam from the sample points. In the welding process, a seam tracking system with fuzzy logic control method is presented to keep the torch precisely on the seam. Finally, full experiments are conducted in the welding factory of the insulated mugs to verify the effectiveness of the proposed system and method.
Keywords:
welding robot; laser welding; vision sensor; insulated mug; RANSAC; seam tracking; fuzzy control1. Introduction
Stainless steel insulated mugs are currently popular due to their heatinsulating properties. Such mugs are usually composed of two layers, i.e., inner layer and out layer, between which a vacuum area is kept. One end joint of the two layers forms the mouth seam of the insulated mug. The other end joint of the two layers forms the bottom seam. The two seams need to be welded with high quality to keep the insulation performance. At present, the mouth and bottom seams of the insulated mug are almost welded manually, which is increasingly unable to meet the market demand for the insulated mugs with high quality.
To improve the weld efficiency and quality, new welding technology that is fit for the insulated mug needs to be adopted. Recently, laser welding has been increasingly used in welding for thin plate. It is an efficient and precise welding method using high energy density laser beam as heat source, which has many merits such as small heat affected area, small heat deformation, deep penetration, no need for welding fillers, high welding speed, and so on [1,2,3]. It is very suitable for narrow seams of thin plate like the mouth and bottom seams of the insulated mug. Since the speed of the laser welding is very high, the torch cannot be moved manually. It is should be installed on a robotic system. Meanwhile, to guarantee the weld quality, the laser welding demands that the position of the torch relative with the seam is precise enough. Due to the shape and position differences of the insulated mugs, a seam tracking system needs to be designed to keep the torch precisely on the seam during the welding process.
Welding robots have been widely adopted in industries and have greatly increased the weld efficiency and quality consistency [4,5,6]. To make the welding robots adapt to the position difference of the seams from the robot teaching stage, seam tracking systems have been added to the robots. Several kinds of sensors have been used in the robotic seam tracking systems, such as throughthearc sensors [7], inductive sensors [8], ultrasonic sensors [9] and vision sensors [10,11,12,13,14,15]. Among these sensors, vision sensors attract the attention of the researchers in the robotics field recently due to their noncontact measurement, rich information and high precision. Some researchers just use a monocular camera as the vision sensor to detect the position of the seam. In [12,13], the initial position of the weld seam was located based on template match algorithm using monocular camera. Chen et al. [14] used a monocular camera to detect the narrow seam in container plate and designed a robust visual servo control method for seam tracking. Some researchers combine the monocular camera with structured laser light as the vision sensor to detect the weld seam. Unlike the former kind of vision sensor, this kind can get three dimensional information of the seam based on triangulation principle. Fang et al. [15,16] used a vision sensor with structured laser light to detect the fillet seam and designed a selftuning fuzzy logic seam tracking control system. Xu et al. [17] proposed a seam tracking system based on vision sensor with structured laser light to improve the welding quality of robotic gas metal arc welding. In [18], a seam tracking system based on a vision sensor was designed for welding a thin plate, in which a decoupled visual measurement method was presented at horizontal and vertical directions.
In this paper, a robotic welding system is designed for welding the mouth and bottom seams of the stainless steel insulated mug. To increase the weld efficiency and quality further, laser welding is adopted. Before welding is started, a vision sensor with structured laser light scans the seam and feature sample points are obtained. To get smooth seam trajectory, random sample consensus (RANSAC) algorithm is proposed to eliminate the outlier points and LSF method is applied to the remaining inner points. When the welding is started, seam tracking control system based on fuzzy logic is proposed to keep the torch precisely on the seam. Experiments on the insulated mugs verify the effectiveness of the proposed method.
The main contributions of this paper are as follows. Firstly, vision sensor based on laser structured light is adopted to the robotic welding of the insulated mugs to improve the welding quality greatly. Secondly, a novel image reconstruction algorithm is designed to form the seam image from the sampled points. Thirdly, RANSAC and LSF algorithms are proposed to get the smooth seam from the image.
The rest of this paper is organized as follows. Section 2 describes the configuration of the robotic welding system for the insulated mugs. In Section 3, the visual seam extraction based on LSF and RANSAC is given. The detailed presentation of the proposed fuzzy seam tracking control system is given in Section 4. Section 5 illustrates the experimental results and Section 6 concludes this paper.
2. The Configuration of the Robotic Laser Welding System
The configuration of the robotic welding system for the insulated mugs is shown in Figure 1, which is mainly composed of the following seven parts: a welding robot, a vision sensor, a cross slider, some fixtures, a conveyer, a laser welding power source and a control system. The welding robot has six degrees of freedom. In the welding process, the robot holds the welding torch and tracks the seams of the insulated mugs. To make the seam tracking accurate enough, a vision sensor with laser structured light is used. It scans the mouth and bottom seams of the insulated mugs and sends the seam information to the seam tracking controller. The vision sensor is moved by a cross slider. In the welding process, the insulated mugs are fixed by the fixtures to keep the position of the mugs stable. The fixtures are put on the conveyer. Since the mouth and bottom seams of the insulated mugs are very narrow, to increase the welding efficiency and quality, the laser welding power source is used. The control system is used to realize the logical and process control of the above mentioned parts of the robotic welding system.
3. Weld Seam Detection Based on Vision Sensor
The mouth and bottom seams of the insulated mugs are welded to keep vacuum property of the air between the inner and outer layers. Since the seams are very narrow and the demand for the weld quality is high, laser welding is adopted. This kind of welding technology needs the position of the torch to be precisely controlled. Due to the high speed of the laser welding, the position of the torch can not be controlled manually. In this paper, an automatical robotic welding system with visionbased seam tracking is proposed. Before welding is started, a cross slider moves the vision sensor with laser structured light across the seams of the mugs. Then, the arc is on and the extracted seam position points are used to guide the torch precisely along the seams during the welding process. In the following sections, the principle of the vision measurement for the seams of the insulated mugs and the main procedures of the feature extraction are described.
3.1. The Reconstruction of the Image from the Sampled Points
As shown in Figure 1, the robotic welding for each insulated mug consists of two stages. Firstly, the vision measurement for the weld seam is completed. Then, the robot finishes the welding along the seam guiding the laser torch. At the first stage, the laserstructured vision sensor scans the seam from the right to the left moved by the cross moving axis. The detailed scanning process is shown in Figure 2. It can be seen that at each time, only two points can be obtained due to the intersection of the laser plane and the seam. For designing the seam tracking controller, an image containing the whole seam points needs to be reconstructed from the images scanned at each time. The reconstructed image can be described as
where ${I}_{r}$ is the reconstructed image, ${I}_{i},(i=1,2,3,\dots ,n)$ is the image scanned at the time ${t}_{i}$.
$$\begin{array}{c}\hfill {I}_{r}={I}_{1}+{I}_{2}+\cdots +{I}_{n}\end{array}$$
3.2. The Computation of the Coordinates of the Seam Points Based on the Vision Sensor
Since depth information of the feature point cannot be obtained with single camera, the vision sensor with a camera and a laserbased light source is used in this paper, as shown in Figure 3. The computation of the threedimensional coordinates of the seam points is based on the principle of triangulation.
For the convenience of analysis, two coordinate frames are established. The camera frame C is established at the optical center of the camera. Its xaxis and yaxis are the same as those of the image plane. Its zaxis is accordance with the optical line of the camera. The robot frame R is established at the base center of the robot according to the definition of the robot manufacturer.
Suppose the intersection point of the laser plane emitted by the laser emitter and the seam of the insulated cup is ${P}_{i}$. From the pinhole model of the camera, the following equation can be given
where $({x}_{ci},{y}_{ci},{z}_{ci})$ are the coordinates of the point ${P}_{i}$ in the camera frame, $({u}_{i},{v}_{i})$ are the corresponding coordinates of the point in the image, M is the intrinsic matrix of the camera, ${k}_{x}$ and ${k}_{y}$ are the magnification coefficients from the image plane to the image coordinates in xaxis and yaxis, respectively, $({u}_{0},{v}_{0})$ is the coordinates of the principle point of the camera. The above intrinsic parameters of the camera can be obtained using the camera calibration method [19].
$$\begin{array}{c}\hfill {z}_{ci}\left[\begin{array}{c}{u}_{i}\\ {v}_{i}\\ 1\end{array}\right]=M\left[\begin{array}{c}{x}_{ci}\\ {y}_{ci}\\ {z}_{ci}\end{array}\right]=\left[\begin{array}{ccc}{k}_{x}& 0& {u}_{0}\\ 0& {k}_{y}& {v}_{0}\\ 0& 0& 1\end{array}\right]=\left[\begin{array}{c}{x}_{ci}\\ {y}_{ci}\\ {z}_{ci}\end{array}\right]\end{array}$$
The seam point ${P}_{i}$ is on the laser plane emitted by the laser emitter, as shown in Figure 3. Therefore, the following equation can be given
where a, b and c are the parameters of the laser plane, which can be obtained using the laser plane calibration method [20,21].
$$\begin{array}{c}\hfill a{x}_{ci}+b{y}_{ci}+c{z}_{ci}+1=0\end{array}$$
Given the coordinates of the point ${P}_{i}$ in the image, its threedimensional coordinates in the camera frame can be computed from (2) and (3) as follows.
$$\begin{array}{c}\hfill \left\{\begin{array}{c}{x}_{ci}={\displaystyle \frac{{z}_{ci}({u}_{i}{u}_{0})}{{k}_{x}}}\hfill \\ {y}_{ci}={\displaystyle \frac{{z}_{ci}({v}_{i}{v}_{0})}{{k}_{y}}}\hfill \\ {z}_{ci}={\displaystyle \frac{{k}_{x}{k}_{y}}{a{k}_{y}({u}_{i}{u}_{0})+b{k}_{x}({v}_{i}{v}_{0})+c{k}_{x}{k}_{y}}}\hfill \end{array}\right.\end{array}$$
Since the seam tracking control is realized in the robot frame, the seam points in the camera frame as given by (4) need to be transformed to the robot frame as follows.
where $({x}_{ri},{y}_{ri},{z}_{ri})$ are the coordinates of the seam point ${P}_{i}$ in the robot frame, ${T}_{cr}$ is the transformation matrix from the camera frame to the robot frame, which can be calibrated using the $ICP$ method [22].
$$\begin{array}{c}\hfill \left[\begin{array}{c}{x}_{ri}\\ {y}_{ri}\\ {z}_{ri}\end{array}\right]={T}_{cr}\left[\begin{array}{c}{x}_{ci}\\ {y}_{ci}\\ {z}_{ci}\end{array}\right]\end{array}$$
3.3. Circle Seam Detection Based on LSF and Visual Points
Due to the light reflection of the stainless steel of the mugs and the moving influence of the cross slider, the captured seam position points may contain lots of noise. If the seam data is directly used to the seam tracking system, the torch may tremble abruptly and the welding quality can not be guaranteed. Since the seams to be welded are the mouth and bottom of the mug, the seam shape is a circle in 3D space. To detect the smooth circle seam in 3D space, this paper firstly extract the circle seam based on LSF method using the seam position data from the vision sensor. Then, RANSAC algorithm is used to eliminate the outlier points and the precise circle seam is detected using the remaining inner points based on the LSF again.
After the vision sensor with laser structured light scans the mouth and bottom seams of the insulated mug, some discrete visual points are obtained. To detect the circle seam in 3D space from the visual points, LSF method is proposed. The circle in 3D space can be considered as the intersection of a plane and a sphere. As shown in Figure 4, ${p}_{l}$ is the plane fitted based on the visual seam points, ${s}_{h}$ is the fitted sphere, ${s}_{m}$ is the extracted mouth or bottom seam. Therefore, to detect the mouth and bottom seams, the plane and sphere should be firstly extracted based on the visual seam points. The plane in 3D space can be described by
where a, b and c are the parameters of the plane.
$$\begin{array}{c}\hfill ax+by+cz=1\end{array}$$
We can assume that the visual points lie on the same plane. If there are n points extracted from the vision sensor, the following equation can be given
where $({x}_{i},{y}_{i},{z}_{i})(i=1,2,3,\cdots ,n)$ are the visual points, ${X}_{p}={[a,b,c]}^{T}$ are the parameters of the plane. According to the LSF method, ${X}_{p}$ can be computed by
$$\begin{array}{c}\hfill A{X}_{p}=\left[\begin{array}{ccc}{x}_{1}& {y}_{1}& {z}_{1}\\ {x}_{2}& {y}_{2}& {z}_{2}\\ \vdots & \vdots & \vdots \\ {x}_{n}& {y}_{n}& {z}_{n}\end{array}\right]{X}_{p}=B=\left[\begin{array}{c}1\\ 1\\ \vdots \\ 1\end{array}\right]\end{array}$$
$$\begin{array}{c}\hfill {X}_{p}={\left({A}^{T}A\right)}^{1}{A}^{T}B\end{array}$$
At the same time, the visual points lie on the same sphere. The sphere in 3D space is described by
where $({x}_{0},{y}_{0},{z}_{0})$ is the center of the sphere, r is the radius of the sphere.
$$\begin{array}{c}\hfill {(x{x}_{0})}^{2}+{(y{y}_{0})}^{2}+{(z{z}_{0})}^{2}={r}^{2}\end{array}$$
Based on the indirect adjustment model [23], the following model can be given
where
$$\begin{array}{c}\hfill V=H{X}_{s}L\end{array}$$
$$\begin{array}{c}\hfill V=\left[\begin{array}{c}{v}_{1}\\ {v}_{2}\\ \vdots \\ {v}_{n}\end{array}\right],\phantom{\rule{1.em}{0ex}}H=\left[\begin{array}{cccc}2{x}_{1}& 2{y}_{1}& 2{z}_{1}& 1\\ 2{x}_{2}& 2{y}_{2}& 2{z}_{2}& 1\\ \vdots & \vdots & \vdots & \vdots \\ 2{x}_{n}& 2{y}_{n}& 2{z}_{n}& 1\end{array}\right]\end{array}$$
$$\begin{array}{c}\hfill {X}_{s}=\left[\begin{array}{c}{x}_{0}\\ {y}_{0}\\ {z}_{0}\\ {x}_{0}^{2}+{y}_{0}^{2}+{z}_{0}^{2}{r}^{2}\end{array}\right],\phantom{\rule{1.em}{0ex}}L=\left[\begin{array}{c}{x}_{1}^{2}+{y}_{1}^{2}+{z}_{1}^{2}\\ {x}_{2}^{2}+{y}_{2}^{2}+{z}_{2}^{2}\\ \vdots \\ {x}_{n}^{2}+{y}_{n}^{2}+{z}_{n}^{2}\end{array}\right]\end{array}$$
From (10), we can get
where P is the identity matrix of order n. After ${X}_{s}$ is determined, the center $({x}_{0},{y}_{0},{z}_{0})$ and radius r of the sphere can be computed.
$$\begin{array}{c}\hfill {X}_{s}={\left({H}^{T}PH\right)}^{1}{H}^{T}PL\end{array}$$
3.4. Precision Improvement for the Detected Circle Seams
Due to the light reflection of the stainless steel of the mugs or the moving influence of the cross slider, the captured visual points do not all lie on the real circle seams. Some points are may far away from the circle seams. We name these points as outer points. Other points are very close to the circle seams. We name these points as inner points. If all the captured visual points are used to detect the circle seams, the precision can not be guaranteed. However, if the outlier points are firstly eliminated and the remaining inner points are used to detect the circle seams, the precision can be greatly improved. In this paper, RANSAC is used to eliminate the outlier points. Then the circle seams are detected based on the remaining inner points. The flow path of the circle seam detection algorithm based on LSF and RANSAC is shown in Algorithm 1.
Algorithm 1 Circle seam detection based on LSF and RANSAC 
Input: Sample points ${P}_{i}(i=1,2,3\dots n)$ captured from the vision sensor 
Output: Plane parameters $(a,b,c)$ and sphere parameters $({x}_{0},{y}_{0},{z}_{0},r)$ 

The results of the circle seam detection is shown in Figure 5. Figure 5a shows the reconstructed image contains several outlier points such as ${p}_{1}$ to ${p}_{9}$. After the processing of the new RANSAC algorithm, the outlier points are removed from the image as shown in Figure 5b. The red circle line in Figure 5b is the final detected seam of the mug. It can be seen that the seam is well detected and smooth enough.
4. Fuzzy Seam Tracking Control
The laser welding needs the position of the torch to be precisely controlled in the welding process. Since the traditional PID controller has disadvantages in the control accuracy and dynamic response speed, it cannot adapt to the seam tracking in the laser welding with high speed. Some researchers have designed intelligent controllers in aerodynamic or urban traffic systems [24,25]. In this paper, fuzzy logic control method is adopt in the seam tracking since it can absorb the intelligence of the welding workers in the adjusting of the torch position through the rule base. The proposed fuzzy seam tracking control diagram of the laser welding for the insulated mugs is shown in Figure 6. It mainly consists of four parts, i.e., the reference seam position, the torch position feedback, the fuzzy controller and the robot joint controller. The reference seam position is computed based on the vision sensor. It is used as the reference signal of the closed loop control. The torch position is used as the feedback signal of the closed loop control. It is computed based on the robot joint encoders and the kinematics. The fuzzy controller computes the adjusting values of the end effector of the robot based on the errors and difference errors between the reference and feedback signals. The robot joint controllers controls the six joints based on the position of its end effector and robot inverse kinematics. In the following parts, the main parts of the fuzzy seam tracking control are described.
4.1. The Reference Seam Position
The robotic laser welding for insulated mugs is composed of two stages, i.e., the stage of the computation of the reference seam position and the welding stage. At the first stage, the seam is scanned by a laserbased vision sensor and the image features of the seam are extracted. The processed seam position from vision sensor is used as the reference signals of the closed loop control. At the second stage, the robot guides the torch to the mug seam and the arc is on. During the welding process, the torch is kept precisely along the seam by means of seam tracking control.
Since laser welding speed is high and the seam is short, the visual detection for the seam and the laser welding cannot execute simultaneously. Therefore, a switch denoted as ${s}_{w}$ in Figure 6 is used to divide the two stages. At the first stage, the switch is closed and the reference seam position is computed. Then, at the second stage, the switch is opened and the visual detection is stopped. The closed loop seam tracking control is kept on during the second stage.
4.2. Membership Functions
The inputs of the fuzzy controller are the error and the error change between the reference seam position and the feedback position of the robot end effector as follows.
where $e\left(t\right)$ and $de\left(t\right)$ are the error and the error change at the sample time t, respectively, ${p}_{r}$ and ${p}_{f}$ are the reference seam position and the feedback position of the robot end effector, respectively.
$$\begin{array}{cc}\hfill e\left(t\right)& ={p}_{r}{p}_{f}\hfill \end{array}$$
$$\begin{array}{cc}\hfill de\left(t\right)& =e\left(t\right)e(t1)\hfill \end{array}$$
Figure 7 gives the membership functions for the inputs and the output of the fuzzy controller. Seven fuzzy sets denoted as NB, NM, NS, ZE, PS, PM, PB are defined for the input e and the output u. The meanings of the seven fuzzy sets are as follows: NB is negative big, NM is negative middle, NS is negative small, ZE is zero, PS is positive small, PM is positive middle, PB is positive big. Five fuzzy sets denoted as NB, NS, ZE, PS, PB are defined for the input $de$.
Input and output scaling factors are used to make the universes of discourse for the inputs and output lie in the range [−1,1].
4.3. Rule Base
Compared to the traditional controller such as the PID controller, the fuzzy controller can reflect the intelligence of the welding workers to a great extent. Through the rule base of the fuzzy controller, the welding experience of the workers can be recorded as each control rule. In this paper, the rule base is established based on the deep communication between the authors and the welding workers and the authors’ understanding of the robotic laser welding system and by trial and error. The rule base of the fuzzy controller is shown in Table 1.
Each rule in the rule base has the IFTHEN form. Thus, the kth rule can be formulated as
where ${E}_{F}^{k}$, $D{E}_{F}^{k}$ and ${U}_{F}^{k}$ are the fuzzy sets for the e, $de$ and u, respectively.
$$\begin{array}{c}\hfill \mathrm{Rule}\phantom{\rule{4.pt}{0ex}}\mathrm{k}:\phantom{\rule{4.pt}{0ex}}\mathrm{if}\phantom{\rule{4.pt}{0ex}}e\phantom{\rule{4.pt}{0ex}}\mathrm{is}\phantom{\rule{4.pt}{0ex}}{E}_{F}^{k}\phantom{\rule{4.pt}{0ex}}\mathrm{and}\phantom{\rule{4.pt}{0ex}}de\phantom{\rule{4.pt}{0ex}}\mathrm{is}\phantom{\rule{4.pt}{0ex}}D{E}_{F}^{k},\phantom{\rule{4.pt}{0ex}}\mathrm{then}\phantom{\rule{4.pt}{0ex}}u\phantom{\rule{4.pt}{0ex}}\mathrm{is}\phantom{\rule{4.pt}{0ex}}{U}_{F}^{k}.\end{array}$$
In establishing the rule base, two fundamental principles are to be followed.
(1) When the input error of the controller is large enough, the output of the controller should quickly remove the error. For example, if e is NB and $de$ is NB, then u is PB.
(2) When the input error of the controller is quickly reducing, the output of the controller should reduce accordingly to avoid much overshoot. For example, if e is NB and $de$ is PB, then u is ZE.
4.4. Defuzzification
The commonly used center of gravity method widely used in the fuzzy field is adopted as the defuzzification method given by
where u is the crisp output of the fuzzy controller, ${c}_{h}$ is the center of the membership function of the fuzzy set ${U}_{h}$, ${\mu}_{k}({E}_{j},D{E}_{k},{U}_{h},e,de)$ is the membership value of the implied fuzzy set k of the k rule, which is computed with the MaxMin inference method.
$$\begin{array}{c}\hfill u=\frac{{\sum}_{i=1}^{35}{c}_{h}{\mu}_{k}({E}_{j},D{E}_{k},{U}_{h},e,de)}{{\sum}_{i=1}^{35}{\mu}_{k}({E}_{j},D{E}_{k},{U}_{h},e,de)}\end{array}$$
4.5. Output Verification
The deviation of the torch from the seam must lie in a small range due to the fixing function of the fixture for the mugs and the limited shape difference of the mugs. Therefore, to guarantee the reliability of the seam tracking control system, the output of the controller at each sample time and the total output during the welding process for each mug must be confined to a specific range. Thus, as an example, the output of the controller at the x direction is given as follows.
where ${u}_{o}$ is the output of the controller at the x direction at each sample time, ${s}_{p}$ is the total output of the controller for welding each mug, ${u}_{l}$ and ${s}_{l}$ are the output threshold values at each sample time and the whole welding process, respectively.
$$\begin{array}{c}\hfill \left\{\begin{array}{c}{u}_{o}=u,\phantom{\rule{1.em}{0ex}}\mathrm{if}\phantom{\rule{4.pt}{0ex}}{s}_{p}<{s}_{l}\phantom{\rule{4.pt}{0ex}}\mathrm{and}\phantom{\rule{4.pt}{0ex}}\leftu\right<{u}_{l}\hfill \\ {u}_{o}=0,\phantom{\rule{1.em}{0ex}}\mathrm{if}\phantom{\rule{4.pt}{0ex}}{s}_{p}\ge {s}_{l}\hfill \\ {u}_{o}={u}_{l},\phantom{\rule{1.em}{0ex}}\mathrm{if}\phantom{\rule{4.pt}{0ex}}{s}_{p}<{s}_{l}\phantom{\rule{4.pt}{0ex}}\mathrm{and}\phantom{\rule{4.pt}{0ex}}u<{u}_{l}\hfill \\ {u}_{o}={u}_{l},\phantom{\rule{1.em}{0ex}}\mathrm{if}\phantom{\rule{4.pt}{0ex}}{s}_{p}<{s}_{l}\phantom{\rule{4.pt}{0ex}}\mathrm{and}\phantom{\rule{4.pt}{0ex}}u\ge {u}_{l}\hfill \end{array}\right.\end{array}$$
5. Experiments and Results
To test the effectiveness of the proposed robotic laser welding system for insulated mugs, experiments are well conducted. The experimental setup is shown in Figure 8. The industrial robot was $ST\ddot{A}UBLI$ TX90, whose degrees of freedom was 6, load was 20 kg, arm exhibition was 1000 m, repositioning precision was 0.03 mm. Cognex DS1050 was used as the three dimensional vision sensor. Nlight QLCW1200 was used the laser welding power source. The programmable logic controller (PLC) Mitsubishi FX3U80MT/ESA was used as the main controller.
The parameters used in the visual feature extraction for the seam were as follows: ${i}_{tmax}=30$, ${n}_{s}=8$, ${d}_{t}=35$ mm. The image processing costed about 20 ms. Therefore, the parameters in the control system design were set as follows: the sample time was set 50 ms, ${u}_{l}=3$ mm, ${s}_{l}=100$ mm.
The seam tracking results are shown in Figure 9, which demonstrates the seam tracking errors in the x, y and z directions, respectively. The biggest tracking error at the x direction was 0.07 mm, and the mean error was 0.04 mm. The biggest tracking error at the y direction was 0.08 mm, and the mean error was 0.03 mm. The biggest tracking error at the z direction was 0.04 mm, and the mean error was 0.01 mm. From the seam tracking results, it can be seen that high seam tracking precision was achieved using the proposed vision measurement and seam tracking control method. To better demonstrate the performance of the proposed controller, traditional PID controller was used in the comparative experiment. The seam tracking errors with the traditional PID controller are shown in Figure 10. It can be seen clearly that the performance of the proposed fuzzy controller is better than the traditional PID controller. The welded seams of the insulated mugs are shown in Figure 11. Figure 11a,b show the welded mouth seams of the insulated mugs, and Figure 11c,d show the welded bottom seams of the insulated mugs. It can be seen that the high weld quality can be achieved using the proposed robotic laser welding system and the methods described above.
6. Conclusions
A robotic laser welding system is presented for stainless steel insulated mugs. Since the mouth and bottom seams of the insulated mugs are very narrow, the position of the laser torch needs to be controlled accurately to guarantee the weld quality. Therefore, a vision sensor with laser structured light is used to detect the seam features which are used to guide the laser torch. After the seam feature points are obtained using the vision sensor, RANSAC is used to eliminate the outlier points and keep the inner points. Then, LSF method is applied to the remaining inner points to get the smooth circle seam. In the welding process, a visionbased fuzzy seam tracking control system is designed to keep the precision of the position of the laser torch. Experiments are well conducted in the manufacturing factory of the insulated mugs to verify the effectiveness of the proposed robotic laser welding system and methods. The new system and methods can effectively improve the weld efficiency and quality for the insulated mugs.
The disadvantage of the current robotic welding system is that the parameters of the fuzzy controller rely on the authors’ understanding of the system and are defined by trial and error. In the future, the datadriven method may be able to be adopted according to the definition of the parameters of the fuzzy controller. Moreover, some selftuning methods can be designed for the fuzzy controller to improve the performance further.
Author Contributions
Proposing the main idea and writing—original draft preparation, Z.F.; preparing the experimental platform, W.W. (Wenwu Weng); designing the mechanical robotic system, W.W. (Weijun Wang); writing—review and editing, C.Z. and G.Y.
Funding
This work was supported by the National Key R & D Program of China (Grant No. 2017YFB1300400), the NationalZhejiang Joint Natural Science Foundation of China (Grant No. U1509202), the Key R & D Program of Zhejiang Province (Grant No. 2018C01086), Equipment Advanced Research Fund of China (Grant No. 6140923010102).
Acknowledgments
The authors would like to acknowledge the support from the Zhejiang Haers Vacuum Containers Co., Ltd. and Suzhou Quick Laser Technology Co., Ltd. for laser welding experiments and data validation.
Conflicts of Interest
The authors declare no conflict of interest.
References
 Saheed, B.A.; Irina, L.; Asmaa, K.; Alexey, S. Effect of laser welding process parameters and filler metals on the weldability and the mechanical properties of AA7020 aluminium alloy. J. Mater. Process. Technol. 2018, 2, 33. [Google Scholar]
 Sandeep, S.; Sachin, M. Research developments in laser welding—A review. Int. J. Innov. Res. Sci. Technol. 2017, 3, 60–64. [Google Scholar]
 Chavan, R.; Ghatage, D.A.; Bhosale, K.K. Review paper on laser welding machine. Int. J. Appl. Sci. Eng. 2017, 5, 815–819. [Google Scholar]
 Tanveer, M.; Mohd, A.W.; Faizan, A. Application of robotics in welding. Int. J. Emerg. Manag. 2018, 7, 30–36. [Google Scholar]
 Chen, X.Z.; Chen, S.B. The autonomous detection and guiding of start weldidng position for arc welding robot. Ind. Robot 2010, 37, 70–78. [Google Scholar] [CrossRef]
 Zhang, Y.M.; Kovacevic, R.; Li, L. Adaptive control of full penetration gas tungsten arc welding. IEEE Trans. Control. Syst. Technol. 1996, 4, 394–403. [Google Scholar] [CrossRef]
 Bingul, Z.; Cook, G.E.; Strauss, A. Application of fuzzy logic to spatial thermal control in fusion welding. IEEE Trans. Ind. Appl. 2000, 36, 1523–1530. [Google Scholar]
 Bae, K.Y.; Park, J.H. A study on development of inductive sensor for automatic weld seam tracking. J. Mater. Process. Technol. 2006, 176, 111–116. [Google Scholar] [CrossRef]
 Mahajan, A.; Figueroa, F. Intelligent seam tracking using ultrosonic sensors for robotic welding. Robotica 1997, 15, 275–281. [Google Scholar] [CrossRef]
 Baek, D.; Moon, H.S.; Park, S.H. Development of an automatic orbital welding system with robust weaving width control and a seamtracking function for narrow grooves. Int. J. Adv. Manuf. Technol. 2017, 93, 767–777. [Google Scholar] [CrossRef]
 Du, R.Q.; Xu, Y.L.; Zhou, Z.; Shu, J.; Chen, S.B. Strong noise image processing for visionbased seam tracking in robotic gas metal arc welding. Int. J. Adv. Manuf. Technol. 2019, 101, 2135–2149. [Google Scholar] [CrossRef]
 Zhu, Z.Y.; Lin, T.; Piao, Y.J.; Chen, S.B. Recognition of the initial position of weld based on the image pattern match technology for welding robot. Int. J. Adv. Manuf. Technol. 2005, 26, 784–788. [Google Scholar] [CrossRef]
 Chen, X.Z.; Chen, S.B.; Lin, T.; Lei, Y.C. Practical method to locate the initial weld position using visual technology. Int. J. Adv. Manuf. Technol. 2006, 30, 663–668. [Google Scholar] [CrossRef]
 Chen, H.Y.; Liu, K.; Xing, G.H.; Dong, Y.; Sun, H.X.; Lin, W. A robust visual servo control system for narrow seam double head welding robot. Int. J. Adv. Manuf. Technol. 2014, 71, 1849–1860. [Google Scholar] [CrossRef]
 Fang, Z.J.; Xu, D.; Tan, M. A visionbased selftuning fuzzy controller for fillet weld seam tracking. IEEE/ASME Trans. Mechatron. 2011, 16, 540–550. [Google Scholar] [CrossRef]
 Kiddee, P.; Fang, Z.J.; Tan, M. A realtime and robust feature detection method using hierarchical strategy and modified Kalman filter for thick plate seam tracking. Int. J. Autom. Control. 2017, 11, 428–446. [Google Scholar] [CrossRef]
 Xu, Y.L.; Lv, N.; Fang, G.; Du, S.F.; Zhao, W.J.; Ye, Z.; Chen, S.B. Welding seam tracking in robotic gas metal arc welding. J. Mater. Process. Technol. 2017, 248, 18–30. [Google Scholar] [CrossRef]
 Fang, Z.J.; Xu, D.; Tan, M. Visual seam tracking system for butt weld of thin plate. Int. J. Adv. Manuf. Technol. 2010, 49, 519–526. [Google Scholar] [CrossRef]
 Wang, W.; Fang, Z.J. Improving 2D camera calibration by LORANSAC. Int. J. Inf. Electr. Eng. 2017, 7, 93–98. [Google Scholar]
 Xu, G.; Hao, Z.B.; Li, X.T.; Su, J.; Liu, H.P.; Zhang, X.Y. Calibration method of laser plane equation for vision measurement adopting objective function of uniform horizontal height of feature points. Opt. Rev. 2016, 23, 33–39. [Google Scholar] [CrossRef]
 Kiddee, P.; Fang, Z.J.; Tan, M. A practical and intuitive calibration technique for crossline structured light. OPTIK 2016, 127, 9582–9602. [Google Scholar] [CrossRef]
 He, Y.; Liang, B.; Yang, J.; Li, S.Z.; He, J. An interative closest points algorithm for registration of 3D laser scanner point clouds with geometric features. Sensors 2017, 17, 1862. [Google Scholar] [CrossRef] [PubMed]
 Shin, H.H.; Cakmak, S.; Brion, O.; Villeneuve, P.; Turner, M.C.; Goldberg, M.; Jerrett, M.; Chen, H.; Crouse, D.; Peters, P.; et al. Indirect adjustment for multiple missing variables applicable to environmental epidemiology. Environ. Res. 2014, 134, 482–487. [Google Scholar] [CrossRef] [PubMed]
 Roman, R.C.; Precup, R.E.; David, R.C. Second order intelligent proportionalintegral fuzzy control of twin rotor aerodynamic systems. Procedia Comput. Sci. 2018, 139, 372–380. [Google Scholar] [CrossRef]
 Zhang, H.B.; Liu, X.M.; Ji, H.H.; Hou, Z.S. Multiagentbased datadriven distributed adaptive cooperative conrol in urban traffic signal timing. Energies 2019, 12, 1402. [Google Scholar] [CrossRef]
Figure 4.
The illustration of the circle seam detection by the intersection of the plane and the sphere.
Figure 5.
The results of the circle seam detection: (a) seam image with outlier points, (b) detected smooth seam.
Figure 11.
The welded seams of the insulated mugs using the proposed method, (a,b) the mouth seams, (c,d) the bottom seams
de/e  NB  NM  NS  ZE  PS  PM  PB 

NB  PB  PB  PB  PM  PS  PS  ZE 
NS  PB  PM  PM  PS  PS  ZE  NS 
ZE  PM  PM  PS  ZE  NS  NM  NM 
PS  PS  ZE  ZE  NS  NM  NB  NB 
PB  ZE  NS  NS  NM  NB  NB  NB 
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).