- freely available
Sensors 2018, 18(1), 129; https://doi.org/10.3390/s18010129
2. Configuration of the Experiment Platform
3. The Synchronous Acquisition Method to Capture Various Kinds of Images
4. Elimination of the Interferences from Arc Light and Spatters by Fusing Adjacent Images
5. Processing Method When the Directional Light Source Is Enabled
6. Processing Method When the Structured Light Source is Enabled
7. Information Fusion Method for Directional and Structured Light Images
- Denosing. When L1 or L2 are enabled, Equation (2) is used to eliminate the arc light and spatters in the images; when L3 is enabled, Equation (1) is used instead to eliminate the arc light and spatters.
- Directional light image processing. First, the processing method proposed in Section 5 is used to calculate the curve of the largest gradient point numbers, as shown in Figure 13. Then, all of the valid peaks in Figure 13 are found using the thresholding and non-maximum suppression method as illustrated in Section 5. Denote the set containing all of the valid peaks by A(p). For each valid peak point pi in A(p), record its confidence interval [pi − li, pi + ri], in which the values of the curve in Figure 13 are not less than 50% of the peak values.
- Structured light image processing. As mentioned in Section 6, the second derivative values of the laser stripe curve are calculated, and the candidate points are found using thresholding method. Denote the set containing all of the candidate points by B(q). Record the second derivative value dq,j at each point qj.
- Information fusion. The actual positions of the borders are expected to lie in the confidence interval of the set A(p) and belong to the set B(q). For each element pi in A(p), detect whether there is any element in B(q) that is located in the confidence interval [pi − li, pi + ri]. If these elements exist in B(q), the element sk with largest second derivative value dp,j is most likely to be the actual position of the border; if not, just ignore pi. After these processing steps, a new candidate point set C(s) containing all sk can be obtained.
- Non-maximum suppression. The non-maximum suppression algorithm is applied to the set C(s), eliminating the elements close to each other. For the cases studied in this paper, the distance threshold of non-maximum suppression process is set to 50 pixels. The final detection result is recorded in the set C(s) after non-maximum suppression.
8. Experiments and Discussions
Conflicts of Interest
- Multiple Pass Weave with TIG. Available online: https://www.pinterest.com/kristen4418/welding/ (accessed on 22 October 2017).
- You, D.; Gao, X.; Katayama, S. WPD-PCA-based laser welding process monitoring and defects diagnosis by using FNN and SVM. IEEE Trans. Ind. Electron. 2015, 62, 628–636. [Google Scholar] [CrossRef]
- Gao, X.; You, D.; Katayama, S. Seam tracking monitoring based on adaptive Kalman filter embedded Elman neural network during high-power fiber laser welding. IEEE Trans. Ind. Electron. 2012, 59, 4315–4325. [Google Scholar] [CrossRef]
- You, D.; Gao, X.; Katayama, S. Multisensor fusion system for monitoring high-power disk laser welding using support vector machine. IEEE Trans. Ind. Inform. 2014, 10, 1285–1295. [Google Scholar]
- Jager, M.; Humbert, S.; Hamprecht, F.A. Sputter tracking for the automatic monitoring of industrial laser-welding processes. IEEE Trans. Ind. Electron. 2008, 55, 2177–2184. [Google Scholar] [CrossRef]
- Umeagukwu, C.; Maqueira, B.; Lambert, R. Robotic acoustic seam tracking: System development and application. IEEE Trans. Ind. Electron. 1989, 36, 338–348. [Google Scholar] [CrossRef]
- Estonchen, E.L.; Neuman, C.P.; Prinz, F.B. Application of acoustic sensors to robotic seam tracking. IEEE Trans. Ind. Electron. 1984, 31, 219–224. [Google Scholar] [CrossRef]
- Umeagukwu, C.; McCormick, J. Investigation of an array technique for robotic seam tracking of weld joints. IEEE Trans. Ind. Electron. 1991, 38, 223–229. [Google Scholar] [CrossRef]
- Sung, K.; Lee, H.; Choi, Y.S.; Rhee, S. Development of a multiline laser vision sensor for joint tracking in welding. Weld. J. 2009, 88, 79s–85s. [Google Scholar]
- Villan, A.F.; Acevedo, R.G.; Alvarez, E.A.; Lopez, A.C.; Garcia, D.F.; Fernández, R.U.; Meana, M.J.; Sanchez, J.M.G. Low-cost system for weld tracking based on artificial vision. IEEE Trans. Ind. Appl. 2011, 47, 1159–1167. [Google Scholar] [CrossRef]
- Zhang, L.; Ye, Q.; Yang, W.; Jiao, J. Weld line detection and tracking via spatial-temporal cascaded hidden Markov models and cross structured light. IEEE Trans. Instrum. Meas. 2014, 64, 742–753. [Google Scholar] [CrossRef]
- Li, Y.; Li, Y.; Wang, Q.; Xu, D.; Tan, M. Measurement and defect detection of the weld bead based on online vision inspection. IEEE Trans. Instrum. Meas. 2010, 59, 1841–1849. [Google Scholar]
- Muhammad, J.; Altun, H.; Abo-Serie, E. Welding seam profiling techniques based on active vision sensing for intelligent robotic welding. Int. J. Adv. Manuf. Technol. 2017, 88, 127–145. [Google Scholar] [CrossRef]
- Muhammad, J.; Altun, H.; Abo-Serie, E. A robust butt welding seam finding technique for intelligent robotic welding system using active laser vision. Int. J. Adv. Manuf. Technol. 2016, 94, 1–17. [Google Scholar] [CrossRef]
- Li, L.; Fu, L.; Zhou, X.; Li, X. Image processing of seam tracking system using laser vision. Robot. Weld. Intell. Autom. 2007, 362, 319–324. [Google Scholar]
- Gu, W.; Xiong, Z.; Wan, W. Autonomous seam acquisition and tracking system for multi-pass welding based on vision sensor. Int. J. Adv. Manuf. Technol. 2013, 69, 451–460. [Google Scholar] [CrossRef]
- Zeng, J.; Chang, B.; Du, D.; Hong, Y.; Chang, S.; Zou, Y. A precise visual method for narrow butt detection in specular reflection workpiece welding. Sensors 2016, 16, 1480. [Google Scholar] [CrossRef] [PubMed]
- Fang, Z.; Xu, D. Image-based visual seam tracking system for fillet joint. In Proceedings of the IEEE International Conference on Robotics and Biomimetics, Guilin, China, 19–23 December 2009; pp. 1230–1235. [Google Scholar]
- Fang, Z.; Xu, D.; Tan, M. Visual seam tracking system for butt weld of thin plate. Int. J. Adv. Manuf. Technol. 2010, 49, 519–526. [Google Scholar] [CrossRef]
- Fang, Z.; Xu, D.; Tan, M. A vision-based self-tuning fuzzy controller for fillet weld seam tracking. IEEE/ASME Trans. Mechatron. 2011, 16, 540–550. [Google Scholar] [CrossRef]
- Zhang, L.; Jiao, J.; Ye, Q.; Han, Z.; Yang, W. Robust weld line detection with cross structured light and hidden Markov model. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Chengdu, China, 5–8 August 2012; pp. 1411–1416. [Google Scholar]
- Zeng, J.; Chang, B.; Du, D.; Peng, G.; Chang, S.; Hong, Y.; Wang, L.; Shan, J. A vision-aided 3D path teaching method before narrow butt joint welding. Sensors 2017, 17, 1099. [Google Scholar] [CrossRef]
- Huang, W.; Kovacevic, R. Development of a real-time laser-based machine vision system to monitor and control welding processes. Int. J. Adv. Manuf. Technol. 2012, 63, 235–248. [Google Scholar] [CrossRef]
- Li, Y.; Wang, Q.; Li, Y.; Xu, D.; Tan, M. Online visual measurement and inspection of weld bead using structured light. In Proceedings of the IEEE International Instrumentation and Measurement Technology Conference, Victoria, BC, Canada, 12–15 May 2008; pp. 2038–2043. [Google Scholar]
- Shi, Y.; Wang, G.; Li, G. Adaptive robotic welding system using laser vision sensing for underwater engineering. In Proceedings of the IEEE International Conference on Control and Automation, Guangzhou, China, 30 May–1 June 2007; pp. 1213–1218. [Google Scholar]
- Kim, J.; Bae, H. A study on a vision sensor system for tracking the I-butt weld joints. J. Mech. Sci. Technol. 2005, 19, 1856–1863. [Google Scholar] [CrossRef]
- Li, X.; Li, X.; Ge, S.S.; Khyam, M.O.; Luo, C. Automatic welding seam tracking and identification. IEEE Trans. Ind. Electron. 2017, 64, 7261–7271. [Google Scholar] [CrossRef]
- Zeng, J.; Chang, B.; Du, D.; Hong, Y.; Zou, Y.; Chang, S. A visual weld edge recognition method based on light and shadow feature construction using directional lighting. J. Manuf. Process. 2016, 24, 19–30. [Google Scholar] [CrossRef]
- Moon, H.S.; Kim, Y.B.; Beattie, R.J. Multi sensor data fusion for improving performance and reliability of fully automatic welding system. Int. J. Adv. Manuf. Technol. 2006, 28, 286–293. [Google Scholar] [CrossRef]
- He, Y.; Xu, Y.; Chen, Y.; Chen, H.; Chen, S. Weld seam profile detection and feature point extraction for multi-pass route planning based on visual attention model. Robot. Comput. Integr. Manuf. 2016, 37, 251–261. [Google Scholar] [CrossRef]
- Chang, D.; Son, D.; Lee, J.; Lee, D.; Kim, T.W.; Lee, K.Y.; Kim, J. A new seam-tracking algorithm through characteristic-point detection for a portable welding robot. Robot. Comput. Integr. Manuf. 2012, 28, 1–13. [Google Scholar] [CrossRef]
- Du, D.; Wang, S.; Wang, L. Study of vision sensing technology in seam recognition based on analyzding target feature. Trans. China Weld. Inst. 2008, 29, 108–112. [Google Scholar]
- Jinle, Z.; Yirong, Z.; Dong, D.; Baohua, C.; Jiluan, P. Research on a visual weld detection method based on invariant moment features. Ind. Robot Int. J. 2015, 42, 117–128. [Google Scholar] [CrossRef]
- Zou, Y.; Du, D.; Zeng, J.; Zhang, W. Visual method for weld seam recognition based on multi-feature extraction and information fusion. Trans. China Weld. Inst. 2008, 34, 33–36. [Google Scholar]
- Krämer, S.; Fiedler, W.; Drenker, A.; Abels, P. Seam tracking with texture based image processing for laser material processing. In Proceedings of the International Society for Optics and Photonics, High-Power Laser Materials Processing: Lasers, Beam Delivery, Diagnostics and Applications III, San Francisco, CA, USA, 20 February 2014; Volume 8963, p. 89630P-1-9. [Google Scholar]
- Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
|Image Processing Methods||Detailed Research Works|
|Image pre-processing methods for denoising|
|Laser stripe pattern extraction methods|
|Welding joint feature extraction and profiling methods|
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).