A Deep Learning-Based Machine Vision System for Online Monitoring and Quality Evaluation During Multi-Layer Multi-Pass Welding
Abstract
1. Introduction
- ‑
- To integrate a line scanner and an infrared sensor into the welding system for the acquisition and visualization of weld bead geometry and thermal data during MLMP welding.
- ‑
- To develop a robust algorithm for detecting groove profiles, modelling weld cross-sections, and reconstructing the three-dimensional welded surface to monitor welding progress.
- ‑
- To develop a processing method for depth data to create input data in a deep learning model used in surface quality evaluation.
- ‑
- To develop an image processing algorithm based on infrared images for burn-through defect inspection during welding.
2. Experimental Setup
3. Methodologies
3.1. Groove Geometry Detection
Algorithm 1: Algorithm of Proposed Pseudo-Code to Groove Geometry Extraction. | |
Input: Line scanner profile | |
Output: Groove geometry | |
1 | Get the height and x value from the raw line scanner profile |
2 | Compute the first derivative of the line scanner profile |
3 | Determine the base material part on the left and right sides based on the slope value |
4 | Determine the groove geometry |
3.2. Cross-Section and Welded Surface Modelling for Multi-Bead Monitoring
3.3. Surface Defect Inspection Using the Line Scanner
3.4. Weld Pool Temperature Monitoring and Burn-Through Detection Using an IR Camera
Algorithm 2: Algorithm of Proposed Pseudo-Code to Burn-Through Detection. | |||
Input: Temperature data | |||
Output: Burn-through or normal status | |||
1 | Find the coordinates of the maximum temperature | ||
2 | Define a region of interest (ROI) centered at that coordinate | ||
3 | Apply a threshold to the ROI to create a binary image | ||
4 | Find contours in the binary image | ||
5 | Use contour hierarchy to identify inner contours | ||
6 | If inner contours exist: | ||
7 | Measure the minimum vertical distance from the inner contour to the top edge | ||
8 | If the distance < threshold distance: | ||
9 | Return: Burn-through | ||
10 | Else | ||
11 | Return: Normal | ||
12 | Else | ||
13 | Return: Normal |
4. Results and Discussion
4.1. Multi-Pass Weld Bead Modelling Results
4.2. Surface Defect Inspection Model Training Results
4.3. Burn-Through Defect Inspection Testing Results
4.4. Integrated System for Weld Bead Monitoring and Surface Defect Detection
5. Conclusions
- ‑
- By integrating a line scanner and an infrared sensor, this study enabled detailed acquisition and visualization of weld profiles and thermal data during the MLMP welding process.
- ‑
- The robust algorithms were developed for groove profile detection, weld cross-section modelling, and three-dimensional surface reconstruction. These algorithms allowed the precise tracking of welding progress and the quantification of distortion.
- ‑
- The normal maps generated from line scanner depth data were used as input to deep learning models for the evaluation of weld surface quality. This method improved the accuracy of defect detection including porosity and a lack of fusion even in the presence of material surface colour variations.
- ‑
- The image processing algorithm was developed and applied to infrared camera data, enabling the effective real-time detection of burn-through defects during welding. This approach allowed for timely feedback and prompt intervention to prevent critical weld failures.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- He, Y.; Li, D.; Pan, Z.; Ma, G.; Yu, L.; Yuan, H.; Le, J. Dynamic Modeling of Weld Bead Geometry Features in Thick Plate GMAW Based on Machine Vision and Learning. Sensors 2020, 20, 7104. [Google Scholar] [CrossRef]
- Xu, F.; Xiao, R.; Hou, Z.; Xu, Y.; Zhang, H.; Chen, S. Multi-Layer Multi-Pass Welding of Medium Thickness Plate: Technologies, Advances and Future Prospects. In Proceedings of the Transactions on Intelligent Welding Manufacturing; Chen, S., Zhang, Y., Feng, Z., Eds.; Springer: Singapore, 2021; pp. 3–33. [Google Scholar]
- Ye, G.; Guo, J.; Sun, Z.; Li, C.; Zhong, S. Weld Bead Recognition Using Laser Vision with Model-Based Classification. Robot. Comput. Integr. Manuf. 2018, 52, 9–16. [Google Scholar] [CrossRef]
- Zhu, J.; Wang, J.; Su, N.; Xu, G.; Yang, M. An Infrared Visual Sensing Detection Approach for Swing Arc Narrow Gap Weld Deviation. J. Mater. Process. Technol. 2017, 243, 258–268. [Google Scholar] [CrossRef]
- Yang, G.; Wang, Y.; Zhou, N. Detection of Weld Groove Edge Based on Multilayer Convolution Neural Network. Measurement 2021, 186, 110129. [Google Scholar] [CrossRef]
- Mao, Z.; Liang, W.; Yuan, F.; Deng, D. Predicting Welding Deformation of Mild Steel Bead-on-Plate Joints by Means of Artificial Neural Network and Inherent Strain Method. J. Manuf. Process. 2025, 142, 424–439. [Google Scholar] [CrossRef]
- Liu, C.; Shen, J.; Hu, S.; Wu, D.; Zhang, C.; Yang, H. Seam Tracking System Based on Laser Vision and CGAN for Robotic Multi-Layer and Multi-Pass MAG Welding. Eng. Appl. Artif. Intell. 2022, 116, 105377. [Google Scholar] [CrossRef]
- Madhvacharyula, A.S.; Pavan, A.V.S.; Gorthi, S.; Chitral, S.; Venkaiah, N.; Kiran, D.V. In Situ Detection of Welding Defects: A Review. Weld World 2022, 66, 611–628. [Google Scholar] [CrossRef]
- Zhang, Z.; Wen, G.; Chen, S. Weld Image Deep Learning-Based on-Line Defects Detection Using Convolutional Neural Networks for Al Alloy in Robotic Arc Welding. J. Manuf. Process. 2019, 45, 208–216. [Google Scholar] [CrossRef]
- Marumoto, K.; Sato, Y.; Fujinaga, A.; Takahashi, T.; Yamamoto, H.; Yamamoto, M. Development of Automation and Monitoring Methods for Narrow-Gap Hot-Wire Laser Welding Using Camera Images. Weld World 2025, 69, 269–280. [Google Scholar] [CrossRef]
- Bacioiu, D.; Melton, G.; Papaelias, M.; Shaw, R. Automated Defect Classification of Aluminium 5083 TIG Welding Using HDR Camera and Neural Networks. J. Manuf. Process. 2019, 45, 603–613. [Google Scholar] [CrossRef]
- Hua, S.; Li, B.; Shu, L.; Jiang, P.; Cheng, S. Defect Detection Method Using Laser Vision with Model-Based Segmentation for Laser Brazing Welds on Car Body Surface. Measurement 2021, 178, 109370. [Google Scholar] [CrossRef]
- Wang, Y.; Lee, W.; Jang, S.; Truong, V.D.; Jeong, Y.; Won, C.; Lee, J.; Yoon, J. Prediction of Internal Welding Penetration Based on IR Thermal Image Supported by Machine Vision and ANN-Model during Automatic Robot Welding Process. J. Adv. Join. Process. 2024, 9, 100199. [Google Scholar] [CrossRef]
- Shafeek, H.I.; Gadelmawla, E.S.; Abdel-Shafy, A.A.; Elewa, I.M. Assessment of Welding Defects for Gas Pipeline Radiographs Using Computer Vision. NDT E Int. 2004, 37, 291–299. [Google Scholar] [CrossRef]
- Luo, H.; Zeng, H.; Hu, L.; Hu, X.; Zhou, Z. Application of Artificial Neural Network in Laser Welding Defect Diagnosis. J. Mater. Process. Technol. 2005, 170, 403–411. [Google Scholar] [CrossRef]
- Zhang, L.; Mostavi, A.; Basantes-Defaz, A.-D.-C.; Ozevin, D.; Indacochea, J.E. The Measurement of Weld Morphology and Inclusions Using Ultrasonics. Measurement 2019, 144, 33–43. [Google Scholar] [CrossRef]
- Zeng, W.; Cai, M.; Wang, P.; Lu, T.; Yao, F. Application of Laser Ultrasonic Technique for Detecting Weld Defect Based on FDST Method. Optik 2020, 221, 165366. [Google Scholar] [CrossRef]
- Mirapeix, J.; García-Allende, P.B.; Cobo, A.; Conde, O.M.; López-Higuera, J.M. Real-Time Arc-Welding Defect Detection and Classification with Principal Component Analysis and Artificial Neural Networks. NDT E Int. 2007, 40, 315–323. [Google Scholar] [CrossRef]
- Yu, P.; Xu, G.; Gu, X.; Zhou, G.; Tian, Y. A Low-Cost Infrared Sensing System for Monitoring the MIG Welding Process. Int. J. Adv. Manuf. Technol. 2017, 92, 4031–4038. [Google Scholar] [CrossRef]
- Yang, L.; Li, E.; Long, T.; Fan, J.; Liang, Z. A Novel 3-D Path Extraction Method for Arc Welding Robot Based on Stereo Structured Light Sensor. IEEE Sens. J. 2019, 19, 763–773. [Google Scholar] [CrossRef]
- Yang, L.; Fan, J.; Liu, Y.; Li, E.; Peng, J.; Liang, Z. Automatic Detection and Location of Weld Beads with Deep Convolutional Neural Networks. IEEE Trans. Instrum. Meas. 2021, 70, 5001912. [Google Scholar] [CrossRef]
- Zou, Y.; Chen, J.; Wei, X. Research on a Real-Time Pose Estimation Method for a Seam Tracking System. Opt. Lasers Eng. 2020, 127, 105947. [Google Scholar] [CrossRef]
- Kaur, R.; Singh, S. A Comprehensive Review of Object Detection with Deep Learning. Digit. Signal Process. 2023, 132, 103812. [Google Scholar] [CrossRef]
- Wang, X.; Zheng, R.; Zhu, J.; Ji, Z.; Zhang, M.; Wu, Q. Multi-Input Dual-Branch Reverse Distillation for Screw Surface Defect Detection. Eng. Appl. Artif. Intell. 2024, 136, 108920. [Google Scholar] [CrossRef]
- Avola, D.; Cascio, M.; Cinque, L.; Fagioli, A.; Foresti, G.L.; Marini, M.R.; Rossi, F. Real-Time Deep Learning Method for Automated Detection and Localization of Structural Defects in Manufactured Products. Comput. Ind. Eng. 2022, 172, 108512. [Google Scholar] [CrossRef]
- Terven, J.; Cordova-Esparza, D. A Comprehensive Review of YOLO Architectures in Computer Vision: From YOLOv1 to YOLOv8 and YOLO-NAS. Mach. Learn. Knowl. Extr. 2023, 5, 1680–1716. [Google Scholar] [CrossRef]
- Wang, A.; Chen, H.; Liu, L.; Chen, K.; Lin, Z.; Han, J.; Ding, G. YOLOv10: Real-Time End-to-End Object Detection 2024. arXiv 2024, arXiv:2405.14458v2. [Google Scholar]
- Wada, K. Labelme: Image Polygonal Annotation with Python. Available online: https://github.com/wkentaro/labelme (accessed on 1 July 2025).
- Jocher, G.; Chaurasia, A.; Qiu, J. Ultralytics YOLO. Available online: https://github.com/ultralytics/ultralytics (accessed on 1 July 2025).
- Silva, R.H.G.e; Schwedersky, M.B.; da Rosa, Á.F. Evaluation of Toptig Technology Applied to Robotic Orbital Welding of 304L Pipes. Int. J. Press. Vessel. Pip. 2020, 188, 104229. [Google Scholar] [CrossRef]
- Akyon, F.C.; Onur Altinuc, S.; Temizel, A. Slicing Aided Hyper Inference and Fine-Tuning for Small Object Detection. In Proceedings of the 2022 IEEE International Conference on Image Processing (ICIP), Bordeaux, France, 16–19 October 2022; pp. 966–970. [Google Scholar]
Name | Device | Specification |
---|---|---|
Welding machine | Welding head: AMI Model M15 Power supply: AMI Model 227 (Arc Machines, Inc., Pacoima, CA, USA) | AVC stroke: 44 mm Wire feed speed: 0–5080 mm/min Travel speed: 2.54–508 mm/min |
Line scanner | Keyence LJ-X 8080 (KOREA KEYENCE Co., Ltd., Seoul, Republic of Korea) | Profile data count: 3200 points |
IR camera | FLIR A700 (Teledyne FLIR LLC, Seoul, Republic of Korea) | Resolution: 640 × 480 pixels |
Parameter | Value |
---|---|
Material | SA 106-Gr.B |
SUS 304 | |
Tube diameter | 220 mm |
Thickness | 20 mm |
Groove angle | 60 degrees |
Argument | Value | Description |
---|---|---|
epochs | 1000 | Total number of training epochs |
batch | 16 | Batch size: the number of training examples used in one iteration of model training |
imgsz | 640 | Target image size for training |
workers | 8 | Number of worker threads for data loading |
seed | 0 | Sets the random seed for training, ensuring reproducibility of results across runs with the same configurations |
lr0 | 0.01 | Initial learning rate |
lrf | 0.01 | Final learning rate |
box | 7.5 | Weight of the box loss component in the loss function |
cls | 0.5 | Weight of the classification loss in the total loss function |
Model | Dataset | mAP50 | mAP50-95 |
---|---|---|---|
Yolov10-N | Luminance data | 0.825 | 0.445 |
Normal map data | 0.854 | 0.491 | |
Yolov10-S | Luminance data | 0.854 | 0.528 |
Normal map data | 0.882 | 0.534 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Truong, V.D.; Wang, Y.; Won, C.; Yoon, J. A Deep Learning-Based Machine Vision System for Online Monitoring and Quality Evaluation During Multi-Layer Multi-Pass Welding. Sensors 2025, 25, 4997. https://doi.org/10.3390/s25164997
Truong VD, Wang Y, Won C, Yoon J. A Deep Learning-Based Machine Vision System for Online Monitoring and Quality Evaluation During Multi-Layer Multi-Pass Welding. Sensors. 2025; 25(16):4997. https://doi.org/10.3390/s25164997
Chicago/Turabian StyleTruong, Van Doi, Yunfeng Wang, Chanhee Won, and Jonghun Yoon. 2025. "A Deep Learning-Based Machine Vision System for Online Monitoring and Quality Evaluation During Multi-Layer Multi-Pass Welding" Sensors 25, no. 16: 4997. https://doi.org/10.3390/s25164997
APA StyleTruong, V. D., Wang, Y., Won, C., & Yoon, J. (2025). A Deep Learning-Based Machine Vision System for Online Monitoring and Quality Evaluation During Multi-Layer Multi-Pass Welding. Sensors, 25(16), 4997. https://doi.org/10.3390/s25164997