Next Article in Journal
Behavior and Performance Analysis of Fire Protection Materials Applied to Steel Structures According to Exposed Temperatures
Next Article in Special Issue
Revealing the Mechanical Properties and Fracture Mechanism of Ag Paste Sintered Solder by Two Different Preparation Methods
Previous Article in Journal / Special Issue
Applicability of Hf-Free 247LC as a Filler Metal for Hot Crack-Free 247LC Superalloy Welds Assisted by Varestraint Testing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Micro-Gap Weld Seam Contrast Enhancement via Phase Contrast Imaging

1
School of Transportation and Logistics Engineering, Wuhan University of Technology, Wuhan 430063, China
2
Engineering Research Center for Logistics Technology and Equipments, Wuhan University of Technology, Wuhan 430063, China
*
Author to whom correspondence should be addressed.
Materials 2025, 18(6), 1281; https://doi.org/10.3390/ma18061281
Submission received: 11 February 2025 / Revised: 10 March 2025 / Accepted: 12 March 2025 / Published: 14 March 2025

Abstract

:
The precision and stability of seam position detection are critical for single-square-groove weld seams formed using two thin metal plates. However, traditional methods, such as structured laser light imaging, struggle with narrow seams that lack misalignment and have high reflectivity, while non-structured light approaches are prone to welding light interference and speckle noise. To overcome these challenges, we propose a versatile optical design that leverages differential illumination to generate differential phase contrast (DPC) images. By processing images captured under differential illumination, the DPC method notably enhances seam edge contrast and suppresses welding light noise, improving the detection robustness and reliability. This approach provides a promising solution for high-precision weld seam detection in challenging environments.

Graphical Abstract

1. Introduction

Welding quality and productivity have significantly improved with the adoption of automatic seam detection and tracking systems. As shown in Figure 1, various types of seam detection sensors are available, including ultrasonic sensors [1], arc sensors [2], magneto-optical sensors [3], tactile sensors [4], and vision sensors [5,6,7,8,9,10,11]. Each method has its unique advantages for specific welding applications. Among these, vision sensors are widely preferred due to their high speed, precision, non-contact measurement capability, and ability to provide three-dimensional information about the weld seam.
Vision-based weld seam detection methods can be broadly classified based on the type of illumination. The first category employs structured light, typically using laser lines generated by a cylindrical lens [11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26]. The second category utilizes wide-field illumination, usually produced by a collimated beam from a condenser or convex lens [27,28,29]. Based on the light collection approach, these methods can further be divided into dark-field imaging [28,29], and bright-field imaging [27]. Due to the highly reflective surface of metal weldments, dark-field imaging is the preferred choice for most vision-based weld seam detection systems. Besides the above classifications, advancements in weld seam tracking systems have also been driven by the development of vision sensors. Optical coherence tomography (OCT), known for its three-dimensional imaging capability, has been applied in welding processes. Its coaxial imaging configuration allows the imaging light source and the welding light source to share the same optical path, enabling real-time capture of the weld profile within the welding pool [30,31].
Structured laser light-based weld seam detection is particularly effective for various weld groove types [32], including lap grooves [23], V-shaped grooves [11,14,17,33,34,35], U-shaped grooves [13], and tee-shaped grooves [15,16], as the laser line projection significantly bends at the seam location. However, for tight square grooves with seam widths smaller than 0.1 mm, the laser line often exhibits minimal shape deformation at the seam location, leading to unstable or failed detection [24,36].
To capture more surface features of the weld seam, non-structured illumination, such as wide-field illumination using light-emitting diodes (LEDs), has been employed for detecting tight square grooves. Although seam features are observable in the acquired images, their contrast is highly sensitive to factors like the surface texture, reflectivity, and flatness of the weldment. Poor seam contrast complicates image processing algorithms, making it challenging to achieve accurate and stable seam position extraction. To address these limitations, significant progress has been made in developing novel denoising optical sensors [27,28,29,37]. For example, combined structured and non-structured illumination demonstrated improved detection stability for tight square grooves [29]. Additionally, the seam image contrast was greatly enhanced by processing image series acquired under periodic wide-field illumination [37]. However, as multiple images are required to generate a single high-contrast image, the acquisition speed should be increased to meet the demands of high-speed welding. Beyond sensor optimization, researchers have explored advanced image processing methods to enhance the applicability of non-structured illumination for weld seam detection. For instance, an innovative Attention-Enhanced Feature Fusion Network (AFFNet) has been used to identify weld locations at the pixel level through enhanced feature detection [38]. The Kalman filter has been employed to predict and optimize seam detection results for tight square grooves under LED illumination [28]. To further improve seam position detection under challenging conditions, probabilistic and deep learning-based methods have been explored. A particle filter was employed to estimate the seam position by incorporating both current and past observations, enhancing robustness against strong light noise [39]. Although the particle filter improved the detection robustness under welding light, it did not enhance the detection accuracy because the seam edge contrast was compromised under welding light illumination. A generative adversarial network (GAN) was used to denoise weld fumes, enabling clearer seam detection [40]. To mitigate welding light noise, a deep learning model was developed for key point detection in fillet welding [41]. Furthermore, leveraging its strong noise resistance, a spatial–temporal deep learning model enabled direct seam tracking using molten pool image sequences [42].
As can be seen from the above analysis, tremendous progress has been made on improving the seam tracking accuracy under welding light noise. However, few studies have directly focused on improving the weld seam edge contrast using non-structured illumination methods from an optical design perspective. Building upon our previous research [37], we propose an improved optical design that incorporates the DPC imaging technique for seam detection in this study. DPC imaging has been demonstrated to enhance edge contrast in printed circuit board (PCB) inspection [43]. It has also gained widespread adoption in modern optical microscopy, where it increases the contrast of fine structures in biological specimens [44,45,46,47,48].
Unlike conventional imaging, which captures only the amplitude of light, the DPC method computationally recovers phase differences from two intensity images, allowing for the detection of features that are otherwise difficult to observe. Inspired by this approach, we designed an optical system that converts phase variations caused by the surface and edges of the weldment into intensity variations by calculating the difference between two intensity images.
To achieve real-time DPC imaging, we employ a high-speed LED controller and high-speed camera. A comparative analysis with existing seam detection methods demonstrates that the DPC image offers competitive results in terms of edge intensity and seam contrast. Additionally, since the DPC image is generated by subtracting one image from another, the welding light noise is inherently reduced, further enhancing detection robustness.
This research introduces several key contributions to vision-guided automatic welding processes. Firstly, the proposed DPC imaging method offers a novel optical design that notably improves the seam edge detection accuracy and robustness of existing seam tracking techniques. By capturing phase variations at the seam edge, the method enhances edge contrast, enabling more precise detection of the seam position and width, which is critical for high-precision laser welding applications. Moreover, the DPC method naturally reduces welding light noise, further enhancing detection robustness even in challenging, noisy conditions. The ability to capture phase variations, a feature that is not achievable with traditional seam tracking methods, forms the foundation for the enhanced seam contrast, making the proposed technique more effective under high-precision welding conditions. Additionally, the DPC imaging technique can be easily integrated into the existing vision sensors used in high-precision laser welding systems, such as robotic-arm laser welding machines. This integration has significant potential to improve the capability of current sensors to detect narrow-gap weld seams with higher accuracy, making it well suited for real-world industrial applications.
To the best of our knowledge, this is the first application of DPC imaging for real-time weld seam detection. With increased seam contrast and effective noise reduction, the seam extraction algorithm becomes simpler and more reliable.

2. Methods

2.1. Experimental Setup and Principle

A three-dimensional model and a photograph of our seam detection system for tight square grooves are shown in Figure 2a,b. The system primarily consists of the following components: a high-speed camera (FLIR: ORX-10GS, Wilsonville, OR, USA), camera lens (Navitar: NMV-50, Rochester, NY, USA), optical filter (Thorlabs: FBH470-10, Newton, MA, USA), blue LED (Thorlabs: M470L5, Newton, USA), high-speed LED controller (Thorlabs: LEDD1B, Newton, USA), condenser lens (Thorlabs: ACL2520U-A, Newton, USA), and analog output card.
The two LEDs emit light with the same spectrum and power, centered at 470 nm. The optical filter, also centered at 470 nm, effectively blocks most ambient and welding light. The condenser lens is used to collimate the LED light. The two high-speed LED controllers switch the LEDs on and off alternately, enabling sequential illumination of the workpiece. The LED driver can modulate the light at a maximum frequency of 5000 Hz. Synchronization between the LED driver and the camera is achieved via control signals from the analog output card, ensuring image acquisition when only one LED is active. The camera operates at a maximum frame rate of ~1000 Hz when the region of interest (ROI) is 200 pixels × 1000 pixels. The working distance is approximately 220 mm, yielding a spatial pixel resolution of 12 µm × 12 µm. Figure 2c illustrates the synchronization control signals for the LED and the camera.
The proposed method leverages a modern high-speed camera and high-speed LED driver, allowing the capture of right and left illumination images within ~2 ms, thereby minimizing motion artifacts during real-time weld seam detection. Figure 2d presents the concept and core idea of the proposed method, which enhances seam contrast through DPC imaging while removing spark noise from the acquired images. Spark noise is directly eliminated by calculating the difference between two images captured almost simultaneously. The mechanism for seam contrast enhancement is explained in detail in subsequent sections.
Schematics of our setup are shown in Figure 3a,b. The angle between the optical axes of the two LEDs and the camera lens was designed to be larger than the light collection angle of the camera lens, ensuring that the camera collects diffused rather than reflected light. The system follows the principle of dark-field illumination to reduce specular reflections from the metal surface of the workpiece. As shown in Figure 3a, the acquisition of DPC images differs from traditional weld seam detection methods because the DPC image is obtained as the difference between images captured with alternating left and right side illumination. Comparing these images taken under different lighting conditions enables the extraction of phase information, which traditional seam detection methods cannot capture, thereby enhancing the details and contrast of the weld seam image. In biological specimens, phase changes in illumination light arise from variations in thickness and refractive indices. In our case, the phase changes result from height variations at the seam edge. Therefore, the proposed DPC imaging approach improves image contrast by converting height variations at the micro-seam edge into intensity variations displayed in the DPC image, making it particularly suitable for seam contrast enhancement in metal workpieces. According to the principle detailed in [45,48], the reflectance differential phase contrast can be calculated from two images under different illuminations using Equation (1):
I D P C = 2 · I R i g h t I L e f t I R i g h t + I L e f t ,
where IRight and ILeft represent images acquired under right-side and left-side illumination respectively [48]. The term IRightILeft represents the difference image, while (IRight + ILeft)/2 is the average image. The ratio (IRightILeft)/(IRight + ILeft) in the equation represents the intensity difference between the two illumination directions, scaled by the sum of the intensities. This scaling helps suppress spatially uniform background signals and, through normalization, enhances contrast by emphasizing directional variations caused by surface topography, phase gradients, or subtle structural features. This process highlights finer details in the specimen, making it possible to observe the intricate structure with much greater clarity. These details are often difficult to detect using conventional imaging methods, thus providing a more precise representation of the specimen’s surface.
The reason why the DPC is defined as the ratio of the difference image to the average image is that the absorption information needs to be normalized [45,48]. In the context of imaging metal workpieces, light absorption is typically uniform and negligible; this contrasts sharply with biological specimens, where absorption properties vary significantly across the specimen. In biological imaging, normalization of the absorption is essential to accurately capture phase shifts caused by structural variations. However, for metal workpieces, the minimal absorption of light makes the normalization process unnecessary. As a result, in the proposed method, the DPC is calculated simply as the difference between two images taken at consecutive time points under the illumination of different LEDs. This approach effectively isolates the phase shift information without requiring absorption normalization, thereby streamlining the process for materials with uniform light absorption characteristics. Thus, Equation (1) is simplified to Equation (2):
I D P C w e l d _ s e a m = I R i g h t I L e f t
Due to the high reflectivity of the metal surface, changes in light intensity caused by different illumination angles impact the imaging results. Therefore, weighting factors are introduced into Equation (2) to adjust for the differences in light intensity caused by varying illumination angles, ensuring that the results accurately reflect the phase changes due to the specimen’s structure. The differential phase contrast with the weighting factor is shown in Equation (3):
I D P C w e l d _ s e a m = w 1 · I R i g h t w 2 · I L e f t ,
where w1 and w2 are weighting factors that can be determined based on the inverse ratio of the average grayscale values of the two images. This approach helps mitigate the impact of intensity differences between the light sources, improving the accuracy of the phase information calculations.

2.2. Comparison with Existing Seam Detection Methods

Figure 3b–e compare the proposed method with current seam detection methods based on non-structured light. Figure 3c shows a typical welding seam detection system that utilizes dark-field illumination, where both LEDs are switched on during acquisition. Figure 3d illustrates a welding seam detection system employing bright-field lighting, where the optical axes of the light source and the camera are symmetrical with respect to the normal vector of the metal surface. Specular glare may occur when light from the LED is directly reflected into the camera. Bright-field illumination is generally not used in weld seam detection due to the high reflectivity of the metal surface. Additionally, there is a defocus issue in the method shown in Figure 3d because the focal plane of the camera lens is not parallel to the metal surface. To improve the stability of seam detection by capturing multiple seam features, both non-structured and structured illumination are used in the method shown in Figure 3e. As shown in the comparison, the key difference in the proposed method, depicted in Figure 3a,b, is that the phase difference of the light is also used in seam feature extraction. By converting the phase variation of the object into intensity variation in the image, the proposed method enhances the seam contrast and visualizes seam features that could not be obtained by means of traditional methods.

2.3. Image Processing

To demonstrate the improvement in image contrast and the versatility of the proposed method, the contrast of the seam edges acquired using different methods was compared under the same image processing conditions. Since localization of the seam position is the most important aspect of seam detection, the image contrast was evaluated by comparing the normalized gradient of the seam edges. The location of the highest gradient in the image usually indicates the seam position. For comparison, the gradient of the seam image was calculated using the Sobel operator, which is a widely used method.
We developed a software tool using LabVIEW 2023 Q1, integrated with MATLAB R2023b code, to process the images. The first step in the image processing pipeline involves applying a 2 × 2 median filter to both the left and right images. This filter aims to reduce noise and smooth the images by replacing each pixel with the median of its neighboring pixels. Specifically, for each pixel IRight(n,m) and ILeft(n,m), the filtered values IRight,filtered(n,m) and ILeft,filtered(n,m) are computed as the median of a 2 × 2 window centered on the pixel, which helps in preserving edge information while removing minor noise. This process is shown in Equations (4) and (5) below:
I R i g h t , f i l t e r e d ( n , m ) = M e d i a n ( I R i g h t ( n , m ) , I R i g h t ( n + 1 , m ) , I R i g h t ( n , m + 1 ) , I R i g h t ( n + 1 , m + 1 ) ) ,
I L e f t , f i l t e r e d ( n , m ) = M e d i a n ( I L e f t ( n , m ) , I L e f t ( n + 1 , m ) , I L e f t ( n , m + 1 ) , I L e f t ( n + 1 , m + 1 ) ) ,
where n and m are the pixel coordinates in the vertical and horizontal directions, and Median means taking the median value of the pixels in the 2 × 2 neighborhood.
Following the filtering step, a phase image is computed by evaluating the weighted difference between the filtered left and right images. The phase difference D(n,m) at each pixel is determined by means of a linear combination of the right and left images, weighted by the respective factors w1 and w2. This is shown in Equation (6):
D ( n , m ) = w 1 · I R i g h t , f i l t e r e d ( n , m ) w 2 · I l e f t , f i l t e r e d ( n , m )
The next step involves computing the gradient of the phase image D(n,m) using the Sobel operator. The Sobel operator applies two convolutional kernels (Sx and Sy). The Sobel operator is applied in both the horizontal and vertical directions to estimate the image gradient components Gx(n,m) and Gy(n,m), which are then combined to produce the gradient magnitude G(n,m). The horizontal (Gx) and vertical (Gy) derivatives of the Sobel operator are expressed in Equations (7) and (8) [49,50]:
S x = 1 0 1 2 0 2 1 0 1 ,   S y = 1 2 1 0 0 0 1 2 1 ,
G x ( n , m ) = D ( n , m ) S x , G y ( n , m ) = D ( n , m ) S y ,
where ∗ stands for the convolution operation.
The final gradient magnitude can be calculated using Equation (9):
G ( n , m ) = G x ( n , m ) 2 + G y ( n , m ) 2
The final step is to normalize the gradient image by mapping the gradient values to the range [0, 1], facilitating comparisons of different images. The normalization is given in Equation (10):
G n o r m a l i z e d ( n , m ) = G ( n , m ) G min G max G min
where Gmin is the minimum value of the gradient image, and Gmax is the maximum value of the gradient image.

3. Experiment Results

To evaluate the flexibility and performance of the proposed seam detection system, it was mounted on a laser welding robot for the detection of weld seams of the tight square groove type. Figure 4a shows an actual photograph of the welding robot (KUKA 6-Axis Robot, Augsburg, Germany) equipped with the proposed seam tracking system. The working distance of the detection system was designed to be approximately 220 mm. The camera’s ROI was set to 200 pixels × 1000 pixels, corresponding to a field of view (FOV) of approximately 2.4 mm × 12 mm. The camera’s exposure time was set to 250 µs. The LEDs and the camera were synchronized using the control signal described in Figure 2c. The modulation frequency of the LEDs was set to approximately 500 Hz, while the image acquisition frequency was 1000 Hz, allowing a pair of DPC images to be acquired at 500 Hz. This acquisition speed was sufficient for real-time seam detection. The welding parameters are listed in Table 1. Before welding, the robot was programmed to keep the weld seam within the camera’s field of view. During welding, the detected seam position was continuously used to adjust the welding torch, ensuring precise alignment of the welding laser with the weld seam center. With a 50 mm focal length, an aperture of approximately 4.8, and a working distance of around 220 mm, the estimated depth of field of our setup was approximately 5 mm. To minimize specimen distortion during the welding process, the fixture shown in Figure 4 was carefully designed. Since the depth of focus was sufficient to maintain sharp images even with minor specimen distortion, we did not adjust the camera focus during welding. Additionally, to minimize the impact of welding light on the camera, the laser power and welding speed were carefully adjusted to control the light intensity. An argon shielding gas flow was used to stabilize the process, and the camera’s field of view was positioned 5 cm away from the welding pool to reduce direct exposure. The specimen preparation, including the base metal thickness, seam width (also called the root opening), and groove angle, followed the standards established by the American Welding Society. Detection experiments were conducted on workpieces with varying seam widths and surface flatness, simulating real-world welding scenarios. The system’s anti-noise capability was also demonstrated when welding light was present in the acquired seam images.
The proposed DPC imaging method captures images under left-side and right-side illumination separately. The combination of these two single-side images can effectively simulate images acquired under double-side illumination. This means that both single-side and double-side illumination images can be obtained through DPC imaging. Single-side and double-side dark-field illumination are common illumination schemes in current weld seam detection systems. Without loss of generality, the DPC images are compared with both single-side and double-side illumination in the following sections to demonstrate the capability of the proposed method in enhancing seam features.

3.1. Detection of Seams with Different Widths and Surface Flatness

Workpieces with a single-square groove were prepared using two thin aluminum plates, each with a thickness of 2 mm. Two groups of workpieces with varying seam widths, textures, and surface flatness were used to comprehensively evaluate the flexibility of the proposed detection method. One group had a seam width of approximately 0.2 mm, while the other had a seam width of 0.06 mm. The second group’s surface texture and flatness differed from those of the first group.
The detection results for the first group are shown in Figure 5. Images captured under single-side illumination from the left and right sides are presented in Figure 5a and Figure 5b, respectively. Figure 5c is the combined image generated by summing the pixel values from Figure 5a,b. Specifically, for each pixel value Ia(n,m) and Ib(n,m) in Figure 5a,b, the corresponding pixel value Ic(n,m) in Figure 5c was obtained as Ic(n,m) = Ia(n,m) + Ib(n,m). Due to the high linearity of the camera sensor, this combined image is equivalent to an image captured under traditional double-side illumination, as shown in Figure 3b.
Figure 5d shows the DPC image generated by subtracting Figure 5b from Figure 5a using Equation (6). To visualize the seam edge contrast, the gradients within the squared regions in Figure 5a–d were calculated and are shown in Figure 5e–h based on the image processing method described in Section 2.3.
The gradient distribution on the left and right sides of the seam in Figure 5e,f is uneven: Figure 5e has a higher gradient on the left edge, while Figure 5f shows a higher gradient on the right edge. In contrast, Figure 5g,h display more balanced gradients for both seam edges. This demonstrates that double-side dark-field illumination provides better seam contrast compared to single-side illumination, which aligns with common expectations. Additionally, background noise between the seam edges is more evident in Figure 5f,g compared to Figure 5h.
To quantitatively analyze the gradient, the mean gradient values between the dashed lines in Figure 5e–h were normalized by applying Equation (10) and are plotted in Figure 5i. A zoomed-in view of the edge area is shown in Figure 5j. As indicated by the arrows in Figure 5i, edge detection based on peak gradient values failed for the left-side (green) and double-side (black) illumination images. Although the right-side illumination (blue) and differential illumination (red) successfully located the seam edge using the peak gradient, the gradient at the opposing edge under right-side illumination was much lower than that achieved with differential illumination, as shown in Figure 5j.
In conclusion, the DPC image provided the highest seam contrast in terms of the seam edge gradient and demonstrated superior performance for seam edge detection compared to the other three illumination modes.
For the second demonstration, a workpiece with relatively non-uniform flatness was intentionally selected to showcase the flexibility of the proposed method. The detection results are shown in Figure 6. In practice, it is challenging to ensure that all workpieces have uniform flatness. Specular reflection can still occur in seam images, even with a detection system designed based on the dark-field imaging principle.
The images acquired under single-side (left or right) illumination are shown in Figure 6a,b. Figure 6c presents the sum of Figure 6a,b. Figure 6d shows the DPC image generated by subtracting Figure 6b from Figure 6a, as defined by Equation (6). Due to the presence of specular reflection, the seam edges in Figure 6b are visually indistinguishable.
The gradient of the highlighted square areas in Figure 6a–d is shown in Figure 6e–h, calculated using the image processing method described in Section 2.3. Under single-side illumination, only one seam edge is visible, as illustrated in Figure 6e,f. Although both edges can be distinguished with double-sided illumination in Figure 6g, the left edge appears visually weaker compared to the edges in Figure 6h.
To quantitatively assess the gradient, the mean gradient values between the dashed lines in Figure 6e–h were normalized by applying Equation (10) and are plotted in Figure 6i. A zoomed-in view of the edge areas is shown in Figure 6j. All four imaging modes accurately located the seam edges based on the peak gradient values. However, the gradient values of the seam edges under differential illumination (red line) were the highest, demonstrating superior performance.

3.2. Anti-Noise Experiment

To evaluate the feasibility of the proposed method, a detection experiment was conducted under welding light noise. As shown in Figure 7a–c, welding light filled the area between the seam edges, resulting in pixel saturation. The gradient images of the square areas in Figure 7a–d were processed using the same procedure described in the previous section and are presented in Figure 7e–h. In Figure 7e–g, the boundaries of the welding light noise dominate, leading to false edge detections. However, the true edges of the weld seam can be clearly distinguished in Figure 7h.
To analyze the gradient distribution more quantitatively, the mean gradient values between the dashed lines in Figure 7e–h were normalized by applying Equation (10) are plotted in Figure 7i, with a zoomed-in view shown in Figure 7j. As shown in Figure 7j, the true edges from the DPC seam image can be identified by the peak gradient values. Additionally, the normalized gradient values for the real edges in the DPC image are even higher than the false edges detected with either single-side or double-side illumination.
Since the DPC image is generated by subtracting one image from another, the welding light noise between the seam edges is effectively removed. This straightforward and efficient noise removal method is a key advantage of using the DPC approach in seam detection, enhancing its competitiveness.

3.3. Evaluation of Seam Detection Accuracy

In addition to comparing the seam contrast under different types of illumination, we evaluated the measurement accuracy of the proposed method. To verify the seam edge detection accuracy, the weld seam width at 30 different locations on two specimens was measured before welding using a commercial measuring apparatus (TESA-VISIO 300, Aarau, Switzerland). These measurement results were then compared with the results obtained using the proposed setup, as shown in Table 2. The mean absolute errors were less than 0.017 mm, and the maximum error was less than 0.036 mm. The standard deviation of the absolute error was less than 0.015 mm. The observed mean absolute error of 0.017 mm is likely due to the resolution of our optical setup, where each camera pixel corresponds to 0.012 mm. Overall, the proposed method demonstrated good accuracy in seam width detection under the tested conditions.

4. Discussion and Conclusions

Herein, we proposed a novel weld seam detection method using the DPC imaging technique for single-square groove weld seams. The DPC technique converts phase variations into intensity variations by calculating the difference between two images acquired under different illumination, yielding seam features with enhanced contrast.
By leveraging modern high-speed cameras and high-speed LED drivers, the proposed method captures DPC images at a high speed, avoiding motion artifacts. The existing vision sensors for weld seam detection primarily rely on light intensity information, discarding the valuable phase information. By retrieving the phase information, the proposed method provides unique and enhanced seam contrast, improving the seam edge gradients.
The effectiveness of the proposed method was demonstrated through side-by-side comparisons with single- and double-side illumination under varying seam sizes and surface flatness conditions. The DPC image consistently exhibited higher contrast than conventional illumination techniques. Furthermore, traditional vision-based seam detection methods struggle to locate seam positions in the presence of welding light noise. The DPC method effectively removes welding light noise by taking the difference between two consecutively acquired images under differential illumination, offering an innovative optical design solution. With its ability to enhance seam contrast and eliminate spark noise, this method enables more accurate and robust seam position detection. The weld seam detection accuracy was verified by comparing the results with a commercial measurement apparatus. The results show that the proposed method has a mean absolute error of approximately 0.017 mm. As part of future work, we plan to incorporate X-ray analysis to further evaluate the weld quality and provide a more comprehensive assessment of the proposed seam detection method’s performance.
To deal with distortion during the welding process, we will also explore using a focus adjustable camera lens to automatically correct the focus and ensure sharp images. To achieve a high frame rate and avoid motion artifacts, the FOV in the proposed method was limited to 2.4 mm × 12 mm. Future improvements could incorporate cameras with higher throughput to expand the FOV. Currently, at least two images are required to reconstruct the DPC image. Further enhancements in image contrast and robustness could be achieved by integrating multiple LEDs at varying illumination angles to expand the spatial frequency coverage. This approach would strengthen the phase contrast, enabling more reliable seam detection under diverse welding conditions. To avoid motion artifacts and further increase the welding speed, a high-power LED light source can be employed, and the exposure time can be further reduced to minimize the time lag between the captures of two consecutive images.
The system was designed under the dark-field imaging principle to accommodate the high reflectivity of aluminum workpieces. However, the DPC-based method can be seamlessly adapted to bright-field seam detection systems for non-metallic or less reflective workpieces. Additionally, it can be extended to other seam detection techniques that utilize structured light sources. Therefore, the proposed imaging technique has strong potential for integration into the commercial vision sensors used in automated welding systems, such as robotic-arm welding machines for laser welding. Enhancing the capability of existing vision sensors to detect narrow-gap weld seams could significantly improve welding accuracy and reliability in industrial applications.
In conclusion, the proposed method leverages the DPC technique to computationally recover phase information, providing unique and improved seam contrast. By directly removing welding light noise through differential image processing, this approach offers a powerful and innovative solution for accurate and robust weld seam detection.

Author Contributions

Y.Y. (Yanfang Yang): Conceptualization, Methodology, Investigation, Formal analysis, Software, Writing of Original Draft; Y.Y. (Yonglu Yang): Data curation, Visualizations, Review & Editing; W.S.: Conceptualization, Investigation, Funding Acquisition. All authors have read and agreed to the published version of the manuscript.

Funding

This project is supported by the National Natural Science Foundation of China (Grant No. 62475202) and the Hubei Provincial Science and Technology Service Talent Program (project number 2024DJC094).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors are very grateful to all the anonymous reviewers for the valuable opinions and suggestions for the improvement of our paper.

Conflicts of Interest

The authors declare no conflicts of interest.

Nomenclature

The following nomenclature is used in this manuscript:
DPCDifferential phase contrast
OCTOptical coherence tomography
LEDLight-emitting diode
GANGenerative adversarial network
AFFNetAttention-Enhanced Feature Fusion Network
PCBPrinted circuit board
ROIRegion of interest
FOVField of view

References

  1. Gao, Y.; Xiao, J.; Xiong, G.; Zhang, H. Research on the deviation sensing of V-groove weld seam based on a novel two channel acoustic sensor. Int. J. Adv. Manuf. Technol. 2022, 119, 5821–5837. [Google Scholar] [CrossRef]
  2. Lin, J.; Jia, A.; Huang, W.; Wen, Z.; Hong, B.; Hong, Y. Weld seam tracking method of root pass welding with variable gap based on magnetically controlled arc sensor. Int. J. Adv. Manuf. Technol. 2023, 126, 5227–5243. [Google Scholar] [CrossRef]
  3. Li, Y.; Gao, X.; Chen, Y.; Zhou, X.; Zhang, Y.; You, D. Modeling for tracking micro gap weld based on magneto-optical sensing and Kalman filtering. IEEE Sens. J. 2020, 21, 11598–11614. [Google Scholar] [CrossRef]
  4. Upadhyay, S.O. Robotic Tactile Sensor Probe used for Contour plotting of Weld Joint in Seam Tracking System. Int. Res. J. Eng. Technol. 2023, 10, 962–982. [Google Scholar]
  5. Xia, L.; Zhou, J.; Xue, R.; Li, X.; Liu, H. Real-time seam tracking during narrow gap GMAW process based on the wide dynamic vision sensing method. J. Manuf. Process. 2023, 101, 820–834. [Google Scholar] [CrossRef]
  6. Liu, C.; Wang, H.; Huang, Y.; Rong, Y.; Meng, J.; Li, G.; Zhang, G. Welding seam recognition and tracking for a novel mobile welding robot based on multi-layer sensing strategy. Meas. Sci. Technol. 2022, 33, 055109. [Google Scholar] [CrossRef]
  7. Sharma, A.; Chaturvedi, R.; Sharma, K.; Binhowimal, S.A.; Giri, J.; Sathish, T. Enhancing weld quality of novel robotic-arm arc welding: Vision-based monitoring, real-time control seam tracking. Ain. Shams. Eng. J. 2024, 15, 103109. [Google Scholar] [CrossRef]
  8. Ge, J.; Deng, Z.; Li, Z.; Liu, T.; Zhuo, R.; Chen, X. Adaptive parameter optimization approach for robotic grinding of weld seam based on laser vision sensor. Robot. Comput. Integr. Manuf. 2023, 82, 102540. [Google Scholar] [CrossRef]
  9. Shi, Y.; Wang, Z.; Liang, Z.; Li, B.; Lin, Z. A welding seam tracking algorithm adaptive to variable groove type: An interactive segmentation passive vision method. Opt. Laser Technol. 2025, 181, 111861. [Google Scholar] [CrossRef]
  10. Boldrin, D.M.; Tosatti, L.M.; Previtali, B.; Demir, A.G. Seam tracking and gap bridging during robotic laser beam welding via grayscale imaging and wobbling. Robot. Comput. Integr. Manuf. 2024, 89, 102774. [Google Scholar] [CrossRef]
  11. Ma, Y.; Fan, J.; Yang, H.; Yang, L.; Ji, Z.; Jing, F.; Tan, M. A fast and robust seam tracking method for spatial circular weld based on laser visual sensor. IEEE Trans. Instrum. Meas. 2021, 70, 1–11. [Google Scholar] [CrossRef]
  12. Nguyen, Q.C.; Hua, H.Q.; Pham, P.T. Development of a vision system integrated with industrial robots for online weld seam tracking. J. Manuf. Process. 2024, 119, 414–424. [Google Scholar] [CrossRef]
  13. Wang, W.; Shi, Y.; Li, C.; Gu, Y. Feature information extraction method for narrow gap U-type groove based on laser vision. J. Manuf. Process. 2023, 104, 257–272. [Google Scholar] [CrossRef]
  14. Li, D.; Wang, M.; Wang, S.; Zhao, H. Research and development of weld tracking system based on laser vision. Meas. Control 2022, 55, 1124–1133. [Google Scholar] [CrossRef]
  15. Cibicik, A.; Njaastad, E.B.; Tingelstad, L.; Egeland, O. Robotic weld groove scanning for large tubular T-joints using a line laser sensor. Int. J. Adv. Manuf. Technol. 2022, 120, 4525–4538. [Google Scholar] [CrossRef]
  16. He, Y.; Cai, R.; Dai, F.; Yu, Z.; Deng, Y.; Deng, J.; Wang, Z.; Ma, G.; Zhong, W. A unified framework based on semantic segmentation for extraction of weld seam profiles with typical joints. J. Manuf. Process. 2024, 131, 2275–2287. [Google Scholar] [CrossRef]
  17. Liu, H.; Tian, Y.; Li, L.; Lu, Y.; Xi, F. One-shot, integrated positioning for welding initial points via co-mapping of cross and parallel stripes. Robot. Comput. Integr. Manuf. 2023, 84, 102602. [Google Scholar] [CrossRef]
  18. Wang, T.; Meng, K.; Cui, L.; Li, X. Weld tracking technology for all-position welding of pipes based on laser vision. Opt. Lasers Eng. 2025, 188, 108912. [Google Scholar] [CrossRef]
  19. Xu, F.; Hou, Z.; Xiao, R.; Xu, Y.; Wang, Q.; Zhang, H. A novel welding path generation method for robotic multi-layer multi-pass welding based on weld seam feature point. Measurement 2023, 216, 112910. [Google Scholar] [CrossRef]
  20. Johan, N.F.; Mohd Shah, H.N.; Sulaiman, M.; Naji, O.A.; Arshad, M.A. Weld seam feature point extraction using laser and vision sensor. Int. J. Adv. Manuf. Technol. 2023, 127, 5155–5170. [Google Scholar] [CrossRef]
  21. Chen, S.; Liu, J.; Chen, B.; Suo, X. Universal fillet weld joint recognition and positioning for robot welding using structured light. Robot. Comput. Integr. Manuf. 2022, 74, 102279. [Google Scholar] [CrossRef]
  22. Wang, J.; Zhang, X.; Liu, J.; Shi, Y.; Huang, Y. Line-structured light fillet weld positioning method to overcome weld instability due to high specular reflection. Machines 2022, 11, 38. [Google Scholar] [CrossRef]
  23. Ma, Y.; Fan, J.; Yang, H.; Wang, H.; Xing, S.; Jing, F.; Tan, M. An efficient and robust complex weld seam feature point extraction method for seam tracking and posture adjustment. IEEE Trans. Ind. Inform. 2023, 19, 10704–10715. [Google Scholar] [CrossRef]
  24. Shao, W.; Huang, Y. A Robust Weld Seam Tracking Method by Using Intensity-Modulated Structured Light. IEEE Sens. J. 2021, 21, 5368–5376. [Google Scholar] [CrossRef]
  25. Xiao, R.; Xu, Y.; Xu, F.; Hou, Z.; Zhang, H.; Chen, S. LSFP-tracker: An autonomous laser stripe feature point extraction algorithm based on siamese network for robotic welding seam tracking. IEEE Trans. Ind. Electron. 2023, 71, 1037–1048. [Google Scholar] [CrossRef]
  26. Xu, F.; He, L.; Hou, Z.; Xiao, R.; Zuo, T.; Li, J.; Xu, Y.; Zhang, H. An automatic feature point extraction method based on laser vision for robotic multi-layer multi-pass weld seam tracking. Int. J. Adv. Manuf. Technol. 2024, 131, 5941–5960. [Google Scholar] [CrossRef]
  27. Wang, P.J.; Shao, W.J.; Gong, S.H.; Jia, P.J.; Li, G. High-precision measurement of weld seam based on narrow depth of field lens in laser welding. Sci. Technol. Weld. Join. 2016, 21, 267–274. [Google Scholar] [CrossRef]
  28. Nilsen, M.; Sikström, F.; Christiansson, A.-K.; Ancona, A. Robust vision-based joint tracking for laser welding of curved closed-square- butt joints. Int. J. Adv. Manuf. Technol. 2019, 101, 1967–1978. [Google Scholar] [CrossRef]
  29. Nilsen, M.; Sikström, F.; Christiansson, A.-K. Adaptive control of the filler wire rate during laser beam welding of squared butt joints with varying gap width. Int. J. Adv. Manuf. Technol. 2019, 102, 3667–3676. [Google Scholar] [CrossRef]
  30. Mattulat, T. Understanding the coaxial optical coherence tomography signal during the laser welding of hidden T-joints. J. Laser Appl. 2024, 36, 012003. [Google Scholar] [CrossRef]
  31. Jiang, Z.; Jiang, Z.; Ding, H. Optical coherence tomography for laser welding applications. Optik 2022, 257, 168784. [Google Scholar] [CrossRef]
  32. Xu, F.; Xu, Y.; Zhang, H.; Chen, S. Application of sensing technology in intelligent robotic arc welding: A review. J. Manuf. Process. 2022, 79, 854–880. [Google Scholar] [CrossRef]
  33. Xiao, R.; Cao, Q.; Chen, S. A novel laser stripe key point tracker based on self-supervised learning and improved KCF for robotic welding seam tracking. J. Manuf. Process. 2024, 127, 660–670. [Google Scholar] [CrossRef]
  34. Tan, Z.; Zhao, B.; Ji, Y.; Xu, X.; Kong, Z.; Liu, T.; Luo, M. A welding seam positioning method based on polarization 3D reconstruction and linear structured light imaging. Opt. Laser Technol. 2022, 151, 108046. [Google Scholar] [CrossRef]
  35. Zhang, G.; Huang, J.; Wu, Y.; Yang, G.; Di, S.; Yuan, H.; Cao, X.; Shin, K. A novel 3D complex welding seam tracking method in symmetrical robotic MAG welding process using a laser vision sensing. Symmetry 2023, 15, 1093. [Google Scholar] [CrossRef]
  36. Gao, X.; Chen, Y.; You, D.; Xiao, Z.; Chen, X. Detection of micro gap weld joint by using magneto-optical imaging and Kalman filtering compensated with RBF neural network. Mech. Syst. Signal Process. 2017, 84, 570–583. [Google Scholar] [CrossRef]
  37. Shao, W.; Rong, Y.; Huang, Y. Image contrast enhancement and denoising in micro-gap weld seam detection by periodic wide-field illumination. J. Manuf. Process. 2022, 75, 792–801. [Google Scholar] [CrossRef]
  38. He, Z.; Pei, Z.; Li, E.; Zhou, E.; Huang, Z.; Xing, Z.; Li, B. An image segmentation-based localization method for detecting weld seams. Adv. Eng. Softw. 2024, 194, 103662. [Google Scholar] [CrossRef]
  39. Shao, W.; Liu, X.; Wu, Z. A robust weld seam detection method based on particle filter for laser welding by using a passive vision sensor. Int. J. Adv. Manuf. Technology. 2019, 104, 2971–2980. [Google Scholar] [CrossRef]
  40. Singh, A.; Kalaichelvi, V.; Dsouza, A.; Karthikeyan, R. GAN-Based image dehazing for intelligent weld shape classification and tracing using deep learning. Appl. Sci. 2022, 12, 6860. [Google Scholar] [CrossRef]
  41. Mobaraki, M.; Ahani, S.; Gonzalez, R.; Yi, K.M.; Van Heusden, K.; Dumont, G.A. Vision-based seam tracking for GMAW fillet welding based on keypoint detection deep learning model. J. Manuf. Process. 2024, 117, 315–328. [Google Scholar] [CrossRef]
  42. Hong, Y.; Jiang, Y.; Yang, M.; Chang, B.; Dong, D.U. Intelligent seam tracking in foils joining based on spatial–temporal deep learning from molten pool serial images. Robot. Comput. Integr. Manuf. 2025, 91, 102840. [Google Scholar] [CrossRef]
  43. Hamilton, D.K.; Sheppard, C.J.R. Differential phase contrast in scanning optical microscopy. J. Microsc. 1984, 133, 27–39. [Google Scholar] [CrossRef]
  44. Amos, W.B.; Reichelt, S.; Cattermole, D.M.; Laufer, J. Re- evaluation of differential phase contrast (DPC) in a scanning laser microscope using a split detector as an alternative to differential interference contrast (DIC) optics. J. Microsc. 2003, 210, 166–175. [Google Scholar] [CrossRef] [PubMed]
  45. Mehta, S.B.; Sheppard, C.J.R. Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast. Opt. Lett. 2009, 34, 1924–1926. [Google Scholar] [CrossRef]
  46. Tian, L.; Waller, L. Quantitative differential phase contrast imaging in an LED array microscope. Opt. Express. 2015, 23, 11394–11403. [Google Scholar] [CrossRef] [PubMed]
  47. Bonati, C.; Laforest, T.; Kunzi, M.; Moser, C. Phase sensitivity in differential phase contrast microscopy: Limits and strategies to improve it. Opt. Express 2020, 28, 33767–33783. [Google Scholar] [CrossRef]
  48. Song, W.; Matlock, A.; Fu, S.; Qin, X.; Feng, H.; Gabel, C.V.; Tian, L.; Yi, J. LED array reflectance microscopy for scattering-based multi-contrast imaging. Opt. Lett. 2020, 45, 1647–1650. [Google Scholar] [CrossRef]
  49. Sobel, I. An Isotropic 3x3 Image Gradient Operator. In Presentation at Stanford A.I. Project 1968. 2014. Available online: https://www.researchgate.net/publication/239398674_An_Isotropic_3x3_Image_Gradient_Operator (accessed on 11 March 2025).
  50. Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 4th ed.; Pearson Education: New York, NY, USA, 2018. [Google Scholar]
Figure 1. Representative state-of-the-art welding seam tracking technologies: (a) structure of a two-channel acoustic sensor for V-groove weld seam tracking [1]; (b) 3D model of a magnetically controlled arc sensor for detecting variable-gap seams [2]; (c) schematic of a magneto-optical imaging sensor for micro-gap weld seam tracking [3]; (d) real photograph of a laser triangulation sensor for spatial circular weld seam tracking [11].
Figure 1. Representative state-of-the-art welding seam tracking technologies: (a) structure of a two-channel acoustic sensor for V-groove weld seam tracking [1]; (b) 3D model of a magnetically controlled arc sensor for detecting variable-gap seams [2]; (c) schematic of a magneto-optical imaging sensor for micro-gap weld seam tracking [3]; (d) real photograph of a laser triangulation sensor for spatial circular weld seam tracking [11].
Materials 18 01281 g001
Figure 2. The experimental setup: (a) a three-dimensional model of the system, where red circle represents the welding point, and green circle represents the camera’s field of view; (b) an actual photograph of the system; (c) the control signal for the LEDs and camera, where U1 represents the control voltage for the camera, U2 represents the control voltage for the left LED, and U3 represents the control voltage for the right LED; (d) the concept of the proposed method.
Figure 2. The experimental setup: (a) a three-dimensional model of the system, where red circle represents the welding point, and green circle represents the camera’s field of view; (b) an actual photograph of the system; (c) the control signal for the LEDs and camera, where U1 represents the control voltage for the camera, U2 represents the control voltage for the left LED, and U3 represents the control voltage for the right LED; (d) the concept of the proposed method.
Materials 18 01281 g002
Figure 3. Schematics of the proposed DPC imaging method and comparisons with typical seam detection methods: (a,b) the proposed seam detection method based on DPC imaging; (c) a seam detection method with double-side illumination in dark-field mode; (d) a seam detection method with single-side illumination in bright-field mode; (e) a seam detection method combining structured light illumination and dark-field illumination.
Figure 3. Schematics of the proposed DPC imaging method and comparisons with typical seam detection methods: (a,b) the proposed seam detection method based on DPC imaging; (c) a seam detection method with double-side illumination in dark-field mode; (d) a seam detection method with single-side illumination in bright-field mode; (e) a seam detection method combining structured light illumination and dark-field illumination.
Materials 18 01281 g003
Figure 4. The experimental setup: (a) an actual photograph of the welding system; (b) a zoomed-in view of panel a, highlighting the workpiece.
Figure 4. The experimental setup: (a) an actual photograph of the welding system; (b) a zoomed-in view of panel a, highlighting the workpiece.
Materials 18 01281 g004
Figure 5. Detection results for a weld seam with a width of 0.2 mm: (a) a seam image acquired under left LED illumination; (b) a seam image acquired under right LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of (i).
Figure 5. Detection results for a weld seam with a width of 0.2 mm: (a) a seam image acquired under left LED illumination; (b) a seam image acquired under right LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of (i).
Materials 18 01281 g005
Figure 6. Detection results for a weld seam with a width of 0.06 mm (seam surface with non-uniform flatness): (a) a seam image acquired under left LED illumination; (b) a seam image acquired under left LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of (i).
Figure 6. Detection results for a weld seam with a width of 0.06 mm (seam surface with non-uniform flatness): (a) a seam image acquired under left LED illumination; (b) a seam image acquired under left LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of (i).
Materials 18 01281 g006
Figure 7. Anti-noise experiment: (a) a seam image acquired under left LED illumination; (b) a seam image acquired under right LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of panel (i).
Figure 7. Anti-noise experiment: (a) a seam image acquired under left LED illumination; (b) a seam image acquired under right LED illumination; (c) the sum of the left and right images (equivalent to illumination from both sides); (d) the DPC image calculated as the difference between the images in (a,b) using Equation (6); (eh) gradient images of the square areas in panels (ad); (i) profiles of the average gradient intensity between the dashed lines in panels (eh), where Gnorm is the normalized gradient; (j) a zoomed-in view of panel (i).
Materials 18 01281 g007
Table 1. Welding parameters.
Table 1. Welding parameters.
ItemDescription
Weld groove typeTight single-square groove (seam width: ~0.06 [mm], ~0.2 [mm]; groove angle: 0°)
Seam typeStraight line
Material of the workpieceAluminum plate (type: 6061; 2 [mm] thickness)
Laser power1.5 [kW]
Welding speed1 [m/min]
Table 2. Measurement accuracy evaluation.
Table 2. Measurement accuracy evaluation.
ItemMean Absolute Error [mm]Maximum Error
[mm]
Standard Deviation
[mm]
Specimen with 0.2 mm seam width0.0160.0360.015
Specimen with 0.06 mm seam width0.0170.0360.014
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, Y.; Yang, Y.; Shao, W. Micro-Gap Weld Seam Contrast Enhancement via Phase Contrast Imaging. Materials 2025, 18, 1281. https://doi.org/10.3390/ma18061281

AMA Style

Yang Y, Yang Y, Shao W. Micro-Gap Weld Seam Contrast Enhancement via Phase Contrast Imaging. Materials. 2025; 18(6):1281. https://doi.org/10.3390/ma18061281

Chicago/Turabian Style

Yang, Yanfang, Yonglu Yang, and Wenjun Shao. 2025. "Micro-Gap Weld Seam Contrast Enhancement via Phase Contrast Imaging" Materials 18, no. 6: 1281. https://doi.org/10.3390/ma18061281

APA Style

Yang, Y., Yang, Y., & Shao, W. (2025). Micro-Gap Weld Seam Contrast Enhancement via Phase Contrast Imaging. Materials, 18(6), 1281. https://doi.org/10.3390/ma18061281

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop