1. Introduction
The intelligentization of robotic welding represents a key trend in the development of advanced manufacturing technologies, with 3D vision-based groove detection playing a crucial role in enabling autonomous welding. However, in the welding process of medium and thick plates, interferences such as arc radiation, spatter, and environmental vibrations are common, which introduce non-uniform noise into the acquired 3D point cloud data [
1,
2,
3]. To address complex working conditions, existing visual detection algorithms typically require the design of specific denoising modules for different interference types. The threshold parameters for these modules are often determined through empirical experience or repeated experimentation, which severely restricts their generalizability and engineering application efficiency [
4,
5,
6,
7]. Consequently, it is necessary to establish a quantitative evaluation method for interference data, which can provide a reliable basis for the adaptive adjustment of algorithm parameters. This would significantly enhance the robustness and intelligent capabilities of detection systems in practical engineering scenarios. Accurate groove identification is essential for downstream welding tasks such as seam tracking, torch/path planning, and quality assurance; misidentified grooves under interference can directly lead to trajectory deviation and weld defects.
Currently, there is no established method for evaluating the degree of interference in welding groove detection data. Research in this area typically focuses on assessing the regional dispersion or stability of 3D detection data [
8,
9,
10,
11,
12]. The selection of evaluation data types and the method used to describe dispersion significantly influence the effectiveness of these evaluation methods in welding groove detection [
13,
14,
15,
16]. In terms of data type selection, current approaches often rely on the Z-coordinate height of the detection data, using height differences to assess interference levels [
17,
18,
19,
20,
21,
22,
23]. These methods are highly sensitive to the
Z-axis height values, which makes it challenging to distinguish between inclination features and actual noise, leading to potential misinterpretations. Furthermore, these methods exhibit low adaptability to specific features of welding grooves, such as deep gaps and rounded corners. Additionally, when the dispersion of height is large, but the overall height is relatively small, these methods often fail to identify data instability, resulting in missed detections. Beyond welding, robust damage/anomaly detection in complex environments has also been investigated in structural engineering, where measurement noise and incomplete observations are common; such studies further motivate noise-resilient descriptors for 3D data analysis [
24]. Although some approaches use Principal Component Analysis (PCA) to select the most prominent projection plane to calculate height relative to the reference plane and enhance feature significance, the aforementioned issues remain unaddressed [
25,
26].
Additionally, there is a lack of research focused on describing the dispersion of welding groove 3D detection data. In engineering applications, the quantitative description of this dispersion is typically based on the experience of engineers or morphological indicators. However, engineering experience tends to be highly context-specific, which limits its general applicability, while morphological indicators, derived from idealized mathematical models, are often impractical in real-world scenarios [
27,
28,
29]. Moreover, the choice of which data feature to use for measuring dispersion remains unclear, with various attempts involving metrics like variance and covariance [
30,
31,
32,
33]. In response to this, the present study proposes using the eigenvalues of the covariance matrix as a means of quantifying the dispersion of welding groove 3D detection data. This approach offers improved geometric invariance and more accurately captures the dispersion of data, maintaining greater stability across varied application scenarios.
In summary, existing studies relevant to this paper mainly fall into three categories: (i) point-cloud noise assessment and quality metrics that use statistics or angular similarity to quantify stability; (ii) point-cloud denoising and robust surface/normal estimation under outliers and non-uniform noise; and (iii) welding groove perception pipelines that require threshold selection for segmentation, recognition, and feature extraction. While these lines of work provide useful tools, they do not directly address the engineering need of interference grading for groove detection data that can separate noise from genuine groove geometries and provide a rotation-stable criterion for parameter tuning.
To address these challenges, this paper proposes a method for evaluating the degree of interference in welding groove 3D detection data based on the angle of reconstructed triangular patches and develops an accompanying detection and reconstruction visualization system. This method innovatively calculates the eigenvalues of the covariance matrix of the angles between the triangular patches and the horizontal plane, establishing a dispersion quantification model with rotational invariance. This model effectively overcomes the sensitivity of traditional height-difference indicators to Z-directional noise. The proposed system features a five-layer processing architecture: the data acquisition layer facilitates 3D detection, the evaluation layer performs interference grading and visualization, the feature processing layer supports the integration of modular algorithms, the evaluation feedback layer provides visual decision support, and the reconstruction layer generates multidimensional models for further analysis. Experimental results demonstrate that the proposed method can identify more regions with high dispersion compared to traditional height-difference-based evaluation methods and exhibits superior adaptability to common welding groove features, such as deep gaps and rounded corners. From the perspective of symmetry, the proposed evaluation model can be regarded as a symmetry-preserving and invariant representation of local geometric dispersion in noisy groove point clouds.
The main contributions of this work are: (1) an eigenvalue-based interference quantification descriptor is proposed, built from reconstructed triangular patch angles, providing a translation-invariant and Z-rotation-stable dispersion measure for noisy groove point clouds; (2) a sliding-window evaluation algorithm is designed, and its computational complexity is analyzed, enabling reproducible deployment in streaming robotic welding scenarios; (3) a five-layer detection and reconstruction visualization system is developed, integrating modular recognition/extraction/anomaly removal algorithms and supporting interference-aware parameter tuning; and (4) the method is validated on real industrial datasets, demonstrating improved identification of severely interfered regions and better adaptability to representative groove geometries (deep gaps, rounded corners).
2. Proposed Methodology: Eigenvalue-Based Interference Evaluation
2.1. Triangular Patch Reconstruction and Angle Computation
The 3D detection data of welding grooves is primarily affected by localized pulse noise. To quantify the degree of this interference, a measure of dispersion is defined. The calculation process for this dispersion is outlined as follows: First, a square window is selected with a data point at its center, and the angles between the reconstructed triangular patches of the point cloud and the horizontal plane within the window are computed. These angles form a data set. Next, the covariance matrix eigenvalues of the data set are determined, and the L2-norm of the vector composed of these eigenvalues is calculated. This L2-norm represents the degree of dispersion at the given location.
Specifically, the 3D scanning results of welding groove detection data are represented as a mesh point cloud with dimensions
. For each data point, an
square window is selected sequentially, where the number of data points within each window is
. The point cloud within each window is used to assess the degree of dispersion at the center of the window. This process is illustrated in
Figure 1.
The window size controls the spatial support used to estimate local dispersion. A small makes the descriptor more sensitive to localized pulse noise but also increases variance and may overreact to isolated outliers; a large improves statistical stability by aggregating more reconstructed patches but may smooth over small-scale defects and increase computational cost. In practice, should be chosen according to the expected spatial scale of interference and the point spacing of the mesh scan. An odd is recommended so that the evaluated point lies at the window center, and should be selected such that the window covers at least one characteristic width of the smallest groove feature that should be preserved (e.g., rounded corners or deep gaps).
For each point cloud within a window, triangular reconstructions are made between each point and its adjacent points: the left and upper neighbors, as well as the right and lower neighbors. The number of triangular patches reconstructed within a window is
. Each point’s top-left and bottom-right neighbors are reconstructed into two triangular patches, while the entire point cloud within the window is reconstructed into a continuous 3D surface, which is complete and without gaps. This process is shown in
Figure 2.
The angles between all reconstructed triangular patches and the horizontal plane are then computed. To calculate the angles, the normal vector of the triangular patch is required. Let the spatial coordinates of the three points of a reconstructed triangular patch be
,
, and
. The normal vector of the triangular patch
is shown as Equation (1).
The normal vector of the horizontal plane is
, so the angle
between the triangular patch and the horizontal plane can be expressed as Equation (2).
The triangular patch normal is computed using the standard cross-product formulation, and the angle to a reference plane is obtained via the dot-product definition (Equations (1) and (2)); these are widely used in geometric processing and surface normal estimation in point clouds [
15,
26].
2.2. Covariance, Eigenvalues, and Dispersion Definition
To facilitate the description of the positions of these angles, the position of the reconstructed plane is defined as the centroid of the triangular patch in the X and Y directions. This position corresponds to the angle between the triangular patch and the horizontal plane. After calculating all the angles within a window, they are represented in an
-dimensional vector, which, along with their X and Y positions, forms
three-dimensional vectors. The covariance matrix of these vectors is also computed. These vectors
are represented as Equation (3):
where
and
are the X and Y coordinates of the centroid of the corresponding triangular patch, and
is the angle between the patch and the horizontal plane. The eigenvalues of the covariance matrix in the directions of its eigenvectors are then computed. These eigenvalues represent the variance along the most significant directions of change for the three-dimensional vectors composed of the X position, Y position, and angle, and they quantify the degree of dispersion within the window as well as the interference level at this location. Since the data is three-dimensional, the covariance matrix is of size
, with three eigenvectors and three eigenvalues. These eigenvalues are denoted as
,
, and
. To directly quantify the degree of dispersion, the L2-norm of
,
, is calculated as Equation (4):
where
is the degree of dispersion. Using the covariance matrix and its eigenvalues to summarize second-order dispersion is consistent with PCA-style statistical descriptions and covariance-based variability measures [
12,
30]. Based on the above process, the interference degree at a location is determined by the stability of the angles of all reconstructed planes in the most significant directions of change. A decrease in stability along certain directions leads to an increase in overall interference. When stability is low in all directions, the overall interference degree is high. Following this procedure, the interference degree for all data locations can be calculated. The sliding window interference detection algorithm based on point cloud reconstruction angle is detailed in Algorithm 1.
For each center point, an neighborhood contains samples and yields reconstructed triangular patches (each interior point contributes a constant number of adjacent patches). The per-patch operations (normal computation, angle evaluation, centroid computation) are , so forming the feature set is per window. The covariance matrix is (Equation (3)), and its eigen-decomposition has constant cost. Therefore, for an mesh, the overall time complexity is and memory complexity is for a straightforward implementation. In practice, the computation can be accelerated by streaming evaluation (line-by-line) and reusing intermediate neighborhood buffers.
| Algorithm 1. Sliding window interference detection algorithm based on point cloud reconstruction angle |
Input: welding groove 3D detection data , sliding window size Output: Interference degree of the detection data 1 Initialize 2 Begin 3 For from 1 to do 4 Set a square window of size centered at 5 Reconstruct the point cloud within the window into triangular patches, obtaining triangular patches 6 For from 1 to do 7 Calculate the normal vector of 8 Calculate the angle between and the horizontal plane 9 Calculate the centroid coordinates of 10 Construct a three-dimensional vector from , , and 11 End for 12 Obtain three-dimensional vectors 13 14 Compute the L2-norm of 15 End for 16 17 End |
To facilitate data visualization, the degree of dispersion at each location is mapped to RGB values. The welding groove detection method used in this paper is 3D scanning, with data in the form of a mesh point cloud. Therefore, for each data point, the degree of dispersion is assigned based on the closest reconstructed triangular patch centroid. The visualization process is shown in
Figure 3.
This method is invariant to translation along the three directions and rotation along the Z-direction. It calculates the angles between the reconstructed patches and the horizontal plane and uses the covariance matrix eigenvalues to compute the dispersion degree. Any translation does not affect the relative values between the data coordinates, ensuring the method’s translation invariance. Since the data in this paper is based on 3D scanning for welding grooves, the interference manifests as distortion along the Z-direction, and the detection plane is parallel to the horizontal plane. Therefore, the method uses the angle relative to the horizontal plane to assess dispersion. This process limits rotations along the X or Y directions, meaning that the inclination of the data relative to the vertical direction will alter the computed interference degree. On the other hand, since rotation along the Z-direction does not affect the dispersion relative to the horizontal plane, the method is invariant to rotations along the Z-direction. From a symmetry perspective, the proposed dispersion descriptor does not depend on the explicit orientation of the local coordinate frame within the horizontal plane. Rotations around the Z-axis correspond to in-plane coordinate changes, which alter the individual patch orientations but preserve the second-order statistical distribution captured by the covariance eigenvalues. Similar invariance principles have been adopted in prior point cloud and surface analysis methods, where rotation-insensitive statistics are used to obtain stable and coordinate-independent geometric descriptors. This property ensures that the method is unaffected by the scanning direction of the equipment during detection. Meanwhile, to mitigate the limitations of this assumption, a robust PCA/RANSAC method can be employed within the neighborhood to estimate a local reference plane. The angle of the triangular patch can then be calculated relative to this reference plane, thereby enhancing robustness under X/Y tilt conditions.
From the perspective of symmetry, the proposed interference evaluation can be interpreted as constructing a symmetry-preserving descriptor of local geometric dispersion. Specifically, by encoding the point cloud neighborhood through reconstructed patch angles and summarizing their distribution using the eigenvalues of a covariance matrix, the resulting dispersion measure exhibits invariance to translation and remains stable under rotations about the Z-axis. Such geometric invariance reflects an underlying symmetry with respect to coordinate transformations, which contributes to the robustness and generalizability of the proposed interference quantification method.
3. Welding Groove Detection and Reconstruction Visualization System
The constructed five-layer system is based on the weld groove detection data evaluation algorithm proposed in the previous section. The system architecture is divided into five processing layers: data acquisition, evaluation, feature processing, evaluation feedback, and reconstruction. The system framework is shown in
Figure 4.
In the data acquisition layer, the system utilizes 3D scanning technology to perform high-precision detection of welding grooves. The acquired data is presented in the form of 3D mesh point clouds, providing essential data support for subsequent processing.
In the evaluation layer, the system employs the interference degree evaluation algorithm mentioned in the previous section to analyze the acquired data for interference. The primary task of this layer is to classify the data based on interference levels and present the evaluation results through visualization. The accuracy of the evaluation directly impacts the selection of data processing thresholds, which in turn determines the precision and adaptability of subsequent algorithms.
The feature processing layer supports the integration of various modular algorithms, allowing the system to flexibly accommodate different welding groove detection requirements. In this layer, the system identifies and extracts various features of the welding groove, including surface, edge, and angle features. Additionally, the system can identify the start and end of the groove and assess anomalies in the extraction results. The processing outcomes of this layer provide accurate feature information for use in the subsequent reconstruction and visualization processes.
In the evaluation feedback layer, the system performs further analysis of the extracted feature information and provides a visual decision-support model. Through this visual feedback, the system offers an intuitive display of the welding groove’s quality, the accuracy of feature extraction, and the distribution of anomalies, aiding users in making informed decisions and adjustments.
Finally, the reconstruction layer generates multidimensional models based on the data and evaluation feedback from the previous layers. These models include point cloud reconstruction, feature reconstruction, and visual representations of the evaluation results. Together, these reconstruction models provide comprehensive and accurate information about the welding groove, supporting further analysis and applications.
This five-layer architecture not only ensures flexibility and compatibility but also enables efficient and precise execution of welding groove detection, evaluation, and reconstruction tasks.
In robotic welding, the line-structured light sensor produces mesh point clouds in a streaming manner along the weld direction. The proposed five-layer architecture is organized as a pipeline: a newly scanned line/frame enters the evaluation layer for on-the-fly interference grading; the resulting interference map guides the selection/parameterization of modular algorithms in the feature-processing layer; and the evaluation-feedback and reconstruction layers update the visualization outputs asynchronously. Because the core interference descriptor relies on local neighborhoods and a covariance eigen-analysis, the per-frame computation can be executed incrementally with bounded buffering, enabling low-latency feedback suitable for online adjustment in robotic welding.
4. Experiment and Discussion
4.1. Experimental Setup and Functionality Test of the Welding Groove Detection and Reconstruction Visualization System
The experimental detection system consists of a six-degree-of-freedom welding robot and a line-structured light vision sensor. The system captures 3D point clouds of the workpiece and the scene by scanning along the weld direction using the line-structured light sensor. To validate the functionality of the proposed weld groove detection and reconstruction visualization system, real original detection data affected by noise interference are used for testing. The original detection data, presented as point clouds, are subject to various types of interference. The experimental platform and data acquisition process are shown in
Figure 5.
Initially, the line-structured light vision sensor is driven by the robotic arm to scan the scene and capture original data. The original data contains various types of interference. For the acquired data, the proposed method is first used to assess the degree of interference. Based on this evaluation, the threshold for selecting data processing algorithms is determined, enabling the identification and extraction of weld groove features. Subsequently, the features extracted by the algorithm are categorized into surface features, edge features, and angle features, and anomalies within the features are removed. Finally, the system outputs the point cloud reconstruction model, the evaluation result visualization model, and the feature reconstruction model.
The weld groove recognition algorithms tested in the experiment include the data-driven Region-based Convolutional Neural Network (R-CNN)-based recognition method and the OTSU image segmentation-based recognition algorithm. The feature extraction algorithms tested include methods based on depth changes, second-order derivative changes in depth, and the average slope sliding window algorithm. The anomaly removal algorithms tested include the Random Sample Consensus (RANSAC)-based edge continuity evaluation algorithm and the X-Y, Y-Z bidirectional projection-based anomaly removal algorithm. These methods represent the mainstream approaches in the current research. Due to the modular design and the limitation of input and output data formats, the proposed system is highly compatible with these methods. The system’s input and the outputs of the methods are summarized in
Table 1.
4.2. Overall Accuracy Test of the Evaluation Method
Identifying severely interfered regions is particularly important because these regions dominate threshold selection and determine whether subsequent recognition/extraction modules fail or remain stable. To validate the accuracy of the proposed interference degree evaluation algorithm for detection data, a widely used height-difference-based evaluation algorithm is used for comparison. Height- and depth-variation-based indicators (often combined with sliding-window statistics) are commonly adopted in welding perception pipelines and point-cloud stability evaluation as practical baselines for interference-related analysis [
17,
21,
23]. Unlike the proposed method based on the angle of reconstructed triangular patches, the height-difference-based evaluation method measures the stability of detection data based on the Z-coordinate variations in the point cloud. Since it only requires comparing the original detection data, the computational load of this method is smaller than that of the method proposed in this paper. To ensure consistency in representing the degree of interference between different methods, the height-difference-based evaluation algorithm also adopts a sliding window and regional covariance matrix eigenvalue square average to calculate and quantify the interference degree, with the same RGB value range used to represent interference levels. A set of weld groove detection data obtained from an actual industrial environment is used to test both methods. This data is acquired using the experimental system described in
Section 4.1. Due to external environmental factors, the data experienced significant interference, resulting in large areas of regional pulse noise. The two evaluation methods are applied, and the original detection data, along with the results of both methods, are presented in
Table 2.
From the comparison of the evaluation results in the table, it can be observed that although both methods identify areas of significant interference at both ends of the data, their marking levels differ. The proposed method, based on the angle of reconstructed triangular patches, identifies more areas with severe interference. By observing the X-Y projection of the original data, the regions with severe interference on both sides are widely distributed. These regions are not only marked as red by the height-difference-based evaluation method but also closely correspond to the areas marked as red by the proposed method. To more intuitively represent the differences and effects of the two methods, the difference in interference degree calculated by both methods is linearly computed and mapped to RGB values. The visualization results are shown in
Figure 6.
The results shown in the figure indicate that the proposed method based on the angle of reconstructed triangular patches identifies more areas of interference. To further compare the accuracy of the additional marked areas between the two methods, a disagreement value
is defined as the difference between the evaluation results of the two methods. A disagreement value of 0 indicates that both methods produced the same result, a disagreement value
indicates that the proposed method identifies a higher interference degree at that location, and a disagreement value
indicates that the comparison method identifies a higher interference degree. A splitting value
is defined, where portions of the disagreement values exceeding
represent the disagreement areas selected by the two methods. These disagreement areas
and
are defined as Equations (5) and (6):
where
and
are the maximum and minimum values of
. In this study, the splitting value
is set to 60%, and areas within
are considered regions uniquely marked by the proposed method, while areas within
are considered regions uniquely marked by the height-difference-based evaluation method. The data within these disagreement regions are calculated separately and marked on the original detection data, as shown in
Figure 7.
The results show that the proposed method marks more regions, and these regions correspond to areas in the original data that are severely affected by interference. To quantify the improvement in the effectiveness of the proposed method, the regions identified by the height-difference-based method as severely interfered with are computed. Since the noise and non-noise regions in the severely interfered data are clearly distinguishable, the mapped RGB values show a high and nearly linear correlation with the degree of interference. Therefore, the regions within the top third of the RGB range are defined as the interference regions. The proportion of the severely interfered locations identified by the height-difference method and the proportion of the regions in are compared. The ratio of to reflects the performance advantage of the proposed method. Multiple sets of interference-affected data are computed, and the average performance improvement is found to be 30.99%.
Upon analysis, it is found that the results are due to the significant influence of Z-coordinate variations on the height-difference-based evaluation method. When the Z-coordinate fluctuates significantly but its magnitude is relatively small, the instability degree does not become apparent in the overall height-difference calculation, although large fluctuations in the Z-coordinate are a major indicator of severe interference. One characteristic of large-scale pulse noise in 3D data is significant Z-coordinate variation, which limits the accuracy of traditional methods. The proposed method, based on the angle of reconstructed triangular patches, calculates the angle between the reconstructed triangular patch and the horizontal plane. Therefore, Z-coordinate variations are partially mapped to angle changes ranging from 0° to 90°, making the method less sensitive to height differences. Experimental results show that, compared to the traditional height-difference-based evaluation method, the proposed method is less influenced by Z-direction height differences and is more effective at identifying regions with high dispersion.
In addition, is evaluated on the same industrial datasets. As increases, the dispersion map becomes smoother and less sensitive to isolated pulse noise, while too large may blur small groove details and increase runtime. It is observed that provides the best trade-off between (i) stable identification of severely interfered regions and (ii) preservation of groove-specific geometries (deep gaps/rounded corners). In practice, should be chosen proportional to the expected spatial scale of interference and the point spacing (e.g., covering at least one minimal groove feature width).
4.3. Adaptability Test of the Evaluation Method for Complex Feature
Complex features in the detection data can affect the accuracy of the evaluation algorithm. These complex features generally consist of planes and curved surfaces with abrupt changes, such as deep gaps and rounded corners. While these features are not caused by interference, traditional height-difference-based evaluation methods may incorrectly interpret them as interference. This section tests the performance of both the height-difference-based evaluation method and the proposed method using deep gap and rounded corner features as examples. The selected detection data come from the proposed weld groove detection system, and no external interference is present. A set of weld groove detection data placed on a T-shaped platform is shown in
Figure 8.
The gaps on the T-shaped platform exhibit abrupt height variations in the detection data. The regions on either side of the groove are flat features with large height differences. Both evaluation methods are applied to assess the interference degree of this data. The evaluation results are mapped to RGB values, and the visualized results are shown in
Figure 9.
The results in the figure show that the height-difference-based evaluation method calculates a high interference degree at the deep gap locations, which contradicts the actual data. In contrast, the proposed reconstructed surface angle-based evaluation method calculates a level of interference at the deep gap locations similar to that at other locations, which aligns with the real scenario. The reason for this discrepancy lies in how the two methods assess the degree of interference. The height-difference-based method calculates the stability of the Z-directional values within the window, so large height changes result in high volatility in the calculations. However, the proposed method calculates instability based on the angle of the reconstructed patches. Since the angle of the reconstructed patch at these locations is consistent, it does not affect the proposed method. Another set of detection data containing rounded corner features is shown in
Figure 10.
Similar to the abrupt height changes in the deep gap feature, the curved surfaces in the rounded corner feature exhibit significant height variations. The height-difference-based evaluation method also calculates a high interference degree, whereas the proposed reconstructed patch angle-based evaluation method does not. Both methods are applied to evaluate the interference degree of this data. The evaluation results are mapped to RGB values, and the visualized results are shown in
Figure 11.
The figure results show that the height-difference-based evaluation method calculates an inaccurately high interference degree at the rounded corner locations, whereas the proposed method calculates a much less significant degree of interference. The reason for this difference is also due to the height-difference-based method being more sensitive to height changes when evaluating Z-directional value stability. Unlike planes, the reconstructed patch angles of curved surfaces are not constant. However, the range of angle variations is small, and the degree of change is gradual, so the proposed method is less affected. Abrupt height variations and curved surfaces are common features in groove welding. The experimental results above demonstrate that the proposed reconstructed patch angle-based evaluation method has a higher adaptability to these features.
Overall, the proposed descriptor is most beneficial in scenarios with non-uniform pulse noise where height-based cues become unreliable, and it remains stable under different scanning directions due to Z-rotation stability. Its limitations mainly arise under strong global tilting (X/Y rotation) and severe missing data, which motivates local reference-plane estimation and occlusion-aware strategies as future work.
4.4. Discussion on Computational Efficiency and Noise Characteristics
Although the experimental evaluation in this work is conducted on real industrial groove detection datasets rather than synthetic benchmarks, the selected test cases represent mainstream and challenging scenarios commonly encountered in robotic welding applications. These datasets include complex groove geometries, non-uniform pulse noise, and real sensing artifacts, which together form a representative evaluation setting for practical interference assessment.
The computational cost mainly increases approximately by squared with the window size at each point, while the covariance eigenvalue decomposition remains a constant cost due to the matrix being . In the experiment, it is observed that when exceeds a certain threshold , the stability improvement tends to saturate, but the delay growth is more significant. Therefore, can be used as a key hyperparameter to meet the target frame rate/delay constraints and combined with line-by-line cache reuse and incremental updates to achieve low-latency online feedback for robot welding scenes.
At genuine groove edges, patch angles typically change in a structured manner, leading to anisotropic covariance where variance concentrates along specific directions. In contrast, high-frequency pulse noise introduces irregular, spatially incoherent angle fluctuations, increasing dispersion more uniformly across directions and yielding larger eigenvalue norms. This behavior helps the proposed eigenvalue-based descriptor reduce false alarms on sharp but clean edges while remaining sensitive to noisy perturbations.
Regarding robustness, the experimental results implicitly demonstrate that the eigenvalue-based dispersion descriptor is less sensitive to high-frequency pulse noise than height-difference-based measures. While sharp geometric transitions (e.g., groove edges or corners) exhibit structured and spatially coherent angle variations, high-frequency noise produces irregular and incoherent angular fluctuations, which are effectively captured by increased covariance dispersion. This explains why the proposed method maintains stable interference evaluation under severe noise without additional filtering stages.
5. Conclusions
In this paper, a new method for evaluating the degree of interference in welding groove 3D detection data based on the angle of reconstructed triangular patches is proposed, and an accompanying detection and reconstruction visualization system is developed, as outlined below:
(1) A novel sliding window interference detection algorithm based on point cloud reconstruction angle is introduced. This method is based on the angle between the reconstructed triangular patches and the horizontal plane and utilizes a sliding window and covariance matrix eigenvalues to calculate the regional data dispersion degree as a measure of interference. Experimental results demonstrate that, compared to the mainstream height-difference-based method, this approach is less susceptible to extreme noise in the Z-direction, with an accuracy improvement of approximately 30.99%. Additionally, it shows better adaptability to common welding groove features such as deep gaps and rounded corners.
(2) A five-layer detection and reconstruction visualization system is developed, capable of outputting point cloud reconstruction models, evaluation result visualization models, and feature reconstruction models. Experimental results indicate that, due to its modular design and the restriction of input and output data formats, the system is highly compatible with different recognition, extraction, and evaluation methods.
(3) The experimental system has been constructed, and validation experiments have been completed. The results confirm that the proposed method offers higher accuracy and adaptability compared to traditional methods. The experimental results demonstrate that this approach is more suitable for practical industrial welding groove detection.
Notably, the symmetry-inspired invariance of the proposed descriptor (translation invariance and rotational invariance about the
Z-axis) makes it suitable for stable interference quantification under varying scanning configurations. Although the proposed descriptor is invariant to translations and rotations about the
Z-axis (
Section 2), it assumes the detection plane is parallel to the horizontal plane; therefore, rotations that tilt the scan can alter the patch-to-horizontal angles and consequently affect the inferred interference degree. In addition, partial occlusions or missing returns (e.g., due to specular surfaces, spatter, or line-of-sight constraints) may lead to incomplete local triangular reconstructions within the
window, reducing the reliability of the angle set and covariance statistics. Nevertheless, because the descriptor aggregates second-order statistics over multiple patches, moderate occlusions tend to have limited impact; severe occlusions can be mitigated by incorporating a completion/confidence-weighting strategy (e.g., occlusion-aware point cloud completion) in future work [
34].
Author Contributions
Conceptualization, B.Z., H.L. and Z.W.; methodology, Y.Z. and Z.W.; software, H.H. and Z.W.; validation, Z.W., S.Q. and J.M.; formal analysis, H.H. and Z.W.; investigation, B.Z. and Z.W.; resources, B.Z. and Z.W.; data curation, B.Z. and Z.W.; writing—original draft preparation, B.Z. and Z.W.; writing—review and editing, H.L. and Y.Z. All authors have read and agreed to the published version of the manuscript.
Funding
This paper is supported by the Key Research and Development Project of the Hubei Science and Technology Plan [Grant No. 2024BAB055], the National Natural Science Foundation of China [Grant No. 52275505], Wuhan Power Battery Low-Carbon Recycling Industry Innovation Joint Laboratory [Grant No. 2025020802040290], Integrated Intelligent Control Technology and Demonstration Application of Welding Robots in Prefabricated Steel Structures [Grant No. 2025EIA077], and the China Scholarship Council [Grant No. 202306950003].
Data Availability Statement
The data presented in this study are available on request from the corresponding author.
Conflicts of Interest
The authors declare no conflicts of interest.
References
- Xiao, R.; Xu, Y.; Hou, Z.; Chen, C.; Chen, S. An adaptive feature extraction algorithm for multiple typical seam tracking based on vision sensor in robotic arc welding. Sens. Actuators A Phys. 2019, 297, 111533. [Google Scholar] [CrossRef]
- Xiao, R.; Xu, Y.; Hou, Z.; Chen, C.; Chen, S. A feature extraction algorithm based on improved snake model for multi-pass seam tracking in robotic arc welding. J. Manuf. Process. 2021, 72, 48–60. [Google Scholar] [CrossRef]
- Yang, G.; Wang, Y.; Zhou, N. Detection of weld groove edge based on multilayer convolution neural network. Measurement 2021, 186, 110129. [Google Scholar] [CrossRef]
- Zhang, D.; Lu, X.; Qin, H.; He, Y. Pointfilter: Point cloud filtering via encoder-decoder modeling. IEEE Trans. Vis. Comput. Graph. 2020, 27, 2015–2027. [Google Scholar] [CrossRef]
- Li, Y.; Wang, J.; Li, B.; Sun, W.; Li, Y. An adaptive filtering algorithm of multilevel resolution point cloud. Surv. Rev. 2021, 53, 300–311. [Google Scholar] [CrossRef]
- Leal, E.; Sanchez-Torres, G.; Branch, J.W. Sparse regularization-based approach for point cloud denoising and sharp features enhancement. Sensors 2020, 20, 3206. [Google Scholar] [CrossRef] [PubMed]
- Zou, B.; Qiu, H.; Lu, Y. Point cloud reduction and denoising based on optimized downsampling and bilateral filtering. IEEE Access 2020, 8, 136316–136326. [Google Scholar] [CrossRef]
- Alexiou, E.; Zhou, X.; Viola, I.; Cesar, P. Pointpca: Point cloud objective quality assessment using pca-based descriptors. EURASIP J. Image Video Process. 2024, 2024, 20. [Google Scholar] [CrossRef]
- Alexiou, E.; Ebrahimi, T. Point cloud quality assessment metric based on angular similarity. In Proceedings of the 2018 IEEE International Conference on Multimedia and Expo (ICME), San Diego, CA, USA, 23–27 July 2018. [Google Scholar]
- Meynet, G.; Digne, J.; Lavou’e, G. Pc-msdm: A quality metric for 3D point clouds. In Proceedings of the 2019 Eleventh International Conference on Quality of Multimedia Experience (QoMEX), Berlin, Germany, 5–7 June 2019. [Google Scholar]
- Meynet, G.; Nehm’e, Y.; Digne, J.; Lavou’e, G. Pcqm: A full-reference quality metric for colored 3D point clouds. In Proceedings of the 2020 Twelfth International Conference on Quality of Multimedia Experience (QoMEX), Athlone, Ireland, 26–28 May 2020. [Google Scholar]
- Casin, P. A generalization of principal component analysis to k sets of variables. Comput. Stat. Data Anal. 2001, 35, 417–428. [Google Scholar] [CrossRef]
- Liu, C.; Wang, H.; Huang, Y.; Rong, Y.; Meng, J.; Li, G.; Zhang, G. Welding seam recognition and tracking for a novel mobile welding robot based on multi-layer sensing strategy. Meas. Sci. Technol. 2022, 33, 055109. [Google Scholar] [CrossRef]
- Yang, L.; Liu, Y.; Peng, J.; Liang, Z. A novel system for off-line 3D seam extraction and path planning based on point cloud segmentation for arc welding robot. Robot. Comput.-Integr. Manuf. 2020, 64, 101929. [Google Scholar] [CrossRef]
- Mitra, N.J.; Nguyen, A. Estimating surface normals in noisy point cloud data. In Proceedings of the Nineteenth Annual Symposium on Computational Geometry, San Diego, CA, USA, 8–10 June 2003. [Google Scholar]
- Nurunnabi, A.; Belton, D.; West, G. Diagnostic-robust statistical analysis for local surface fitting in 3D point cloud data. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, I-3, 269–274. [Google Scholar] [CrossRef]
- Yang, L.; Liu, Y.; Peng, J. Advances techniques of the structured light sensing in intelligent welding robots: A review. Int. J. Adv. Manuf. Technol. 2020, 110, 1027–1046. [Google Scholar] [CrossRef]
- Xu, S.; Shi, B.; Wang, C.; Xing, F. Novel high-performance automatic removal method of interference points for point cloud data in coal mine roadway environment. Int. J. Remote Sens. 2023, 44, 1433–1459. [Google Scholar]
- Hu, Y.; Wu, Q.; Wang, L.; Jiang, H. Multiview point clouds denoising based on interference elimination. J. Electron. Imaging 2018, 27, 023009. [Google Scholar] [CrossRef]
- Ding, S.; Chen, X.; Ai, C.; Wang, J.; Yang, H. A noise-reduction algorithm for raw 3D point cloud data of asphalt pavement surface texture. Sci. Rep. 2024, 14, 16633. [Google Scholar] [CrossRef]
- Zeng, J.; Chang, B.; Du, D.; Wang, L.; Chang, S.; Peng, G.; Wang, W. A weld position recognition method based on directional and structured light information fusion in multi-layer/multi-pass welding. Sensors 2018, 18, 129. [Google Scholar] [CrossRef]
- Ding, Y.; Huang, W.; Kovacevic, R. An on-line shape-matching weld seam tracking system. Robot. Comput.-Integr. Manuf. 2016, 42, 103–112. [Google Scholar] [CrossRef]
- Lü, X.; Gu, D.; Wang, Y.; Qu, Y.; Qin, C.; Huang, F. Feature extraction of welding seam image based on laser vision. IEEE Sens. J. 2018, 18, 4715–4724. [Google Scholar] [CrossRef]
- Zhang, H.; Yang, X.; Luo, Y.; Chen, F.; Lu, N.; Liu, Y.; Deng, Y. A novel hybrid algorithm for damage detection in bridge foundations under complex underwater environments using ROV capture pictures. Eng. Struct. 2026, 352, 122131. [Google Scholar] [CrossRef]
- Cheng, D.; Zhao, D.; Zhang, J.; Wei, C.; Tian, D. PCA-based denoising algorithm for outdoor LiDAR point cloud data. Sensors 2021, 21, 3703. [Google Scholar] [CrossRef]
- Nurunnabi, A.; West, G.; Belton, D. Outlier detection and robust normal-curvature estimation in mobile laser scanning 3D point cloud data. Pattern Recognit. 2015, 48, 1404–1419. [Google Scholar] [CrossRef]
- Bray, D.; Gilmour, S.; Guild, F.; Taylor, A. The effects of particle morphology on the analysis of discrete particle dispersion using Delaunay tessellation. Compos. Part A Appl. Sci. Manuf. 2013, 54, 37–45. [Google Scholar] [CrossRef]
- Rayat, C.S.; Rayat, C.S. Measures of dispersion. Stat. Methods Med. Res. 2018, 27, 47–60. [Google Scholar]
- Gries, S.T. What do (most of) our dispersion measures measure (most)? dispersion? J. Second. Lang. Stud. 2022, 5, 171–205. [Google Scholar] [CrossRef]
- Singh, S.K.; Banerjee, B.P.; Lato, M.J.; Sammut, C.; Raval, S. Automated rock mass discontinuity set characterisation using amplitude and phase decomposition of point cloud data. Int. J. Rock Mech. Min. Sci. 2022, 152, 105072. [Google Scholar] [CrossRef]
- Huang, H.; Yuan, S.; Wen, C.; Hao, Y.; Fang, Y. Noisy few-shot 3D point cloud scene segmentation. In Proceedings of the 2024 IEEE International Conference on Robotics and Automation (ICRA), Yokohama, Japan, 13–17 May 2024. [Google Scholar]
- Dong, Y.; Xu, B.; Liao, T.; Yin, C.; Tan, Z. Application of local-feature-based 3D point cloud stitching method of low-overlap point cloud to aero-engine blade measurement. IEEE Trans. Instrum. Meas. 2023; in press. [Google Scholar]
- Zhai, R.; Li, X.; Wang, Z.; Guo, S.; Hou, S.; Hou, Y.; Gao, F.; Song, J. Point cloud classification model based on a dual-input deep network framework. IEEE Access 2020, 8, 55991–55999. [Google Scholar] [CrossRef]
- Sangaiah, A.K.; Anandakrishnan, J.; Kumar, S.; Bian, G.B.; AlQahtani, S.A.; Draheim, D. Point-KAN: Leveraging Trustworthy AI for Reliable 3D Point Cloud Completion with Kolmogorov Arnold Networks for 6G-IoT Applications. IEEE Internet Things J. 2025. [Google Scholar] [CrossRef]
Figure 1.
Process for evaluating the regional dispersion of detection data based on a sliding window.
Figure 1.
Process for evaluating the regional dispersion of detection data based on a sliding window.
Figure 2.
Triangular patch reconstruction process for point cloud data within a window to assess dispersion.
Figure 2.
Triangular patch reconstruction process for point cloud data within a window to assess dispersion.
Figure 3.
Visualization process of the calculated dispersion degree of detection data.
Figure 3.
Visualization process of the calculated dispersion degree of detection data.
Figure 4.
Framework of the weld groove detection and reconstruction visualization system.
Figure 4.
Framework of the weld groove detection and reconstruction visualization system.
Figure 5.
Experimental platform and data acquisition process.
Figure 5.
Experimental platform and data acquisition process.
Figure 6.
Difference in interference degree calculated by both methods. In the figure, areas in light blue indicate regions marked as severely interfered with by the height-difference-based evaluation method only, while areas in light yellow indicate regions marked as severely interfered with by the proposed method. Areas in light green indicate regions where both methods’ evaluation results are similar, with a difference of 0. Furthermore, since the same RGB value range is used, linear calculation and comparison are feasible.
Figure 6.
Difference in interference degree calculated by both methods. In the figure, areas in light blue indicate regions marked as severely interfered with by the height-difference-based evaluation method only, while areas in light yellow indicate regions marked as severely interfered with by the proposed method. Areas in light green indicate regions where both methods’ evaluation results are similar, with a difference of 0. Furthermore, since the same RGB value range is used, linear calculation and comparison are feasible.
Figure 7.
Visualization of regions with large differences in interference degree quantification results between the two methods. The light yellow area indicates regions marked by the proposed method only, while the light blue area indicates regions marked by the height-difference-based evaluation method only.
Figure 7.
Visualization of regions with large differences in interference degree quantification results between the two methods. The light yellow area indicates regions marked by the proposed method only, while the light blue area indicates regions marked by the height-difference-based evaluation method only.
Figure 8.
Weld groove detection data with a deep gap feature; (a,b) are the 3D view and X-Y projection of the detection data, respectively. The T-shaped platform groove is marked in the X-Y projection image.
Figure 8.
Weld groove detection data with a deep gap feature; (a,b) are the 3D view and X-Y projection of the detection data, respectively. The T-shaped platform groove is marked in the X-Y projection image.
Figure 9.
Evaluation results of weld groove detection data with a deep groove feature: (a) results from the height-difference-based evaluation method; (b) results from the proposed reconstructed surface angle-based evaluation method. All results are mapped to the same RGB value range; red indicates areas with a high degree of interference calculated by the method.
Figure 9.
Evaluation results of weld groove detection data with a deep groove feature: (a) results from the height-difference-based evaluation method; (b) results from the proposed reconstructed surface angle-based evaluation method. All results are mapped to the same RGB value range; red indicates areas with a high degree of interference calculated by the method.
Figure 10.
Weld groove detection data with a rounded corner feature; (a,b) are the 3D view and X-Y projection of the detection data, respectively.
Figure 10.
Weld groove detection data with a rounded corner feature; (a,b) are the 3D view and X-Y projection of the detection data, respectively.
Figure 11.
Evaluation results of weld groove detection data with a rounded corner feature: (a) results from the height-difference-based evaluation method, (b) results from the proposed reconstructed surface angle-based evaluation method. All results are mapped to the same RGB value range; red indicates areas with a high degree of interference calculated by the method.
Figure 11.
Evaluation results of weld groove detection data with a rounded corner feature: (a) results from the height-difference-based evaluation method, (b) results from the proposed reconstructed surface angle-based evaluation method. All results are mapped to the same RGB value range; red indicates areas with a high degree of interference calculated by the method.
Table 1.
Methods used in the experiment and the input/output data formats.
Table 1.
Methods used in the experiment and the input/output data formats.
| Method | Position in System | Function | Input | Output |
|---|
| Data-driven R-CNN-based recognition method | Feature Processing Layer | Weld groove recognition | Data labeled based on the degree of interference calculated by the evaluation method | Weld groove region |
| OTSU image segmentation-based recognition algorithm | Feature Processing Layer | Weld groove recognition | OTSU segmentation threshold, based on the degree of interference determined by the evaluation method | Weld groove region |
| Depth change-based method | Feature Processing Layer | Weld groove feature extraction | Depth change screening threshold, based on the degree of interference determined by the evaluation method | Groove edge features |
| Second-order derivative of the depth change-based method | Feature Processing Layer | Weld groove feature extraction | Second-order derivative of depth change screening threshold, based on the degree of interference determined by the evaluation method | Groove edge features |
| Average slope sliding window algorithm | Feature Processing Layer | Weld groove feature extraction | Sliding window size, based on the degree of interference determined by the evaluation method | Groove edge features |
| RANSAC-based edge continuity evaluation algorithm | Evaluation Feedback Layer | Removal of anomalies from extracted features | Groove edge features | Groove edge features after anomaly removal |
| X-Y, Y-Z bidirectional projection-based anomaly removal algorithm | Evaluation Feedback Layer | Removal of anomalies from extracted features | Groove edge features | Groove edge features after anomaly removal |
Table 2.
Test results of the evaluation methods: RGB values are used to represent the degree of interference, with red indicating high interference and blue indicating low interference.
| Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |