Next Article in Journal
Effect of Nitrogen on the Corrosion Resistance of 6Mo Super Austenitic Stainless Steel
Previous Article in Journal
Effects of Annealing Temperature on Microstructural Evolution and Mechanical Properties in Cold-Rolled High-Nitrogen Austenitic Steel
Previous Article in Special Issue
Research on the Formation, Microstructure, and Properties of 304 Stainless Steel AC-DC Hybrid TIG Welding
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Detection and Imaging of Corrosion Defects in Steel Structures Based on Ultrasonic Digital Image Processing

1
National Key Laboratory of Precision Welding and Joining of Materials and Structures, Harbin Institute of Technology, Harbin 150001, China
2
PipeChina Engineering Quality Supervision and Inspection Company, Beijing 100013, China
*
Author to whom correspondence should be addressed.
Metals 2024, 14(4), 390; https://doi.org/10.3390/met14040390
Submission received: 30 January 2024 / Revised: 27 February 2024 / Accepted: 12 March 2024 / Published: 26 March 2024
(This article belongs to the Special Issue Corrosion Protection for Metallic Materials)

Abstract

:
Corrosion is one of the critical factors leading to the failure of steel structures. Ultrasonic C-scans are widely used to identify corrosion damage. Limited by the range of C-scans, multiple C-scans are usually required to cover the whole component. Thus, stitching multiple C-scans into a panoramic image of the area under detection is necessary for interpreting non-destructive testing (NDT) data. In this paper, an image mosaic method for ultrasonic C-scan based on scale invariant feature transform (SIFT) is proposed. Firstly, to improve the success rate of registration, the difference in the probe starting position in two scans is used to filter the matching pairs of feature points obtained by SIFT. Secondly, dynamic programming methods are used to search for the optimal seam path. Finally, the pixels in the overlapping area are fused by fade-in and fade-out fusion along the seam line. The improved method has a higher success rate of registration and lower image distortion than the conventional method in the mosaic of ultrasonic C-scan images. Experimental results show that the proposed method can stitch multiple C-scan images of a testing block containing artificial defects into a panorama image effectively.

1. Introduction

Corrosion is a common kind of defect in steel structures and is one of the main direct or indirect causes of structural failure [1,2]. A common case is the corrosion of buried water pipelines. Water pipeline corrosion directly harms the pipe itself, contributing to longitudinal stress, metal loss, pipe wall perforation, and reductions in service life, as well as cathodic or coating disbanding and hydrogen embrittlement [3]. Another common case is corrosion in structural steel due to exposure to moisture and oxygen. The annual cost due to the corrosion of steel structures, especially at coastal sites, can be tremendous [4]. There are different types of corrosion in steel, such as general, galvanic, bimetallic, embedded, atmospheric, and cut-edge corrosion. Correctly determining the spatial distribution of steel corrosion is critical for estimating the remaining service life of deteriorating steel structures.
Currently, non-destructive techniques such as ultrasonic inspection [5,6], radiography [7], thermography [8], optical inspection [9], etc. have been employed to monitor and identify the corrosion damage in the steel structures. In the detection of general corrosion that causes large area thickness loss in metal, phased array ultrasonic testing (PAUT) has many advantages, including increased flaw detection ability, increased inspection speed, and reliability of results. PAUT has been widely used for the in-service detection and characterization of corrosion in pipes, tanks, vessels, and other critical assets [10,11,12]. Due to their larger footprint, PAUT probes can cover a larger surface at higher speeds, leading to a significant time reduction and enhanced resolution.
The observation of corrosion defects by PAUT is divided into two parts: the outline and the depth. The outline of corrosion can be observed by the color change in the C-scan image, and the depth of corrosion can be determined by the arrival time of the ultrasonic signal. Ultrasonic C-scan uses a projection of the ultrasonic data onto a plan view of the component being tested to create an image, which is the most used view in phased array corrosion mapping. However, in the detection of large-area general corrosion by PAUT, limited by the range of a single C-scan, multiple C-scans are usually required to cover the whole component. Stitching multiple partially overlapped C-scan images into a panorama image brings great convenience to interpreting NDT data.
Image mosaicing is an effective means of constructing a single seamless image by aligning multiple partially overlapped images [13]. Image mosaic is divided into two steps: image registration and image fusion. Image registration is a process that combines different images obtained from different times, different views, or different sensors altogether [14]. After image registration, the original images are placed in the same coordinate system. In the stage of image fusion, the images are merged to form a panorama image. At present, point-feature-based registration is the most widely used method in the mosaic of NDT images, in which the representative scale invariant feature transform (SIFT) algorithm was proposed by David G. Lowe in 1999 [15] and upgraded in 2004 [16]. The features extracted by SIFT are invariant to image scale and rotation and are shown to provide robust matching across a substantial range of affine distortion, change in 3D viewpoint, addition of noise, and change in illumination.
The SIFT algorithm has been used in the registration and mosaic of optical images [17], synthetic aperture radar (SAR) images [18], microscopic images [19], remote sensing images [20,21], and other fields. And SIFT is also commonly used to stitch NDT images. Huang et al. [22] proposed an improved SIFT image mosaic algorithm to stitch the optical images of printed circuit board bare plates. The accuracy and efficiency of image mosaics were effectively improved by designing a block strategy and using the similarity criterion. Zhao et al. [23] used the improved SIFT algorithm to stitch multiple videoscope images of the inner wall of the pipeline. The videoscope image of the inner surface of the pipeline was completely reconstructed by this method.
However, because of the low image resolution, random defect distribution, and image error caused by probe jitter and coupling conditions in ultrasonic NDT images, it is difficult for conventional image mosaic algorithms to stitch ultrasonic NDT images. Consequently, it is necessary to introduce other methods to improve the robustness of the SIFT algorithm. Qiu et al. [24] combined features from the accelerated segment test (FAST) operator and SIFT descriptor to detect and describe image features. The fast library for approximate nearest neighbors (FLANN) algorithm with adaptive threshold and the improved random sample consensus (RANSAC) algorithm were used to remove the mismatching points. The speed and accuracy of feature matching were obviously improved. Teng et al. [25] proposed a SIFT-based technique that was modality-invariant and still retained the strengths of local techniques. Histogram weighting strategies were introduced to improve the accuracy of descriptor matching. The results showed that the method could not only improve multimodal registration accuracy but also had the potential to improve the performance of all SIFT-based applications. Hossein-Nejad et al. [26] developed SIFT using adaptive RANSAC. The threshold value of RANSAC exploited the variance of the distances in addition to the mean value. Simulation results confirmed the superiority of the method in terms of the correct matching rate. Hossein-Nejad et al. [27] also proposed a method called clustered redundant key point elimination (CRKEM) to remove the redundancy of key points extracted by SIFT. The experimental results confirmed the superiority of the proposed method in image mosaicing as well as in image registration and matching.
In this paper, a novel image mosaic method based on SIFT is proposed to stitch the ultrasonic C-scan images acquired by PAUT. The matching of feature points is first filtered by the difference in probe starting position between two scans and then further filtered by RANSAC to improve the success rate of image registration. Also, the dynamic programming method and the fade-in and fade-out image fusion methods are introduced to reduce the distortion in the fused image. Multiple C-scan images of a testing block containing artificial defects are stitched into a high-quality panorama image by this method.

2. Principles of SIFT Image Registration

SIFT features are the local image features that remain unchanged during rotation, zooming, and lighting and keep somewhat stable during view changing, affine transformation, and noise emission. The SIFT algorithm is implemented in two phases. The first phase is the generation of SIFT features, namely extracting the feature points irrelevant to scale zooming and rotation from the images to be matched. The second phase is the matching of SIFT feature points.
The SIFT algorithm is implemented in the following four steps: check the extremum of scale space, refine the position of feature points, calculate the descriptive information of feature points, and generate local feature descriptors. Use the Gaussian filter first to filter the original images several consecutive times and establish the first scale group. Then, halve the image size and implement the same Gaussian filtering to form the second scale group. Repeat this process until the image size is smaller than a given threshold. Next, differentiate the Gaussian images in every scale group to obtain the Gaussian difference scale group (DOG) images. The generation of DOG images is shown in Figure 1. Then, extract the local extremums of these Gaussian difference images and obtain the feature points of the images in the scale space domain.
The principal direction assignment of key points is based on the local gradient direction of the image. Using the gradient characteristics and direction distribution of pixels in the neighborhood of key points, the gradient modulus and direction can be obtained as follows:
m x ,   y = I x + 1 ,   y I x 1 ,   y + I x ,   y + 1 I x ,   y 1
θ x , y = a r c t a n ( I x , y + 1 I x , y 1   I x + 1 ,   y I x 1 ,   y
A key point descriptor is created by first computing the gradient magnitude and orientation at each image sample point in a region around the key point location. An 4 × 4 arrays of 8 bin histogram is created and shown in Figure 2. The 16 × 16 pixels in the neighborhood are divided into 16 sub-regions with 4 × 4 pixels, and the 128D feature descriptor of SIFT can be obtained by calculating the eight gradient directions of each region. Then, the Best Bin First algorithm is used to search for the nearest neighbor and the next nearest neighbor of a key point. If the searched feature point is E, and P and P′ are the two feature points with the shortest and the second shortest Euclidean distance, the distance ratio Z of EP and EP′ is calculated. If the Z value is smaller than a certain threshold, the feature point pair is matched. The RANSAC algorithm is used to extract the final matching pairs to calculate the transformation matrix. The images will be registered by the transformation matrix.

3. Implementation of Image Mosaic Based on the SIFT Algorithm

The error rate of feature matching in the registration of an ultrasonic C-scan image is high for the conventional SIFT algorithm. Moreover, there is an obvious deviation in the fused image, which brings inconvenience to interpreting NDT data. In this paper, the difference in probe starting position between two scans is used to filter the matching pairs of feature points obtained by SIFT. And then the matching pairs are further filtered by the RANSAC algorithm to obtain the correct transformation matrix. The dynamic programming method is used to search for an optimal seam line, and the pixels of two images are fused by fade-in and fade-out fusion along the seam line. The flow chart of the image mosaic method for ultrasonic C-scan images is shown in Figure 3.

3.1. Acquisition of C-Scan Images

The instrument used in this paper is shown in Figure 4. The test instrument is a Multi 2000 Pocket ultrasonic phased array produced by M2M in France, connected to an E6A2-CW3E rotary encoder produced by OMRON Corporation in Japan to acquire C-scan images. According to the sound velocity, attenuation, and thickness of the testing block, the following parameters are chosen: The central frequency of the probe is 5 MHz. Simultaneously excite 16 of the 64 elements for detection. The focusing law is 0-degree beam steering, and the system gain is 57 dB. The encoder step value is 0.5 mm. The mechanical scanning range is 105 mm. Select the highest signal peak value exceeding the threshold of the gate for C-scan imaging. The width of the C-scan image is 24.8 mm under the above condition. Experimental results show that using the above-mentioned parameters, images of higher quality can be obtained.
In order to imitate the irregular shape of an actual corrosion defect, a testing block containing an artificial defect is made. Metal loss with different depths and contours is made using a milling machine and a milling cutter of 15 mm in diameter. The depths of the irregular stepped area are 1 mm, 2 mm, and 5 mm, respectively. The outline of the defect is randomly selected in the milling process. The testing block containing artificial defects is shown in Figure 5.
The testing block is divided into four parts to be scanned, and the C-scan images for each scan are shown in Figure 6, respectively. The C-scan image generated by PAUT takes the direction of encoder stepping as a horizontal axis and the direction of probe electronic scanning as a vertical axis. In the C-scan image, a pixel region of fixed length and width represents the signal peak of a single A-scan. Hence, the proportion of C-scan image output directly is not consistent with the actual proportion. If the image is compressed according to the actual proportion, the compression process will affect the matching of the feature point. Therefore, C-scan images used to be stitched are directly output. After the image mosaic is all completed, the panorama image will be compressed according to the actual proportion.

3.2. Image Registration

Because the resolution of the ultrasonic C-scan image is low and the texture features are obscure, the matching pairs of features calculated by the conventional SIFT algorithm include a large number of mismatching pairs. To solve this problem, the difference in probe starting position between two C-scans is used to screen the matching pairs. The matching pairs are further screened by RANSAC, and the final matching pairs are used to calculate the transformation matrix. The principle of matching pair filtering is shown in Figure 7.
Set the starting points of the probe in the first and second C-scans to ( x a , y a ) and ( x b , y b ) , respectively. There are a total of m matching pairs calculated by the SIFT algorithm, and the corresponding feature point coordinate sets are as follows:
l o c a = x a 1 , y a 1 , x a 2 , y a 2 , x a i , y a i x a m , y a m l o c b = x b 1 , y b 1 , x b 2 , y b 2 , x b i , y b i x b m , y b m
The formula for matching pair filtering is as follows:
1 k x b i x a i x a x b 1 + k 1 k y b i y a i y a y b 1 + k
where k is the error coefficient allowed in the screening of matching pairs.
Taking the stitching of the first and second C-scan images as an example of image mosaic, the feature point matching pairs of the conventional SIFT algorithm and the improved registration method are respectively shown in Figure 8. The same numbers in the images represent the feature point matching pairs between the two original images. In Figure 8a, only 3 of the 16 pairs are matched correctly by the conventional SIFT algorithm. In Figure 8b, 11 of the 16 pairs are matched correctly by the improved image registration method.
The transformation matrices calculated by the matching pairs shown in Figure 8a,b are respectively used to stitch the original images directly. The fused images are shown in Figure 9. By the conventional SIFT algorithm, the transformation matrix is miscalculated due to the excessive error rate of matching pairs, leading to the failure of the image mosaic shown in Figure 9a. The transformation matrix is calculated correctly by the improved registration method, and the fused image is shown in Figure 9b. However, the fused image shown in Figure 9b has obvious differences on both sides of the seam line, and the pixel transition in the overlapping area of two images is not smooth.

3.3. Image Fusion

If two C-scan images are stitched directly, the difference on both sides of the seam line is obvious in the fused image. For this reason, this paper introduces an optimal seam line searching algorithm and fade-in and fade-out fusion to eliminate the difference in the fused image.

3.3.1. Improved Optimal Seam Searching

To make the seam line ideal for segmenting the overlapping area in the fused image, the difference in intensity and the difference in geometrical structure around the seam line should be minimal. For this reason, the reference [28] puts forward the criterion value for searching for the optimal seam line as follows:
E x , y = E 2 c o l o r x , y + E g e o m e t r y x , y
where E c o l o r ( x , y ) represents the value difference between overlapping pixels in two images, and E g e o m e t r y ( x , y ) represents the structural difference between overlapping pixels in two images. To emphasize the difference between the four edge pixels in the evaluation criterion of structure similarity, the gradient operator templates shown below are used to calculate E g e o m e t r y ( x , y ) as follows:
S x = 2 0 2 1 0 1 2 0 2   and   S y = 2 1 2 0 0 0 2 1 2
Set the two original images to f 1 and f 2 , and E g e o m e t r y ( x , y ) can be obtained from the following formula:
E g e o m e t r y = D i f f f 1 x , y , f 2 x , y
D i f f is obtained by calculating the product of the gradient differences between two images in the x and y directions.
The optimal seam line searching algorithm is divided into three steps:
  • Initialization. Each pixel in the first column is regarded as the starting point of a seam line. The intensity value of the seam line is initialized to the criterion value of the pixel in the first column.
  • Expansion. The column that has calculated the intensity value extends down to the last column. For each seam line, the criterion values sum to the current pixel, and one of the three pixels in the next column is calculated. The direction of the minimum criterion value is chosen as the expansion direction.
  • Selecting the optimal seam line. The seam line with the minimum criterion value is selected as the optimal seam line.
The schematic diagram of searching for an optimal seam line is shown in Figure 10.
According to the above method, the path of the optimal seam line and the fused image are shown in Figure 11.

3.3.2. Fade-in and Fade-out Fusion

After two C-scan images are stitched according to the optimal seam line, the difference on both sides of the seam line has improved, but the transition of pixel value is still not smooth. To improve the quality of the fused image, fade-in and fade-out image fusion is employed along the optimal seam line. Fade-in and fade-out image fusion is shown in Figure 12.
Fade-in and fade-out image fusion assigns different weights to the pixels in the overlapping area of two images, which can transform the pixels smoothly and preserve the information of the original images as much as possible. Fade-in and fade-out image fusion can be expressed by the following formula:
F x , y = f 1 x , y                                                        x , y f 1           ω 1 x , y f 1 x , y + ω 2 x , y f 2 x , y        x , y f 1 f 2 f 2 x , y                                                        x , y f 2          
In the formula, f 1 x , y and f 2 x , y are the upper and lower images to be fused, respectively; ω 1 x , y and ω 2 x , y are the distribution weights of pixel values of the upper and lower images, respectively.
The distribution weights should be selected according to the image’s characteristics. To avoid excessive smoothing of the fusion region, new distribution weights are adopted, as shown in Figure 13b. Set the width of the overlapping area of two images to R, and select the area with the width of R/3 along the seam line as the fusion region. The sum of ω 1 x , y and ω 2 x , y is constantly equal to 1, as ω 1 x , y decreases from 1 to 0 and ω 2 x , y increases from 0 to 1 in the form of a cubic function.
The fused images obtained by different fusion methods are shown in Figure 14. The fused image, which is stitched directly, is shown in Figure 14a, and the fused image stitched by the improved fusion method is shown in Figure 14b. The results show that pixel values in the fusion region are transformed smoothly by the improved fusion method.
According to the above method, four C-scan images are stitched into a panorama image. Because the proportion of the C-scan images used for stitching is different from the actual proportion, the panorama image is compressed according to the actual proportion after stitching, as shown in Figure 15a. It is hard to quantitatively evaluate the fused image through visual observation. Therefore, we flip the photo of the defect horizontally and overlap it with the fused image, which is shown in Figure 15b. By calculating the area deviation of each region between the fused image and photo, we confirm that the fused image corresponds well to the photo of the testing block.

4. Evaluation of Image Mosaic Results

According to the above method, the four C-scan images are stitched three times to obtain a panorama image. The result of image registration can be evaluated by the number and accuracy of feature point matching pairs. The evaluation criteria for image fusion are divided into subjective evaluation and objective evaluation [29]. In this paper, peak signal-to-noise ratio (PSNR) [30] and structure similarity (SSIM) [31] in the objective evaluation index are used to evaluate the results of image fusion.

4.1. Evaluation of the Image Registration Results

Image registration is always needed for the three stitches. The number and accuracy of matching pairs directly affect the success rate of image registration. In the processes of three stitching, the number and accuracy of matching pairs of the conventional SIFT algorithm and the improved method are shown in Table 1. After screening the matching pairs according to the probe starting position, the final number of matching pairs may be less, but the accuracy is greatly improved. Because there is only the relationship of rotation and translation between the C-scan images, the transformation matrix can be calculated by a rigid-body transformation model. At least two sets of feature point matching pairs are needed to calculate the transformation matrix. Therefore, the accuracy of matching pairs has a greater impact on the success rate of image registration.

4.2. Evaluation of Image Fusion Results

PSNR is the most widely used objective image evaluation index. It represents the ratio of the energy of the peak signal to the average energy of the noise in the image. PSNR can be obtained by using the following formula:
P S N R = 10 l o g 10 ( 2 n 1 ) 2 M S E
In the formula, n is the number of bits per pixel, which is generally taken as 8; MSE is the mean square error of the current image I 1 and the reference image I 2 . The MSE formula for the image of “ H × W ” size is as follows:
M S E = 1 H × W x = 1 H y = 1 W I 1 x , y I 2 x , y 2
The value of PSNR is inversely proportional to the distortion of the fused image. The higher the PSNR value, the less the image distortion.
Considering that PSNR is an error-sensitive evaluation method, the results of PSNR and human vision are different. To avoid the obvious difference between objective evaluation results and visual evaluation results, this paper also introduces SSIM to evaluate image fusion.
SSIM is an index to measure the similarity of two images. It mainly considers three key features: brightness, contrast, and structure. SSIM is more consistent with the intuitive perception of human eyes. The SSIM index can be calculated using the following formula:
S S I M x , y = L x , y × C x , y × S x , y
In the formula, L x , y represents the brightness difference in two images, C x , y represents the contrast difference in two images, and S x , y represents the structural similarity difference in two images. The formulas for calculating L x , y , C x , y , and S x , y are as follows:
L x , y = 2 μ x μ y + C 1 μ x 2 + μ y 2 + C 1
C x , y = 2 σ x σ y + C 2 σ x 2 + σ y 2 + C 2
S x , y = 2 σ x y + C 3 σ x σ y + C 3
In the formula, μ x and μ y refer to the average brightness of two images; σ x and σ y refer to the standard difference of two images; σ x y is the covariance of two images; and C 1 , C 2 , and C 3 are constants. C 1 is often taken as ( K 1 L ) 2 ; C 2 is often taken as ( K 2 L ) 2 ; and C 3 is often taken as 0.5 C 2 . The higher the SSIM value, the less the image distortion.
Both PSNR and SSIM are full reference evaluation indexes, so it is necessary to select an ideal image as the reference image. This paper uses the C-scan image covering the fusion region obtained with the same parameters as the reference image. The fusion region in the fused image is compared with the same region in the reference image to calculate the PSNR and SSIM. The PSNR and SSIM of the images fused directly and fused by the improved method are shown in Figure 16. The results show that the image distortion caused by the improved fusion method is lower.

5. Conclusions

In this paper, an image mosaic method for ultrasonic C-scan based on the SIFT algorithm is proposed. Four C-scan images of the testing block containing artificial defects are stitched into a qualified panorama image by this method. From the current study, the following conclusions can be drawn:
  • To improve the success rate of image registration, the feature point matching pairs obtained by the SIFT algorithm are screened by the difference in probe starting position in two C-scans. And then the matching pairs are further screened by RANSAC to obtain the final matching pairs used to calculate the transformation matrix. The success rate of feature point matching has greatly improved.
  • After registration, a dynamic programming method is used to search for the optimal seam line. The pixels on both sides of the seam line are fused by the fade-in and fade-out image fusion. In this way, the image error on both sides of the seam line is eliminated, and the pixels in the fusion region are transformed smoothly.
  • The panorama image is obtained by stitching four C-scan images three times. The experimental results show that the improved image mosaic method has higher accuracy of feature point matching and smaller image distortion, which is significantly better than the conventional SIFT algorithm.
The image mosaic method is suitable for the complete imaging of corrosion defects in plate-like structures. It is important to note that the use of the image mosaic method may have some technical limitations. One case where the method may not be applicable is when testing on a curved surface. In this case, the process of ultrasonic imaging may change the local features of C-scan images, which brings difficulty to image registration. The above-mentioned limitation will be addressed in future work by the authors.

Author Contributions

Conceptualization, D.C. and H.L.; methodology, D.C., Z.X. and H.L.; software, Z.X.; validation, Z.X.; formal analysis, Z.X. and H.L.; investigation, Z.X.; resources, D.C.; data curation, Z.X. and H.L.; writing—original draft preparation, Z.X.; writing—review and editing, D.C., Z.X. and H.L.; visualization, Z.X.; supervision, D.C.; project administration, D.C.; funding acquisition, D.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (52375328) and the CGN-HIT Advanced Nuclear and New Energy Research Institute (CGN-HIT202310).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Acknowledgments

The authors are grateful to Qingsheng Li, Qiang Guo, Weigang Su and Tao Jia for providing testing block and helpful discussions on topics related to this work.

Conflicts of Interest

Author Haichun Liu was employed by the company PipeChina Engineering Quality Supervision and Inspection Company. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Nomenclature

DOGDifference of Gaussians
FASTFeatures from accelerated segment test
FLANNFast library for approximate nearest neighbors
LBPLocal binary patterns
MSEMean square error
NDTNon-destructive testing
PAUTPhased array ultrasonic testing
PSNRPeak signal-to-noise ratio
RANSACRandom sample and consensus
SARSynthetic aperture radar
SIFTScale-invariant feature transform
SSIMStructure similarity index measure
SURFSpeeded up robust features
E c o l o r Value difference of pixels
E g e o m e t r y Structure difference of pixels
S x Operator templates to calculate the gradient in the x direction
S y Operator templates to calculate the gradient in the y direction
f 1 x , y & f 2 x , y Original images
l o c a & l o c b Coordinate set of feature points
μ x & μ y Average brightness of image
σ x & σ y Standard difference of image
σ x y Covariance of two images
ω 1 x , y & ω 2 x , y The distribution weights of the pixel values
kError allowed in the screening of matching pairs
C x , y Contrast difference between images
E x , y The criterion value for searching for the optimal seam line
I x , y Gaussian smoothed image
L x , y Brightness difference between images
S x , y Structural similarity difference between images
m x ,   y Gradient magnitude
θ x , y Gradient orientation

References

  1. Abbas, A.; Adesina, A.Y.; Suleiman, R.K. Influence of organic acids and related organic compounds on corrosion behavior of stainless steel—A critical review. Metals 2023, 13, 1479. [Google Scholar] [CrossRef]
  2. Basdeki, M.; Apostolopoulos, C. Mechanical behavior evaluation of tempcore and HYBRID reinforcing steel bars via a proposed fatigue damage index in long terms. Metals 2021, 11, 834. [Google Scholar] [CrossRef]
  3. Hussein Farh, H.M.; Ben Seghier, M.E.A.; Taiwo, R.; Zayed, T. Analysis and ranking of corrosion causes for water pipelines: A critical review. Npj Clean Water 2023, 6, 65. [Google Scholar] [CrossRef]
  4. Di Sarno, L.; Majidian, A.; Karagiannakis, G. The Effect of Atmospheric Corrosion on Steel Structures: A State-of-the-Art and Case-Study. Buildings 2021, 11, 571. [Google Scholar] [CrossRef]
  5. Leila, S.; Grey, F.; Kast, T.Q.N.; Rackel, S.N. Corrosion protection of steel elements in façade systems—A review. J. Build. Eng. 2020, 32, 101759. [Google Scholar]
  6. Sharma, S.; Mukherjee, A. Ultrasonic guided waves for monitoring corrosion in submerged plates. Struct. Control. Health Monit. 2015, 22, 19–35. [Google Scholar] [CrossRef]
  7. McCrea, A.; Chamberlain, D.; Navon, R. Automated inspection and restoration of steel bridges—A critical review of methods and enabling technologies. Autom. Constr. 2002, 11, 351–373. [Google Scholar] [CrossRef]
  8. Doshvarpassand, S.; Wu, C.; Wang, X. An overview of corrosion defect characterization using active infrared thermography. Infrared Phys. Technol. 2019, 96, 366–389. [Google Scholar] [CrossRef]
  9. Jahanshahi, M.R.; Kelly, J.S.; Masri, S.F.; Sukhatme, G.S. A survey and evaluation of promising approaches for automatic image-based defect detection of bridge structures. Struct. Infrastruct. Eng. 2009, 5, 455–486. [Google Scholar] [CrossRef]
  10. Tai, J.L.; Grzejda, R.; Sultan, M.T.H.; Łukaszewicz, A.; Shahar, F.S.; Tarasiuk, W.; Rychlik, A. Experimental investigation on the corrosion detectability of a36 low carbon steel by the method of phased array corrosion mapping. Materials 2023, 16, 5297. [Google Scholar] [CrossRef]
  11. Laurent, L.B.; Grégoire, B.; Pascal, D. Corrosion detection and measurement improvement using advanced ultrasonic tools. In Proceedings of the 19th World Conference on Non-Destructive Testing 2016, Munich, Germany, 13–17 June 2016. [Google Scholar]
  12. To, T.T.; Dang, T.N. A new approach to corrosion mapping of fuel tank from collected images using phased array technology. In Proceedings of the 2019 International Conference on System Science and Engineering (ICSSE), Dong Hoi City, Vietnam, 20–21 July 2019. [Google Scholar]
  13. Bose, A.L.; Caspar, K.L.; Adamu, M.Z.; Abid, Y. The current state on usage of image mosaic algorithms. Sci. Afr. 2022, 18, e01419. [Google Scholar]
  14. Bhat, A.S.; Shivaprakash, A.V.; Prasad, N.S.; Nagaraj, C. Template matching technique for panoramic image stitching. In Proceedings of the 2013 7th Asia Modelling Symposium, Hong Kong, China, 23–25 July 2013. [Google Scholar]
  15. Lowe, D.G. Object recognition from local scale-invariant features. In Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece, 20–27 September 1999; Volume 2, pp. 1150–1157. [Google Scholar]
  16. Lowe, D.G. Distinctive image features from scale-invariant interest points. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
  17. Zhao, J.; Zhang, X.; Gao, C.; Qiu, X.; Tian, Y.; Zhu, Y.; Cao, W. TI rapid mosaicking of unmanned aerial vehicle (UAV) images for crop growth monitoring using the SIFT algorithm. Remote Sens. 2019, 11, 1226. [Google Scholar] [CrossRef]
  18. Zhang, W. Combination of SIFT and canny edge detection for registration between SAR and optical images. IEEE Geosci. Remote Sens. Lett. 2022, 19, 4007205. [Google Scholar] [CrossRef]
  19. Chen, X.; Liu, X.; Zhao, H.; Zhang, J.; Lu, W. Novel method for automatic three-dimensional stitching of microscopic images of mems microstructure. Chin. J. Mech. Eng. 2013, 49, 85–91. [Google Scholar] [CrossRef]
  20. Manandhar, P.; Jalil, A.; AlHashmi, K.; Marpu, P. Automatic generation of seamless mosaics using invariant features. Remote Sens. 2021, 13, 3094. [Google Scholar] [CrossRef]
  21. Paul, S.; Udaysankar, D.; Naidu, Y.; Reddy, Y. An efficient SIFT-based matching algorithm for optical remote sensing images. Remote Sens. Lett. 2022, 13, 1069–1079. [Google Scholar] [CrossRef]
  22. Huang, Z.; Chen, W.; Li, Z.; Xie, W. An improved SIFT algorithm for PCB defect detection. Radio Eng. 2023, 53, 1479–1486. [Google Scholar]
  23. Zhao, Q.; Zhang, L.; Qian, Q. Reconstruction and inspection of the inner wall damage of industrial pipelines. Sci. Technol. Eng. 2021, 21, 10796–10805. [Google Scholar]
  24. Qiu, H.; Peng, S. Adaptive threshold based SIFT image registration algorithm. In Proceedings of the 2nd International Conference on Optics and Image Processing (ICOIP), Dalian, China, 20–22 May 2022. [Google Scholar]
  25. Teng, S.; Hossain, M.T.; Lu, G. Multimodal image registration technique based on improved local feature descriptors. J. Electron. Imaging 2015, 24, 013013. [Google Scholar] [CrossRef]
  26. Hossein-Nejad, Z.; Nasri, M. An adaptive image registration method based on SIFT features and RANSAC transform. Comput. Electr. Eng. 2017, 62, 524–537. [Google Scholar] [CrossRef]
  27. Hossein-Nejad, Z.; Nasri, M. Clustered redundant keypoint elimination method for image mosaicing using a new Gaussian-weighted blending algorithm. Vis. Comput. 2021, 38, 1991–2007. [Google Scholar] [CrossRef]
  28. Fang, X.; Pan, Z.; Xu, D. An improved algorithm for image mosaics. J. Comput. Aided Des. Comput. Graph. 2003, 15, 1362–1365+1457–1458. [Google Scholar]
  29. Wu, Q.; Li, H.; Meng, F.; Ngan, K.N. A perceptually weighted rank correlation indicator for objective image quality assessment. IEEE Trans. Image Process. 2018, 27, 2499–2513. [Google Scholar] [CrossRef]
  30. Huynh-Thu, Q.; Ghanbari, M. Scope of validity of PSNR in image/video quality assessment. Electron. Lett. 2008, 44, 800–801. [Google Scholar] [CrossRef]
  31. Wang, Z.; Bovik, A.C.; Sheikh, H.R.; Simoncelli, E.P. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef]
Figure 1. Gaussian pyramid and the generation of DOG images.
Figure 1. Gaussian pyramid and the generation of DOG images.
Metals 14 00390 g001
Figure 2. Computing the gradient of each image sample point in a region around the keypoint location and the creation of an orientation histogram.
Figure 2. Computing the gradient of each image sample point in a region around the keypoint location and the creation of an orientation histogram.
Metals 14 00390 g002
Figure 3. Flow chart of the image mosaic method for ultrasonic C-scan images.
Figure 3. Flow chart of the image mosaic method for ultrasonic C-scan images.
Metals 14 00390 g003
Figure 4. Photo of the instrument used in this paper.
Figure 4. Photo of the instrument used in this paper.
Metals 14 00390 g004
Figure 5. Photo of the testing block and the section view in thickness direction.
Figure 5. Photo of the testing block and the section view in thickness direction.
Metals 14 00390 g005
Figure 6. C-scan images used to be stitched. (a) C-scan image A; (b) C-scan image B; (c) C-scan image C (d) C-scan image D.
Figure 6. C-scan images used to be stitched. (a) C-scan image A; (b) C-scan image B; (c) C-scan image C (d) C-scan image D.
Metals 14 00390 g006
Figure 7. Screening of the matching pairs.
Figure 7. Screening of the matching pairs.
Metals 14 00390 g007
Figure 8. The results of feature point matching. (a) Matching pairs obtained by the conventional SIFT registration algorithm. (b) By the improved registration method.
Figure 8. The results of feature point matching. (a) Matching pairs obtained by the conventional SIFT registration algorithm. (b) By the improved registration method.
Metals 14 00390 g008
Figure 9. Images stitched by different registration algorithms. (a) The fused image obtained by the conventional SIFT registration algorithm. (b) By the improved registration method.
Figure 9. Images stitched by different registration algorithms. (a) The fused image obtained by the conventional SIFT registration algorithm. (b) By the improved registration method.
Metals 14 00390 g009
Figure 10. The schematic diagram of searching for an optimal seam line.
Figure 10. The schematic diagram of searching for an optimal seam line.
Metals 14 00390 g010
Figure 11. The fused image improved by optimal seam searching. (a) The optimal seam line. (b) The fused image.
Figure 11. The fused image improved by optimal seam searching. (a) The optimal seam line. (b) The fused image.
Metals 14 00390 g011
Figure 12. Fade-in and fade-out image fusion based on optimal seam searching.
Figure 12. Fade-in and fade-out image fusion based on optimal seam searching.
Metals 14 00390 g012
Figure 13. The weights of fade-in and fade-out image fusion. (a) The weights of conventional fade-in and fade-out fusion. (b) The weights used in this paper.
Figure 13. The weights of fade-in and fade-out image fusion. (a) The weights of conventional fade-in and fade-out fusion. (b) The weights used in this paper.
Metals 14 00390 g013
Figure 14. Comparison of two fusion methods. (a) Direct fusion. (b) The improved fusion method.
Figure 14. Comparison of two fusion methods. (a) Direct fusion. (b) The improved fusion method.
Metals 14 00390 g014
Figure 15. The panoramic image obtained by stitching. (a) The fused image. (b) Flip the photo of the testing block horizontally and overlap it with the fused image.
Figure 15. The panoramic image obtained by stitching. (a) The fused image. (b) Flip the photo of the testing block horizontally and overlap it with the fused image.
Metals 14 00390 g015
Figure 16. Evaluation of image fusion results. (a) PSNR. (b) SSIM.
Figure 16. Evaluation of image fusion results. (a) PSNR. (b) SSIM.
Metals 14 00390 g016
Table 1. Evaluation of the results of image registration.
Table 1. Evaluation of the results of image registration.
Conventional SIFTImproved Algorithm
Number of Matching PairsMatching
Accuracy
Number of Matching PairsMatching
Accuracy
The first stitching1618.75%1668.75%
The second stitching1631.25%1662.50%
The third stitching1612.50%1172.73%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chi, D.; Xu, Z.; Liu, H. Detection and Imaging of Corrosion Defects in Steel Structures Based on Ultrasonic Digital Image Processing. Metals 2024, 14, 390. https://doi.org/10.3390/met14040390

AMA Style

Chi D, Xu Z, Liu H. Detection and Imaging of Corrosion Defects in Steel Structures Based on Ultrasonic Digital Image Processing. Metals. 2024; 14(4):390. https://doi.org/10.3390/met14040390

Chicago/Turabian Style

Chi, Dazhao, Zhixian Xu, and Haichun Liu. 2024. "Detection and Imaging of Corrosion Defects in Steel Structures Based on Ultrasonic Digital Image Processing" Metals 14, no. 4: 390. https://doi.org/10.3390/met14040390

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop