Next Article in Journal
Influence of Resin Content and Density on Water Resistance of Bamboo Scrimber Composite from a Bonding Interface Structure Perspective
Previous Article in Journal
Chitosan Catalyzed Novel Piperidinium Dicoumarol: Green Synthesis, X-ray Diffraction, Hirshfeld Surface and DFT Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel High Recognition Rate Defect Inspection Method for Carbon Fiber Plain-Woven Prepreg Based on Image Texture Feature Compression

Key Laboratory for Precision and Non-Traditional Machining Technology of Ministry of Education, School of Mechanical Engineering, Dalian University of Technology, Dalian 116024, China
*
Author to whom correspondence should be addressed.
Polymers 2022, 14(9), 1855; https://doi.org/10.3390/polym14091855
Submission received: 11 April 2022 / Revised: 28 April 2022 / Accepted: 29 April 2022 / Published: 30 April 2022
(This article belongs to the Section Polymer Fibers)

Abstract

:
Carbon fiber plain-woven prepreg is one of the basic materials in the field of composite material design and manufacturing, in which defect identification is an important and easily neglected part of testing. Here, a novel high recognition rate inspection method for carbon fiber plain-woven prepregs is proposed for inspecting bubble and wrinkle defects based on image texture feature compression. The proposed method attempts to divide the image into non-overlapping block lattices as texture primitives and compress them into a binary feature matrix. Texture features are extracted using a gray level co-occurrence matrix. The defect types are further defined according to texture features by k-means clustering. The performance is evaluated in some existing computer vision and machine learning methods based on fiber recognition. By comparing the result, an overall recognition rate of 0.944 is achieved, which is competitive with the state-of-the-arts.

1. Introduction

Carbon fiber reinforced plastics are widely used in aerospace due to their good fatigue and corrosion resistance coupled with high strength-to-weight and stiffness-to-weight ratios [1]. The predominant material when it comes to composites in structural aircraft components is pre-impregnated carbon fibers [2]. The preparation process of composites means that it is easy to produce bubble and wrinkle defects, and timely inspection of these defects can help improve both the performance of composites and their subsequent service performance.
The inspection methods for examining wrinkle and bubble defects in carbon fiber prepreg are not like the common non-destructive testing methods for composite materials, which are similar to surface texture inspection methods. Unidirectional prepreg and woven prepreg are the most popular raw materials for fabricating composite materials. Unidirectional prepreg is the most commonly utilized material thanks to automatic manufacturing processes such as automated tape laying and automatic fiber placement. With improvement in the degree of automation, defect inspection technologies for unidirectional prepreg have gained popularity as well; these include ultrasonic testing [3], radiographic testing [4], and thermal imaging testing [5]. The structure of woven prepreg is more complicated than that of unidirectional prepreg, and the quality inspection of its preparation process is more difficult. At present, defect inspection for carbon fiber woven prepreg relies heavily on manual inspection, which has the disadvantages of low detection efficiency dependence on the experience of the inspector. Moreover, due to the complex structure of carbon fiber woven prepreg, when defect inspection technologies for unidirectional prepreg are used to examine defects in carbon fiber woven prepreg, the recognition rate of the inspection technologies used with unidirectional prepreg cannot meet the demands of examining carbon fiber woven prepreg. Therefore, it is particularly important to propose a new defect inspection method with a high recognition rate for use with carbon fiber woven prepreg, in particular to allow for online inspection during future automated manufacturing processes.
From a woven fabric texture feature point of view, there are typically four classes [6,7]: structural analysis (SA) [8,9], spectral methods [10,11,12], model-based methods [13,14,15], and statistics-based classification approaches [16,17,18]. Among these, SA assumes that the surface texture is generated by following a placement rule and that defect-frees and defects are respectively composed of overlapped texture primitives and nonoverlapped texture primitives. Therefore, SA is one of the most suitable methods for defect inspection of woven prepreg; its core texture primitives are defined differently in the literature. Bodnarova et al. [10] introduced the definition of texture primitive as texture blobs surrounded by rectangular regions to form an overlapping binarized grid. Ng tested the performance of the valley-emphasis method on common defect detection applications [19]. Jia and Liang [20] proposed a texture blob location-based method which does not directly assume that it conforms to a rigid grid, rather inferring the placement rule dynamically. As stated in [6,7], the requirement of the texture pattern is regular; thus, the locations of defects can be identified through structural analysis.
From the perspective of training strategy, woven prepreg inspection methods are divided into two classes based on either supervised or unsupervised training. Unlike unsupervised machine learning methods, woven prepregs may have unpredictable defect forms in different production processes; thus, manufacturing inspection methods are different from traditional machine learning methods, e.g., image decomposition (ID) [21], motif-based (MB) methods [22], Bollinger bands (BB) [23], regular bands (RB) [24], Elo rating (ER) methods [25], and wavelet-pre-processed golden image subtraction (WGIS) [26]. The ID method inspects defects using the integrating the image decomposition method, which allows for the possibility of removing repeated texture primitives completely and removing segment defects directly [27,28,29]. Another special method, MB [22], is rarely mentioned in the literature on plain-woven fabric inspection; its distinguishing characteristic is the preprocessing step of lattice segmentation. The lattice segmentation method [30] divides a plain-woven fabric image into non-overlapping block lattices which consist of the same textures. The block lattice is similar to the texture primitives in SA, and the lattice segmentation method derives a unified placement rule for images of the same woven fabric patterns which share roughly the same texture for segmented block lattices. The two categories (defect-frees and defects) generalize the categorization of motif-based and non-motif-based methods.
The vision testing and lattice segmentation methods are exploited here to develop a novel plain-woven fabric image analysis method including texture feature compression, which is proposed for synthetically analyzing the texture patterns of carbon fiber plain-woven prepreg and classify them into defect-free, bubble defect, and wrinkle defect classes. Texture features are extracted using a gray level co-occurrence matrix (GLCM). The types of defects are further defined according to their texture features. Finally, k-means clustering (KMC) is used to confirm the types of defects, then detect the quality of prepreg for effective processing. This method can accurately extract the texture feature information of carbon fiber plain-woven prepreg, making inspection results more reliable and realistic.

2. Methodology

2.1. Image Preprocessing

Prepreg images acquired from via digital cameras are embedded with errors such as noise, fickle shadows, and illumination changes; these can appear similar to defective objects caused in the manufacturing process and affect image quality. Thus, image preprocessing can dampen the bad effects caused by such errors, which is vital in feature extraction [31,32]. Three kinds of preprocessing methods were used here to enhance captured images as well as to determine crossover points or floats (the technical details of these calculations can be found in Appendix A). The main preprocessing methods used were presented as follows.
Step 1. Histogram equalization: The plain-woven texture of the images was equalized by cumulative distribution function (CDF), which is helpful for enhancing their contrast.
Step 2. Gray-level morphology: The bottom-hat image was subtracted from the sum of the original and top-hat images in order to maximize the contrast between the objects and the gaps and distinguish them from each other.
Step 3. Steerable filters: Image edges were obtained by deconvolving the image with templates generated in different directions. Then, Hough transform was used to strengthen the weft yarn in the image, and the image was corrected according to the inclination angle. Meanwhile, performance evaluation metrics were utilized to compare the results of the vertical and horizontal filters.

2.2. Image Compression

The woven pattern of carbon fiber plain-woven prepreg has different describable symmetry features depending on the warp and weft yarns. Based on this symmetrical woven pattern, a new image compression algorithm which can remove noise data, retain texture features, and compress and simplify the original image data is proposed here. At the same time, the texture primitive of plain-woven patterns can be extracted from defect-free images. The texture primitive is a unit that can form a whole pattern via simple translation without rotation [33]. In addition, it is a marker for recognizing the woven pattern of carbon fiber plain-woven prepreg.
As shown in Figure 1, taking a defect-free carbon fiber plain-woven prepreg as an example, the image compression process here was carried out as follows.
Step 1. Preparation: The original matrix Oorg can be obtained from a preprocessing image, and the pixel width w and pixel height h can be determined as well. In the process of capturing the surface image of each prepreg, we determined the focal length, FOV, and working distance of the camera and the morphological structure of the prepreg did not change. Therefore, the numbers of warps and wefts in each image could be determined, and are defined here as nx and ny, respectively.
O o r g = o 11 o 1 j o k 1 o k j , j = 1 , 2 , w     ;   k = 1 , 2 , h
Step 2. Segment horizontally: the original matrix, Oorg, is segmented horizontally according to the value of the weft, ny. As a result, ny weft-block images corresponding to the matrix of ny weft-block images can be obtained as shown in Equation (2). The number of rows and columns in each matrix is respectively rh and w, where rh = h/ny and x represents the gray value in the weft-block image. Then, the weft-block and original matrices can be expressed as shown in Equations (3) and (4):
x j i = x 1 j i x k j i T , i = 1 , 2 , n y       ;   j = 1 , 2 , w     ;   k = 1 , 2 , r h
W i = x 1 i x j i = x 11 i x 1 j i x k 1 i x k j i
O o r g = W 1   ,   W 2     , W i T
Step 3. Compute the binary threshold value, t ¯ ; the elements (gray value) of each column in the matrix Wi are added up together and then multiplied by a correction coefficient, λ, as shown in Equation (5). The weft-block matrix Wi becomes a new matrix, Vi, as shown in Equation (6).
v j i = λ × k = 1 r h x k j i   ,       i = 1 , 2 , n y       ;   j = 1 , 2 , w
V i = v 1 i   , v 2 i   ,     ,         v j i
Afterwards, the element sequence of Vi is used as the x-coordinate and the element value as the y-coordinate to draw the grayscale transformation graph; in other words, the variations in the element sequence of Vi in terms of grayscale values are plotted against the number of pixels. A sub-threshold value, ti, can be easily obtained from the grayscale transformation graph. Similarly, the binary threshold value, t ¯ , can be obtained by taking the average of the corresponding thresholds of each obtained Vi, as expressed in Equation (7):
t ¯ = t i / i   , i = 1 , 2 , n y
Step 4. Binary operation: the binary threshold, t ¯ , is used to divide the element value of each column vector x j i into binary data (0 or α) in matrix Wi, as shown in Equation (8). Thus, the original matrix, Oorg, and the warp-block matrix Wi become the binary matrices Obnr and Wibnr, respectively.
x j i = 0 , v j i < t ¯ α , v j i t ¯
Generally, the pixel values 0 and 255 represent black and white in a grayscale image. For a good data visualization effect, the pixel value α is usually defined as 255.
Step 5. Segment vertically: the binary matrix Wibnr is segmented vertically according to the value of the warp, nx. As a result, nx crossing-block images and the corresponding nx crossing-block matrix Cil in each Wibnr can be obtained as shown in Equation (9), with Cil being a binary matrix. The number of rows and columns of each matrix Cil is rh and rw, respectively, where rh = h/ny, rw = w/nx. Then, the binary warp-block and binary original matrices can be expressed as shown in Equations (10) and (11):
C i l = x 1 i l x j i l = x 11 i l x 1 j i l x k 1 i l x k j i l , l = 1 , 2 , n x       ;   j = 1 , 2 , r w     ;   k = 1 , 2 , r h
W i b n r = C i 1 C i l
O b n r = W 1 b n r   ,   W 2 b n r     , W i b n r T
Step 6. Black/White classification: the binary proportions of each element are calculated for the crossing-block matrix Cil; the color with a smaller proportion will be replaced by the color with a larger proportion. In terms of a grayscale image, the matrix Cil is constituted by either a black block or a white block. A new matrix, Ocls, can be obtained by updating the values of the elements of Obnr.
Step 7. Extract the output matrix F: Convolution operations are performed between Ocls and convolution kernel K. The kernel K is an rw × rh matrix in which the values of all elements are the same as ζ calculating by Equation (12); the horizontal step is rw and the vertical step is rh. The output matrix F can be obtained from the following Equation (13):
ζ = r w × r h × α 1
F = O c l s K
where ⊗ is the convolution operator.
As a result, the output matrix F is obtained, in which the number of rows is equal to the number of weft yarns and the number of columns is equal to the number of warp yarns; it is a binary matrix that the element value is either zero or one. Thus, the texture feature of a defect-free carbon fiber plain-woven prepreg is clearly mapped by matrix F with the least amount of data.

2.3. Texture Feature Extraction

The GLCM is a texture feature extraction method based on gray-level spatial dependence. Each element in the matrix represents the occurrence of a grayscale combination. Assuming that f(x, y) is a two-dimensional digital image with a size of M × N and a gray level of h, the different pixel spacing modeling of GLCM with a certain spatial relationship is expressed as Equation (14) [34]:
P i , j = # x 1 , y 1 , x 2 , y 2 M × N f x 1 , y 1 = i , f x 2 , y 2 = j
where # denotes the number of elements in the set, d is the distance between (x1, y1) and (x2, y2), and θ is the angle between the vector and the axis of the coordinate.
In general, d selects 1~8, while θ selects 0°, 45°, 90°, and 135°. According to the output matrix F provided by the texture feature compression algorithm, only one kind of the texture feature of weft and warp yarns can be extracted in a case where the angle of θ is 45° or 135°, as shown in Figure 2. Thus, the values of θ with 45° and 135° can be selected in order to improve the contrast of the feature data, thereby facilitating the detection of defect and defect-free images. The co-occurrence matrices of the two conditions are tabulated in Table 1.
Therefore, texture features can be represented by the values of contrast and homogeneity along with the angular second moment [35,36]. These three secondary statistics can reflect the textural features of carbon fiber plain-woven prepreg.
The schematic for defect-free fiber texture feature extraction is shown in Figure 2; the different colors of the square sections represent the different values of different textural features. The value of the white section is 1, the value of the black section is 0, and the values of dark gray and light gray sections are equal to the calculation results. Taking the defect-free image as an example, whether θ is 45° or 135° the size of a matrix is 2 × 2 and one diagonal value is 0. Therefore, the contrast, homogeneity, and angular second moment can be easily obtained using basic arithmetic (the specific operation is shown in Appendix A).

2.4. Defect Inspection

Bubble and wrinkle defects are two kinds of typical defects in prepreg during the laying up process. The KMC algorithm was adopted to inspect the textural features in defect-free, winkle defect, and bubble defect cases. By calculating the center of clustering for each condition, the KMC algorithm looked for the best way of grouping images with a minimal value of the mean similarity [37,38]. The accuracy of the KMC algorithm can be improved and verified using training and validation sets with images for the three conditions. Moreover, the KMC algorithm can consolidate the defined defect-type partition [39].

3. Materials and Experiments

Carbon fiber plain-woven prepreg (WP-3011, Guangwei Composite Material Co., Ltd, Weihai, China) was used for this study. The prepreg was laid on the mold with assistance from a machine hand (KRC4, KUKA Robotics Co., Ltd, Shanghai, China). Then, images were captured using a digital camera (MV-CH050-10UM, HIKROBOT Co., Ltd, Hangzhou, China). The camera captured images of other areas by coordinate movement and stored them in the computer. The computer configuration used was a Windows 7 64x operating system and Inter Core i7-8565U (1.8 GHz) CPU with 8 GB running memory. The process of defect inspection for carbon fiber plain-woven prepreg is shown in Figure 3. The working distance of the camera was 450 mm. Defect sizes were no more than 150 mm × 180 mm, and not smaller than the size of the texture primitive. Representative original images of the defect-free, bubble defect, and wrinkle defect samples are shown in Figure 4.
As a result, 1200 prepreg surface images (200 defect-free images, 500 images with bubbles, and 500 images with wrinkles) were collected as the total dataset and used for improving the proposed plain-woven prepreg defect automatic inspection method.
Python 3.7 and OpenCV module were used for the development of the proposed defect automatic inspection method; a flowchart is shown in Figure 5. The enhanced grayscale images of the prepreg were obtained using three preprocessing algorithms. A texture feature compression algorithm was used to compress and simplify the preprocessed images. Then, the texture features of the compressed output matrix F, consisting of contrast, homogeneity, and the angular second moment, were extracted by the GLCM algorithm (1-pixel and 45°, 135° directions). The mean value of two directions was utilized as the texture feature. These three features formed a three-element array as the input of the KMC algorithm. In the end, the results of clustering recognition were obtained.

4. Results and Discussion

As shown in Figure 6, a series of preprocessed defect-free images were used to describe the image preprocessing steps. The original grayscale image is shown in Figure 6a. To enhance the contrast in the original grayscale image, the plain-woven texture of the image was equalized by CDF, as shown in Figure 6b. Then, the histogram-equalized image was filtered through a box filtering function to reduce noise, as shown in Figure 6c. After that, a top-hat transform (as shown in Figure 6d) and bottom-hat transform (as shown in Figure 6e) were applied to the filtered image to obtain the gray-level morphology, as shown in Figure 6f. The gray-level morphology algorithm allowed us minimize the effects of tensile fiber on fiber texture extraction. Finally, a steerable filtering algorithm, which included a horizontal steerable filter (as shown in Figure 6g) and a vertical steerable filter (as shown in Figure 6h), was used to obtain the clear edges of each yarn, with the templates generated in different directions by the deconvolving method. With this comparison, the vertical steerable image was selected for the Gaussian filter to complete the whole image preprocessing step, as shown in Figure 6i.
The comparisons between the original gray and the histogram-equalized images and histograms in three conditions (defect-free, bubble defect, and wrinkle defect) are shown in Figure 7. By comparing the images, it can easily be seen that the histogram equalization equalizes the brightness of the original grayscale image caused by the reflective resin on the prepreg surface and improves the clarity of the plain-woven texture. By comparing the histograms, it can be seen that a uniform distribution of the gray level was obtained, which is helpful for extraction of texture features.
The gray-level morphological operation processing is shown in Figure 8. An area of 10 (warp yarns) × 10 (weft yarns) was intercepted for image observation. The weave points of warp and weft yarns in the image were regarded as mountains and valleys. The contrast was improved by minimizing the number of valleys. The top-hat and bottom-hat images contained the mountains and the valleys of the yarns, respectively. Thus, gray-level morphological operation processing was carried out by adding the top-hat operated image to the original grayscale image and then subtracting the bottom-hat operated image from the sum. Following this gray-level morphology process, the gray-level morphological image was obtained, which was similar to the ideal image.
A steerable filter was adopted to filter the gray-level morphological image in order to emphasize the borders of warp and weft yarns in the final preprocessing step. The effects of vertical filtering and horizontal filtering are compared in Figure 9. The performance evaluation metrics for the vertical and horizontal steerable filtered images are shown in Table 2. From Table 2, it can be seen that (i) the ACC values of the vertical steerable filter are much higher than those of the horizontal steerable filter; (ii) the TPR values of the horizontal steerable filter in the full-size image are very low, specifically, 41.64% below the maximum value; and (iii) the FPR values of the vertical steerable filters are much lower than those of the horizontal steerable filters. Therefore, the vertical steerable filtered images were selected to improve recognition accuracy before the final preprocessing step.
An image compression algorithm was proposed to compress and simplify the preprocessed image data while retaining the texture features for further analysis; the image compression process is shown in Figure 10. A preprocessed image (shown in Figure 10a) was segmented horizontally (shown in Figure 10b) based on the number of weft yarns, then used to compute the binary threshold value for threshold classification (shown in Figure 10c). The binary sub-image arrays were then segmented vertically based on the number of warp yarns to obtain crossing-block image arrays as the preliminary texture pattern, which are shown in Figure 10d. Then, a dichotomy operation was carried out for the Black/White classification process, as shown in Figure 10e. After that, the crossing-block image arrays were merged to obtain the final texture pattern, as shown in Figure 10f. Using a convolution operation, the final texture pattern was converted into the output matrix, F (obtained by Python and shown in Figure 10g), which was used as a standard defect-free template for further analysis. Following the same image compression process, the texture patterns of the original images of carbon fiber plain-woven prepreg with and without defects (corresponding to Figure 4) were obtained, and are shown in Figure 11.
Texture features were extracted based on the GLCM method. The mean values of the texture features are listed in Table 3. Three texture feature parameters, namely, the contrast and homogeneity along with the angular second moment, made up a three-dimensional array which was used to present the texture features.
The KMC algorithm classified 180 samples by three conditions, defect-free (60 samples), with bubble defect (60 samples), and with wrinkle defect (60 samples), by recognizing different representative defect-free, bubble, and wrinkle texture features of carbon fiber plain-woven prepreg samples randomly selected from the 1200 total samples. The results were plotted as a three-dimensional scatter plot, shown in Figure 12. Each texture feature of these three conditions has distinguishable differences, which are defined according to their locations in the plot.
Moreover, a total of 1000 samples based on the same three conditions (defect-free, bubble, and wrinkle) with proportions of 2:3:5, 2:4:4, and 2:5:3, respectively, were used to verify the recognition rate. The plots for each proportion are shown in Figure 13. The initial and final center points of the clustering results are shown in Table 4 and Table 5 respectively. The overall recognition accuracy (IE) listed in Table 6 is 94.41%, where TN, TB, and TW are the number of defect-free, bubble and wrinkle samples, RN, RB, and RW are the number of correctly recognized data elements, IN, IB, and IW are the recognition accuracy, and IM is their average.
When texture feature extraction and defect type inspection was performed only by GLCM and KMC the recognition accuracy was 91.96%, as shown in Table 7. At the same time, the image processing time is also slower than that of the above method, as shown in Table 8. In contrast with general image preprocessing and the GLCM algorithm, this paper further refines the processing of texture features. Therefore, the recognition accuracy can be significantly improved in different proportions, as shown in Figure 14. Furthermore, the proposed method was compared with several existing computer vision and machine learning methods based on fiber recognition, with the results shown in Figure 15. Despite the different data sets employed by the above methods, their final purpose is ultimately to effectively recognize fiber morphologies and defect types. The proposed method can describe the morphological characteristics of defects in carbon fiber prepreg more comprehensively and more simply, and shows a remarkable improvement recognition accuracy; the results show that the proposed methodology is competitive or better than these traditional techniques. The three kinds of surface morphologies (defect-free, bubble, and wrinkle) in carbon fiber prepreg were all recognized effectively.

5. Conclusions

Here, we have proposed a novel defect inspection method to improve the efficiency pf automatic online detection of defects in carbon fiber plain-woven prepregs, which can help in developing an automatic laying-up process in their production. The proposed method compresses a plain-woven prepreg image while reserving additional texture features, which are calculated using GLCM. These features can be regarded as a three-dimensional array and used as the input of the KMC algorithm, which is then applied to define the defect types and realize defect inspection. It should be noted that the performance of the proposed method was compared to several different inspection methods, it showed significant improvement; the recognition rate and performance time of the proposed method reached 94.41% and 136 ms, respectively. Moreover, this proposed method could be used for surface defect identification in other woven patterns, such as twill and satin.

Author Contributions

Conceptualization, L.L. and J.Q.; methodology, L.L. and S.X.; software, L.L.; validation, S.X. and Y.W.; formal analysis, L.L.; investigation, J.Q.; resources, H.G.; data curation, L.L.; writing—original draft preparation, L.L.; writing—review and editing, L.L. and S.X.; visualization, L.L.; supervision, S.X.; project administration, Y.W. and H.G.; funding acquisition, Y.W. and H.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the National Natural Science Foundation of China (NSFC)-Liaoning Joint Funding [grant number U1708256] and the Fundamental Research Funds for the Central Universities [grant number DUT21LAB108]. The authors would like to acknowledge the above financial supports.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

Nomenclature

Oorgoriginal image matrix
wimage width (px)
himage height (px)
nxthe numbers of warps
nythe numbers of wefts
Wiweft-block matrix
t ¯ binary threshold value
Obnrbinary matrix
Cilcrossing-block matrix
Foutput matrix
Mijco-occurrence matrix
Ccontrast
Hhomogeneity
Eangular second moment

Appendix A

Let us recall that when the prepreg images captured in the process of automatic placement need to be enhanced by three preprocessing methods: (1) histogram equalization; (2) gray-level morphology; (3) steerable filters; (4) Performance evaluation metrics; (5) Gray level co-occurrence matrix. The specific algorithm implementation of these methods is as follows.
(1)
Histogram equalization: Firstly, the global image is equalized directly. Suppose that the histogram distribution of the captured image A is HA(D). The monotone nonlinear mapping is used to change image A into image B, that is, function transformation f is applied to each pixel in image A, and the histogram of image B is obtained as HB(D).
The whole process can be understood as changing all the DA in image A into DB as follows:
D A D A + Δ D A H A D d D = D B D B + Δ D B H B D d D
In order to achieve histogram equalization, the following are in particular:
0 D A H A D d D = 0 D B H B D d D
Because the target is histogram uniformly distributed, we obtain HB(D) = A0/L, where A0 is the number of pixels, and L is the grayscale depth (usually 256). And then we get:
0 D A H A D d D = A 0 D B L = A 0 f D A L
Therefore, let us define f:
f D A = L A 0 0 D A H A D d D
The discrete form is:
f D A = L A 0 u = 0 D A H A u
Then the histogram distribution in the local area window is used to construct the mapping function to equalize the local image area.
Finally, in order to avoid the discontinuity and excessive enhancement of the image, the interpolation method is used to accelerate the histogram equalization. Bilinear interpolation formula is as followed.
f D = 1 Δ y 1 Δ x f u l D + Δ x f b l D + Δ y 1 Δ x f u r D + Δ x f b r D
where ful(D), fur(D), fdl(D), fdr(D) are the mapping values of the histogram CDF of the adjacent four windows to the central pixel, and Δx, Δy are the ratio of the distance between the center pixel and the window pixel and the window size.
(2)
Gray-level morphology: The operation object of gray-level mathematical morphology is not a set, but an image function. The dilation and erosion operations of the input image f(x, y) with structure element b(x, y) are respectively defined as:
f b s , t = m a x f s x , t y + b x , y s x , t y D f , x , y D b
f b s , t = m i n f s + x , t + y + b x , y s + x , t + y D f , x , y D b
Opening and closing operations are respectively defined as:
f o p n = f b b
f c l s = f b b
The operations of top-hat and bottom-hat are respectively defined as:
f t o p = f f o p n
f b t m = f c l s f
Therefore, we obtain the final gray-level morphology preprocessing formula:
f f i n a l = f + f t o p f b t m
(3)
Steerable filters: Considering the woven pattern, we use the steerable vertical filters with the following basic 2-D Gaussian function in this paper.
f x , y = 1 2 π σ x σ y e x p x 2 2 σ x 2 + y 2 2 σ y 2
where σx and σy are the standard deviation in the x and y directions, respectively.
Rotation matrix (Equation (A15)) is used to rotate the axis in accordance with a necessary angle θ into the function.
X Y = cos θ sin θ sin θ cos θ x y
Therefore, the steerable vertical filter algorithm is derived from Equation (A16). After defining appropriate parameters of σx, σy and θ, the filter is convolved with above-mentioned morphology enhanced image.
f x , y = 1 2 π σ x σ y e x p × x cos θ + y sin θ 2 2 σ x 2 x sin θ + y cos θ 2 2 σ y 2
(4)
Performance evaluation metrics: First, the numerical comparisons between steerable filtered images (binary images after filter) and template images are conducted in a pixel-by-pixel manner. Specifically, both pixel values in the filtered and template images are 255 as true positive (TP), while 0 as true negative (TN). The pixel value in the filtered image is 255 and that of the defect manual-labeled image is 0 as false positive (FP), while the reversed situation is false negative (FN). The following measurement metrics are used to compare the result of applying vertical and horizontal steerable filters:
A C C = T P + T N T P + F N + F P + T N
T P R = T P T P + F N
F P R = F P F P + T N
P P V = T P T P + F P
N P V = T N T N + F N
In the above formula, ACC is accuracy, TPR is true positive rate, FPR is false positive rate, PPV is positive predictive value, NPV is negative predictive value.
(5)
Gray level co-occurrence matrix: All values of the co-occurrence matrices are given normalized treatment before the texture features are calculated. The co-occurrence matrices, Mij, is shown as Equation (A22).
M i j = P i j i = 0 l j = 0 l P i j
Contrast, homogeneity and angular second moment are used to describe the carbon fiber texture features.
Contrast: The contrast reflects the sharpness of the image and the depth of the texture patterns. The pattens is deep, the image is clear, and the contrast is large. On the contrary, the patterns are shallow, the image is fuzzy, and the contrast is small.
C = t = 0 N 1 t 2 i = 0 l 1 j = 0 l 1 P 2 i , j
t = i j
Homogeneity: The homogeneity measures how much the image texture changes locally. The larger the value is, the less variation of different regions of the image texture is, and the local texture is very uniform.
D = i = 0 l 1 j = 0 l 1 P i , j 1 1 + i j 2
Angular second moment (ASM): ASM is the sum of squares of element values in the gray co-occurrence matrix. It reflects the degree of texture thickness and the degree of evenness of image gray-level distribution. When the value is larger, gray-level distribution is uniform and image is closed-grain. Otherwise, it is smaller.
E = i = 0 l 1 j = 0 l 1 P 2 i , j , d , θ

References

  1. Zhang, J.; Chaisombat, K.; He, S.; Wang, C.H. Hybrid composite laminates reinforced with glass/carbon woven fabrics for lightweight load bearing structures. Mater. Des. 2018, 36, 75–80. [Google Scholar] [CrossRef]
  2. Deng, B.; Shi, Y.; Yu, T.; Zhao, P. Influence mechanism and optimization analysis of technological parameters for the composite prepreg tape winding process. Polymers 2020, 12, 1843. [Google Scholar] [CrossRef] [PubMed]
  3. Tan, Y.; Meng, L.; Zhang, D. Strain sensing characteristic of ultrasonic excitation-fiber Bragg gratings damage detection technique. Measurement 2013, 46, 294–304. [Google Scholar] [CrossRef]
  4. Crane, R.L.; Chang, F.F.; Allinikov, S. Use of radiographically opaque fibers to aid the inspection of composites. Mater. Eval. 1978, 36, 69–71. [Google Scholar]
  5. Addepalli, S.; Zhao, Y.; Roy, R.; Galhenege, W.; Colle, M.; Yu, J.; Ucur, A. Non-destructive evaluation of localised heat damage occurring in carbon composites using thermography and thermal diffusivity measurement. Measurement 2019, 131, 706–713. [Google Scholar] [CrossRef]
  6. Aisyah, H.A.; Tahir, P.M.; Sapuan, S.M.; Ilyas, R.A.; Khalina, A.; Nurazzi, N.M.; Lee, S.H.; Lee, C.H. A comprehensive review on advanced sustainable woven natural fibre polymer composites. Polymers 2021, 13, 471. [Google Scholar] [CrossRef]
  7. Hanbay, K.; Talu, M.F.; Ozguven, O.F. Fabric defect detection systems and methods-A systematic literature review. Optik 2016, 127, 11960–11973. [Google Scholar] [CrossRef]
  8. Ngan, H.Y.T.; Pang, G.K.H.; Yung, N.H.C. Automated fabric defect detection—A review. Image Vis. Comput. 2011, 29, 442–458. [Google Scholar] [CrossRef]
  9. Song, K.Y.; Kittler, J.; Petrou, M. Defect detection in random colour textures. Image Vis. Comput. 1996, 14, 667–683. [Google Scholar] [CrossRef]
  10. Bodnarova, A.; Bennamoun, M.; Kubik, K.K. Defect detection in textile materials based on aspects of the HVS. In Proceedings of the 1998 IEEE International Conference on Systems, Man, and Cybernetics, San Diego, CA, USA, 14 October 1998; Volume 5, pp. 4423–4428. [Google Scholar]
  11. Tabassian, M.; Ghaderi, R.; Ebrahimpour, R. Knitted fabric defect classification for uncertain labels based on Dempster-Shafer theory of evidence. Expert Syst. Appl. 2011, 38, 5259–5267. [Google Scholar] [CrossRef]
  12. Tong, L.; Wong, W.K.; Kwong, C.K. Differential evolution-based optimal Gabor filter model for fabric inspection. Neurocomputing 2016, 173, 1386–1401. [Google Scholar] [CrossRef]
  13. Tsai, D.M.; Hsieh, C.Y. Automated surface inspection for directional textures. Image Vis. Comput. 1999, 18, 49–62. [Google Scholar] [CrossRef]
  14. Alata, O.; Ramananjarasoa, C. Unsupervised textured image segmentation using 2-D quarter plane autoregressive model with four prediction supports. Pattern Recognit. Lett. 2005, 26, 1069–1081. [Google Scholar] [CrossRef]
  15. Bouhamidi, A.; Jbilou, K. An iterative method for Bayesian Gauss-Markov image restoration. Appl. Math. Model. 2009, 33, 361–372. [Google Scholar] [CrossRef]
  16. Monaco, J.P.; Madabhushi, A. Class-specific weighting for Markov random field estimation: Application to medical image segmentation. Med. Image Anal. 2012, 16, 1477–1489. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  17. Zhang, Y.X.F.; Bresee, R.R. Fabric defect detection and clssification using image-analysis. Text. Res. J. 1995, 65, 1–9. [Google Scholar] [CrossRef]
  18. Latif-Amet, A.; Ertuzun, A.; Ercil, A. An efficient method for texture defect detection: Sub-band domain co-occurrence matrices. Image Vis. Comput. 2000, 18, 543–553. [Google Scholar] [CrossRef]
  19. Ng, H.F. Automatic thresholding for defect detection. Pattern Recognit. Lett. 2006, 27, 1644–1649. [Google Scholar] [CrossRef]
  20. Jia, L.; Liang, J.Z. Fabric defect inspection based on isotropic lattice segmentation. J. Frankl. Inst. Eng. Appl. Math. 2017, 354, 5694–5738. [Google Scholar] [CrossRef]
  21. Ng, M.K.; Ngan, H.Y.T.; Yuan, X.M.; Zhang, W.X. Patterned fabric inspection and visualization by the method of image decomposition. IEEE Trans. Autom. Sci. Eng. 2014, 11, 943–947. [Google Scholar] [CrossRef] [Green Version]
  22. Ngan, H.Y.T.; Pang, G.K.H.; Yung, N.H.C. Motif-based defect detection for patterned fabric. Pattern Recognit. 2008, 41, 1878–1894. [Google Scholar] [CrossRef]
  23. Ngan, H.Y.T.; Pang, G.K.H. Novel method for patterned fabric inspection using Bollinger bands. Opt. Eng. 2006, 45, 087202. [Google Scholar]
  24. Ngan, H.Y.T.; Pang, G.K.H. Regularity analysis for patterned texture inspection. IEEE Trans. Autom. Sci. Eng. 2009, 6, 131–144. [Google Scholar] [CrossRef] [Green Version]
  25. Tsang, C.S.C.; Ngan, H.Y.T.; Pang, G.K.H. Fabric inspection based on the Elo rating method. Pattern Recognit. 2016, 51, 378–394. [Google Scholar] [CrossRef] [Green Version]
  26. Ngan, H.Y.T.; Pang, G.K.H.; Yung, S.P.; Ng, M.K. Wavelet based methods on patterned fabric defect detection. Pattern Recognit. 2005, 38, 559–576. [Google Scholar] [CrossRef]
  27. Starck, J.L.; Elad, M.; Donoho, D. Redundant multiscale transforms and their application for morphological component separation. Adv. Imag. Elect. Phys. 2004, 132, 287–348. [Google Scholar]
  28. Xu, L.; Yan, Q.; Xia, Y.; Jia, J.Y. Structure extraction from texture via relative total variation. ACM Trans. Graph. 2012, 31, 1–10. [Google Scholar] [CrossRef]
  29. Ng, M.K.; Yuan, X.M.; Zhang, W.X. Coupled variational image decomposition and restoration model for blurred cartoon-plus-texture images with missing pixels. IEEE Trans. Image Process. 2013, 22, 2233–2246. [Google Scholar] [CrossRef]
  30. Jia, L.; Chen, C.; Xu, S.K.; Shen, J. Fabric defect inspection based on lattice segmentation and template statistics. Inf. Sci. 2020, 512, 964–984. [Google Scholar] [CrossRef]
  31. Zhang, H.M.; Pei, Z.L.; Zhang, Z.G. Design and implementation of image processing system based on MATLAB. In Proceedings of the International Conference on Logistics Engineering, Management and Computer Science, Shenyang, China, 19 July 2015; Volume 117, pp. 1356–1359. [Google Scholar]
  32. Wang, T.T.; Dong, J.C. Application of image enhancement in the electronation of ancient books. Nat. Sci. Ed. 2015, 33, 26–29. [Google Scholar]
  33. Haralick, R.M. Statistical and structural approaches to texture. Proc. IEEE 1979, 67, 786–804. [Google Scholar] [CrossRef]
  34. Soh, L.K.; Tsatsoulis, C. Texture analysis of SAR sea ice imagery using gray level co-occurrence matrices. IEEE Trans. Geosci. Remote Sens. 1999, 37, 780–795. [Google Scholar] [CrossRef] [Green Version]
  35. Xin, B.; Zhang, J.; Zhang, R.; Wu, X. Color texture classification of yarn-dyed woven fabric based on dual-side scanning and co-occurrence matrix. Text. Res. J. 2017, 87, 1883–1895. [Google Scholar] [CrossRef]
  36. Baralidi, A.; Parmiggian, F. An investigation of the textural characteristics associated with gray level cooccurrence matrix statistical parameters. IEEE Trans. 1995, 33, 293–303. [Google Scholar]
  37. Kuo, C.F.J.; Tsai, C.C. Automatic recognition of fabric nature by using the approach of texture analysis. Text. Res. J. 2006, 76, 375–382. [Google Scholar] [CrossRef]
  38. Du, Z.; Gao, R.; Zhou, T.; He, L. Determination of featured parameters to cluster stiffness handle of fabrics by the CHES-FY system. Fiber. Polym. 2013, 14, 1768–1775. [Google Scholar] [CrossRef]
  39. Likas, A.; Vlassis, N.; Verbeek, J.J. The global k-means clustering algorithm. Pattern Recognit. 2003, 36, 451–461. [Google Scholar] [CrossRef] [Green Version]
  40. Mei, S.; Yang, H.; Yin, Z. An unsupervised-learning-based approach for automated defect inspection on textured surfaces. IEEE Trans. Instrum. Meas. 2018, 67, 1266–1277. [Google Scholar] [CrossRef]
  41. Li, Y.; Zhao, W.; Pan, J. Deformable patterned fabric defect detection with fisher criterion-based deep learning. IEEE Trans. Autom. Sci. Eng. 2017, 14, 1256–1264. [Google Scholar] [CrossRef]
  42. Zhang, K.; Li, P.; Dong, A.; Zhang, K. Yarn-dyed fabric defect classification based on convolutional neural network. Opt. Eng. 2017, 56, 093104. [Google Scholar]
Figure 1. The schematic of texture features in the compression process.
Figure 1. The schematic of texture features in the compression process.
Polymers 14 01855 g001
Figure 2. The schematic of three defect-free fiber texture feature extractions.
Figure 2. The schematic of three defect-free fiber texture feature extractions.
Polymers 14 01855 g002
Figure 3. The process of defect inspection for carbon fiber plain-woven prepreg.
Figure 3. The process of defect inspection for carbon fiber plain-woven prepreg.
Polymers 14 01855 g003
Figure 4. The original images of carbon fiber plain-woven prepreg with and without defects, captured by digital camera.
Figure 4. The original images of carbon fiber plain-woven prepreg with and without defects, captured by digital camera.
Polymers 14 01855 g004
Figure 5. The flowchart of the proposed defect automatic inspection method.
Figure 5. The flowchart of the proposed defect automatic inspection method.
Polymers 14 01855 g005
Figure 6. Image preprocessing steps: (a) original grayscale image; (b) histogram equalization; (c) box filtering; (d) top-hat transform; (e) bottom-hat transform; (f) gray-level morphology; (g) horizontal steerable filter; (h) vertical steerable filter; and (i) Gaussian filter.
Figure 6. Image preprocessing steps: (a) original grayscale image; (b) histogram equalization; (c) box filtering; (d) top-hat transform; (e) bottom-hat transform; (f) gray-level morphology; (g) horizontal steerable filter; (h) vertical steerable filter; and (i) Gaussian filter.
Polymers 14 01855 g006
Figure 7. Three kinds of grayscale images (defect-free, bubble, and wrinkle) and their gray histograms with/without histogram equalization.
Figure 7. Three kinds of grayscale images (defect-free, bubble, and wrinkle) and their gray histograms with/without histogram equalization.
Polymers 14 01855 g007
Figure 8. Gray-level morphological operation processing.
Figure 8. Gray-level morphological operation processing.
Polymers 14 01855 g008
Figure 9. Comparison of the results of the vertical and horizontal steerable filters, Gaussian filter, and binary operation.
Figure 9. Comparison of the results of the vertical and horizontal steerable filters, Gaussian filter, and binary operation.
Polymers 14 01855 g009
Figure 10. The image compression process: (a) preprocessed image; (b) grayscale sub-image arrays of ten weft yarns; (c) binary sub-image arrays of ten weft yarns; (d) preliminary texture pattern; (e) preliminary texture pattern processed by dichotomy; (f) final texture pattern; and (g) the output matrix, F.
Figure 10. The image compression process: (a) preprocessed image; (b) grayscale sub-image arrays of ten weft yarns; (c) binary sub-image arrays of ten weft yarns; (d) preliminary texture pattern; (e) preliminary texture pattern processed by dichotomy; (f) final texture pattern; and (g) the output matrix, F.
Polymers 14 01855 g010
Figure 11. The texture patterns of the original images of carbon fiber plain-woven prepreg with and without defects, corresponding to Figure 4.
Figure 11. The texture patterns of the original images of carbon fiber plain-woven prepreg with and without defects, corresponding to Figure 4.
Polymers 14 01855 g011
Figure 12. Plot of KMC algorithm classification for the three conditions.
Figure 12. Plot of KMC algorithm classification for the three conditions.
Polymers 14 01855 g012
Figure 13. Plot of KMC algorithm classification for the three conditions with proportions of (a) 2:3:5, (b) 2:4:4, and (c) 2:5:3.
Figure 13. Plot of KMC algorithm classification for the three conditions with proportions of (a) 2:3:5, (b) 2:4:4, and (c) 2:5:3.
Polymers 14 01855 g013
Figure 14. Diagram of the algorithm comparison.
Figure 14. Diagram of the algorithm comparison.
Polymers 14 01855 g014
Figure 15. Comparison of different recognition methods (MSCDAE [40], SDAE [41], LSTS [30], BB [23], ER [25], RB [24], WGIS [26], NN [42]).
Figure 15. Comparison of different recognition methods (MSCDAE [40], SDAE [41], LSTS [30], BB [23], ER [25], RB [24], WGIS [26], NN [42]).
Polymers 14 01855 g015
Table 1. The co-occurrence matrices of two conditions [34].
Table 1. The co-occurrence matrices of two conditions [34].
FunctionPositionConditions
M (i, j, d, 45°)RRD(d): (x1x2= d, y1y2= −d)
   or (x1x2= -d, y1y2= d)
{RRD(d), f(x1, y1) = i, f(x2, y2) = j}
M (i, j, d, 135°)RLD(d): (x1x2= d, y1y2= d)
   or (x1x2= -d, y1y2= −d)
{RLD(d), f(x1, y1) = i, f(x2, y2) = j}
Table 2. Performance evaluation metrics for vertical and horizontal steerable filtered images.
Table 2. Performance evaluation metrics for vertical and horizontal steerable filtered images.
Image SizeDirectionACCTPRFPRPPVNPV
10 × 10Vertical82.2778.1313.3186.2778.74
Horizontal72.8972.2426.3974.7171.07
Full-sizeVertical77.3369.5719.2981.4373.36
Horizontal63.5836.4947.3390.0156.19
Table 3. The mean value of textural features.
Table 3. The mean value of textural features.
SpecimensContrastHomogeneityAngular Second Moment
Defect-free0.0320.0220.960
Bubble0.2720.2420.680
Wrinkle0.8390.8300.177
Table 4. The initial cluster center of three surface morphologies with different proportions.
Table 4. The initial cluster center of three surface morphologies with different proportions.
Defect-Free:
Bubble: Wrinkle
Defect-Free
Cluster Center Coordinates
Bubble
Cluster Center Coordinates
Wrinkle
Cluster Center Coordinates
2:3:5(0.0352, 0.0225, 0.9556)(0.2746, 0.2456, 0.6734)(0.8345, 0.9934, 0.1579)
2:4:4(0.0352, 0.0225, 0.9556)(0.2746, 0.2456, 0.6734)(0.8345, 0.9934, 0.1579)
2:5:3(0.0352, 0.0225, 0.9556)(0.2746, 0.2456, 0.6734)(0.8345, 0.9934, 0.1579)
Table 5. The final cluster center of three surface morphologies with different proportions.
Table 5. The final cluster center of three surface morphologies with different proportions.
Defect-Free:
Bubble: Wrinkle
Defect-Free
Cluster Center Coordinates
Bubble
Cluster Center Coordinates
Wrinkle
Cluster Center Coordinates
2:3:5(0.0321, 0.0203, 0.9596)(0.2696, 0.2425, 0.6759)(0.8389, 0.8297, 0.1779)
2:4:4(0.0322, 0.0217, 0.9598)(0.2719, 0.2427, 0.6799)(0.8393, 0.8304, 0.1773)
2:5:3(0.0319, 0.0221, 0.9603)(0.2734, 0.2408, 0.6811)(0.8432, 0.8314, 0.1763)
Table 6. The recognition accuracy of the proposed method in this paper.
Table 6. The recognition accuracy of the proposed method in this paper.
ProportionTN/TB/TWRN/RB/RWIN/%IB/%IW/%IM/%IE/%
2:3:5200:300:500188:283:47194.0094.3394.2094.1894.41
2:4:4200:400:400189:379:38094.5094.7595.0094.75
2:5:3200:500:300189:469:28494.5093.8094.6794.32
Table 7. The recognition accuracy of the GLCM algorithm.
Table 7. The recognition accuracy of the GLCM algorithm.
Data SetProportionRecognition Accuracy (%)Average Value (%)
12:3:592.1291.96
22:4:492.41
32:5:391.37
Table 8. The effect of image compression on performance time.
Table 8. The effect of image compression on performance time.
MethodGLCM & KMC MethodThe Proposed Method of This Article
Each testing time (ms)472136
Total testing time (min)7.872.27
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Li, L.; Wang, Y.; Qi, J.; Xiao, S.; Gao, H. A Novel High Recognition Rate Defect Inspection Method for Carbon Fiber Plain-Woven Prepreg Based on Image Texture Feature Compression. Polymers 2022, 14, 1855. https://doi.org/10.3390/polym14091855

AMA Style

Li L, Wang Y, Qi J, Xiao S, Gao H. A Novel High Recognition Rate Defect Inspection Method for Carbon Fiber Plain-Woven Prepreg Based on Image Texture Feature Compression. Polymers. 2022; 14(9):1855. https://doi.org/10.3390/polym14091855

Chicago/Turabian Style

Li, Lun, Yiqi Wang, Jialiang Qi, Shenglei Xiao, and Hang Gao. 2022. "A Novel High Recognition Rate Defect Inspection Method for Carbon Fiber Plain-Woven Prepreg Based on Image Texture Feature Compression" Polymers 14, no. 9: 1855. https://doi.org/10.3390/polym14091855

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop