Next Article in Journal
Revealing Nucleic Acid Mutations Using Förster Resonance Energy Transfer-Based Probes
Previous Article in Journal
All-Fiber Configuration Laser Self-Mixing Doppler Velocimeter Based on Distributed Feedback Fiber Laser
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Defect Detection in Textures through the Use of Entropy as a Means for Automatically Selecting the Wavelet Decomposition Level

by
Pedro J. Navarro
1,†,
Carlos Fernández-Isla
1,†,
Pedro María Alcover
1,*,† and
Juan Suardíaz
2,†
1
División de Sistemas e Ingeniería Electrónica (DSIE), Universidad Politécnica de Cartagena, Campus Muralla del Mar, s/n, Cartagena E-30202, Spain
2
División de Innovación en Sistemas Telemáticos y Tecnología Electrónica (DINTEL), Universidad Politécnica de Cartagena, Campus Muralla del Mar, s/n, Cartagena E-30202, Spain
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sensors 2016, 16(8), 1178; https://doi.org/10.3390/s16081178
Submission received: 12 May 2016 / Revised: 11 July 2016 / Accepted: 22 July 2016 / Published: 27 July 2016
(This article belongs to the Section Physical Sensors)

Abstract

:
This paper presents a robust method for defect detection in textures, entropy-based automatic selection of the wavelet decomposition level (EADL), based on a wavelet reconstruction scheme, for detecting defects in a wide variety of structural and statistical textures. Two main features are presented. One of the new features is an original use of the normalized absolute function value (NABS) calculated from the wavelet coefficients derived at various different decomposition levels in order to identify textures where the defect can be isolated by eliminating the texture pattern in the first decomposition level. The second is the use of Shannon’s entropy, calculated over detail subimages, for automatic selection of the band for image reconstruction, which, unlike other techniques, such as those based on the co-occurrence matrix or on energy calculation, provides a lower decomposition level, thus avoiding excessive degradation of the image, allowing a more accurate defect segmentation. A metric analysis of the results of the proposed method with nine different thresholding algorithms determined that selecting the appropriate thresholding method is important to achieve optimum performance in defect detection. As a consequence, several different thresholding algorithms depending on the type of texture are proposed.

1. Introduction

Defect detection plays a vital role in automatic inspection in most production processes (food, textile, bottling, timber, steel industries, etc.). In many of these processes, quality controls still depend to a large extent on the training of specialized inspectors. Manual inspection involves limitations in terms of accuracy, coherence and efficiency when detecting defects. This is due to the fact that inspectors are prone to suffer fatigue, boredom or simply fail to pay sufficient attention because of the repetitive nature of their tasks [1]. To deal with these problems, human inspectors are being substituted by automatic visual inspection systems [2,3,4].
Texture analysis provides a very powerful tool to detect defects in applications for visual inspection, since textures provide valuable information about the features of different materials.
In computer vision, texture is broadly classified into two main categories: statistical and structural [5]. Textures that are random in nature are well suited for statistical characterization. Statistical textures do not have easily identifiable primitives; however, some visual properties can usually be observed, such as directionality (directional versus isotropic), appearance (coarse versus fine) or regularity (regular versus irregular) (e.g., wool, sand, wood, etc.) (Figure 1).
On the other hand, structural textures, also called patterned textures, are characterized by a set of primitives (texels) and placement rules. The placement rules define the spatial relationships between the texels, and these spatial relationships may be expressed in terms of adjacency, closest distance or periodicities. The texels themselves may be defined by their gray level, shape or homogeneity of some local property (e.g., milled surfaces, fabric, etc.) (Figure 2). Statistical texture patterns are isotropic, while patterned textures can be classified into oriented (directional) or non-oriented (isotropic) [6] (see Figure 3). A homogeneous texture contains repetitive properties everywhere in an image [7]; if repetitive self-similar patterns can be found, it is possible to talk about homogeneous structural texture (Figure 4a). A homogeneous statistical texture cannot be described with texture primitives and displacement rules; the spatial distribution of gray levels is rather stochastic, but the repetition, self-similarity properties still hold (Figure 4b). If there is no repetition or spatial self-similarity, a texture may be defined as inhomogeneous (Figure 4c).
More recently, Ngan [8] provided a new approach, which defines patterned textures in terms of an underlying lattice, composed of one or more motifs, whose symmetry properties are governed by 17 wallpaper groups; wallpaper groups, also known as crystallographic groups, are well defined in mathematic algebra [9]. Figure 2c,d shows two patterned textures classified in the p1 wallpaper group.
In a recent and complete review of defect detection in textures, Xie [10] classified texture analysis techniques in the following categories: statistical techniques [11,12,13], structural techniques [14,15], filter-based techniques [16,17,18] and model-based techniques [19,20,21], while in his review, Kumar [22] sets three categories of defect detection in fabric: the statistical approach, the spectral approach and the model-based approach. Ngan [23] proposes a new classification of techniques for defect detection in textures into motif and non-motif-based approaches. While the traditional (non-motif) techniques can be sub-divided into statistical methods, spectral methods, model-based methods, learning and structural methods (since texture is composed of one main lattice with only one motif), the motive-based techniques [23,24] use the difference and energy variance among different motifs.
Clustering techniques are used in many defect detection methods; these methods are mainly based on the extraction of texture features. Such features are obtained using different techniques, such as co-occurrence matrix [25,26,27,28], Fourier transform [7,29,30,31,32], Gabor transform [33,34,35,36,37] or the wavelet transform [38,39,40,41].
Spectral-approach techniques provide either the frequency contents of a texture image (Fourier transform) or spatial-frequency analysis (Gabor filters, wavelet transform (WT)).
Fourier transform shows good results when applied over texture patterns with high directionality or regularity, because the information about the directionality and periodicity of the texture pattern is well recognizable in the 2D spectrum, although it fails when attempting to determine the spatial localization of such patterns. Concerning spatial localization, Gabor filters provide better accuracy, but they show a lack of reliability when processing natural textures, since there is no single filter resolution that can localize a structure. The main advantage that WT has over the Gabor transform is that it makes it possible to represent the textures in the appropriate scale because of the variation of the spatial resolution it provides.
The suitability of WT for use in image analysis is well established: a representation in terms of the frequency content of local regions over a range of scales provides an ideal framework for the analysis of image features, which in general are of different sizes and can often be characterized by their frequency domain properties [42]. This makes the wavelet transform an attractive option when segmenting textures, as reported by Truchetet [43] in his review of industrial applications of wavelet-based image processing. He reported different uses of wavelet analysis in successful machine vision applications: detecting defects for manufacturing applications for the production of furniture, textiles, integrated circuits, etc., from their wavelet transformation and vector quantization-related properties of the associated wavelet coefficients; the sorting of ceramic tiles and the recognition of metallic paints for car refinishing by combining color and texture information through a multiscale decomposition of each color channel in order to feed a classifier; printing defect identification and classification (applied to printed decoration and tampoprint images) by analyzing the fractal properties of a textured image; image database retrieval algorithms basing texture matching on energy coefficients in the pyramid wavelet transform using Daubechies wavelets; face recognition using the “symlet” wavelet because of its symmetry and regularity; online inspection of a loom under construction using a specific class of the 2D discrete wavelet transform called the multiscale wavelet representation with the objectives of attenuating the background texture and accentuating the defects; online fabric inspection device performing an independent component analysis on a sub-band decomposition provided by a two-level DWT in order to increase the defect detection rate.

Background and Contributions

Wavelet transform has resulted in two groups of techniques for detecting defects: (1) direct thresholding methods [25,44,45,46,47], whose operation is based on the background attenuation provided by the WT in the successive decomposition levels (such attenuation allows enhancing defects, which can be segmented by utilizing direct thresholding [48]); and (2) methods based on extracting texture features by means of WT [41,42]. After obtaining texture features, feature vectors are formed, and different classifiers are used on them (neural networks, Bayes classifiers, etc.).
In the present paper, a direct thresholding method was decided to be used due to the several disadvantages that methods based on feature vectors present (high computing costs and difficulties to set stop criteria from the proximity-based techniques or training difficulties posed by learning-based techniques). If direct thresholding methods are used, a couple of important challenges have to be faced: (a) how to select the optimal wavelet decomposition level; and (b) determining which bands enable the elimination of the maximum information relative to the texture pattern, which involves two dangerous risks: (a) features of the texture pattern can merge with those of the defects due to an excessive wavelet decomposition; and (b) false positives may appear due to a defective reconstruction scheme.
A review of the literature on texture defect detection methods using wavelet transforms shows a great deal of interest in methods for automatic selection of the optimal wavelet decomposition level due to the potential industrial applications. Of these, the best performance is offered by the methods proposed by [25,46,47], the Tsai and Han methods being the ones that report better results. Tsai proposes an image reconstruction approach based on the analysis and synthesis of wavelet transforms well suited for inspecting surface defects embedded in homogeneous structural and statistical textures. To achieve this, the method removes all regular and repetitive texture patterns from the restored image by selecting proper approximation or detail subimages for wavelet synthesis. In this way, an image where defects are highlighted is obtained. Finally, the method carries out a binarization based on the average and the variance to segment the defects. To find defects in statistical textures with isotropic patterns, the Tsai method uses the approximation image obtained with an optimal resolution level calculated by means of a ratio between two consecutive energy levels. Han reports a method for defect detection in images with high-frequency texture background. Wavelet transform is used to decompose the texture images into approximation, horizontal, vertical and diagonal subimages. This method uses only the approximation subimage, due to the fact that textures are always high-frequency elements of the image. The optimal decomposition level is the maximum variation of the local homogeneity calculated at two successive wavelet decomposition levels. Local homogeneity is one of the 28 features defined by Haralik et al. for texture characterization computed over the co-occurrence matrix [49]. Once the appropriate decomposition level has been selected, Han uses the approximation subimage to reconstruct the image. After that, Otsu’s method is applied to segment the defects. However, as will be seen further in this text, both methods present serious difficulties when processing the large set of textures presented in this paper and when quantifying the size of the defect. This last feature is crucial for many industrial applications: for example in maintenance, tasks where the product is not discarded, but repaired.
This paper proposes a new approach to face the aforementioned inconveniences based on the normalized absolute function value (NABS) and Shannon’s entropy calculation. The main contributions can be summarized as follows:
  • A numeric analysis of the normalized absolute function value (NABS) at different wavelet decomposition levels, which makes it possible to determine the texture directionality. The proposed algorithm makes it possible to classify into high and low directionality patterns, depending on the variation in the slope of the NABS.
  • A novel use of the normalized Shannon’s entropy has been formulated, calculated over different detail subimages, in order to determine the optimal decomposition level in textures with low directionality. For this purpose, it is proposed to calculate this optimal decomposition level as the maximum ratio between the entropy of the approximation subimage and the total entropy, as the sum of entropies calculated for every subimage. This ratio provides better results when detecting defects in a wider range of textures.
This article is organized as follows: Section 1 gives a brief overview of the concerned problem and highlights the paper contribution; also, it examines the background of wavelet decomposition and presents the mathematical notation that will be used throughout the rest of the paper; Section 2 shows a brief description about the wavelet transform; Section 3 presents the proposed method: entropy-based automatic selection of the wavelet decomposition level (EADL); Section 4 presents the results when the developed method was applied to different real textures; finally, Section 5 presents the conclusions.

2. Wavelet Decomposition and Mathematical Notation

Wavelet transform can be applied on an image f x , y in 256 gray levels and of size X × Y by means of a convolution (linear, periodic or cyclic), through two filters (a low-pass filter: L; and a band-pass filter: H), and by taking, from the resulting image, a sample out of two. This process is applied to all the rows, and then, with the resulting rows to all of the columns, provides four subimages defined as L L (L filter on rows and columns), L H (H filter on rows and L filter on columns), H L (L filter on rows and H filter on columns) and H H (H filter on columns and rows). These four subimages share the same origin, and their size is a quarter of the original size. One of them ( L L ) is referred to as the approximation subimage. It is defined as f L L 1 x , y and represents an approximation of the original image. The other three subimages are referred to as detail subimages: horizontal detail subimage or f L H 1 x , y ; vertical detail subimage or f H L 1 x , y ; and diagonal detail subimage or f H H 1 x , y . According to this notation, the original image can be represented in this way: f x , y = f L L 0 x , y .
From this first decomposition level, wavelet transform can be applied again on the previous approximation image, achieving in this way, again, a new approximation image and three detail subimages. f L L j x , y represents the approximation image obtained in the decomposition level j. From that image, subimages f L L j + 1 x , y , f L H j + 1 x , y , f H L j + 1 x , y and f H H j + 1 x , y can then be obtained. Such subimages form the set of images of the decomposition level j + 1 . In [50], there is a detailed description of the algorithm.
The recovery process of the approximation image f L L j x , y , from the four subimages of the decomposition level ( j + 1 ) , is also described in the basic literature on wavelet transforms. Such a reconstructed image is called F:
F = W 1 f L L j x , y
This procedure can be recursively applied until reaching the starting level ( j = 0 ).
Figure 5 graphically represents the decomposition process of wavelet transform. Image f L L j x , y is decomposed into the three detail subimages of level ( j + 1 ) , as well as the new approximation image of level ( j + 1 ) that, at the same time, is decomposed into four subimages at level ( j + 2 ) . Every subimage of level j is of size X / 2 j × Y / 2 j , where X and Y represent the dimensions of the original image at level j = 0 . Thus, a multiresolution analysis of the original image is achieved at different scales, and according to the wavelet signal used in the convolution, it is possible then to find periodic structures, at one scale or another, in a horizontal, vertical or diagonal distribution.

3. Entropy-Based Method for Automatic Selection of the Wavelet Decomposition Level

Entropy has been used in many image processing methods: image segmentation [51,52]; thresholding methods [53,54]; Haralick’s texture descriptor, utilized as parameter for gray-level co-occurrence matrices; measurement incorporated with feature vectors [49,55], etc.
A three-stage method for detecting defects in textures is proposed:
  • In order to determine if the texture has high directionality, the NABS value is calculated. If so, the subimage that contains the maximum directionality information is removed, together with the approximation subimage.
  • If the directionality value is low, then the optimum decomposition level is estimated by means of Shannon’s entropy.
  • To determine the optimum thresholding method.

3.1. Automatic Detection of High Directionality in Texture

In high horizontal and vertical directionality patterns, texture information relies to a great extent on the approximation subimage and either in the horizontal or in the vertical subimage, as shown in Figure 6. In these cases, to isolate defects, it is enough to reconstruct the image eliminating the approximation detail (Figure 6b,g) together with the subimage, which contains the maximum information about the pattern directionality in the first decomposition level (Figure 6c,i).
In this work, the normalized absolute function value (NABS) was used to separate patterns of structural textures with high directionality from the rest. To determine the directionality of a texture, the values of the NABS of the detail subimages ( f L H ( j ) ) , f H L ( j ) ) , f H H ( j ) ) ) are calculated at different decomposition levels (see its graphical representation in Figure 7), and the trend line slopes for each of the three sets of values are obtained ( N A B S h , N A B S v and N A B S d ). If one of these three slopes is very superior to the other two, it can be considered that the texture has great directionality (h, v or d). Thus, the corresponding subimage (h, v or d) together with the approximation subimage of the first decomposition level are eliminated. The approximation subimage is eliminated in the reconstruction process because it constitutes a rough representation of the original image and, therefore, contains a great deal of information about the pattern’s directionality [46].
The normalized expressions of the absolute value (NABS) of the horizontal, vertical and diagonal detail subimages in level j are given by Equations (2)–(4):
N A B S h j = 1 N p i x e l s j · x y f L H ( j ) ( x , y ) )
N A B S v j = 1 N p i x e l s j · x y f H L ( j ) ( x , y ) )
N A B S d j = 1 N p i x e l s j · x y f H H ( j ) ( x , y ) )
for j = 1 , 2 , . . . , J (where J = log 2 N ). N p i x e l s j = X 2 j × Y 2 j is the number of pixels at each decomposition level j.
Figure 7 shows representative images of statistical (Figure 7a,c,d) and structural (Figure 7b) textures. The first two images represent a pattern with high horizontal and vertical directionality respectively, while the other two images present a statistical texture with a coarse-to-fine appearance.
The graphs to right of each image show the NABS values of the horizontal, vertical and diagonal details ( N A B S h j , N A B S v j , N A B S d j ) as a function of the decomposition level j = 1 , 2 , 3 , 4 . Analysis of the graphs shows considerable differences between the NABS values for the high directionality texture patterns and for no directional patterns. This difference is very noticeable in each of the trend line equation slopes, which were obtained using the least squares approach.
It can therefore be concluded that a pattern presents high horizontal, vertical or diagonal directionality if the first-order trend line equation coefficients ( a { h , v , d } ) show variations greater than a given percentage of the maximum of the coefficients.
To simplify the programming, a scale change on the ratio between each of the coefficients and the maximum is applied, expressed as a percentage (Equation (5)), so that the maximum slope value for the three trend lines is obtained when the numerical value R { h , v , d } is zero. Based on the empirical results, a texture may be considered highly directional (horizontal, vertical or diagonal) if the two values R { h , v , d } different from zero are higher than 70%.
R { h , v , d } = 100 · 1 a { h , v , d } a { h , v , d } m a x
Table 1 shows the significant difference between the vertical ( R v , 95 . 4 % ) and diagonal ( R d , 93 . 5 % ) details of the image in Figure 7a and the horizontal detail ( R h , 0 . 0 % ). The table also shows the variation of the horizontal detail ( R h , 94 . 7 % ) and diagonal detail ( R d , 93 . 0 % ) with respect to the vertical detail ( R v , 0 . 0 % ), of the image in Figure 7b. The images in Figure 7a,b can therefore be said to show a high degree of horizontal and vertical directionality, respectively.
The resulting image F will be derived from the composition of the rest of detail subimages. In the case of the textures in Figure 7a,b, the new images will be derived through the following expressions:
F = W 1 f H L ( 1 ) + f H H ( 1 )
F = W 1 f L H ( 1 ) + f H H ( 1 )
As regarding the behavior of the first-order coefficients of the trend line equation on texture patterns shown in Figure 7a–d, it can be noticed that in the statistical and structural textures with high directionality, there are variations greater than 90% in the trend line equation of the horizontal and diagonal details with respect to the other detail subimages (Figure 7a,b. Therefore, as indicated above, a texture may be considered highly directional if there are two values R { h , v , d } nonzero and higher than 70 % .

3.2. Automatic Selection of the Appropriate Decomposition Level Using Shannon’s Entropy

In the process described in Section 3.1, it is proven that, at certain decomposition levels, image textures enjoying high directionality are clearly detected and, at the same time, the defects on them. It is very useful to count on algorithms that automatically determine the optimal decomposition level, in which textures and defects are more easily detected. An algorithm based on Shannon’s entropy has been developed, and it is presented below.
It is already known that Shannon’s entropy describes the level of randomness or uncertainty in an image, i.e., how much information such an image provides. It is possible to state that the higher the value of the entropy, the greater the image quality [56]. Figure 8 shows how texture patterns decrease as the decomposition level of the image is increased. Image degradation level can be measured and quantified at successive levels through Shannon’s entropy.
The entropy value [57,58] is calculated according to the following Equation (6):
S ( X ) = i = 1 T p x i · log p x i
where X = x 1 , x 2 , x T is the set of T values on which the entropy function is applied and where the function p x i calculates the probability of occurrence associated with the value x i ; as our analysis image set X (256 gray-level images) consists of these 256 possible values and the probability p x i will derive from the total number of pixels in the image that have, indeed, the gray level x i , divided by the total number of pixels in the image (also referred to as N t ).
In performing the convolution process of the image with a specific wavelet signal, unbounded real values, positive and negative ones, are obtained. Thus, in order to calculate Shannon’s entropy function in each subimage and for each decomposition level j, the values of each subimage are firstly transformed over the range of integer values between zero and 255. Therefore, in addition to allowing the visualization of the resulting subimages, entropy can be calculated, as well. For each decomposition level, Shannon’s entropy provides four values: S L L j , S L H j , S H L j and S H H j . Each of these normalized values will be divided by the total number of pixels the subimage has in the decomposition level j. Such values are referred to as S s j or S L L j for the entropy of the approximation image and as S h j , S v j and S d j (or, respectively, S L H j , S H L j and S H H j ) for the entropy values of the horizontal detail, vertical detail and diagonal detail subimages.
The function value of Shannon’s entropy indicates how much information on the texture from the original image remains in each subimage. This function value is calculated on each subimage derived from each decomposition level. According to Equation (6), it can be stated that entropy provides a measure of the histogram: the higher the entropy, the greater the uniformity of the histogram, i.e., the greater the information that the image contains on the texture. As the decomposition level increases, subimages lose information about the texture patterns.
In an optimal decomposition level, it would be possible to eliminate the texture pattern with no significant loss of information on the defects. In order to determine the optimal decomposition level, a ratio value r j will be used, calculated between the entropy value of the approximation subimage and the sum of the four subimages:
r j = S s j S s j + S h j + S v j + S d j
Variations of this ratio allow detecting changes in the amount of texture information between two consecutive decomposition levels. The goal is to find the decomposition level causing the maximum variation of the value expressed in Equation (7) with regard to the value in the previous decomposition level. This would indicate that the texture pattern is still present at level j and that it disappears at level ( j + 1 ) , where, however, information on the defects would remain.
For this calculation process, the coefficient A D R j is defined as the difference between two values of r j corresponding to two consecutive decomposition levels.
A D R j = 0 j = 1 r j r j 1 j = 2 , . . . , J
The optimal decomposition level ( J * ) is defined as the level in which A D R j takes the highest value.
J * = arg { m a x j A D R j }
From that level, the decomposition process may end. Beyond that value (for values j > J * ), it is possible to assume that the approximation image remains sufficiently smoothed: most of the texture patterns have been eliminated, and from now on, going ahead with the decomposition would lead to a loss of information on the defects.
In Table 2, the values of Shannon’s entropy are gathered. They were calculated for the images in Figure 8. The values of the coefficients r j , calculated for each image and in each decomposition level, were also gathered, as well as the calculated values of A D R j .
Once the optimal decomposition level is determined, the process ends with the reconstruction of the image according to Equation (10).
F ( x , y ) = W 1 f L L ( j ) ( x , y )

3.3. Optimum Thresholding Method

The wavelet-based methods of texture defect detection reviewed here conclude with a thresholding stage. Most authors use recognized methods, such as [59], or methods based on empirical adjustment [46]. These methods do not specify the criteria for the selection of the thresholding technique, and they do not provide objective calculations on the performance of the method. Sezgin’s survey [48] reviews a large number of thresholding methods for defect detection. Sezgin categorizes the thresholding methods into six groups according to the information they use. These categories are:
  • histogram shape-based methods,
  • clustering-based methods,
  • entropy-based methods,
  • object attribute-based methods,
  • spatial methods and
  • local methods.
Sezgin’s findings show that the best thresholding results were achieved with the three first groups of methods.
Several thresholding methods in the final stage of the EADL method have been used to determine the following:
  • How does the selection of thresholding methods influence the final performance of the algorithm?
  • Which is the most suitable thresholding method for segmenting defects according to texture type?
To answer the above questions, the following thresholding methods were selected, attending to the three first groups proposed by Sezgin:
  • Histogram shape-based methods: thresholding methods based on gray level average (Ave) and the thresholding method based on computing the minima of the maxima of the histogram (MiMa) [60].
  • Clustering-based methods: Ridler’s method [61], Trussell’s method [62], Otsu’s method [59] and Kittle’s method [63].
  • Entropy-based methods: Pun’s method [64], Kapur’s method [52] and Johanssen’s method [51].
In order to perform a quantitative analysis of the segmentation method proposed in this paper, the adequate metrics are necessary to be selected. From the set of metrics proposed by Sezgin [48] and Zhang [65,66], the misclassification error (ME) has been selected. ME represents a measurement of the number of misclassified pixels; ME occurs if the foreground (defect) is identified as the background (texture pattern), or vice versa).
M E = B P B T + O P O T B P + O P
ME is calculated as the relation between the pixels of the test image (segmented by means of the method proposed in this paper) and those of a pattern image (manually segmented). Equation (11) is used to calculate it. B T (background test) and O T (object test) respectively indicate the number of pixels of the texture pattern image and those of the defect in the test image. B P (background pattern) and O P (object pattern) respectively indicate the number of pixels of the texture pattern and those of the defect in the pattern image. The equation proves that the more both images look alike, the smaller M E will be, being zero in the case of a perfect match between the manual segmentation and the automatic segmentation. This indicates, therefore, a maximum efficiency of the method proposed in the present paper.
Equation (12) is used to assess the yield of the segmentation method.
η = 100 · ( 1 M E )
Figure 9 contains the flowchart that summarizes the EADL method. The most expensive computational processes are those colored red in the shown flowchart. The computational complexity of each one is:
  • Compute N A B S : T n = 10 × n 2 + 10 × n + 12 Θ n 2 .
  • Shannon entropy: T n = 8 × n 2 + 10 × n + k , where k 200 , 000 ; for values n 180 : Θ n 2 .
  • Compute r j and J * : Θ log 2 n .
From above individual computational complexities, a total computational complexity of order Θ n 2 is derived. On the other hand, the computational complexity of the algorithm used for wavelet decomposition is 90 × n 3 + α × n 2 + 33 × n + 47 + β , being α = 212 . 5 in the most favorable case and α = 219 . 5 in the worst case; the value of β indicates the number of instructions executed by function MbufPut2d() of the Matrox Imagin Library (MIL). Therefore, the computational complexity of the EADL method is Θ n 3 × log 2 n . From these calculations, it can be concluded that EADL does not increase the complexity of the process for calculating the Wavelet transforms.

4. Results

The EADL method, whose algorithm is shown in Figure 9, was implemented using the C programming language. The EADL method, the automatic band selection method [46] and the adaptive level-selecting method [25] for wavelet reconstruction were tested over a set of 223 images of texture: 115 structural textures: milled surface (29), fabric (67) and bamboo weave (19); and 108 statistical textures: sandpaper surface (29), wood surface (19), wool surface (19), painted surface (21) and cast metal (20).
After checking different mother wavelets (Haar, symlets, biorthonormal, Meyer and coiflets), the Haar-based function with two coefficients has been used as the mother wavelet, because it shows the best ratio between yield and computational cost. The Haar wavelet has been applied until the fourth decomposition level. Higher decomposition levels have proven to lead to the merger of the defects with the texture pattern, which prevents their segmentation.

4.1. Statistical Textures

Figure 10 shows a representative set of different types of defects in statistical textures obtained from the group of 108 images used to test the EADL method: wood (a), painted surfaces (j), sandpaper (k), wool (p) and cast metal (u).
Table 3 shows the value r j calculated from the Shannon entropy ratio at different decomposition levels for the images of Figure 10 classified as statistical textures; it also shows the optimal decomposition level calculated by means of the EADL method, the automatic band selection method and the adaptive level-selecting method, respectively. Image (a) does not show the r j and A D R j figures since the optimal decomposition level was found with the NABS algorithm ( J * = 1 ).
Metric analysis (Table 4) showed that if only one thresholding method is used in EADL programming for the group of statistical textures considered, that method should be MiMa, since it was the one that offered the best average performance in defect detection ( 95 . 00 % ). The maximum performance for each type of texture is underlined. On the other hand, if the thresholding method with the best performance is used according to the kind of texture—wood, sandpaper, wool, painted surface or cast metal—two thresholding methods should be used:
  • The Kapur method for natural directional textures wood ( 98 . 38 % ), sandpaper ( 98 . 47 % ) and wool ( 96 . 75 % ).
  • The MiMa method for artificial irregular isotropic statistical textures painted surface ( 92 . 22 % ) and cast metal ( 92 . 25 % ).

4.2. Structural textures

Figure 11 shows five types of defects in patterns of structural texture obtained from a group of 108 images used to test the EADL method. Figure 11 shows: (a) milled surface, (f) fabric fine-appearance, (k) fabric medium-appearance, (p) fabric coarse-appearance and (u) bamboo weave.
Table 5 shows the value r j calculated from the Shannon entropy ratio at different decomposition levels for the images of Figure 11 classified as structural textures. It also shows the optimal decomposition level calculated by means of the EADL method, the automatic band selection method and the adaptive level-selecting method, respectively. Image (a) does not show the r j and A D R j figures since the optimal decomposition level was found with the NABS algorithm ( J * = 1 ).
Table 6 depicts the numeric values obtained for the thresholding algorithms selected: a value of 100 % indicates that the defect detected by EADL fully matches with the same defect when segmented by a qualified human inspector. The maximum performance for each type of texture is underlined.
As in the previous Section 4.1, the efficiency of the EADL method is determined as compared to the use of a single or multiple thresholding methods. The metric analysis results (Table 6) show that MiMa achieved the best average performance ( 92 . 22 % ). However, if the best-performing thresholding method according to the type of texture—milled surfaces, bamboo weave and fabric—needs to be selected, it is best to select:
  • Kapur’s method for artificial directional structural textures milled surfaces ( 99 . 25 % ) and bamboo weave ( 91 . 11 % ).
  • The MiMa method for natural isotropic structural textures fabric appearance fine ( 96 . 25 % ) and fabric appearance course ( 86 . 82 % ).
Table 7 shows the yields obtained in defect detection with the Tsai, Ngan and EADL methods on 115 structural and 108 statistical texture images. Table 7 also shows the optimal levels of wavelet decomposition ( J * ) obtained by each method. As can be seen, the EADL method performance is higher for the two groups of analyzed textures. It can also be seen that EADL method provides a lower average level of decomposition than the other methods discussed. A lower decomposition level means less degradation of the original image and, therefore, greater detail in detecting defects.

5. Conclusions

This paper presents a robust method for detecting defects in a wide variety of structural and statistical textures. An image reconstruction scheme based on the automatic selection of (1) the band, using NABS and (2) the optimal wavelet transform resolution level, using Shannon’s entropy, has been used.
Valuable information about the directionality of the texture patterns can be extracted from the analysis of the NABS value of the horizontal, vertical and diagonal details at different decomposition levels.
A correct wavelet reconstruction scheme has been implemented to remove the texture patterns and highlight the defects in the resulting images.
It is demonstrated that the optimal decomposition levels computed from the Shannon entropy are lower than the ones provided by other methods based on the co-occurrence matrix (Han method) or on energy calculation (Tsai method). This fact implies an increment of information in the image resulting from the wavelet reconstruction scheme. This characteristic, together with the optimal selection of a thresholding method (MiMa), has allowed the EADL method to achieve high performances in defect detection: a 95 . 00 % in statistical textures and a 92 . 22 % in structural textures.
An analysis of the results of the EADL method with nine different thresholding algorithms showed that selecting the appropriate thresholding method is important for achieving optimum performance in defect detection. On the basis of a metric analysis of 223 images, the most appropriate thresholding algorithm for each texture is proposed. The MiMa method proved to be the most appropriate for the textures, such as painted surfaces, cast metal and fabric. However, the Kapur method has been demonstrated to be better with wood, sandpaper surface, wool, milled surface and bamboo weave.

Acknowledgments

The work submitted here has been conducted within the framework of the ViSel-TR project (Selective Computer Vision Techniques for Non-structured Environments, Grant TIN2012-39279) funded by the Spanish government under the National Plan for R&D. This article is the result of the activity carried out under the “Research Programme for Groups of Scientific Excellence at Region of Murcia” of the Seneca Foundation (Agency for Science and Technology of the Region of Murcia—19895/GERM/15).

Author Contributions

Pedro J. Navarro, Carlos Fernández-Isla, Pedro María Alcover and Juan Suradíaz conceived and designed the algorithm; Pedro J. Navarro, Carlos Fernández-Isla, Pedro María Alcover and Juan Suradíaz performed the tests with texture images database; Pedro J. Navarro and Pedro María Alcover carried out the metric analysis; Carlos Fernández-Isla contributed with an exhaustive bibliography analysis; Pedro J. Navarro, Carlos Fernández-Isla, Pedro María Alcover and Juan Suradíaz wrote the paper. Pedro J. Navarro, Carlos Fernández-Isla, Pedro María Alcover and Juan Suradíaz corrected the draft and approved the final version.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, M.J.J.; Huang, C.L. Evaluating the eye fatigue problem in wafer inspection. IEEE Trans. Semiconduct. Manuf. 2004, 17, 444–447. [Google Scholar] [CrossRef]
  2. Fernández, C.; Suardíaz, J.; Navarro, P.J. Automated Visual Inspection Application within the Industry of Preserved Vegetables. In Proceedings of the International Conference on Quality Control by Artificial Vision (QCAV’01), Le Creusoft, France, 21–23 May 2001; pp. 1–6.
  3. Navarro, P.; Suardiaz, J.; Alcover, P.; Borraz, R.; Mateo, A.; Iborra, A. Teleoperated Visual Inspection System for Hull Spot-Blasting. In Proceedings of the IECON 2006—32nd Annual Conference on IEEE Industrial Electronics, Paris, France, 7–10 November 2006; pp. 3845–3850.
  4. Pernkopf, F.; O’Leary, P. Visual Inspection of Machined Metallic High-Precision Surfaces. EURASIP J. Adv. Signal Process. 2002, 2002, 667–678. [Google Scholar] [CrossRef]
  5. Jain, A.K. Fundamentals of Digital Image Processing; Prentice-Hall, Inc.: Upper Saddle River, NJ, USA, 1989. [Google Scholar]
  6. Lefebvre, A.; Corpetti, T.; Moy, L.H. Estimation of the orientation of textured patterns via wavelet analysis. Pattern Recognit. Lett. 2011, 32, 190–196. [Google Scholar] [CrossRef]
  7. Tsai, D.M.; Kuo, C.C. Defect detection in inhomogeneously textured sputtered surfaces using 3D Fourier image reconstruction. Mach. Vis. Appl. 2007, 18, 383–400. [Google Scholar] [CrossRef]
  8. Ngan, H.Y.; Pang, G.K.; Yung, N.H. Automated fabric defect detection—A review. Image Vis. Comput. 2011, 29, 442–458. [Google Scholar] [CrossRef]
  9. Armstrong, M.A. Groups and Symmetry; Springer-Verlag: New York, NY, USA, 1988. [Google Scholar]
  10. Xie, X. A Review of Recent Advances in Surface Defect Detection using Texture analysis Techniques. ELCVIA Electron. Lett. Comput. Vis. Image Anal. 2008, 7. [Google Scholar] [CrossRef]
  11. Dutta, S.; Datta, A.; Chakladar, N.D.; Pal, S.; Mukhopadhyay, S.; Sen, R. Detection of tool condition from the turned surface images using an accurate grey level co-occurrence technique. Precis. Eng. 2012, 36, 458–466. [Google Scholar] [CrossRef]
  12. Bi, M.; Sun, Z.; Li, Y. Textural Fabric Defect Detection using Adaptive Quantized Gray-level Co-occurrence Matrix and Support Vector Description Data. Inf. Technol. J. 2012, 11, 673–685. [Google Scholar]
  13. Hoseini, E.; Farhadi, F.; Tajeripour, F. Fabric Defect Detection Using Auto-Correlation Function. Int. J. Comput. Theory Eng. 2013, 5, 114–117. [Google Scholar] [CrossRef]
  14. NagaRaju, C.; NagaMani, S.; rakesh Prasad, G.; Sunitha, S. Morphological Edge Detection Algorithm Based on Multi-Structure Elements of Different Directions. Int. J. Inf. Commun. Technol. Res. 2011, 1, 37–43. [Google Scholar]
  15. Narendra, V.G.; Hareesh, K.S. Study and comparison of various image edge detection techniques used in quality inspection and evaluation of agricultural and food products by computer vision. Int. J. Agric. Biol. Eng. 2011, 4, 83–90. [Google Scholar]
  16. Fathi, A.; Monadjemi, A.H.; Mahmoudi, F. Defect Detection of Tiles with Combined Undecimated Wavelet Transform and GLCM Features. Int. J. Soft Comput. Eng. 2012, 2, 30–34. [Google Scholar]
  17. Ai, Y.; Xu, K. Feature extraction based on contourlet transform and its application to surface inspection of metals. Opt. Eng. 2012, 51, 113605–113605. [Google Scholar] [CrossRef]
  18. Chen, S.; Feng, J.; Zou, L. Study of fabric defects detection through Gabor filter based on scale transformation. In Proceedings of the 2010 International Conference on Image Analysis and Signal Processing, Beijing, China, 24–28 October 2010; pp. 97–99.
  19. Joshi, M.S.; Bartakke, P.P.; Sutaone, M.S. Texture representation using autoregressive models. In Proceedings of the International Conference on Advances in Computational Tools for Engineering Applications, (ACTEA ’09), Beirut, Lebanon, 15–17 July 2009; pp. 386–390.
  20. Bu, H.G.; Huang, X.B.; Wang, J.; Chen, X. Detection of Fabric Defects by Auto-Regressive Spectral Analysis and Support Vector Data Description. Text. Res. J. 2010, 80, 579–589. [Google Scholar] [CrossRef]
  21. Singh, S.; Kaur, M. Machine Vision System for Automated Visual Inspection of Tile’s Surface Quality. IOSR J. Eng. 2012, 2, 429–432. [Google Scholar] [CrossRef]
  22. Kumar, A. Computer-Vision-Based Fabric Defect Detection: A Survey. IEEE Trans. Ind. Electron. 2008, 55, 348–363. [Google Scholar] [CrossRef]
  23. Ngan, H.Y.; Pang, G.K.; Yung, N.H. Motif-based defect detection for patterned fabric. Pattern Recognit. 2008, 41, 1878–1894. [Google Scholar] [CrossRef]
  24. Ngan, H.Y.T.; Pang, G.K.H.; Yung, N.H.C. Performance Evaluation for Motif-Based Patterned Texture Defect Detection. IEEE Trans. Autom. Sci. Eng. 2010, 7, 58–72. [Google Scholar] [CrossRef]
  25. Han, Y.; Shi, P. An adaptive level-selecting wavelet transform for texture defect detection. Image Vis. Comput. 2007, 25, 1239–1248. [Google Scholar] [CrossRef]
  26. Zhu, D.; Pan, R.; Gao, W.; Zhang, J. Yarn-Dyed Fabric Defect Detection Based On Autocorrelation Function And GLCM. Autex Res. J. 2015, 15, 226–232. [Google Scholar] [CrossRef]
  27. Zhang, D.; Zhao, M.; Zhou, Z.; Pan, S. Characterization of Wire Rope Defects with Gray Level Co-occurrence Matrix of Magnetic Flux Leakage Images. J. Nondestruct. Eval. 2013, 32, 37–43. [Google Scholar] [CrossRef]
  28. Obula Konda Reddy, R.; Eswara Reddy, B.; Keshava Reddy, E. Classifying Similarity and Defect Fabric Textures based on GLCM and Binary Pattern Schemes. Int. J. Inf. Eng. Electron. Bus. 2013, 5, 25–33. [Google Scholar] [CrossRef]
  29. Tsai, D.M.; Huang, T.Y. Automated surface inspection for statistical textures. Image Vis. Comput. 2003, 21, 307–323. [Google Scholar] [CrossRef]
  30. Tsai, D.M.; Wu, S.C.; Li, W.C. Defect detection of solar cells in electroluminescence images using Fourier image reconstruction. Sol. Energy Mater. Sol. Cells 2012, 99, 250–262. [Google Scholar] [CrossRef]
  31. Hu, G.H.; Wang, Q.H.; Zhang, G.H. Unsupervised defect detection in textiles based on Fourier analysis and wavelet shrinkage. Appl. Opt. 2015, 54, 2963–2980. [Google Scholar] [CrossRef] [PubMed]
  32. Tsai, D.M.; Lin, C.P.; Huang, K.T. Defect detection in coloured texture surfaces using Gabor filters. Imaging Sci. J. 2005, 53, 27–37. [Google Scholar] [CrossRef]
  33. Jing, J.; Yang, P.; Li, P.; Kang, X. Supervised defect detection on textile fabrics via optimal Gabor filter. J. Ind. Text. 2014, 44, 40–57. [Google Scholar] [CrossRef]
  34. Abdollah Mirmahdave, S.; Abdollah, A.; Ahmadyfard, A.; Mosavi, M. Random Texture Defect Detection by Modeling the Extracted Features from the Optimal Gabor Filter. J. Adv. Comput. Res. 2015, 6, 656–685. [Google Scholar]
  35. Hu, G.H. Optimal ring Gabor filter design for texture defect detection using a simulated annealing algorithm. In Proceedings of the 2014 International Conference on Information Science, Electronics and Electrical Engineering (ISEEE), Sapporo, Japan, 26–28 April 2014; Volume 2, pp. 860–864.
  36. Hu, G.H. Automated defect detection in textured surfaces using optimal elliptical Gabor filters. Opt. Int. J. Light Electron Opt. 2015, 126, 1331–1340. [Google Scholar] [CrossRef]
  37. Lambert, G.; Bock, F. Wavelet methods for texture defect detection. In Proceedings of the International Conference on Image Processing, Santa Barbara, CA, USA, 26–29 October 1997; Volume 3, pp. 201–204.
  38. Sari, L.; Ertüzün, A. Texture Defect Detection Using Independent Vector Analysis in Wavelet Domain. In Proceedings of the 2014 22nd International Conference on Pattern Recognition (ICPR), Stockholm, Sweden, 24–28 August 2014; pp. 1639–1644.
  39. Li, W.C.; Tsai, D.M. Wavelet-based defect detection in solar wafer images with inhomogeneous texture. Pattern Recognit. 2012, 45, 742–756. [Google Scholar] [CrossRef]
  40. Ying, Y.L.; Chen, D.E. Defect Detection in Patterned Fabrics Using Wavelet Filter. Adv. Mater. Res. 2013, 756, 3831–3834. [Google Scholar] [CrossRef]
  41. Ghorai, S.; Mukherjee, A.; Gangadaran, M.; Dutta, P.K. Automatic Defect Detection on Hot-Rolled Flat Steel Products. IEEE Trans. Instrum. Meas. 2013, 62, 612–621. [Google Scholar] [CrossRef]
  42. Çelik, H.; Dülger, L.; Topalbekiroğlu, M. Development of a machine vision system: Real-time fabric defect detection and classification with neural networks. J. Text. Inst. 2014, 105, 575–585. [Google Scholar]
  43. Truchetet, F.; Laligant, O. Review of industrial applications of wavelet and multiresolution-based signal and image processing. J. Electron. Imag. 2008, 17, 031102. [Google Scholar] [CrossRef]
  44. Fujiwara, H.; Zhang, Z.; Hashimoto, K. Toward automated inspection of textile surfaces: Removing the textural information by using wavelet shrinkage. In Proceedings of the 2001 ICRA IEEE International Conference on Robotics and Automation, Seoul, Korea, 21–26 May 2001; Volume 4, pp. 3529–3534.
  45. Ngan, H.Y.; Pang, G.K.; Yung, S.; Ng, M.K. Wavelet based methods on patterned fabric defect detection. Pattern Recognit. 2005, 38, 559–576. [Google Scholar] [CrossRef]
  46. Tsai, D.M.; Chiang, C.H. Automatic band selection for wavelet reconstruction in the application of defect detection. Image Vis. Comput. 2003, 21, 413–431. [Google Scholar] [CrossRef]
  47. Tsai, D.M.; Hsiao, B. Automatic surface inspection using wavelet reconstruction. Pattern Recognit. 2001, 34, 1285–1305. [Google Scholar] [CrossRef]
  48. Sezgin, M.; Sankur, B. Survey over image thresholding techniques and quantitative performance evaluation. J. Electron. Imag. 2004, 13, 146–168. [Google Scholar]
  49. Haralick, R.M.; Shanmugam, K.; Dinstein, I. Textural Features for Image Classification. IEEE Trans. Syst. Man Cybern. 1973, SMC-3, 610–621. [Google Scholar] [CrossRef]
  50. Mallat, S.G. A theory for multiresolution signal decomposition: The wavelet representation. IEEE Trans. Pattern Anal. Mach. Intell. 1989, 11, 674–693. [Google Scholar] [CrossRef]
  51. Johannsen, G.; Bille, J. A Threshold Selection Method Using Information Measures. In Proceedings of the Sixth Int’l Conference Pattern Recognition, Munich, Germany, 19–22 October 1982; pp. 140–143.
  52. Kapur, J.; Sahoo, P.; Wong, A. A new method for gray-level picture thresholding using the entropy of the histogram. Comput. Vis. Graph. Image Process. 1985, 29, 273–285. [Google Scholar] [CrossRef]
  53. Tao, W.B.; Tian, J.W.; Liu, J. Image segmentation by three-level thresholding based on maximum fuzzy entropy and genetic algorithm. Pattern Recognit. Lett. 2003, 24, 3069–3078. [Google Scholar] [CrossRef]
  54. Yan, C.; Sang, N.; Zhang, T. Local entropy-based transition region extraction and thresholding. Pattern Recognit. Lett. 2003, 24, 2935–2941. [Google Scholar] [CrossRef]
  55. Melgani, F.; Serpico, S.B. A statistical approach to the fusion of spectral and spatio-temporal contextual information for the classification of remote-sensing images. Pattern Recognit. Lett. 2002, 23, 1053–1061. [Google Scholar] [CrossRef]
  56. Tsai, D.Y.; Lee, Y.; Matsuyama, E. Information Entropy Measure for Evaluation of Image Quality. J. Digit. Imag. 2008, 21, 338–347. [Google Scholar] [CrossRef] [PubMed]
  57. Coifman, R.R.; Wickerhauser, M.V. Entropy-based algorithms for best basis selection. IEEE Trans. Inf. Theory 1992, 38, 713–718. [Google Scholar] [CrossRef]
  58. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
  59. Otsu, N. A Threshold Selection Method from Gray-Level Histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar]
  60. Parker, J.R. Gray Level Thresholding in Badly Illuminated Images. IEEE Trans. Pattern Anal. Mach. Intell. 1991, 13, 813–819. [Google Scholar] [CrossRef]
  61. Ridler, T.; Calvard, S. Picture Thresholding Using an Iterative Selection Method. IEEE Trans. Syst. Man Cybern. 1978, 8, 630–632. [Google Scholar]
  62. Magid, A.; Rotman, S.R.; Weiss, A.M. Comments on Picture thresholding using an iterative selection method. IEEE Trans. Syst. Man Cybern. 1990, 20, 1238–1239. [Google Scholar] [CrossRef]
  63. Kittler, J.; Illingworth, J. Minimum error thresholding. Pattern Recognit. 1986, 19, 41–47. [Google Scholar] [CrossRef]
  64. Pun, T. Entropic thresholding, a new approach. Comput. Graph. Image Process. 1981, 16, 210–239. [Google Scholar] [CrossRef]
  65. Zhang, Y.J. A review of recent evaluation methods for image segmentation. In Proceedings of the Sixth International Symposium on Signal Processing and its Applications, Kuala Lumpur, Malaysia, 13–16 August 2001; Volume 1, pp. 148–151.
  66. Abak, A.T.; Baris, U.; Sankur, B. The performance evaluation of thresholding algorithms for optical character recognition. In Proceedings of the Fourth International Conference on Document Analysis and Recognition, Ulm, Germany, 18–20 August 1997; Volume 2, pp. 697–700.
Figure 1. Some statistical textures: (a) sandpaper, coarse appearance; (b) sandpaper, fine appearance; (c) painted surface, fine appearance; (d) wool, isotropic; (e) wood, high directionality; (f) cast metal, irregular.
Figure 1. Some statistical textures: (a) sandpaper, coarse appearance; (b) sandpaper, fine appearance; (c) painted surface, fine appearance; (d) wool, isotropic; (e) wood, high directionality; (f) cast metal, irregular.
Sensors 16 01178 g001
Figure 2. Several examples of structural textures: (a) milled surface; (b) bamboo weave; (c) fabric, coarse appearance (p1); (d) fabric, fine appearance (p1).
Figure 2. Several examples of structural textures: (a) milled surface; (b) bamboo weave; (c) fabric, coarse appearance (p1); (d) fabric, fine appearance (p1).
Sensors 16 01178 g002
Figure 3. Some homogeneous textures: (a) structural texture with directional pattern; (b) isotropic statistical texture; (c) isotropic patterned texture.
Figure 3. Some homogeneous textures: (a) structural texture with directional pattern; (b) isotropic statistical texture; (c) isotropic patterned texture.
Sensors 16 01178 g003
Figure 4. Homogeneous and inhomogeneous textures: (a) homogeneous structural texture; (b) homogeneous statistical texture; (c) inhomogeneous texture.
Figure 4. Homogeneous and inhomogeneous textures: (a) homogeneous structural texture; (b) homogeneous statistical texture; (c) inhomogeneous texture.
Sensors 16 01178 g004
Figure 5. Visual representation of the algorithm of the two-dimension discrete wavelet transform (DWT-2D), where a filtering operation is carried out with a low-pass filter and a band-pass filter in each row. Both filters are applied again on the two resulting matrices, this time in the columns. The result is four subimages, whose size is a quarter of the original image; one of these images is known as the approximation image (double filter L L ), and the remaining three are known as detail images: horizontal ( L H ), vertical ( H L ) or diagonal ( H H ).
Figure 5. Visual representation of the algorithm of the two-dimension discrete wavelet transform (DWT-2D), where a filtering operation is carried out with a low-pass filter and a band-pass filter in each row. Both filters are applied again on the two resulting matrices, this time in the columns. The result is four subimages, whose size is a quarter of the original image; one of these images is known as the approximation image (double filter L L ), and the remaining three are known as detail images: horizontal ( L H ), vertical ( H L ) or diagonal ( H H ).
Sensors 16 01178 g005
Figure 6. Texture patterns (a,f) and subimages of the first wavelet decomposition levels for directional textures: aproximation (b,g); horizontal (c,h); vertical (d,i); and diagonal (e,j).
Figure 6. Texture patterns (a,f) and subimages of the first wavelet decomposition levels for directional textures: aproximation (b,g); horizontal (c,h); vertical (d,i); and diagonal (e,j).
Sensors 16 01178 g006
Figure 7. Graphic representation of the normalized absolute function values (NABS) ( N A B S h j , N A B S v j , N A B S d j ) and their trend line equations calculated at four decomposition levels of images (ad). Notice that textures (a,b) are selected to be highly directional since they show a trend line slope significantly greater than the others corresponding to certain subimages.
Figure 7. Graphic representation of the normalized absolute function values (NABS) ( N A B S h j , N A B S v j , N A B S d j ) and their trend line equations calculated at four decomposition levels of images (ad). Notice that textures (a,b) are selected to be highly directional since they show a trend line slope significantly greater than the others corresponding to certain subimages.
Sensors 16 01178 g007
Figure 8. Approximation subimages ( f L L ( j ) ) of four wavelet decomposition levels for different textures. (a) Sandpaper (course appearance); (b) sandpaper (fine appearance) and (c) painted-surface: statistical textures; (d) fabric (course appearance) and (e) fabric (fine appearance): structural textures.
Figure 8. Approximation subimages ( f L L ( j ) ) of four wavelet decomposition levels for different textures. (a) Sandpaper (course appearance); (b) sandpaper (fine appearance) and (c) painted-surface: statistical textures; (d) fabric (course appearance) and (e) fabric (fine appearance): structural textures.
Sensors 16 01178 g008
Figure 9. Flowchart to implement the entropy-based automatic selection of the wavelet decomposition level (EADL) method. See Equation (5) for R h , R v , and R d definition.
Figure 9. Flowchart to implement the entropy-based automatic selection of the wavelet decomposition level (EADL) method. See Equation (5) for R h , R v , and R d definition.
Sensors 16 01178 g009
Figure 10. Defects in different statistical textures: (a) wood, (f) painted surface, (k) sandpaper, (p) wool, (u) cast metal; (b,g,l,q,v) are the images resulting from the Tsai method; (c,h,m,r,w) are the images resulting from the Han method; (d,i,n,s,x) are the images resulting from the EADL method; (e,j,o,t,y) are the pattern images resulting from segmentation carried out by human inspectors.
Figure 10. Defects in different statistical textures: (a) wood, (f) painted surface, (k) sandpaper, (p) wool, (u) cast metal; (b,g,l,q,v) are the images resulting from the Tsai method; (c,h,m,r,w) are the images resulting from the Han method; (d,i,n,s,x) are the images resulting from the EADL method; (e,j,o,t,y) are the pattern images resulting from segmentation carried out by human inspectors.
Sensors 16 01178 g010
Figure 11. Defects in different structural texture. (a) milled surface, (f) fabric fine-appearance, (k) fabric medium-appearance, (p) fabric coarse-appearance and (u) bamboo weave; (b,g,l,q,v) are the images resulting from the Tsai method; (c,h,m,r,w) are the images resulting from the Han method; (d,i,n,s,x) are the images resulting from the EADL method; (e,j,o,t,y) are the ground-truth images resulting from segmentation carried out by human inspectors.
Figure 11. Defects in different structural texture. (a) milled surface, (f) fabric fine-appearance, (k) fabric medium-appearance, (p) fabric coarse-appearance and (u) bamboo weave; (b,g,l,q,v) are the images resulting from the Tsai method; (c,h,m,r,w) are the images resulting from the Han method; (d,i,n,s,x) are the images resulting from the EADL method; (e,j,o,t,y) are the ground-truth images resulting from segmentation carried out by human inspectors.
Sensors 16 01178 g011
Table 1. Trend line equation coefficients and maximum variation ratios of the images in Figure 7.
Table 1. Trend line equation coefficients and maximum variation ratios of the images in Figure 7.
1st Order Coefficients y = ax + bMaximum Variation Ratio
Image a h a v a d R h R v R d
Figure 7a Wood 77 . 6 3 . 5 5 . 0 0 % 95 . 4 % 93 . 5 %
Figure 7b milled surface 4 . 2 78 . 9 5 . 6 94 . 7 % 0 % 93 . 0 %
Figure 7c sandpaper (coarse) 29 . 4 30 . 8 27 . 8 4 . 5 % 0 % 9 . 7 %
Figure 7d sandpaper (fine) 5 . 6 6 . 5 8 . 6 34 . 9 % 24 . 8 % 0 %
Table 2. Normalized entropies of four decomposition levels for statistical and structural isotropic textures of Figure 8. The optimal resolution level determination ( J * ) is highlighted for every texture.
Table 2. Normalized entropies of four decomposition levels for statistical and structural isotropic textures of Figure 8. The optimal resolution level determination ( J * ) is highlighted for every texture.
Decomposition Level S s j S h j S v j S d j r j ADR j
Sandpaper-coarse appearance
j = 1 0 . 0001337 0 . 0001237 0 . 0001210 0 . 0001020 0 . 2782 -
j = 2 0 . 0005385 0 . 0005169 0 . 0005102 0 . 0004868 0 . 2624 0 . 0158
j = 3 0 . 0017080 0 . 0020344 0 . 0020738 0 . 0020759 0 . 2164 0 . 0460
j = 4 ( J * ) 0 . 0080432 0 . 0072046 0 . 0071588 0 . 0068548 0 . 2749 0 . 0585
Sandpaper-fine appearance
j = 1 0 . 0001255 0 . 0001228 0 . 0001231 0 . 0000972 0 . 2678 -
j = 2 0 . 0005295 0 . 0005121 0 . 0005184 0 . 0005024 0 . 2568 0 . 0111
j = 3 0 . 0013122 0 . 0017993 0 . 0018528 0 . 0020630 0 . 1867 0 . 0700
j = 4 ( J * ) 0 . 0069050 0 . 0058039 0 . 0060676 0 . 0059565 0 . 2792 0 . 0925
Painted-surface
j = 1 0 . 0001273 0 . 0000464 0 . 0000485 0 . 0000295 0 . 5060 -
j = 2 ( J * ) 0 . 0005614 0 . 0003787 0 . 0003889 0 . 0002652 0 . 3522 0 . 1538
j = 3 0 . 0021846 0 . 0017481 0 . 0018008 0 . 0016123 0 . 2974 0 . 0548
j = 4 0 . 0083969 0 . 0076082 0 . 0076374 0 . 0066278 0 . 2774 0 . 0200
Fabric-coarse appearance
j = 1 0 . 0001309 0 . 0001189 0 . 0001249 0 . 0000966 0 . 2778 -
j = 2 0 . 0005388 0 . 0005157 0 . 0005202 0 . 0005054 0 . 2590 0 . 0187
j = 3 ( J * ) 0 . 0014766 0 . 0018525 0 . 0019925 0 . 0021241 0 . 1983 0 . 0607
j = 4 0 . 0068620 0 . 0068641 0 . 0068436 0 . 0063836 0 . 2546 0 . 0563
Fabric-fine appearance
j = 1 0 . 0001309 0 . 0001078 0 . 0001066 0 . 0000894 0 . 3012 -
j = 2 ( J * ) 0 . 0005310 0 . 0005461 0 . 0005289 0 . 0005220 0 . 2495 0 . 0517
j = 3 0 . 0020043 0 . 0022291 0 . 0022342 0 . 0021072 0 . 2337 0 . 0158
j = 4 0 . 0080267 0 . 0078364 0 . 0078882 0 . 0079990 0 . 2528 0 . 0191
Table 3. J * calculated from statistical textures of Figure 10a,f,k,p,u with the EADL method ( r j  coefficients are shown), with the automatic band selection method (Tsai [46]) and the adaptive level-selecting method (Han [25]). (For r j , A D R j and J * , see Equations (7)–(9)).
Table 3. J * calculated from statistical textures of Figure 10a,f,k,p,u with the EADL method ( r j  coefficients are shown), with the automatic band selection method (Tsai [46]) and the adaptive level-selecting method (Han [25]). (For r j , A D R j and J * , see Equations (7)–(9)).
Image in Figure 10 j = 1 j = 2 j = 3 j = 4 J *
EADLTsaiHan
(a) wood r j ----114
A D R j ----
(f) painted surface r j 0 . 3550 0 . 3020 0 . 2954 0 . 2696 243
A D R j - 0 . 0530 0 . 0067 0 . 0258
(k) sandpaper r j 0 . 3550 0 . 3020 0 . 2954 0 . 2696 344
A D R j - 0 . 0147 0 . 0311 0 . 0225
(p) wool r j 0 . 2700 0 . 2469 0 . 2171 0 . 2548 434
A D R j - 0 . 0231 0 . 0298 0 . 0377
(u) cast metal r j 0 . 2765 0 . 2687 0 . 2540 0 . 2529 344
A D R j - 0 . 0078 0 . 0148 0 . 0011
Table 4. Numerical values obtained using different thresholding methods on statistical textures: (1) the average method, (2) the minima of the maxima of the histogram (MiMa) method, (3) the Riddler method, (4) the Thrussel method, (5) the Otsu method, (6) the Pun method, (7) the Kapur method,(8) the Johanssen method and (9) the Kittler Method.
Table 4. Numerical values obtained using different thresholding methods on statistical textures: (1) the average method, (2) the minima of the maxima of the histogram (MiMa) method, (3) the Riddler method, (4) the Thrussel method, (5) the Otsu method, (6) the Pun method, (7) the Kapur method,(8) the Johanssen method and (9) the Kittler Method.
Statistical TexturesN(1)(2)(3)(4)(5)(6)(7)(8)(9)
Wood19 91 . 86 97 . 81 97 . 45 94 . 64 93 . 57 63 . 60 98.38 97 . 21 63 . 62
Sandpaper surface29 56 . 48 98 . 39 54 . 72 73 . 24 78 . 96 55 . 00 98.47 91 . 34 65 . 31
Wool19 56 . 16 94 . 34 55 . 12 55 . 96 73 . 51 52 . 23 96.75 87 . 47 41 . 21
Painted surface21 61 . 26 92.22 71 . 84 73 . 97 83 . 94 59 . 35 91 . 12 91 . 77 86 . 38
Cast metal20 58 . 14 92.25 63 . 65 62 . 57 59 . 60 57 . 42 62 . 50 57 . 54 65 . 28
Averages108 64 . 78 95.00 68 . 56 72 . 08 77 . 91 57 . 52 89 . 44 85 . 07 64 . 36
N: number of the images.
Table 5. J * calculated from structural textures of Figure 11a,f,k,p,u with the EADL method ( r j coefficients are shown), with the automatic band selection method (Tsai [46]) and the adaptive level-selecting method (Han [25]) (For r j , A D R j and J * , see Equations (7), (8), and (9)).
Table 5. J * calculated from structural textures of Figure 11a,f,k,p,u with the EADL method ( r j coefficients are shown), with the automatic band selection method (Tsai [46]) and the adaptive level-selecting method (Han [25]) (For r j , A D R j and J * , see Equations (7), (8), and (9)).
Image in Figure 11 j = 1 j = 2 j = 3 j = 4 J *
EADLTsaiHan
(a) Milled surface r j ----113
A D R j ----
(f) Fabric fine-appearance r j 0 . 3962 0 . 2872 0 . 2745 0 . 2741 243
A D R j - 0 . 1090 0 . 0127 0 . 0005
(k) Fabric medium-appearance r j 0 . 4855 0 . 2578 0 . 2320 0 . 2370 244
A D R j - 0 . 2278 0 . 0258 0 . 0050
(p) Fabric coarse-appearance r j 0 . 2669 0 . 2463 0 . 2069 0 . 2349 314
A D R j - 0 . 0206 0 . 0394 0 . 0280
(u) Bamboo weave r j 0 . 3908 0 . 2855 0 . 2580 0 . 2426 244
A D R j - 0 . 1053 0 . 0275 0 . 0154
Table 6. Numerical values obtained using different thresholding methods on structural textures: (1) the average method, (2) the MiMa method, (3) the Riddler method, (4) the Thrussel method, (5) the Otsu method, (6) the Pun method, (7) the Kapur method,(8) the Johanssen method and (9) the Kitler method.
Table 6. Numerical values obtained using different thresholding methods on structural textures: (1) the average method, (2) the MiMa method, (3) the Riddler method, (4) the Thrussel method, (5) the Otsu method, (6) the Pun method, (7) the Kapur method,(8) the Johanssen method and (9) the Kitler method.
Structural TexturesN(1)(2)(3)(4)(5)(6)(7)(8)(9)
Milled surface29 78 . 87 97 . 25 98 . 05 97 . 56 85 . 86 34 . 26 99.25 98 . 10 34 . 26
Fabric (fine, medium)38 73 . 78 96.25 75 . 23 79 . 29 84 . 00 66 . 99 91 . 84 79 . 56 70 . 45
Fabric (Coarse)29 56 . 20 86.82 60 . 02 61 . 26 60 . 68 53 . 16 83 . 95 61 . 07 59 . 20
Bamboo weave19 52 . 12 88 . 56 65 . 76 61 . 54 52 . 59 51 . 88 91.11 62 . 76 39 . 55
Averages115 65 . 24 92.22 74 . 77 74 . 91 70 . 78 51 . 57 91 . 54 75 . 37 50 . 87
N: number of the images.
Table 7. Yield values in defect detection.
Table 7. Yield values in defect detection.
Statistical TexturesNTsai [46] J * Han [25] J * EADL J *
Wood19 92 . 75 4 . 00 82 . 17 3 . 75 97 . 81 1 . 05
Sandpaper surface29 91 . 42 4 . 00 77 . 81 3 . 93 98 . 39 2 . 27
Wool19 92 . 75 4 . 00 52 . 88 1 . 00 94 . 34 2 . 00
Painted surface21 80 . 73 3 . 78 49 . 49 3 . 13 92 . 22 2 . 03
Cast metal20 78 . 50 4 . 00 63 . 79 4 . 00 92 . 25 2 . 00
Averages108 87 . 23 3 . 96 65 . 23 3 . 16 95 . 00 1 . 87
Structural Textures N Tsai J * Han J * EADL J *
Milled surfaces29 65 . 05 4 . 00 72 . 82 1 . 00 97 . 25 1 . 10
Fabric (fine, medium)38 58 . 20 3 . 95 62 . 21 3 . 45 96 . 25 1 . 74
Fabric (Coarse)29 73 . 08 4 . 00 51 . 59 3 . 97 86 . 82 2 . 00
Bamboo weave19 81 . 73 3 . 80 54 . 51 4 . 00 88 . 56 2 . 00
Averages115 69 . 51 3 . 94 60 . 28 3 . 10 92 . 22 1 . 71

Share and Cite

MDPI and ACS Style

Navarro, P.J.; Fernández-Isla, C.; Alcover, P.M.; Suardíaz, J. Defect Detection in Textures through the Use of Entropy as a Means for Automatically Selecting the Wavelet Decomposition Level. Sensors 2016, 16, 1178. https://doi.org/10.3390/s16081178

AMA Style

Navarro PJ, Fernández-Isla C, Alcover PM, Suardíaz J. Defect Detection in Textures through the Use of Entropy as a Means for Automatically Selecting the Wavelet Decomposition Level. Sensors. 2016; 16(8):1178. https://doi.org/10.3390/s16081178

Chicago/Turabian Style

Navarro, Pedro J., Carlos Fernández-Isla, Pedro María Alcover, and Juan Suardíaz. 2016. "Defect Detection in Textures through the Use of Entropy as a Means for Automatically Selecting the Wavelet Decomposition Level" Sensors 16, no. 8: 1178. https://doi.org/10.3390/s16081178

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop