Next Article in Journal
Introduction to the Special Issue in Symmetry Titled “Symmetry in Statistics and Data Science”
Previous Article in Journal
A Comparative Study of the Fractional Partial Differential Equations via Novel Transform
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Analysis of Global and Adaptive Thresholding for Biometric Images Based on Neutrosophic Overset and Underset Approaches

by
Vinoth Dhatchinamoorthy
and
Ezhilmaran Devarasan
*
Department of Mathematics, School of Advanced Sciences, Vellore Institute of Technology, Vellore 632014, India
*
Author to whom correspondence should be addressed.
Symmetry 2023, 15(5), 1102; https://doi.org/10.3390/sym15051102
Submission received: 21 January 2023 / Revised: 6 March 2023 / Accepted: 11 March 2023 / Published: 18 May 2023
(This article belongs to the Section Computer)

Abstract

:
The study introduces a new threshold method based on a neutrosophic set. The proposal applies the neutrosophic overset and underset concepts for thresholding the image. The global threshold method and the adaptive threshold method were used as the two types of thresholding methods in this article. Images could be symmetrical or asymmetrical in professional disciplines; the government maintains facial image databases as symmetrical. General-purpose images do not need to be symmetrical. Therefore, it is essential to know how thresholding functions in both scenarios. Since the article focuses on biometric image data, face and fingerprint data were considered for the analysis. The proposal provides six techniques for the global threshold method based on neutrosophic membership, indicating neutrosophic T F overset ( NO T F ), neutrosophic T I overset ( NO T I ), neutrosophic T I F overset ( NO T I F ), neutrosophic T F underset ( NU T F ), neutrosophic T I underset ( NU T I ), neutrosophic T I F underset ( NU T I F ); similarly, in this study, the researchers generated six novel approaches for the adaptive method. These techniques involved an investigation using biometric data, such as fingerprints and facial images. The achievement was 98 % accurate for facial image data and 100 % accurate for fingerprint data.

1. Introduction

Enhancing an image is necessary to perfect its appearance and highlight certain processing aspects of the contained information. Thresholding techniques involve various algorithms that are based on the characteristics of the image. The gray-level histogram of the image plays a major role in the task of thresholding, as it allows for the separation of objects from the image background. Proper thresholding requires the separation of objects from the background of the image. The characteristics of the image can affect the outcome directly or indirectly. The neutrosophic theory is applied to digital image processing and it includes a membership function, an indeterminacy function, and the non-membership function. The authors implemented the min–max normalization method to reduce uncertainty noises from the images. Through the computed membership functions, the non-linearity problem was solved by applying the activation functions. The transformed sets of neutrosophic images helped us to find the similarities and dissimilarities of the image. In addition, to reduce the uncertain noises in an image, Jha [1] discussed a novel approach of neutrosophic sets (NS) for image segmentation to overcome the uncertainty intensity issues from the missing data. The interval neutrosophic sets (INS) enables the transformation of an image and the description of the intervals of the membership functions. This approach helps to evaluate the contrast between the membership functions and also defines a score function for INS. Yuan [2] proposed a new image segmentation method based on INS, and the experimental results showed that it achieved higher PSNR values and performed better than the k-means clustering algorithm. Song [3] proposed a fast image segmentation method that combines a saliency map with NS (SMNS) to achieve higher accuracy. The SMNS method can effectively solve the issues of under-segmentation and over-segmentation, and performs well in the presence of salt and pepper noise, Gaussian white noise, and mixed noise. Multi-class image segmentation is a method that can handle uncertainty management by weak continuity constraints with the NS domain [4]. This method segments the images by the spatial and boundary data of the images. An advantage of this method is that it can perform segmentation without prior knowledge of the number of classes in the image, using an iterative technique. The modified Cramer–Rao bound is used to statistically validate noise perturbations. Ji [5] also proposed a neutrosophic c-means clustering method for color image segmentation that incorporates gradient-based structural similarity to address misclassification issues. Instead of using a maximum membership rule, the method implements similarity in the superpixels. Additionally, the Linguistic neutrosophic cubic set (LNCS) method is used for NS membership degrees, which aggregates using aggregation operators. The LNCS method is validated with various noise types, including Gaussian, speckle, and Poisson noises. The neutrosophic convolutional neural network (NCNN) is a method that involves the NS theory in CNN techniques to segment or classify the images. NCNN achieves 5.11 % and 2.21 % on MNIST and CIFAR-10 datasets, respectively, for five different noise levels. Yang [6] introduced an adaptive local ternary pattern (ALTP) by using Weber’s law. The ALTP features select the threshold for local ternary patterns based on automatic strategies. This proposal focused on face recognition with the center-symmetric adaptive local ternary (CSALT). The CSALT patterns extract better discriminative information patterns from facial images. For ORL and JAFEE face datasets, the weighted nonnegative matrix factorization (WNMF) achieved 98 % and 100 % , respectively, which are more efficient than the PCA algorithm [7]. Alagarsamy [8] proposed ear and face recognition by the method of the Runge–Kutta threshold with ring projection. This proposal was examined for the IIT Delhi dataset and ORL face data; the level of achievement was approximately 96 % . An analysis of ear symmetry is necessary to understand the possibility of matching an individual’s left and right ears. Reconstructed portions of the ear were occluded in a surveillance video. The ear symmetry was assessed geometrically using symmetry operators and Iannarelli’s measurements [9]. Das [10] proposed a new approach—a pixel-based scheme to segment fingerprint images. This proposal consists of three phases: image enhancement, threshold evaluation, and post-processing. Based on the analysis, the author concluded that the SVM algorithm is not suitable when the speed of recognition is the key factor. From the opening and closing of the morphological operations, Wan [11] improved the robustness of Otsu’s algorithm. In this direction, we present a literature review of the individual contributions in this area.
In this aspect, regarding biometric data, such as fingerprint and face data, NS and thresholding play important roles. Our objective is to utilize NS, with its membership functions, to produce better outcomes by analyzing the data in three ways. Each NS domain image contains more features than the classical features of the images, which is advantageous in making decisions in scenarios where indeterminacy exists. Since biometric data contains both symmetrical and asymmetrical types of images, it is essential to analyze both aspects. In this study, we limited our focus to fingerprint and facial images. Our aim is to take on the responsibility of implementing NS overset and underset concepts in the thresholding process. This method will threshold images by dual or triple-step conditions. We use the concept of thresholding in neutrosophic for face and fingerprint images because our goal is to integrate these concepts. The remaining sections of this article are organized as follows: Section 2 presents the preliminaries, Section 3 explains the proposed method, and Section 4 summarizes the findings and discusses the results obtained. Our conclusions and the scope of the article’s future work are presented in Section 5.

2. Preliminaries

Definition 1. 
Let X be a universe of discourse, with a generic element in X denoted by x, then a neutrosophic set, A, is an object with form [12]
A = { ( x , T A ( x ) , I A ( x ) , F A ( x ) }
where the functions T , I , F : X ] 0 , 1 + [ define, respectively, the degree of truth, the degree of indeterminacy, and the degree of the falsity of the element x X to the set condition.
0 T A ( x ) + I A ( x ) + F A ( x ) 3 +
Definition 2  
([13]). Let X be a space of points (objects), with a generic element in X denoted byx. A single-valued neutrosophic set ( S V N S ) A in X is characterized by the truth membership function T A , indeterminacy membership function I A , and falsity membership function F A . For each point, x in X, T A , I A , F A [ 0 , 1 ] . When X is continuous, SVNS A can be written as
A = X T x , I x , F x / x , x X
when X is discrete, SVNS A can be written as
A = i = 1 n T x , I x , F x / x , x X
Definition 3. 
A neutrosophic image P N R is characterized by neutrosophic components, which are T , I , F , where P N R are the pixel values of the image. Neutrosophic images are universally approached with gray-level images. Therefore, the neutrosophic image set is defined as [14]
P N R ( i , j ) = { T ( i , j ) , I ( i , j ) , F ( i , j ) }
In general, mean values and standard deviations of the image are taken as truth and indeterminacy memberships. The image transformation pixels are made by the following formulae
T ( i , j ) = p ¯ ( i , j ) p ¯ min p ¯ max p ¯ min p ¯ ( i , j ) = 1 w w m = i w 2 m = i + w 2 n = j w 2 n = j + w 2 p ( m , n ) I ( i , j ) = δ ( i , j ) δ min δ max δ min δ ( i , j ) = p ( i , j ) p ¯ ( i , j ) F ( i , j ) = 1 T ( i , j )
where p ¯ ( i , j ) is the pixel mean in the region w w and w is generally w = 2 n + 1 , ( n 1 ) .
Definition 4 
(Single-valued neutrosophic overset [15]). Single-valued neutrosophic overset A is defined as A = { ( x , T A ( x ) , I A ( x ) , F A ( x ) } , such that there exists at least one element in A that has at least one neutrosophic component that is > 1, and no element has neutrosophic components that are < 0.
Definition 5 
(Single-valued neutrosophic overset [15]). Single-valued neutrosophic underset A is defined as A = { ( x , T A ( x ) , I A ( x ) , F A ( x ) } , such that there exists at least one element in A that has at least one neutrosophic component that is < 0, and no element has neutrosophic components that are > 1.
The performance measured [16]
η = B 0 B T + F 0 F T B 0 + F 0
where B 0 is the background of the ground truth image, F 0 denotes the foreground of the ground truth image, B T represents the background area pixels, F T denotes the foreground area pixels in the image, and . is the cardinality of the set.

3. Proposed Method

In this article, we modify the overset and underset concepts for image thresholding. While using this, the performance of the neutrosophic and a single-valued neutrosophic is more similar. The article uses neutrosophic sets to address the basic concepts.
Definition 6. 
Let f ( x , y ) = I ( ı , j ) m × n R 2 be an image, then the zero padding for neutrosophic image P k 0 is defined with respect to h as follows:
P k 0 ( g ( x , y ) ) = f ( x , y ) if x + h , y + h max m , max n or x h , y h < min m , min n 0 if x h , y h min m , min n or x + h , y + h > max m , max n
where k = 2 N + 1 , 3 k min ( m , n ) and h = k mod ( 2 ) .
Definition 7. 
Let f ( x , y ) = I ( ı , j ) m × n R 2 be an image, then the one padding for the neutrosophic image P k 1 is defined with respect to h as
P k 1 ( A ) = f ( x , y ) if x + h , y + h max m , max n or x h , y h < min m , min n 1 if x h , y h min m , min n or x + h , y + h > max m , max n
where k = 2 N + 1 , 3 k min ( m , n ) and h = k mod ( 2 ) .
Definition 8. 
Let A = I ( ı , j ) m × n R 2 be an image, then the set of arithmetic mean μ values for h of the image is defined as
g μ ( A ) = { f 1 μ 1 , f 2 μ 2 , f c μ c } f c μ c = f ( A , P k ( A ) , h ) = 1 h 2 k = i Δ i i + Δ i l = j Δ i j + Δ i f c ( k , l )
where c = { 1 , 2 , min ( m , n ) } and Δ i , Δ j = { 1 , 2 , h }
Definition 9. 
Let f ( x , y ) = I ( ı , j ) m × n R 2 be an image, then the set of the standard deviation σ values for h (of the image) is defined as
g σ ( x , y ) = { f 1 σ 1 , f 2 σ 2 , f c σ c } f c σ c = f ( A , P k , h ) = 1 h 2 k = i Δ i i + Δ i l = j Δ j j + Δ j f c ( k , l ) f c μ c 2
where c = { 1 , 2 , min ( m , n ) } and Δ i , Δ j = { 1 , 2 , h }
Figure 1 displays the truth, indeterminacy, and falsity memberships for the Lena image. Figure 1b–d exhibit the corresponding memberships for truth, indeterminacy, and falsity, respectively. Through membership figures, readers can recognize the indeterminacy peaks that may affect the system to threshold images. On the other hand, the indeterminacy peaks that correspond to truth and falsity can also impair one’s ability to make decisions. This is why we present various threshold methods in this article.

3.1. Neutrosophic Overset Global Threshold

Let A = I ( ı , j ) m × n R 2 be an image, then the neutrosophic components of A represent T , I , F [ 0 , 1 ] with respect to h.
Definition 10 
( T F Overset). The neutrosophic overset of A based on T , F memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) > α and F ( x , y ) α
ii.
I ( x , y ) arg max ( g σ ( A ) ) > α arg max ( g σ ( A ) )
where
T , I , F = f ( A , P k 0 , h , P N R ) f σ ( A ) = f ( g σ , P k 0 , h )
then the binary image for the global threshold value α is defined as
NO T F ( A α ) = 1 if it satisfies the condition i & i i 0 otherwise .
Definition 11 
( T I Overset). The neutrosophic overset of A based on T , I memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) > α and I ( x , y ) < α
ii.
F ( x , y ) α
then the binary image for the global threshold value α is defined as
NO T I ( A α ) = 1 if satisfies the condition i & i i 0 otherwise
Definition 12 
( T I F Overset). The neutrosophic overset of A based on T , I memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) > α
ii.
I ( x , y ) < α
iii.
F ( x , y ) α
then the binary image for the global threshold value α is defined as
NO T I F ( A α ) = 1 if it satisfies the condition i i i & i i i 0 otherwise .
The article observes how the accuracy tends to vary with the alpha values in Figure 2. Here, this article considers three different types of ground truth values: binary using the OTSU method, normalization, and the minimax normalization method. These metrics are computed for neutrosophic overset thresholds based on NO T F ; Figure 2a, NO T I  Figure 2b, and NO T I F  Figure 2c.

3.2. Neutrosophic Overset Adaptive Threshold

Let A = I ( ı , j ) m × n R 2 be an image, then the neutrosophic components of A represent T , I , F [ 0 , 1 ] with respect to h.
Definition 13 
( T F Overset). The neutrosophic overset of A based on T , F memberships for the adaptive threshold satisfies the following conditions
i.
T ( x , y ) × ( L 1 ) > α t ( x , y ) and T ( x , y ) > I ( x , y )
ii.
F ( x , y ) × ( L 1 ) < α t ( x , y ) and F ( x , y ) > I ( x , y )
where
T , I , F = f ( A , P k 0 , h , P N R ) α t = { α 1 , α 2 , α t } = f ( A , P k 0 , h , g μ )
The adaptive image over T F of A is
NO T F ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T F of A is
NO T F ( A b i n α t ) = 1 if g μ > 0 0 otherwise
Definition 14 
( T I Overset). The neutrosophic overset of A based on T , I memberships for the adaptive threshold satisfies the following conditions:
i.
T ( x , y ) × ( L 1 ) > α t ( x , y ) and T ( x , y ) > I ( x , y )
ii.
I ( x , y ) × ( L 1 ) > α t ( x , y ) and I ( x , y ) > F ( x , y )
The adaptive image over T I of A is
NO T I ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T I of A is
NO T I ( A b i n α t ) = 1 if g μ > 0 0 otherwise
Definition 15 
( T I F Overset). The neutrosophic overset of A based on T , I , F memberships for the adaptive threshold satisfies the following conditions:
i.
T ( x , y ) × ( L 1 ) > α t ( x , y ) or F ( x , y ) × ( L 1 ) < α t ( x , y )
ii.
T ( x , y ) > I ( x , y ) or F ( x , y ) < I ( x , y )
The adaptive image over T I F of A is
NO T I F ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T F of A
NO T I F ( A b i n α t ) = 1 if g μ > 0 0 otherwise
The comparison between the mean adaptive threshold and the Gaussian adaptive threshold for the multivariate normal distribution is depicted in Figure 3. Figure 3a–f exhibit the NO T F ( A α t ) mean adaptive, Gaussian adaptive, NO T I ( A α t ) mean adaptive, NO T I F ( A α t ) mean adaptive, and Gaussian adaptive, respectively.

3.3. Neutrosophic Underset Global Threshold

Let A = I ( ı , j ) m × n R 2 be an image, then the single-valued neutrosophic components of A represent T , I , F [ 0 , 1 ] with respect to h.
Definition 16 
( T F Underset). The single-valued neutrosophic underset of A based on T , F memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) < α and F ( x , y ) α
ii.
I ( x , y ) arg max ( g σ ( A ) ) > α arg max ( g σ ( A ) )
where
T , I , F = f ( A , P k 1 , h , P N R ) f σ ( A ) = f ( g σ , P k 1 , h )
then the binary image for the global threshold value α is defined as
NU T F ( A α ) = 0 if satisfies the condition i & i i 1 otherwise
Definition 17 
( T I Underset). The neutrosophic underset of A based on T , I memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) < α and I ( x , y ) < α
ii.
F ( x , y ) α
then the binary image for the global threshold value α is defined as
NU T I ( A α ) = 0 if satisfies the condition i & i i 1 otherwise
Definition 18 
( T I F Underset). The neutrosophic underset of A based on T , I , F memberships for the threshold value α satisfies the following conditions
i.
T ( x , y ) < α
ii.
I ( x , y ) < α
iii.
F ( x , y ) α
then the binary image for the global threshold value α is defined as
NU T I F ( A α ) = 0 if satisfies the condition i i i & i i i 1 otherwise
The article demonstrates how the alpha values in Figure 4 tend to alter how accurate the neutrosophic underset theory is. The accuracy study for this neutrosophic underset of the global threshold takes into consideration Lena’s image.

3.4. Neutrosophic Underset Adaptive Threshold

Let A = I ( ı , j ) m × n R 2 be an image, then the single-valued neutrosophic components of A represent T , I , F [ 0 , 1 ] with respect to h.
Definition 19 
( T F Underset). The neutrosophic underset of A based on T , F memberships for the adaptive threshold satisfies the following conditions
i.
T ( x , y ) × ( L 1 ) > α t ( x , y ) and T ( x , y ) > I ( x , y )
ii.
F ( x , y ) × ( L 1 ) < α t ( x , y ) and F ( x , y ) > I ( x , y )
where
T , I , F = f ( A , P k 1 , h , P N R ) α t = { α 1 , α 2 , α t } = f ( A , P k 1 , h , g σ )
The adaptive image over T F of A is
NU T F ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T F of A
NU T F ( A b i n α t ) = 1 if g μ > 0 0 otherwise
Definition 20 
( T I Underset). The neutrosophic overset of A based on T , I memberships for the adaptive threshold satisfies the following conditions
i.
T ( x , y ) × ( L 1 ) > α t ( x , y ) and T ( x , y ) > I ( x , y )
ii.
I ( x , y ) × ( L 1 ) > α t ( x , y ) and I ( x , y ) > F ( x , y )
The adaptive image over T I of A is
NU T I ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T I of A
NU T I ( A b i n α t ) = 1 if g μ > 0 0 otherwise
Definition 21 
( T I F Underset). The neutrosophic underset of A based on T , I , F memberships for the adaptive threshold satisfies the following conditions
i 
T ( x , y ) × ( L 1 ) > α t ( x , y ) or F ( x , y ) × ( L 1 ) α t ( x , y )
ii 
T ( x , y ) > I ( x , y ) or F ( x , y ) I ( x , y )
The adaptive image over T I F of A is
NU T I F ( A α t ) = g μ if satisfies the condition i & i i 0 otherwise
then the binary image over T I F of A
NU T I F ( A b i n α t ) = 1 if g μ > 0 0 otherwise
The analog method of the overset adaptive for the multivariate distribution is contrasted with the neutrosophic underset adaptive threshold in Figure 5.
The accuracy of a sample image (in comparison with the neutrosophic global and adaptive threshold approaches) is presented in Table 1 and Table 2, respectively.

4. Results and Discussion

We used hardware that supports the 11th Gen Intel(R) Core(TM) i5-1135G7 @ 2.40GHz 2.42 GHz with 16 GB of RAM capacity for the analysis.
As previously discussed, symmetrical images are present on authenticated documents, such as passports, driver’s licenses, voter identification cards, Aadhar cards, etc. However, a face cannot always be symmetrical. Some faces may even be asymmetric when they have wounds or natural misalignments. Due to the patterns of the structures, symmetric fingerprint images are very uncommon. Likewise, biometric images include both symmetric and asymmetric formation images. Therefore, the proposed methods must cooperate efficiently in both scenarios.
The sample images for each method of visualization are explicit in Figure 6. The original image is displayed in the first column of the figure, followed by NO T F , NO T I , NO T I F , NU T F , NU T I , NU T I F global and adaptive thresholds, in that order. Similarly, the symmetric image outputs of the proposed thresholding methods are also displayed in Figure 7. The threshold image is validated using average accuracy, recall, and f1 score. The ground truth for the global threshold is determined by normalization, minimax, and Otsu with binary, while the adaptive threshold is based on Gaussian and mean threshold.
The article will first investigate biometric data collected from different databases, such as FVC 2000 [17], FVC 2002- [18], FVC 2004 [19], SD302a [20], SD302d [21], and Soco [22]. A variety of sensor datasets, including low-cost optical sensors, low-cost capacitive sensors, and optical sensors are included in the dataset FVC 2000. Similar to the previous dataset, FVC 2002 includes an optical sensor, a capacitive sensor, and a synthetic fingerprint generation type that generates fingerprint images that are 110 fingers wide with 8 impressions (finger deep). The FVC 2004 dataset also contains updated versions of the optical sensor “V300”, optical sensor “U4000”, thermal sweeping sensor “FCD4B14CB”, and synthetic fingerprint generation version 3. The databases are significantly more challenging to use than the FVC2002 and FVC2000 ones due to the intentional perturbations that are produced by data. The NIST datasets (SD302a and SD302d) were used to generate activity scenarios where subjects would probably leave their fingerprints on various objects. The activities and objects were selected to simulate the types of items frequently used in actual law enforcement casework and different latent print development techniques. The analysis was made from 18 sample images with various types of datasets.
Table 3 lists the performance accuracy of each approach for the considered fingerprint data. According to the data in the table, the following technique is advised for each dataset: FVC 2000— NO T I , FVC 2002— NO T F , NU T F , FVC 2004— NU T I , SD302a— NU T F , NU T I , SD302d— NU T I , Soco— NU T I , NU T I F for global threshold; FVC 2000— NO T F , FVC 2002— NO T F , NO T I , FVC 2004— NO T I F , SD302a - NO T I F , SD302d— NO T I , Soco— NO T I F for the adaptive threshold. According to the threshold type, the best method for the global threshold is NU T I , and the best method for the adaptive thresholds is NO T I F , as shown Figure 8. Overall, NU T I is preferable to other methods for fingerprint images.
Based on our data, we have found that NO T F performs poorly in comparison to other global threshold methods. In a set-by-set comparison, the underset method is recommended for the global threshold. In comparison to the other methods, all underset methods performed poorly. Even though the set-wise analysis favored the underset methods, the overset methods dominated the adaptive threshold. The majority of fingerprint images were asymmetrical, so choosing one of the suggested threshold methods would not have made much of a difference. This denotes that when using a symmetry analysis, it is meaningless as to whether an image is symmetric or asymmetric if it is a fingerprint image. Instead, we use the global threshold method by using neutrosophic underset approaches. The sample image for the SD302a dataset contained more fingerprint patterns than the background’s brightness level. Furthermore, it had a few shades of existence where intensity measurements were missing. The image in the dataset with the highest accuracy had more shades or missing measurements. The neutrosophic overset method is preferable in cases where the image contains missing data, especially for the SD302a dataset. The samples in the SD302b dataset, which were completely covered by patterns, performed remarkably well in the global threshold technique’s neutrosophic underset method and overset method. However, the overset method outperformed the underset accuracy in the adaptive approach. With an equivalently bright background, the Soco dataset sample images contained more dark shades. This dataset performed similar to SD302b when using the global threshold method. The overset method was more advantageous in the adaptive case than the underset method. FVC2000 was completely distinct from previous image types because of the gray features that make fingerprint images. These samples progressed well toward both global threshold tasks. The underset approach in the adaptive method aimed to attain the same level of accuracy. Visually, it is evident that the FVC 2002 and 2004 samples are of a similar type, with some ambiguous fingerprint patterns mixed in with dark shading noise. Undersets performed extraordinarily well in global thresholding for both datasets. However, oversets in the adaptive methods had some noticeable accuracy. The results indicate that the NOTF method typically performs poorly in some scenarios in terms of the global threshold.
The images we used for the facial image analysis were from Caltech Web Faces [23], MIT-CBCL [24], RF (real fake) data [25] NIST-MEDS (Multiple Encounter Dataset)-I [26], and NIST- MEDS-II [27]. Google Image Search was used to collect images of people for the Caltech Web Faces dataset by attempting to enter frequently given names. The dataset is accompanied by a ground truth file that provides the positions of the centers of each frontal face’s eyes, nose, and mouth. It comprises a total of 10,524 human faces in various scenarios and resolutions, including portrait photos and groups of people. In the MIT-CBCL database, there are ten faces. These data contain high-resolution images of faces in frontal, half-profile, and profile views, along with fake images made from 3D head models made by fitting a morphable model. In MEDS-I and MEDS-II, the resolution and image sizes vary significantly. The MEDS data offer instances of repeated observations of the same person over time and include people of different ages; the baseline performance benchmark for face detection was carried out by the MITRE Corporation using Google Picasa. MEDS-I and MEDS-II are intended to promote research and assist the NIST multiple biometric evaluations. The MEDS-II update roughly doubles the number of images and expands the metadata support research and evaluation of pose conformance and local facial features. These data are available to assist the FBI and partner organizations in developing face recognition tools, techniques, and procedures in order to support next-generation identification (NGI), forensic comparison, training, analysis, facial image conformance standards, as well as inter-agency exchange standards. The MITRE Corporation developed MEDS-I and MEDS-II in the FBI Data Analysis Support Laboratory (DASL). Facial analysis was made from 15 sample images with various types of datasets.
Images of faces typically differed from those of fingerprints. Multiple objects, such as ears, eyes, noses, and facial expressions, were not present in fingerprint images. NS must perform tasks involving face data thresholding in multi-object images. In the global threshold method for the considered face, the data from Table 4 for CWF— NO T I , for CBCL and RF—except NO T I F , for MEDS-I— NU T I , NU T I F , and for MEDS-II— NO T I , NU T I , and NU T I F perform exceptionally well. Typically, NO T I is preferable for global face data thresholds. While focusing on the adaptive approach for CWF— NU T F , NU T I F , for CBCL—all underset methods NU T F , NU T I , NU T I F , for RF— NO T F , for MEDS-I and II NU T F , NU T I , NU T I F provide impressive results for face data. In terms of general performances, both the NU T F and NU T I F methods perform admirably; however, regarding the error rate, the NU T F method is preferable for adaptive purposes; NU T I F is the best possible method that combines global and adaptive approaches for the considered face dataset. For facial images, a symmetrical analysis is possible, so using symmetrically based images for analysis makes sense. In the CW, CBCL, and MEDS-II datasets, symmetric images perform better than asymmetric images for the neutrosophic global threshold. In this type of threshold, only MEDS-I under-performs. The RF only contains asymmetric images. Furthermore, the adaptive threshold method outperforms symmetrical facial images. As the article suggests, symmetrical facial images perform much better than asymmetrical ones in the global and adaptive thresholds. Readers can also validate this via Figure 8. In a set-by-set comparison of the global and adaptive techniques, the underset method models outperform the overset method. NO T I F performed less efficiently in both global and adaptive approaches compared to other methods. To better assess the global performance of these methods, when an image contains multiple objects, the adaptive method is the best solution. The CWF sample data consisted of facial images, facial images with text, and multi-object faces with text existences, which were analyzed. Except for TF-based approaches, the other techniques used to calculate the global threshold have good levels of accuracy. However, when the TF method was switched to an adaptive approach, it also overcame its flaws, and all of the suggested methods produced positive results. Comparatively, the global threshold performs very well compared to the adaptive method in most of the scenarios. The CBCL samples had bright backgrounds and covered faces. In both global and adaptive approaches, the accuracy levels of all proposed methods were achieved at around 90 % . The results revealed that all methods performed well and more similarly when the image did not contain multi-objects or bright backgrounds. Images with expression scenarios were included in the RF sample data. Expression images passed the global threshold manner successfully above 91 % . However, it would have been challenging to reach 90 % with the adaptive method. As a result, when the α value is well-known, a global threshold is preferable. The MEDS-I and II datasets, which contain various background facial images with or without objects, are similar. In an environment with global thresholds, MEDS-I performed better than MEDS-II. MEDS-I’s global method was not overcome by MEDS-I in the adaptive concept but the MEDS-II adaptive method’s accuracy overcame the MEDS-II global method.

5. Conclusions

This research, expected to demonstrate the effectiveness of the neutrosophic set concept in handling indeterminacy challenges, can also be applied to image data. We explored twelve innovative approaches for global and adaptively thresholding images. Following the procedures, these techniques reveal various image patterns. With the help of these multiple image patterns, it might be possible to overcome challenges in the threshold task. The analysis of biometric image data, such as fingerprint and face data, was considered for the threshold task. Symmetrical and asymmetrical approaches were also investigated. As per the results, the symmetrical concept was ineffective for fingerprint images but effective for facial images. Asymmetric faces typically produced worse results than symmetric ones. Face data scored 98 % as the best score and fingerprint images reached maximum accuracy. Through this study, we found excellent quantitative results. The proposed method was used mainly for the biometrics dataset. The methods examined different types of fingerprint and face data, such as sensor type, background intensity, and bright and dark shades in the images; each dataset was unique from the others. Every NO method, except for NO TF , performed with greater than 90 % accuracy in the majority of global threshold cases. The NU method achieved 100 % accuracy for fingerprint images depending on the data. When there were missing measurements in the image and shades presented on both NO and NU , the latter method had an advantage. We obtained lower adaptive accuracy values when compared to global. The precise alpha value is essential for the global threshold. The use of adaptive techniques is preferable when the alpha is unknown. The proposed methods used the mean intensity value of the local blocks to evaluate the alpha values. Here, the alpha values vary according to the image block. The advantage of the method is that it functions well when an image has multiple objects in it. Whenever very few errors occurred, these methods achieved a maximum of 94 % accuracy for fingerprint images. The difference between the classical threshold and the proposed threshold methods was analyzed based on their ability to handle indeterminacy. The proposed methods were also tested on facial images containing multiple objects; all proposed methods except for the NO TF method yielded positive results in the global thresholding approach. The CBCL dataset had a maximum face data accuracy of 98 % . When comparing the overset and underset, the underset was the most preferable. The MEDS-II data for adaptive estimation received 96 % , and the underset method was recommended for the image threshold. The overall findings were that the T F -based overset and underset lacked accuracy. It would be challenging to improve the better results. Feature segmentation-based techniques have the potential to improve biometric recognition by enabling gender, age, and expression detection, among other things. Furthermore, these techniques could be applied to various types of image data, including medical images and animal images. If these methods prove to be effective in all domains, they could become the most impactful image pre-processing technique. Our primary goal is to implement these techniques in all segments of image analysis. The proposed methods will be applied to binary image classification for various types of image datasets in the future.

Author Contributions

Concept, methodology, validation: V.D.; manuscript preparation, enhancement, discussions: E.D. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by the Vellore Institute of Technology.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data that support the findings of this study are openly available. The data sources are mentioned in Section 4.

Acknowledgments

Authors would like to express our appreciation to the reviewers for their valuable and constructive suggestions during the development of this research work.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

Abbreviations/SymbolsDefinition
hBlock size
TTruth membership
IIndeterminacy membership
FFalsity membership
P k 0 Zero padding
P k 1 One padding
g μ Arithmetic mean function
g σ Standard deviation function
η Performance measure
α Global threshold value
α t Adaptive threshold value
I ( ı , j ) m × n Image with i , j coordinates with m rows and n columns
NO T F ( A α ) T F -based neutrosophic overset global threshold of image A
NO T I ( A α ) T I -based neutrosophic overset global threshold of image A
NO T I F ( A α ) T I F -based neutrosophic overset global threshold of image A
NO T F ( A α t ) T F -based neutrosophic overset adaptive threshold of image A
NO T I ( A α t ) T I -based neutrosophic overset adaptive threshold of image A
NO T I F ( A α t ) T I F -based neutrosophic overset adaptive threshold of image A
NU T F ( A α ) T F -based neutrosophic underset global threshold of image A
NU T I ( A α ) T I -based neutrosophic underset global threshold of image A
NU T I F ( A α ) T I F -based neutrosophic underset global threshold of image A
NU T F ( A α t ) T F -based neutrosophic underset adaptive threshold of image A
NU T I ( A α t ) T I -based neutrosophic underset adaptive threshold of image A
NU T I F ( A α t ) T I F -based neutrosophic underset adaptive threshold of image A

References

  1. Jha, S.; Kumar, R.; Priyadarshini, I.; Smarandache, F.; Long, H.V. Neutrosophic image segmentation with dice coefficients. Measurement 2019, 134, 762–772. [Google Scholar] [CrossRef]
  2. Yuan, Y.; Ren, Y.; Liu, X.; Wang, J. Approach to image segmentation based on interval neutrosophic set. Numer. Algebr. Control. Optim. 2020, 10, 1–11. [Google Scholar] [CrossRef]
  3. Song, S.; Jia, Z.; Yang, J.; Kasabov, N.K. A fast image segmentation algorithm based on saliency map and neutrosophic set theory. IEEE Photonics J. 2020, 12, 1–6. [Google Scholar] [CrossRef]
  4. Dhar, S.; Kundu, M.K. Accurate multi-class image segmentation using weak continuity constraints and neutrosophic set. Appl. Soft Comput. 2021, 112, 107759. [Google Scholar] [CrossRef]
  5. Ji, B.; Hu, X.; Ding, F.; Ji, Y.; Gao, H. An effective color image segmentation approach using superpixel-neutrosophic C-means clustering and gradient-structural similarity. Optik 2022, 260, 169039. [Google Scholar] [CrossRef]
  6. Yang, W.; Wang, Z.; Zhang, B. Face recognition using adaptive local ternary patterns method. Neurocomputing 2016, 213, 183–190. [Google Scholar] [CrossRef]
  7. Zhou, J. Research of SWNMF with New Iteration Rules for Facial Feature Extraction and Recognition. Symmetry 2019, 11, 354. [Google Scholar] [CrossRef]
  8. Alagarsamy, S.B.; Murugan, K. Multimodal of ear and face biometric recognition using adaptive approach runge–kutta threshold segmentation and classifier with score level fusion. Wirel. Pers. Commun. 2022, 124, 1061–1080. [Google Scholar] [CrossRef]
  9. Abaza, A.; Ross, A. Towards understanding the symmetry of human ears: A biometric perspective. In Proceedings of the 2010 Fourth IEEE International Conference on Biometrics: Theory, Applications and Systems (BTAS), Washington, DC, USA, 27–29 September 2010; pp. 1–7. [Google Scholar]
  10. Das, D. A fingerprint segmentation scheme based on adaptive threshold estimation. In Proceedings of the 2018 11th International Congress on Image and Signal Processing, Biomedical Engineering and Informatics (CISP-BMEI), Beijing, China, 13–15 October 2018; pp. 1–6. [Google Scholar]
  11. Wan, G.C.; Xu, H.; Zhou, F.Z.; Tong, M.S. Improved Fingerprint Segmentation Based on Gradient and Otsu’s Method for Online Fingerprint Recognition. In Proceedings of the 2019 PhotonIcs & Electromagnetics Research Symposium-Spring (PIERS-Spring), Rome, Italy, 17–20 June 2019; pp. 1050–1054. [Google Scholar]
  12. Smarandache, F. Neutrosophy: Neutrosophic Probability, Set, and Logic: Analytic Synthesis & Synthetic Analysis; American Research Press: Pasadena, CA, USA, 1998. [Google Scholar]
  13. Smarandache, F. Multispace & Multistructure. Neutrosophic Transdisciplinarity (100 Collected Papers of Sciences), Volume IV. 2010. Available online: https://digitalrepository.unm.edu/math_fsp/39/ (accessed on 27 May 2022).
  14. Guo, Y.; Cheng, H.D.; Zhang, Y. A new neutrosophic approach to image denoising. New Math. Nat. Comput. 2009, 5, 65. [Google Scholar] [CrossRef]
  15. Smarandache, F. Neutrosophic Overset, Neutrosophic Underset, and Neutrosophic Offset. Similarly for Neutrosophic Over-/Under-/Off-Logic, Probability, and Statistics. Infinite Study. 2016. Available online: https://digitalrepository.unm.edu/math_fsp/26 (accessed on 20 October 2022).
  16. Tizhoosh, H.R. Image thresholding using type II fuzzy sets. Pattern Recognit. 2005, 38, 2363–2372. [Google Scholar] [CrossRef]
  17. Maio, D.; Maltoni, D.; Cappelli, R.; Wayman, J.L.; Jain, A.K. FVC2000: Fingerprint verification competition. IEEE Trans. Pattern Anal. Mach. Intell. 2002, 24, 402–412. [Google Scholar] [CrossRef]
  18. Maio, D.; Maltoni, D.; Cappelli, R.; Wayman, J.L.; Jain, A.K. FVC2002: Second fingerprint verification competition. In Proceedings of the 2002 International Conference on Pattern Recognition, Quebec City, QC, Canada, 11–15 August 2002; Volume 3, pp. 811–814. [Google Scholar]
  19. Maio, D.; Maltoni, D.; Cappelli, R.; Wayman, J.L.; Jain, A.K. FVC2004: Third fingerprint verification competition. In Biometric Authentication: First International Conference, ICBA 2004, Hong Kong, China, 15–17 July 2004; Springer: Berlin/Heidelberg, Germany, 2004; pp. 1–7. [Google Scholar]
  20. Fiumara, G.P.; Flanagan, P.A.; Grantham, J.D.; Ko, K.; Marshall, K.; Schwarz, M.; Tabassi, E.; Woodgate, B.; Boehnen, C. NIST Special Database 302: Nail to Nail Fingerprint Challenge. 2019. Available online: https://www.nist.gov/publications/nist-special-database-302-nail-nail-fingerprint-challenge (accessed on 5 December 2022).
  21. Fiumara, G.; Schwarz, M.; Heising, J.; Peterson, J.; Flanagan, P.A.; Marshall, K. NIST Special Database 302: Supplemental Release of Latent Annotations. 2021. Available online: https://www.nist.gov/publications/nist-special-database-302-supplemental-release-latent-annotations (accessed on 5 December 2022).
  22. Shehu, Y.I.; Ruiz-Garcia, A.; Palade, V.; James, A. Sokoto coventry fingerprint dataset. arXiv 2018, arXiv:1807.10609. [Google Scholar]
  23. Li, F.-F.; Andreeto, M.; Ranzato, M.; Perona, P. Caltech 101 (1.0). In CaltechDATA; Computational Vision Group, California Institute of Technology: London, ON, Canada, 2022. [Google Scholar] [CrossRef]
  24. Weyrauch, B.; Heisele, B.; Huang, J.; Blanz, V. Component-based face recognition with 3D morphable models. In Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop, Washington, DC, USA, 27 June–2 July 2004; p. 85. [Google Scholar]
  25. Nam, S.; Oh, S.W.; Kang, J.Y. Real and Fake Face Detection. January 2019. Available online: https://www.kaggle.com/datasets/ciplab/real-and-fake-face-detection (accessed on 24 November 2022).
  26. Watson, C.I. Multiple Encounter Dataset I (MEDS-I), NIST Interagency/Internal Report (NISTIR); National Institute of Standards and Technology: Gaithersburg, MD, USA, 2010. Available online: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=904685 (accessed on 24 November 2022).
  27. Founds, A.; Orlans, N.; Genevieve, W.; Watson, C. NIST Special Databse 32—Multiple Encounter Dataset II (MEDS-II), NIST Interagency/Internal Report (NISTIR); National Institute of Standards and Technology: Gaithersburg, MD, USA, 2011. Available online: https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=908383 (accessed on 24 November 2022).
Figure 1. Neutrosophic membership visualization, Lena image source: https://en.wikipedia.org/wiki/File:Lenna_(test_image).png (accessed on 14 December 2022).
Figure 1. Neutrosophic membership visualization, Lena image source: https://en.wikipedia.org/wiki/File:Lenna_(test_image).png (accessed on 14 December 2022).
Symmetry 15 01102 g001
Figure 2. Neutrosophic overset global threshold accuracy analysis for various α values.
Figure 2. Neutrosophic overset global threshold accuracy analysis for various α values.
Symmetry 15 01102 g002
Figure 3. Neutrosophic overset adaptive thresholding method. (a) NO T F c-means method, (b) NO T F Gaussian method, (c) NO T I c-means method, (d) NO T I Gaussian method, (e) NO T I F c-means method and (f) NO T I F Gaussian method. Sudoku image source: https://www.fatalerrors.org/a/opencv-threshold-segmentation.html (accessed on 15 December 2022).
Figure 3. Neutrosophic overset adaptive thresholding method. (a) NO T F c-means method, (b) NO T F Gaussian method, (c) NO T I c-means method, (d) NO T I Gaussian method, (e) NO T I F c-means method and (f) NO T I F Gaussian method. Sudoku image source: https://www.fatalerrors.org/a/opencv-threshold-segmentation.html (accessed on 15 December 2022).
Symmetry 15 01102 g003
Figure 4. Neutrosophic underset global threshold accuracy analysis for various α values.
Figure 4. Neutrosophic underset global threshold accuracy analysis for various α values.
Symmetry 15 01102 g004
Figure 5. Neutrosophic underset adaptive thresholding method. (a) NU T F c-means method, (b) NU T F Gaussian method, (c) NU T I c-means method, (d) NU T I Gaussian method, (e) NU T I F c-means method and (f) NU T I F Gaussian method.
Figure 5. Neutrosophic underset adaptive thresholding method. (a) NU T F c-means method, (b) NU T F Gaussian method, (c) NU T I c-means method, (d) NU T I Gaussian method, (e) NU T I F c-means method and (f) NU T I F Gaussian method.
Symmetry 15 01102 g005
Figure 6. Proposed outputs of the sample images.
Figure 6. Proposed outputs of the sample images.
Symmetry 15 01102 g006
Figure 7. Proposed output of the symmetric sample image. Original image source: https://en.wikipedia.org/wiki/File:Braus_1921_395.png (accessed on 28 January 2023).
Figure 7. Proposed output of the symmetric sample image. Original image source: https://en.wikipedia.org/wiki/File:Braus_1921_395.png (accessed on 28 January 2023).
Symmetry 15 01102 g007
Figure 8. Method-wise analysis for the global and adaptive threshold for fingerprint and facial image data; (a) global method for fingerprint data, (b) adaptive method for fingerprint data, (c) global method for face data, (d) adaptive method for face data.
Figure 8. Method-wise analysis for the global and adaptive threshold for fingerprint and facial image data; (a) global method for fingerprint data, (b) adaptive method for fingerprint data, (c) global method for face data, (d) adaptive method for face data.
Symmetry 15 01102 g008
Table 1. Accuracy of the global threshold method for α = 0.5 .
Table 1. Accuracy of the global threshold method for α = 0.5 .
Lena ImageMethod η norm η minmax η Otsu + bin Avg
Symmetry 15 01102 i001 NO T F 99.37893.52498.78797.23
NO T I 99.57893.72498.58797.296
NO T I F 99.10795.0497.27197.139
NU T F 99.37893.52498.78797.23
NU T I 99.3693.50798.80497.224
NU T I F 99.14793.29399.01897.153
Table 2. Accuracy of the adaptive threshold method based on α t .
Table 2. Accuracy of the adaptive threshold method based on α t .
Sudoku ImageMethod η gauss η mean Avg
Symmetry 15 01102 i002 NO T F 92.79190.84491.818
NO T I 92.72990.78291.756
NO T I F 92.65390.70791.68
NU T F 85.01383.06784.04
NU T I 84.9478383.973
NU T I F 84.9478383.973
Table 3. Accuracy of fingerprint data.
Table 3. Accuracy of fingerprint data.
Dataset NO TF NO TI NO TIF NU TF NU TI NU TIF
Global Threshold
SD302a0.331 ± 0.0220.934 ± 0.0540.921 ± 0.0530.886 ± 0.0540.925 ± 0.0540.888 ± 0.053
0.324 ± 0.0340.927 ± 0.0030.905 ± 0.0050.923 ± 0.0790.923 ± 0.0030.906 ± 0.003
0.312 ± 0.0270.939 ± 0.0160.932 ± 0.0160.893 ± 0.070.934 ± 0.0150.91 ± 0.015
SD302d0.312 ± 0.0250.945 ± 0.0210.934 ± 0.0230.911 ± 0.0640.943 ± 0.0210.924 ± 0.021
0.29 ± 0.0210.945 ± 0.0230.942 ± 0.0230.924 ± 0.0520.943 ± 0.0230.929 ± 0.024
0.946 ± 0.0280.944 ± 0.0280.929 ± 0.0270.946 ± 0.0280.935 ± 0.0280.894 ± 0.028
Soco0.429 ± 0.0020.989 ± 0.0090.964 ± 0.0090.664 ± 0.0060.995 ± 0.0090.991 ± 0.008
0.464 ± 0.0090.98 ± 0.0310.957 ± 0.030.533 ± 0.0170.982 ± 0.0310.977 ± 0.031
0.39 ± 0.010.979 ± 0.0280.951 ± 0.0290.777 ± 0.0230.984 ± 0.0280.978 ± 0.027
FVC20000.382 ± 0.0180.965 ± 0.0360.936 ± 0.0350.795 ± 0.0390.966 ± 0.0360.953 ± 0.034
0.394 ± 0.0460.937 ± 0.0950.893 ± 0.0910.805 ± 0.0960.938 ± 0.0950.936 ± 0.094
0.972 ± 0.0310.969 ± 0.0310.938 ± 0.0290.972 ± 0.0320.972 ± 0.0320.97 ± 0.03
FVC20020.924 ± 0.080.921 ± 0.0810.896 ± 0.0830.924 ± 0.080.918 ± 0.080.893 ± 0.078
0.445 ± 0.0030.977 ± 0.0050.933 ± 0.0060.701 ± 0.0150.982 ± 0.0060.968 ± 0.005
0.948 ± 0.0370.944 ± 0.0370.925 ± 0.0370.948 ± 0.0370.945 ± 0.0370.931 ± 0.037
FVC20040.427 ± 0.00.999 ± 0.00.982 ± 0.00.608 ± 0.01.0 ± 0.01.0 ± 0.0
0.408 ± 0.0160.938 ± 0.0530.92 ± 0.0530.657 ± 0.0390.938 ± 0.0540.938 ± 0.054
0.411 ± 0.0050.991 ± 0.0140.974 ± 0.0150.648 ± 0.010.992 ± 0.0140.992 ± 0.014
Adaptive Threshold
SD302a0.835 ± 0.0020.835 ± 0.0020.849 ± 0.0010.837 ± 0.0020.837 ± 0.0020.837 ± 0.002
0.686 ± 0.0020.686 ± 0.0020.851 ± 0.0020.846 ± 0.0030.846 ± 0.0030.846 ± 0.003
0.858 ± 0.0040.858 ± 0.0040.805 ± 0.0040.794 ± 0.0040.794 ± 0.0040.794 ± 0.004
SD302d0.869 ± 0.0060.869 ± 0.0050.788 ± 0.0050.778 ± 0.0050.779 ± 0.0060.779 ± 0.006
0.752 ± 0.0010.752 ± 0.0010.88 ± 0.0010.874 ± 0.0020.874 ± 0.0020.874 ± 0.002
0.883 ± 0.0040.883 ± 0.0040.856 ± 0.0030.843 ± 0.0040.843 ± 0.0040.843 ± 0.004
Soco0.852 ± 0.0040.854 ± 0.0040.838 ± 0.0010.791 ± 0.0030.793 ± 0.0030.793 ± 0.003
0.824 ± 0.0080.825 ± 0.0080.834 ± 0.0050.802 ± 0.0040.803 ± 0.0040.803 ± 0.004
0.718 ± 0.0030.72 ± 0.0040.856 ± 0.0030.81 ± 0.0040.81 ± 0.0040.81 ± 0.004
FVC20000.901 ± 0.0040.901 ± 0.0040.909 ± 0.0010.903 ± 0.0030.903 ± 0.0030.903 ± 0.003
0.858 ± 0.0090.858 ± 0.0090.889 ± 0.0010.851 ± 0.0090.851 ± 0.0090.851 ± 0.009
0.633 ± 0.010.634 ± 0.010.645 ± 0.00.754 ± 0.0140.755 ± 0.0140.755 ± 0.014
FVC20020.847 ± 0.0070.845 ± 0.0060.846 ± 0.0060.831 ± 0.0060.829 ± 0.0060.829 ± 0.006
0.728 ± 0.0040.738 ± 0.0050.851 ± 0.0040.849 ± 0.0080.853 ± 0.0080.854 ± 0.007
0.937 ± 0.0060.94 ± 0.0060.921 ± 0.0060.871 ± 0.0080.87 ± 0.0080.87 ± 0.008
FVC20040.848 ± 0.0060.848 ± 0.0060.942 ± 0.0010.77 ± 0.0060.77 ± 0.0060.77 ± 0.006
0.644 ± 0.0030.645 ± 0.0040.848 ± 0.0020.8 ± 0.0050.8 ± 0.0050.8 ± 0.005
0.832 ± 0.010.832 ± 0.0090.898 ± 0.0010.764 ± 0.0070.764 ± 0.0080.764 ± 0.008
Table 4. Accuracy of face data.
Table 4. Accuracy of face data.
Dataset NO TF NO TI NO TIF NU TF NU TI NU TIF
Global Threshold
CWF0.365 ± 0.0270.94 ± 0.0540.925 ± 0.0560.759 ± 0.0720.941 ± 0.0530.94 ± 0.054
0.469 ± 0.0080.955 ± 0.040.934 ± 0.0390.489 ± 0.0310.957 ± 0.040.95 ± 0.036
0.424 ± 0.0050.988 ± 0.0160.973 ± 0.0150.606 ± 0.0120.984 ± 0.0150.977 ± 0.015
CBCL0.958 ± 0.0280.952 ± 0.0280.94 ± 0.0280.958 ± 0.0280.958 ± 0.0280.957 ± 0.027
0.979 ± 0.0180.979 ± 0.0180.978 ± 0.0180.979 ± 0.0180.979 ± 0.0180.978 ± 0.018
0.987 ± 0.0130.987 ± 0.0130.971 ± 0.0130.987 ± 0.0130.987 ± 0.0130.987 ± 0.013
RF0.915 ± 0.0730.915 ± 0.0730.91 ± 0.0740.915 ± 0.0730.915 ± 0.0730.914 ± 0.073
0.961 ± 0.0030.96 ± 0.0030.956 ± 0.0040.961 ± 0.0030.961 ± 0.0030.958 ± 0.003
0.971 ± 0.0410.971 ± 0.0410.964 ± 0.040.971 ± 0.0410.971 ± 0.0410.971 ± 0.041
MEDS-I0.961 ± 0.0290.961 ± 0.0290.954 ± 0.0290.961 ± 0.0290.961 ± 0.0290.958 ± 0.029
0.344 ± 0.0130.97 ± 0.010.96 ± 0.010.797 ± 0.0320.97 ± 0.010.97 ± 0.01
0.937 ± 0.0630.937 ± 0.0630.925 ± 0.0650.937 ± 0.0630.937 ± 0.0630.937 ± 0.063
MEDS-II0.383 ± 0.0140.955 ± 0.010.945 ± 0.010.688 ± 0.040.955 ± 0.010.955 ± 0.01
0.431 ± 0.0430.883 ± 0.1030.873 ± 0.1010.545 ± 0.1150.884 ± 0.1070.884 ± 0.107
0.356 ± 0.080.797 ± 0.1880.793 ± 0.1830.516 ± 0.1650.798 ± 0.1880.797 ± 0.187
Adaptive Threshold
CWF0.631 ± 0.0050.631 ± 0.0060.826 ± 0.0020.856 ± 0.0070.856 ± 0.0070.856 ± 0.007
0.832 ± 0.0070.838 ± 0.0070.81 ± 0.0070.738 ± 0.0090.744 ± 0.0090.744 ± 0.009
0.854 ± 0.0110.854 ± 0.0110.841 ± 0.0040.903 ± 0.010.902 ± 0.010.903 ± 0.009
CBCL0.923 ± 0.0130.923 ± 0.0130.923 ± 0.0120.909 ± 0.0130.909 ± 0.0130.909 ± 0.013
0.909 ± 0.0030.909 ± 0.0030.909 ± 0.0040.925 ± 0.0060.925 ± 0.0060.925 ± 0.006
0.895 ± 0.0040.895 ± 0.0040.895 ± 0.0040.934 ± 0.0090.934 ± 0.0090.934 ± 0.009
RF0.85 ± 0.0040.85 ± 0.0040.849 ± 0.0040.855 ± 0.0140.855 ± 0.0140.855 ± 0.014
0.892 ± 0.0010.892 ± 0.0010.85 ± 0.0010.753 ± 0.0060.753 ± 0.0060.753 ± 0.006
0.742 ± 0.0110.742 ± 0.0110.853 ± 0.00.874 ± 0.0110.874 ± 0.0110.874 ± 0.011
MEDS-I0.521 ± 0.0080.521 ± 0.0080.597 ± 0.0010.835 ± 0.0140.835 ± 0.0140.835 ± 0.014
0.694 ± 0.0060.694 ± 0.0060.808 ± 0.0020.906 ± 0.0120.906 ± 0.0120.906 ± 0.012
0.637 ± 0.0040.637 ± 0.0040.636 ± 0.0070.919 ± 0.0150.919 ± 0.0150.919 ± 0.015
MEDS-II0.498 ± 0.0080.498 ± 0.0090.708 ± 0.00.857 ± 0.0150.857 ± 0.0150.857 ± 0.015
0.442 ± 0.0030.442 ± 0.0030.547 ± 0.00.964 ± 0.0060.964 ± 0.0060.964 ± 0.006
0.76 ± 0.0080.761 ± 0.0080.761 ± 0.0080.818 ± 0.0080.819 ± 0.0080.819 ± 0.008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Dhatchinamoorthy, V.; Devarasan, E. An Analysis of Global and Adaptive Thresholding for Biometric Images Based on Neutrosophic Overset and Underset Approaches. Symmetry 2023, 15, 1102. https://doi.org/10.3390/sym15051102

AMA Style

Dhatchinamoorthy V, Devarasan E. An Analysis of Global and Adaptive Thresholding for Biometric Images Based on Neutrosophic Overset and Underset Approaches. Symmetry. 2023; 15(5):1102. https://doi.org/10.3390/sym15051102

Chicago/Turabian Style

Dhatchinamoorthy, Vinoth, and Ezhilmaran Devarasan. 2023. "An Analysis of Global and Adaptive Thresholding for Biometric Images Based on Neutrosophic Overset and Underset Approaches" Symmetry 15, no. 5: 1102. https://doi.org/10.3390/sym15051102

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop