Next Article in Journal
Improved Generalized-Pinball-Loss-Based Laplacian Twin Support Vector Machine for Data Classification
Previous Article in Journal
Parameter Estimation of Uncertain Differential Equations Driven by Threshold Ornstein–Uhlenbeck Process with Application to U.S. Treasury Rate Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Robust Harmonic Fuzzy Partition Local Information C-Means Clustering for Image Segmentation

School of Electronic Engineering, Xi’an University of Posts and Telecommunications, Xi’an 710121, China
*
Author to whom correspondence should be addressed.
Symmetry 2024, 16(10), 1370; https://doi.org/10.3390/sym16101370
Submission received: 5 September 2024 / Revised: 4 October 2024 / Accepted: 12 October 2024 / Published: 15 October 2024
(This article belongs to the Section Computer)

Abstract

:
Considering the shortcomings of Ruspini partition-based fuzzy clustering in revealing the intrinsic correlation between different classes, a series of harmonic fuzzy local information C-means clustering for noisy image segmentation are proposed. Firstly, aiming at the shortage of Zadeh’s fuzzy sets, a new concept of generalized harmonic fuzzy sets is originally introduced and the corresponding harmonic fuzzy partition is further defined. Then, based on the concept of symmetric harmonic partition, a new harmonic fuzzy local information C-means clustering (HLICM) is proposed and the local convergence of the algorithm is rigorously proved using Zangwill’s theorem. Finally, inspired by the improved fuzzy local information C-means clustering (IFLICM) and kernel-based weighted fuzzy local information C-means clustering (KWFLICM), two enhanced robust HLICM algorithms are constructed to further improve the ability of the algorithm to suppress noise. Compared with existing state-of-the-art robust fuzzy clustering-related algorithms, it has been confirmed that the two proposed algorithms have significant competitiveness and superiority.

1. Introduction

Clustering analysis [1] is the process of dividing a set of abstract objects or physical targets into multiple classes containing similar features for analysis and understanding. The main purpose of cluster analysis [2] is to group data samples based on their similarities, so that samples in the same class are similar to each other and samples in different classes are different from each other. Clustering [3] is an unsupervised partition method that does not rely on priori knowledge and does not require human intervention. The origin of cluster analysis [4] can be traced back to too many different fields, such as mathematics, statistics, economics, biology, etc. It is widely used in image recognition, data mining [5], and other fields to determine the similarity between different data and achieve data partitioning. Common clustering analysis methods include hierarchical-based clustering [6], partition-based clustering [7], and density-based clustering [8]. Among them, partition-based clustering methods have been widely applied in recent years, including hard partition and fuzzy partition [9]. Hard C-mean (HCM) clustering [10] is one of the classical algorithms for implementing hard C-partition of datasets. It uses a hard clustering algorithm [11] to assign data points to a fixed number of clusters and each example can only belong to one cluster. Therefore, considering the obvious drawbacks of the HCM algorithm, Dunn [12] extended the HCM algorithm to a fuzzy form based on the concept of fuzzy partition given by Ruspini [13]. Subsequently, Bezdek [14] further generalized Dunn’s objective function to a more general form, enabling the FCM algorithm to preserve more information in the original data. However, the biggest drawback of the FCM algorithm is that it does not utilize any spatial information in the image [15], which makes the algorithm sensitive to noise and outliers and prevents it from obtaining satisfactory segmentation results in noise-corrupted images.
In view of this, researchers have proposed many robust fuzzy clustering algorithms that utilize the spatial information of images [16]. Firstly, Ahmed [17] et al. introduced spatial neighborhood terms into the objective function of the FCM algorithm and proposed a spatial fuzzy C-mean clustering algorithm (FCM-S) that combines spatial information, which was applied to the segmentation of MRI images. To reduce the computational complexity of the FCM-S algorithm, Chen and Zhang [18] proposed three improved algorithms, FCM-S, FCM-S1, and FCM-S2. In addition, Chen and Zhang [18] proposed a kernel version of an improved algorithm that uses kernel-induced distance metric instead of the squared Euclidean distance. As we have clearly pointed out, the number of gray levels in an image is much smaller than the number of pixels in the image. Based on this fact, Szilagyi et al. [19] proposed the enhanced fuzzy C-mean clustering algorithm (EnFCM) in an attempt to further improve the efficiency of image segmentation. It should be noted that the EnFCM algorithm achieves segmentation performance comparable to the FCM-S algorithm while significantly reducing computational complexity. Subsequently, Cai et al. [20] proposed the fast generalized fuzzy C-means clustering algorithm (FGFCM). The FGFCM algorithm first defines a novel linearly weighted sum image using the gray values and spatial location information within the pixel neighborhood window of the original image and then performs clustering on the gray histogram of the image. Similar to the EnFCM algorithm, the FGFCM algorithm consumes less running time while achieving satisfactory segmentation results.
In recent years, Krinidis and Chatzis [21] proposed the fuzzy local information C-means (FLICM) clustering algorithm, which suppresses image noise and preserves image details by constructing fuzzy local factors incorporating local spatial information, achieving better segmentation performance in noisy image segmentation. To make FCM and FGFCM clustering more robust to noise, Mithra and Emmanuel [22] proposed an improved fuzzy-based clustering algorithm for sputum images called IFLICM. It combines local spatial and grayscale information in a fuzzy way to maintain robustness and noise insensitivity. A new improved kernel factor is introduced into the traditional FCM algorithm to improve the segmentation performance of sputum images. It manages the effect of neighboring pixels based on their distance from the center pixel. The FLICM algorithm balances the membership values corresponding to each pixel point in the local neighborhood window by fuzzy local factors so that the membership values of non-noise pixels and noise pixels in the neighborhood window converge to similar values and therefore improves the robustness of the algorithm. The KWFLICM algorithm proposed by Gong et al. [23] improved the fuzzy local factor. By introducing the spatial weight influencing factor composed of local variance coefficients into the algorithm more information about spatial interactions can be obtained, resulting in more satisfactory segmentation results.
Problem statement: Due to the limitation of the sum of membership degrees, existing fuzzy clustering algorithms have not fully considered the intrinsic correlation between classes. Therefore, GLCA, as a special unsupervised clustering algorithm [24], uses a non-additive Sugeno fuzzy measure to represent the intrinsic correlation between classes. This enables GLCA to maintain the simplicity and robustness of the Bezdek-type clustering method while effectively extracting expected outliers. However, the fuzzy membership in GLCA may have negative values and the algorithm will not converge locally [25], so it has not attracted more attention from scholars. After studying the characteristics of the GLCA algorithm, Leski [26] proposed the generalized weighted conditional fuzzy clustering (GWCFCM) algorithm. The fuzzy membership in GWCFCM may be greater than 1, which conflicts with the existing definition of membership in Zadeh’s fuzzy set theory. Therefore, Zadeh’s fuzzy set theory cannot support these fuzzy clustering algorithms, which poses a great challenge to the development of these algorithms.
Motivation: Based on the above considerations, this paper first introduces a new concept of generalized harmonic fuzzy sets. Then, inspired by the theoretical research of the GWCFCM algorithm, it is emphasized that clustering algorithms need to fully consider the intrinsic correlation between different categories. Therefore, this paper modifies the constraints of the GWCFCM algorithm by using the idea of non-linear constraints in GLCA to define non-additive harmonic fuzzy partition, which are supported by the new harmonic fuzzy sets. Finally, based on the concept of harmonic fuzzy partition, the related theories and algorithms of harmonic fuzzy C-means clustering and harmonic fuzzy local information C-means clustering are studied.
The main contributions of this paper are summarized below:
  • The new concept of generalized harmonic fuzzy sets is defined for the first time, and on this basis, the concept of symmetric harmonic fuzzy partition is proposed.
  • On the basis of existing robust fuzzy local information clustering for noisy image segmentation, a harmonic fuzzy local information C-means clustering algorithm is proposed through extension. The convergence of the proposed clustering algorithm is rigorously analyzed using Zangwill’s theorem and the bordered Hessian matrix theorem.
  • For images with high noise, two improved algorithms, IHLICM and KWHLICM, are proposed to improve the HLICM algorithm by using the idea of the KWFLICM algorithm.
  • The experimental results indicate that the proposed harmonic fuzzy clustering correlation algorithms are superior to the existing state-of-the-art robust fuzzy clustering correlation algorithms.
The rest of the paper is organized as follows: Section 2 introduces the theory of fuzzy sets, fuzzy partition, and fuzzy clustering related algorithms. Section 3 presents the basic concepts of generalized harmonic fuzzy sets and harmonic fuzzy partition and constructs the harmonic fuzzy C-mean clustering algorithm and its improved robust segmentation algorithms. Section 4 compares and analyzes the clustering results of the proposed HLICM and its improved algorithms with other state-of-the-art robust fuzzy clustering algorithms. Finally, Section 5 provides a summary of the entire paper and offers prospects.

2. Related Work

In this section, we focus on introducing the basic theories involved in this paper, including fuzzy sets, fuzzy partition, and fuzzy C-means clustering and its robust algorithms based on local information.

2.1. Fuzzy Sets and Fuzzy Partition

Basic fuzzy sets, also known as a Zadeh fuzzy sets or a type-1 fuzzy sets, were proposed by Zadeh in 1965 [27]. The membership function, decomposition theorem, and extension principle constitute the three pillars of the classical fuzzy set theory.
Definition 1.
[27]. Let  X = { x i , i = 1 , , n }  be a nonempty finite set, where  i  is the index of the element of the set,  n  is the number of elements, and  x i  is the value taken by the element  i . For any element  x i  in this set, the fuzzy set  A  built on  X  can be defined as  A = { ( x i , μ A ( x i ) ) , x i X } , where  μ A ( x i ) [ 0 , 1 ]  is the fuzzy membership degree of  x i  belonging to the finite set  X .
Zadeh generalized various common concepts of union sets, intersection sets, etc. He defined some basic operation on fuzzy sets, equality, containment, complementation, intersection, and union related to classical Zadeh’s fuzzy sets A , B in any universe of discourse X as follows.
(a)
A = B if and only if μ A ( x ) = μ B ( x ) , x X ;
(b)
A B if and only if μ A ( x ) μ B ( x ) , x X ;
(c)
A c is the complement of A , if and only if μ A c ( x ) = 1 μ A ( x ) , x X ;
(d)
A B if and only if μ A B ( x ) = max ( μ A ( x ) , μ B ( x ) ) , x X ;
(e)
A B if and only if μ A B ( x ) = min ( μ A ( x ) , μ B ( x ) ) , x X .
So far, fuzzy sets have been widely used in pattern recognition, image processing, and intelligent decision making [28].
With the deeper study of fuzzy set theory, Ruspini [29,30] defined the concept of fuzzy partition as a generalization of hard partition. The definition of fuzzy partition is as follows.
Definition 2.
[29]. Let  X ˜ = { A k | k = 1 , 2 , , c } , where  A j  is a fuzzy set on domain  X = { x 1 , x 2 , , x n } X ˜  is a fuzzy partition if and only if
(a) 
k { 1 , 2 , , c } , A k ϕ ;
(b) 
x X , k = 1 c μ A k ( x ) = 1 .
Later, Pan and Xu [31] also gave a generalized axiomatization definition of fuzzy partition as follows.
Definition 3.
[31]. Let  X = [ a , b ] . A fuzzy partition of  X  is an objective having the following form  X ˜ = μ A 1 ( x ) , μ A 2 ( x ) , , μ A c ( x ) , c + , where the fuzzy membership function  μ A k ( x ) : X [ 0 , 1 ] ( k = 1 , 2 , , c )  defines the degrees of memberships of the element  x X  belonging to the class  A k , respectively, and satisfy the following conditions:
(a) 
For any  x X , there is at least one  k { 1 , 2 , , c }   such that  μ A k ( x ) > 0 ;
(b) 
For any  k { 1 , 2 , , c } , μ A k ( x )   is continuous on  X ;
(c) 
For any  k { 1 , 2 , , c } , there is at least one  x 0 X   such that  μ A k ( x 0 ) = 1 ;
(d) 
For any  k { 1 , 2 , , c } , if  μ A k ( x 0 ) = 1   for  x 0 X , then  μ A k ( x )   is non-decreasing on  [ a , x 0 ] , and is non-increasing on  [ x 0 , b ] ;
(e) 
0 < k = 1 c μ A k ( x ) 1  holds for  x X .
If k = 1 c μ A k ( x ) = 1 , x X , then X ˜ is said to be a regular fuzzy partition, namely Ruspini’s fuzzy partition.
In addition, Mesiar and Rybátrik [32] used the assignment of an exemplar generating function to define the following g-fuzzy partition.
Definition 4.
[32]. Let  X ˜ = { A k | k = 1 , 2 , , c } , where  A k  is a fuzzy set on domain  X = { x 1 , x 2 , , x n } X ˜  is a generalized fuzzy partition if and only if  x X ,  k = 1 c g ( μ A k ( x ) ) = 1 , where  g ( )  is a normed generator; i.e., it is a continuous strictly increasing function  g :   [ 0 , 1 ] [ 0 , 1 ] ,  g ( 0 ) = 0 ,  g ( 1 ) = 1 . The typical normed generator is  g ( x ) = x r ( r > 0 ) ,  g ( x ) = ln ( 1 + x ) / ln 2 , and  g ( x ) = ( exp ( a x ) 1 ) / ( exp ( a ) 1 ) ( a > 0 ) .
Overall, fuzzy partition is the theoretical basis of fuzzy clustering [33]. For a long time, the fuzzy partition problem in fuzzy clustering theory has attracted extensive attention from scholars [34,35].

2.2. Fuzzy Partition-Related Clustering

The FCM algorithm is an unsupervised fuzzy clustering method proposed by Dunn [12] and developed by Bezdek [14]. The optimization model for FCM is as follows:
min J F C M ( U , V ) = k = 1 c i = 1 n u k i m d 2 ( x i , v k )
s.t. (a) u k i [ 0 , 1 ] , i , k ; (b) k = 1 c u k i = 1 , i ; (c) 0 < i = 1 n u k i < n , k .
Fuzzy membership u i k and the clustering center v k corresponding to the optimization model are obtained as follows:
u k i = 1 j = 1 c d 2 ( x i , v k ) / d 2 ( x i , v j ) 1 / ( m 1 )
v k = i = 1 n u k i m x i / i = 1 n u k i m
The FCM algorithm calculates fuzzy membership of each sample to different clusters in the range [0, 1], ensuring that the sum of membership degrees of each sample to all clusters is 1. Namely, the fuzzy partition space in the FCM algorithm is described as follows:
P ¯ f c = U = [ u k i ] c × n c × n | u k i [ 0 , 1 ] , i , k ; k = 1 c u k i = 1 , i
However, this clustering method fails to reveal the inherent correlations between different categories and has some difficulty in dealing with data with noise or outliers. Considering the shortcomings of the FCM algorithms, Wang et al. [36] and Zhu et al. [37] proposed a double-indices-induced fuzzy C-means clustering method, and the corresponding optimization model is established as follows:
min J D I F C M ( U , V ; X ) = k = 1 c i = 1 n u k i m d 2 ( x i , v k )
s.t. (a) u k i [ 0 , 1 ] , i , k ; (b) k = 1 c u k i r = 1 , i , 0 < r ; (c) 0 < i = 1 n u k i < n , k .
The iterated formulas of membership and clustering center are given by
u k i = 1 j = 1 c d 2 ( x i , v k ) / d 2 ( x i , v j ) 2 r m r 1 / r
v k = i = 1 n u k i m x i i = 1 n u k i m
For double-indices-induced fuzzy C-means clustering, if r = 1 , this algorithm degenerates to the classical FCM algorithm. Therefore, the FCM algorithm is a specific case of the DIFCM algorithm. In addition, from the perspective of fuzzy partition, this DIFCM algorithm is a generalized fuzzy partition clustering method [32], and its g-fuzzy partition space may be defined as follows:
P ¯ g c = U = [ u k i ] c × n c × n | u k i [ 0 , 1 ] , i , k ; k = 1 c u k i r = 1 ( 0 < r ) , i

2.3. Robust Fuzzy Clustering with Local Information

The FCM algorithm introduces fuzziness to the attributes of each image pixel, allowing the FCM algorithm to preserve more detailed information in the original image and achieve more satisfactory segmentation results. However, the biggest drawback of the FCM algorithm in image segmentation is that it does not consider any spatial information in the image [15], which makes the algorithm sensitive to noise in the image and unable to obtain satisfactory segmentation results in noisy images. Krinidis and Chatzis [21] defined the concept of fuzzy local factor using the local spatial information of image pixels, called local spatial fuzzy factor [38] and introduced this factor into the FCM algorithm to propose the fuzzy local information C-means clustering algorithm (FLICM). Due to the local spatial fuzzy factor fusing the spatial distance and grayscale information between neighborhood pixels in a fuzzy manner [39], it can find a balance between suppressing noise and image detail preservation to a certain extent. The optimization model of this FLICM algorithm is as follows:
min J F L I C M ( U , V ) = k = 1 c i = 1 n u k i m d 2 ( x i , v k ) + G k i
  • s.t. (a) 0 u k i 1 , i , k ; (b) k = 1 c u k i = 1 , i ; (c) 0 < i = 1 n u k i < n , k .
  • with the fuzzy local information factor G k i = j N i , j i 1 d j i + 1 ( 1 u k j ) m d 2 ( x j , v k ) .
Using the Lagrange multiplier method, the following fuzzy membership u k i and the clustering center v k can be obtained:
u k i = 1 l = 1 c d 2 ( x i , v k ) + G k i d 2 ( x i , v l ) 2 + G l i 1 m 1
v k = i = 1 n u k i m x i / i = 1 n u k i m
To make FCM clustering more robust to noise, Li and Zheng [22] also proposed an improved fuzzy-based clustering algorithm IFLICM for MR image segmentation. The optimization model for the IFLICM algorithm is as follows:
min J I F L I C M ( U , V ) = k = 1 c i = 1 n u k i m d 2 ( x i , v k ) + G k i
  • s.t. (a) 0 u k i 1 , i , k ; (b) k = 1 c u k i = 1 , i ; (c) 0 < i = 1 n u k i < n , k .
  • where the improved fuzzy local information factor is defined as
G k i = j N i , j i s j i ( 1 u k j ) m d 2 ( x j , v k )
in which, s j i = exp | x i x j | / 255 + exp | x i x ¯ i | / 255 2 , x ¯ i = 1 | N i | j N i x j .
Using the Lagrange multiplier method, the following membership u k i and the clustering center v k can be also obtained:
u k i = 1 l = 1 c d 2 ( x i , v k ) + G k i d 2 ( x i , v l ) 2 + G l i 1 m 1
v k = i = 1 n u k i m x i / i = 1 n u k i m
The FLICM algorithm introduces unsupervised parameters, but still lacks sufficient robustness to high noise. In view of this, Gong et al. [23] proposed a kernel metric-based fuzzy local information FCM algorithm (KWFLICM), which introduces a kernel function [40] and the weighted factor w i j [41] to further improve the noise resistance performance. The combination of the coefficient of variation and the weighted factor w i j better expresses the difference between the center pixel and its neighboring pixels and can obtain more information about spatial interactions [39], thereby achieving segmentation results with less noise. The objective function of the KWFLICM algorithm is as follows:
J K W F L I C M ( U , V ) = i = 1 n k = 1 c u k i m ( 1 K ( x i , v k ) ) + G k i
where U = ( u k i ) c × n represents the membership matrix of size c × n , n represents the total number of samples, c represents the number of categories into which the samples have been classified, u k i represents the degree to which sample x i belongs to category k , and m is the fuzzy weighting exponent, which is defaulted to m = 2 in this paper.
The weighted fuzzy local information factor with kernel metric G k i is defined as follows:
G k i = j N i , j i w j i ( 1 u k j ) m ( 1 K ( x j , v k ) )
where, the weighted factor w j i consists of spatial information w s c and neighborhood gray scale information w g c . Using the least square method to optimize its objective function, the update formulas for the degree of membership and clustering center are obtained as follows:
u k i = 1 l = 1 c ( 1 K ( x i , v k ) ) + G k i ( 1 K ( x i , v l ) ) + G k l 1 / ( m 1 )
v k = i = 1 n u k i m K ( x i , v k ) x i / i = 1 n u k i m K ( x i , v k )
Overall, the KWFLICM algorithm combines neighborhood spatial information and grayscale features [39] to establish a new fuzzy local information factor, which greatly improves the robustness of the algorithm to noise [42].

3. Proposed Clustering Algorithm

To address the shortcomings of existing fuzzy clustering algorithms, a harmonic fuzzy set is first defined; secondly, a harmonic fuzzy partition concept is proposed; and finally, a series of harmonic fuzzy local information C-means algorithms for noisy image segmentation are constructed and its convergence is rigorously proved.

3.1. New Harmonic Fuzzy Set Theory

Zadeh [43] proposed a fuzzy set which uses the degree of membership to indicate the positivity level of elements in the set. However, the fuzzy membership values in fuzzy sets are limited to the range of [0, 1] and their shortcomings are gradually exposed in practical applications. Therefore, this paper will propose a new concept of generalized harmonic fuzzy sets, providing theoretical support for the research of harmonic fuzzy partition and harmonic fuzzy C-means clustering algorithms.
Definition 5.
Let  X = { x 1 , x 2 , , x n } , and its power set is denoted as  ( X ) . The generalized harmonic fuzzy set  A  built on  X  is defined as  A = { ( x i , h A ( x i ) ) | x i X } , where  h A ( x ) : X ( 1 , + )  is the generalized harmonic fuzzy membership function, and  h A ( x i ) ( 1 , + )  is the harmonic fuzzy membership degree of  x i  belonging to the finite set  X .
We generalize various common concepts of union sets, intersection sets, etc. and define some basic operation on the generalized fuzzy sets, equality, containment, complementation, intersection, and union related to the generalized harmonic fuzzy sets A , B in any universe of discourse X as follows.
(a)
A = B if and only if h A ( x ) = h B ( x ) , x X ;
(b)
A B if and only if h A ( x ) h B ( x ) , x X ;
(c)
A c is the standard complement of A , if and only if h A c ( x ) = h A ( x ) h A ( x ) 1 , x X ;
(d)
A B if and only if h A B ( x ) = max ( h A ( x ) , h B ( x ) ) , x X ;
(e)
A B if and only if h A B ( x ) = min ( h A ( x ) , h B ( x ) ) , x X .
Definition 6.
Harmonic fuzzy complement operator is defined an operation on harmonic fuzzy set  A , namely  N ( h ) ( h A ( x ) ) : ( 1 , + ) ( 1 , + )   that assigns a value to each harmonic fuzzy membership degree  h A ( x )   of any given harmonic fuzzy set  A . Some common harmonic fuzzy complement operators are given as follows:
(a) 
Standard harmonic fuzzy complement operator  N S ( h ) ( h A ( x ) ) = h A ( x ) h A ( x ) 1 ,  x X ;
(b) 
Yager’s harmonic fuzzy complement operator is first constructed as  N Y a g e r ( h ) ( h A ( x ) ) = h A ( x ) ( ( h A ( x ) ) α 1 ) 1 / α ( α > 0 ) ,  x X ;
(c) 
Sugeno’s harmonic fuzzy complement operator  N Sugeno ( h ) ( h A ( x ) ) = h A ( x ) + λ h A ( x ) 1 ( λ > 1 ) ,  x X .
For standard harmonic fuzzy complement operator N S ( h ) ( ) , if h A ( x ) = 1.5 , N S ( h A ( x ) ) = 3.0 ; if h A ( x ) = 5.0 , N S ( h ) ( h A ( x ) ) = 1.25 ; if h A ( x ) = 2.0 , N S ( h ) ( h A ( x ) ) = h A ( x ) = 2.0 . Therefore, 2.0 is a fixed point of the standard harmonic complement operator, namely 2.0 is a fixed point of the standard harmonic complement operator; namely 2.0 is an intermediary element of the generalized harmonic fuzzy logistic and reasoning.
To measure the fuzzy uncertainty of generalized harmonic fuzzy sets, we give the axiomatic definition of fuzzy entropy of generalized harmonic fuzzy sets according to existing fuzzy entropy theory of Zadeh’s fuzzy sets [44].
Definition 7.
Let  E ( h )  denotes a mapping  E ( h ) :   ( X ) [ 0 , 1 ] , where  ( X )  is the set composed of all generalized harmonic fuzzy sets on the finite set  X . Then  E  is called the fuzzy entropy of generalized harmonic fuzzy sets, if it satisfies the following conditions:
(a) 
E ( h ) ( A ) 0  if and only if  h A ( x ) 1 +  or  + ,  x X ;
(b) 
E ( h ) ( A ) = 1   if and only if  h A ( x ) = 2.0 ,  x X ;
(c) 
E ( h ) ( A ) E ( h ) ( B )  if and only if the generalized harmonic fuzzy set  A  is having lesser overall fuzziness than the generalized harmonic fuzzy set  B .
In practical applications, some formulas of fuzzy entropy for the generalized harmonic fuzzy sets are given as follows:
(a)
E ( h ) ( A ) = 2 n i = 1 n min 1 h A ( x i ) , h A ( x i ) 1 h A ( x i ) ;
(b)
E ( h ) ( A ) = i = 1 n min 1 h A ( x i ) , h A ( x i ) 1 h A ( x i ) i = 1 n max 1 h A ( x i ) , h A ( x i ) 1 h A ( x i )
In order to measure the similarity between generalized harmonic fuzzy sets, we give an axiomatic definition of the fuzzy similarity measure based on the existing Zadeh fuzzy similarity theory of fuzzy sets [45].
Definition 8.
A mapping  S ( h ) : ( X ) × ( X ) [ 0 , 1 ]  is a similarity measure between generalized harmonic fuzzy sets if  S  satisfies the following conditions:
(a) 
0 S ( h ) ( A , B ) 1 , A , B ( X ) ;
(b) 
S ( h ) ( A , B ) = S ( h ) ( B , A ) , A , B ( X ) ;
(c) 
If  A = B , then  S ( h ) ( A , B ) = 1 ;
(d) 
If  A B C , A , B , C ( X ) , then  S ( h ) ( A , C ) S ( h ) ( A , B )  and  S ( h ) ( A , C ) S ( h ) ( B , C ) .
Some common formulas for the fuzzy similarity measure are given as follows:
(a)
S ( h ) ( A , B ) = i = 1 n min ( h A ( x i ) , h B ( x i ) ) i = 1 n max ( h A ( x i ) , h B ( x i ) ) ;
(b)
S ( h ) ( A , B ) = i = 1 n min ( h A 1 ( x i ) , h B 1 ( x i ) ) i = 1 n max ( h A 1 ( x i ) , h B 1 ( x i ) ) .
In addition, we use the similarity measure between the generalized harmonic fuzzy sets to induce a new formula of fuzzy entropy as follows [46].
E ( h ) ( A ) = S ( h ) ( A A c , A A c ) = i = 1 n min 1 h A ( x i ) , h A ( x i ) 1 h A ( x i ) i = 1 n max 1 h A ( x i ) , h A ( x i ) 1 h A ( x i )
In summary, the fuzzy entropy and similarity of generalized reconciled fuzzy sets can be widely used in pattern recognition, image processing, and intelligent decision making [47].

3.2. Harmonic Fuzzy Partition

In 2003, Leski [25] proposed a weighted generalized conditional fuzzy clustering algorithm that introduces additional information about the fuzzy environment as a constraint in the clustering process. The algorithm is effective for clustering abnormal data and constructing radial basis neural networks. The specific constraints are constructed below:
k = 1 c u k i ( α , β ) = ( k = 1 c β k i ( u k i ) α ) 1 / α
when α ( , 0 ) ( 0 , + ) , and weight constraints β k 0 , k = 1 c β k = 1 , we analyze the following three special cases:
(1)
If α = 1.0 , j = 1 c u k i ( α , β ) = ( k = 1 c β k ( u k i ) α ) 1 / α = k = 1 c β k u k i ;
(2)
If α 0 , u k i 0 , lim α 0 j = 1 c u k i ( α , β ) = lim α 0 ( k = 1 c β k ( u k i ) α ) 1 / α = k = 1 c u k i β k ;
(3)
If α = 1.0 , j = 1 c u k i ( α , β ) = ( k = 1 c β k ( u k i ) α ) 1 / α = 1 k = 1 c β k / u k i .
The weighted generalized conditional fuzzy clustering optimization model is as follows:
min J G W C F C M ( U , V ; X ) = k = 1 c i = 1 n β k u k i m d 2 x i , v k
s . t . (a) u i j [ 0 , + ) , i , k ; (b) c ( k = 1 c β k u k i α ) 1 / α = f i , i ; (c) 0 < i = 1 n u k i < n , k .
The iterated formulas of fuzzy membership u i j and the clustering center v j are shown below:
u k i = f i c 1 j = 1 c β j d 2 ( x i , v k ) / d 2 ( x i , v j ) α / ( m α ) 1 / α
v k = i = 1 n u k i m x i / i = 1 n u k i m
when α 0 , β k = 1 / c ( k ) , and f i = 1 ( i ) , fuzzy membership is u k i = j = 1 c ( d 2 ( x i , v j ) ) 1 / ( c m ) ( d 2 ( x i , v k ) ) 1 / m , and this corresponds the generalized multiplicative fuzzy clustering based on non-additive multiplicative fuzzy partition, and can be also explained by the theory of the generalized multiplicative fuzzy sets [48]. If α = 1.0 , β k = 1 / c ( k ) , and f i = 1 ( i ) , membership u k i = 1 j = 1 c ( d 2 ( x i , v k ) / d 2 ( x i , v j ) ) 1 / ( m 1 ) , and this corresponds the classical fuzzy C-means clustering based on Ruspini’s additive fuzzy partition, and can be also explained by the theory of Zadeh’s fuzzy sets. However, if α = 1.0 , β k = 1 / c ( k ) , and f i = 1 ( i ) , membership u i j = r = 1 c ( d 2 ( x i , v j ) / d 2 ( x i , v r ) ) 1 / ( m + 1 ) . This clustering has not attracted the attention of relevant scholars, but it can be explained by the theory of the generalized harmonic fuzzy sets proposed in this paper. Therefore, on the basis of harmonic fuzzy set theory proposed in this paper, we will explore a new partition theory and optimization model to support this clustering method to promote the widespread application in machine learning and machine intelligence.
Based on the generalized harmonic fuzzy sets proposed in this paper, a new concept of harmonic fuzzy partition is originally proposed as follows.
Definition 9.
Given a dataset  X = { x 1 , x 2 , , x n } , these samples are then categorized into  c  classes.  H = [ h k i ] c × n  being a symmetric harmonic fuzzy partition matrix of size  c × n h k i  denotes the harmonic fuzzy membership of each sample belonging to different classes and the corresponding harmonic fuzzy partition space  H ˜ c  is defined as follows:
P ˜ h c = H = [ h i j ] n × c n × c | h i j ( 1 , + ) , i , k ; 1 k = 1 c 1 / h k i = 1 , i
The above concept of symmetric harmonic fuzzy partition provides theoretical support for the research of the generalized harmonic fuzzy partition C-means clustering.

3.3. Modeling of Harmonic Fuzzy Clustering

Based on the related theories of harmonic fuzzy sets and harmonic fuzzy partition, the harmonic fuzzy C-means clustering algorithm (HCM) is researched, which satisfies the constraint that the harmonic sum of membership degrees of each sample to all classes is 1. Therefore, it can reveal the intrinsic correlation between data and help further discover representation of data structures [49].
The HCM algorithm is an abstract process to find the minimum value of the objective function by dividing the dataset X into c classes. It can be modeled as follows:
min J G H F C M ( H , V ) = k = 1 c i = 1 n h k i m d 2 ( x i , v k )
  • s.t. (a) h k i ( 1 , + ) , i , k ; (b) 1 / k = 1 c 1 / h k i = 1 , i ; (c) 0 < i = 1 n 1 / h k i < n , k .
  • where m is the fuzzy weighting exponential which affects the fuzziness of the clustering structure. h k i denotes the harmonic fuzzy membership degree of each sample belonging to different classes.
Using the Lagrange multiplier method, the following unconstrained objective function can be constructed:
L ( H , V , λ ) = k = 1 c i = 1 n h k i m d 2 ( x i , v k ) + i = 1 n λ i ( 1 ( k = 1 c h k i 1 ) 1 )
In addition, according to the constraint condition 1 = ( k = 1 c h k i 1 ) 1 , we may obtain a new unconstrained objective function as follows:
L ( H , V , λ ) = k = 1 c i = 1 n h k i m d 2 ( x i , v k ) + i = 1 n λ i ( k = 1 c h k i 1 1 )
By taking the partial derivatives of L ( H , V , λ ) with respect to the harmonic fuzzy membership h k i and the clustering center v k in Equation (28) and making them zero, we obtain the following:
h k i = l = 1 c ( d 2 ( x i , v l ) ) 1 / ( m + 1 ) ( d 2 ( x i , v k ) ) 1 / ( m + 1 )
v k = i = 1 n h k i m x i i = 1 n h k i m
With harmonic fuzzy clustering, like the existing Bezdek fuzzy clustering, it is difficult to suppress the impact of noise in image segmentation when directly applied. To this end, the robust FLICM algorithm proposes a series of improvements to harmonic fuzzy clustering. Figure 1 shows the main framework of the proposed algorithm.

3.3.1. Robust Harmonic Fuzzy Clustering with Local Information

However, the above harmonic fuzzy C-means clustering algorithm has the same drawback as the classical FCM algorithm; i.e., it is very sensitive to noise and outliers. Inspired by the existing robust FLICM algorithm [21], this paper further proposes the harmonic fuzzy local information C-means (HLICM) algorithm to solve the above problem. The optimization model of the robust harmonic fuzzy algorithm for image segmentation is constructed as follows:
min J HIICM   ( H , V ) = k = 1 c i = 1 n h k i m d 2 x i , v k + H k j
  • s.t. (a) h k i ( 1 , + ) , i , k ; (b) 1 / k = 1 c 1 / h k i = 1 , i ; (c) 0 < i = 1 n 1 / h k i < n , k .
  • where m is the fuzzy weighting exponent and h k i represents the harmonic fuzzy membership of each sample belonging to different categories.
H k i = j N i , j i 1 d j i + 1 h k j / ( h k j 1 ) m d 2 ( x j , v k )
where h k j / ( h k j 1 ) is the standard complement membership of harmonic fuzzy membership h k j . Obviously, there is the equation 1 h k j + 1 h k j / ( 1 h k j ) = 1 .
Compared with the FLICM algorithm, the proposed algorithm replaces the Zadeh fuzzy set in the model with a harmonic fuzzy set, which establishes a harmonic fuzzy local information C-means clustering algorithm, and therefore, the above model is reasonable.
Using the Lagrange multiplier method, the unconstrained objective function is constructed as follows:
J H L I C M ( H , V , λ ) = k = 1 c i = 1 n h k i m [ d 2 ( x i , v k ) + H k i ] + i = 1 n λ i ( 1 j = 1 c 1 / h k i )
We find the partial derivatives of the objective function with respect to h k i , v k , and λ i and make them equal to zero.
  • Solution h k i
    J h k i = m h k i m 1 [ d 2 ( x i , v k ) + H k i ] λ i h k i 2 = 0
    J λ i = k = 1 c h k i 1 1 = 0
From Equation (34), it follows that
h k i = λ i m [ d 2 ( x i , v k ) + H k i ] 1 m + 1
Substituting Equation (36) into Equation (35), we can obtain
k = 1 c ( λ i / m ) 1 / ( m + 1 ) ( d 2 ( x i , v k ) + H k i ) 1 / ( m + 1 ) = 1
In this way, we get
λ i m 1 / ( m + 1 ) = k = 1 c d 2 ( x i , v k ) + H k j 1 / ( m + 1 )
Substituting Equation (38) into Equation (36), we obtain
h k i = l = 1 c ( d 2 ( x i , v l ) + H l i ) 1 m + 1 ( d 2 ( x i , v k ) + H k i ) 1 m + 1
2.
Solution V
According to L v k = 0 , we can obtain
v k = i = 1 n h k i m ( x i + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m x j ) i = 1 n h k i m ( 1 + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m )
To decrease the computational complexity, this clustering center is often approximated by the following equation:
v k = i = 1 n h k i m x i i = 1 n h k i m
Equations (41) and (39) are iterative formulas of the harmonic fuzzy membership and the clustering center, respectively, and they solve Equation (31) by alternately iterating between the harmonic fuzzy partition matrix and the clustering centers. The specific algorithmic algorithm 1 is described as follows:
Algorithm 1 Harmonic fuzzy local information C-means clustering
Input     Dataset X = {x1, x2,…, xn}, where n is the number of samples.
Output     Harmonic   fuzzy   partition   matrix   h = [ h i j ] n × c and the clustering center
        matrix V .
Initialization Set algorithm stopping error ε , set the maximum number of
       iterations of the algorithm T , initialize the clustering center v k ( 0 ) , set
       the number of iterations t = 0 , and set the size of neighborhood
        window w × w .
Repeat      Calculate   the   degree   of   harmonic   fuzzy   membership   h k i ( t ) using Equation (39);
        Calculate   the   clustering   center   v k using Equation (41);
        If   | V ( t + 1 ) V ( t ) | | < ε or the number of iterations of the algorithm
        t > T , the algorithm ends; Otherwise, t = t + 1
End    Algorithm is over.

3.3.2. Harmonic Fuzzy Clustering with Weighted Local Information

However, due to the fact that the segmentation results and noise resistance of the HLICM algorithm are not as good as we expected, this paper was inspired by the KWFLICM algorithms [23], and a weighted local coefficient w j i was introduced into the HLICM algorithm to construct a new robust fuzzy local clustering algorithm IHLICM with good noise immunity and segmentation performance. The optimization model of the IHLICM algorithm is constructed below:
min J I H L I C M ( H , V ) = k = 1 c i = 1 n h k i m d 2 ( x i , v k ) + H k i
  • s.t. (a) h k i ( 1 , + ) , i , k ; (b) 1 / k = 1 c 1 / h k i = 1 , i ; (c) 0 < i = 1 n 1 / h k i < n , k .
  • where m is the fuzzy weighting exponent, which defaults to the range of [5, 10] and h k i denotes the harmonic fuzzy membership of each sample belonging to different categories.
The weighted harmonic fuzzy local information factor H k j is defined as follows:
H k i = j N i , j i w j i ( h k j / ( h k j 1 ) ) m d 2 ( x j , v k )
Using the least squares method to solve this objective function, the updated formulas for harmonic fuzzy membership and the clustering center are obtained as follows:
h i k = l = 1 c ( d 2 ( x i , v l ) + H l i ) 1 / ( m + 1 ) ( d 2 ( x i , v k ) + H l i ) 1 / ( m + 1 )
v k = i = 1 n h k i m ( x i + j N i , j i w j i ( h k j / ( h k j 1 ) ) m x j ) i = 1 n h k i m ( 1 + j N i , j i w j i ( h k j / ( h k j 1 ) ) m )
To reduce the computational complexity, this clustering center is often approximated by the following formula:
v k = i = 1 n h k i m x i / i = 1 n h k i m

3.3.3. Robust Kernelized Harmonic Fuzzy Clustering with Local Information

Two robust HCM algorithms have significant improvement in terms of noise resistance and segmentation performance, but there are still shortcomings compared to the KWFLICM algorithm. Therefore, this paper was once again inspired by the KWFLICM algorithm and proposes a novel high performance KWHLICM algorithm. The optimization model of the KWHLICM algorithm is constructed as follows:
min J K W H L I C M ( H , V ) = k = 1 c i = 1 n h k i m 1 K ( x i , v k ) + H k i
  • s.t. (a) h k i ( 1 , + ) , i , k ; (b) 1 / k = 1 c 1 / h k i = 1 , i ; (c) 0 < i = 1 n 1 / h k i < n , k
  • where m is the fuzzy weighting exponent, which defaults to the range of [5, 10], h k i denotes the harmonic fuzzy membership of each sample belonging to different categories, and the kernel weighted harmonic fuzzy local information factor H k j is defined as follows:
H k i = j i , j N i w i j ( h k j / ( h k j 1 ) ) m ( 1 K ( x j , v k ) )
where K ( x j , v k ) = exp d 2 ( x j , v k ) σ 2 , σ is the scale parameter of Gaussian function.
The weighted local coefficient w i j in [23] consists of spatial information w s c and neighborhood gray scale information w g c . The least squares method is used to solve this objective function to obtain the updated formulas for harmonic membership and the clustering center, respectively, as follows:
h k i = l = 1 c ( ( 1 K ( x i , v l ) ) + H l i ) 1 / ( m + 1 ) ( ( 1 K ( x i , v k ) ) + H k i ) 1 / ( m + 1 )
v k = i = 1 n h k i m ( K ( x i , v k ) x i + j N i , j i w j i ( h k j / ( h k j 1 ) ) m K ( x j , v k ) x j ) i = 1 n h k i m ( K ( x i , v k ) + j N i , j i w j i ( h k j / ( h k j 1 ) ) m K ( x j , v k ) )
To reduce the computational complexity, this clustering center is often approximated by the following formula:
v k = i = 1 n h i k m K ( x i , v k ) x i / i = 1 n h i k m K ( x i , v k )
Similar to the KWFLICM algorithm, the proposed KWHLICM algorithm also combines the neighborhood spatial information and grayscale features to establish a new fuzzy local information factor, which greatly improves the robustness of the algorithm to noise. The KWHLICM algorithm tests a noisy image, with the dynamic iteration process for image segmentation shown in Figure 2.
As shown in Figure 3, a 3 × 3 block of neighborhood pixels is selected from the noisy image. The iterative changes of fuzzy membership and the clustering centers intuitively reflect that the proposed algorithm is a two-level alternating iterative algorithm. In a neighborhood of the noisy image, we identify normal pixels with grayscale values of 118, 120, 124, and 140, and noisy pixels with grayscale values of 0, 26, 208, and 230. In the first iteration, the membership of the noise pixels is relatively low, and after several iterations using the KWHLICM algorithm, noisy pixels and surrounding pixels have similar membership. The iterated results show that the proposed KWHLICM algorithm can effectively suppress the effect of noise on this fuzzy clustering algorithm.
In summary, these two improved algorithms have a similar process to the HLICM algorithm; they are not repeated here due to the limited length of this paper.

3.4. Algorithm Convergence Analysis

Only the convergence of alternating iterative algorithms can ensure their effectiveness and rationality. So far, the related algorithms for fuzzy C-means clustering with constraints as additive expressions have been rigorously proved to be locally convergent and have been widely used in pattern analysis and data mining [50,51]. Saha and Das [52] provided a general convergence analysis method for iterative algorithms using Zangwill’s theorem. This paper uses this method to analyze the convergence of the proposed HLICM algorithm. If the objective function of the HLICM algorithm is decreasing, it indicates that the algorithm is convergent.
Considering the limited length of this paper, the proof of continuity and tightness constraints for the HLICM algorithm is the same as that for the FCM algorithm [53,54,55], and the IHLICM and KWHLICM algorithms are similar to the HLICM method. Therefore, in this article, we only prove that the objective function of the HLICM algorithm is a decreasing function, thus omitting the proof of the other two constraints in the theorem and the subsequent proof of the proposed algorithm.
Theorem 1.
[56] (Lagrange’s Theorem). Let functions  f : D f ,  D f N  and  g i : D g i ,  D g i n ,  i = 1 , 2 , , t , t < n , be continuously partially differentiable and let  x 0 = x 1 0 , , x N 0 D f  be a local extreme point of the function  f  subject to the constraints  g i ( x 1 , , x N ) = 0 ,  i = 1 , 2 , , t L ( x ; λ ) = f ( x 1 , , x N ) + i = 1 t λ i g i ( x 1 , , x N )  and  J = g 1 ( x ) x 1 g 1 ( x ) x 2 g 1 ( x ) x t g 2 ( x ) x 1 g 2 ( x ) x 2 g 2 ( x ) x t g t ( x ) x 1 g t ( x ) x 2 g t ( x ) x t 0  at the point  x 0 . Then, we have that the gradient of  L ( x ; λ )  at the point  ( x 0 , λ 0 )  is 0, i.e.,  L ( x 0 ; λ 0 ) = 0 .
Theorem 2.
[56(local sufficient conditions). Let functions  f : D f ,  D f N  and  g i : D g i ,  D g i n ,  i = 1 , 2 , , t , t < n , be continuously partially differentiable and let  x 0 = x 1 0 , , x N 0 D f  be a solution of the system  L ( x 0 ; λ 0 ) = 0 . Let  H L ( x , λ ) = 0 0 L 2 λ 1 x 1 L 2 λ 1 x n 0 0 L 2 λ t x 1 L 2 λ t x n L 2 x 1 λ 1 L 2 x 1 λ t L 2 x 1 x 1 L 2 x 1 x n L 2 x n λ 1 L 2 x n λ t L 2 x n x 1 L 2 x n x n  be the bordered Hessian matrix and consider its leading principle minors  H ¯ r ( x 0 ; λ 0 )  of the order  r = 2 t + 1 , 2 t + 2 , , n + t  at point  ( x 0 , λ 0 ) . Therefore, the following expressions can be derived:
(1) 
If all leading principle minors  H ¯ r ( x 0 ; λ 0 ) ,  2 t + 1 r n + t , have the sign  ( 1 ) t , then  x 0 = x 1 0 , , x N 0  is a local minimum point of function  f  subject to the constraints  g i ( x ) = 0 , i = 1 , 2 , , t .
(2) 
If the signs of all leading principle minors  H ¯ r ( x 0 ; λ 0 ) ,  2 t + 1 r n + t , are alternated, the sign of  H ¯ n + 1 ( x 0 ; λ 0 ) = H ¯ L ( x 0 ; λ 0 )  being that of  ( 1 ) n  then  x 0 = x 1 0 , , x N 0  is a local minimum point of functions  f  subject to the constraints  g i ( x ) = 0 , i = 1 , 2 , , t .
(3) 
If neither the conditions of (1) nor those of (2) are satisfied, then  x 0   is not a local extreme point of function  f   subject to the constraints  g i ( x ) = 0 , i = 1 , 2 , , t . Here, the case in which one or several leading principal minors have a value of zero is not considered a violation of conditions (1) or (2).
For the optimization model of the HLICM algorithm, using constraint 1 = 1 / j = 1 c 1 / h i j , the final constructed unconstrained objective function is constructed as follows:
J H L I C M ( H , V , λ ) = k = 1 c i = 1 n h k i m [ d 2 ( x i , v k ) + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m d 2 ( x j , v k ) ] + i = 1 n λ i ( 1 + k = 1 c 1 h k i )
where H k i = j N i , j i 1 d j i + 1 h k j / ( h k j 1 ) m d 2 ( x j , v k ) .
If J H L I C M ( H , V , λ ) is a decreasing function, then the following three propositions should hold true
Proposition 1.
Given a clustering center  V , let  F ( H ) = J H L I C M ( H , V , λ ) , if and only if Equation (39) is computed to obtain  h k i ,  h k i  is a minima of function  F ( H , λ ) .
Proof. 
Given V , if it is assumed that the minimum point of F ( H , λ ) is h , then
F ( H ) h k i = m h k i m 1 [ d 2 ( x i , v k ) + H k i ] λ i ( 1 / h k i 2 ) = 0
λ i = m h k i m + 1 [ d 2 ( x i , v k ) + H k i ]
h k i = λ i / ( m [ d 2 ( x i , v k ) + H k i ] ) 1 m + 1
Finally, we can obtain
h k i = h k i = l = 1 c ( d 2 ( x i , v l ) + H l i ) 1 m + 1 ( d 2 ( x i , v k ) + H k i ) 1 m + 1
Subsequently, we find the Hessian matrix of function F ( H ) when h k i = h k i
h k i ( F ( H ) h k i ) = m ( m 1 ) h k i m 2 ( d 2 ( x i , v k ) + H k i ) + 2 ( m h k i m + 1 ( d 2 ( x i , v k ) + H k i ) ) h k i 3 = m ( m 1 ) h k i m 2 ( d 2 ( x i , v k ) + H k i ) + 2 m h k i m 2 ( d 2 ( x i , v k ) + H k i ) = m ( m + 1 ) h k i m 2 ( d 2 ( x i , v k ) + H k i )
As a result, we obtain
h a b ( F ( H ) h k i ) = 0 , a k   or   b i m ( m + 1 ) h k i m 2 ( d 2 ( x i , v k ) + H k i ) , a = k   and   b = i  
According to Equation (58), this Hessian matrix is positive definite when m > 1 However, for the HLICM algorithm, it only makes sense when m > 0 . Therefore, this Hessian matrix is positive definite when m takes any real number greater than 0.
At the same time, we can know that the sufficient condition for determining the strict minima of the objective function is to analyze the Jacobi and Hessian matrices. Therefore, in addition to the above analysis, we also need to evaluate the bordered Hessian matrix [56]. Given the clustering center matrix V , the bordered Hessian matrix of h i = [ h 1 i , h 2 i , , h c i ] and λ i is as follows:
H L ( h i , λ i ) = 0 2 L λ i h 1 i 2 L λ i h c i 2 L λ i h 1 i 2 L h 1 i h 1 i 2 L h 1 i h c i 2 L λ i h c i 2 L h c i h 1 i 2 L h c i h c i
For 2 L λ i h k i = 1 h k i 2 , 1 k c the leading principle minors of matrix H L ( h i , λ i ) are as follows:
H ¯ 3 ( h i , λ i ) = det 0 2 L λ i h 1 i 2 L λ i h 2 i 2 L λ i h 1 i 2 L h 1 i h 1 i 2 L h 1 i h 2 i 2 L λ i h 2 i 2 L h 2 i h 1 i 2 L h 2 i h 2 i = 2 L λ i h 2 i 2 2 L h 1 i h 1 i 2 L λ i h 1 i 2 2 L h 2 i h 2 i = ( h 2 i 4 m 2 h 1 i m 2 ( d 2 ( x i , v k ) + H k i ) h 1 i 4 m 2 h 2 i m 2 ( d 2 ( x i , v k ) + H k i ) ) h i = h i , λ i = λ i < 0
H ¯ 4 ( h i , λ i ) = det 0 2 L λ i h 1 i 2 L λ i h 2 i 2 L λ i h 3 i 2 L λ i h 1 i 2 L h 1 i h 1 i 2 L h 1 i h 2 i 2 L h 1 i h 3 i 2 L λ i h 2 i 2 L h 2 i h 1 i 2 L h 2 i h 2 i 2 L h 2 i h 3 i 2 L λ i h 3 i 2 L h 3 i h 1 i 2 L h 3 i h 2 i 2 L h 3 i h 3 i = k = 1 3 2 L λ i h k i 2 k 1 = 1 , k 1 k 3 2 L h k 1 i h k 1 i = ( ( m ( m 1 ) ) 2 k = 1 2 h k i 4 k 1 = 1 , k 1 k 3 h k 1 i m 2 ( d 2 ( x i , v k ) + H k i ) ) h i = h i , λ i = λ i < 0
H ¯ c + 1 ( H i , λ i ) = ( k = 1 c 2 L λ i h k i 2 k 1 = 1 , k 1 k c 2 L h k 1 i h k 1 i ) h i = h i , λ i = λ i < 0
According to the following bordered Hessian matrix theorem (local sufficient conditions), h k i = l = 1 c ( d 2 ( x i , v l ) + H l i ) 1 m + 1 ( d 2 ( x i , v k ) + H k i ) 1 m + 1 is the local minimum point in the condition of 1 = 1 / k = 1 c 1 / h k i , i . □
Proposition 2.
Given  h = [ h k i ] c × n , let  F ( V ) = J H L I C M ( H , V , λ )  if and only if Equation (40) is solved for  v k V  is a minima of the function  F ( V ) .
Proof. 
Given h i j , if it is assumed that v j is a minima of the function F ( V ) , then
F ( V ) v k = ( 2 ) i = 1 n h k i m [ ( x i v k ) + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m ( x j v k ) ] = 0
v k = i = 1 n h k i m ( x i + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m x j ) i = 1 n h k i m ( 1 + j N i , j i 1 d j i + 1 ( h k j / ( h k j 1 ) ) m )
Finally, we obtain the Hessian matrix of function F ( V ) in V = V
v l ( F ( v ) v k ) = 0 , l k 2 i = 1 n h k i m 1 + j N i , j i ( d j i + 1 ) 1 h k j / ( h k j 1 ) m , l = k
According to Equation (65), all diagonal elements of this Hessian matrix are greater than 0 and all off-diagonal elements are 0. Therefore, this Hessian matrix is positive definite. □
Through the previous deduction, this paper can conclude that the objective function of the HLICM algorithm decreases accordingly. In addition, the proof of the proposed algorithms with continuity and compactness constraints is similar to the proof of the FCM algorithm [54,55]. Overall, according to Zangwill’s theorem, the proposed HLICM algorithm is locally convergent. Since the IHLICM and KWHLICM algorithms are improved algorithms of HLICM and their derivation principles are similar to those of the HLICM algorithm, after proving that the HLICM algorithm is locally convergent, we can also show that the two improved algorithms proposed in this paper are also locally convergent, which can be proved in the same way as above.

4. Experimental Results and Analysis

To verify the effectiveness of the proposed algorithm, we selected grayscale images and color natural images for segmentation testing and compared the test results of the proposed algorithm to the following algorithms: ARFCM [57], FLICMLNLI [58], FCM-VMF [59], DSFCMN [60], KWFLICM [23], PFLSCM [61], FCM-SICM [62], FSC-LNML [63], and FLICM [21]. In addition, Gaussian noise, Rician noise, speckle noise, and salt and pepper noise were added to these images and the segmentation results of these noisy images were manually observed for evaluation. Therefore, we used many evaluation metrics, such as accuracy (Acc), sensitivity (Sen), the Jaccard coefficient, segmentation accuracy (SA), the Kappa coefficient (Kappa), mean intersection over union (mIoU), peak signal noise ratio (PSNR), and the Dice similarity coefficient (DICE) to objectively evaluate the segmentation results. During the testing process, the local neighborhood window size of the HLICM algorithm was 3 × 3, with a fuzzy weighting exponent of 2.0. The local neighborhood window size of the IHLICM and KWHLICM algorithms was 3 × 3, with a range of fuzzy weighting exponents of [5, 10]. A hardware platform with CPU-i7, 2.70 GHz, and 16.0 GB RAM was tested on the Matlab r2021b software platform in the Windows 11 environment. For the comparison algorithms, the local neighborhood window sizes of the ARFCM, FLICMLNLI, FCM_VMF, DSFCM_N, KWFLICM, PFLSCM, FCM_SICM, FSC_LNML, and FLICM algorithms were 5 × 5, 3 × 3, 4 × 4, 3 × 3, 3 × 3, 3 × 3, 7 × 7, 7 × 7, and 3 × 3, respectively. The number of clusters in fuzzy clustering algorithm for image segmentation is determined according to the Matlab-based clustering validity index toolbox [64]. For the convenience of description, this section uses μ to denote the mean value of Gaussian noise, σ n 2 represents the normalized variance of Gaussian noise, p denotes the intensity level of salt and pepper noise, and σ denotes the standard deviation of Rician noise. For example, Gaussian noise with a mean value of 0 and normalized variance of 0.1 is represented as GN(0, 0.1), salt and pepper noise with an intensity of 30% is represented as SPN(0.3), speckle noise with a normalized variance of 0.2 is represented as SN(0.2), and Rician noise with a standard deviation of 80 is represented as RN(80).

4.1. Evaluation Indicators

4.1.1. Peak Signal-to-Noise Ratio (PSNR) [65]

Peak signal-to-noise ratio (PSNR) is an indicator used to measure the quality of an image or video. It is usually used to compare the difference between the original image and a compressed or processed image. PSNR is calculated using the following formula:
P S N R = 10 log 10 M A X 2 M S E
M S E = 1 M 1 N 1 x = 1 M 1 y = 1 N 1 I ( x , y ) O ( x , y ) 2
where M A X represents the maximum possible value of image pixels and M S E is the mean squared error (MSE), which represents the square of the average difference in pixel values between the original image and the processed image. The higher the PSNR value, the better the image quality, as this means that the difference between the original image and the processed image is smaller.

4.1.2. Segmentation Accuracy (SA) [57]

Image segmentation is the process of dividing an image into different regions or objects, and segmentation accuracy is the evaluation of the degree of similarity between the segmentation results and the ground truth.
S A = j = 1 c | A j C j | l = 1 c | C l |

4.1.3. Mean Intersection over Union (mIoU) [66]

The average intersection and merger ratio is an indicator used to evaluate the quality of image segmentation, which is the average of IoU (intersection and merger ratio). The mIoU index integrates the degree of similarity between the segmentation results and the ground truth across all categories and is a commonly used indicator for segmentation accuracy.
m I o U = 1 c i = 1 c | A i C i | | A i C i |
In image segmentation tasks, the higher the mIoU value, the closer the segmentation result is to the ground truth, which means that the algorithm has better segmentation performance in different categories.

4.1.4. Accuracy (Acc), Sensitivity (Sen), the Jaccard Coefficient, and the DICE [67]

TP denotes true positive, TN denotes true negative, FP denotes false positive, and FN denotes false negative. The accuracy value ranges from 0 to 1. The closer the value is to 1, the higher the accuracy of classification or segmentation.
Accuracy
A c c = T P + T N T P + T N + F N + F P
Sensitivity
S e n = T P T N + T P
Jaccard coefficient
J a c c a r d = T P T P + F P + F N
Dice similarity coefficient (DICE)
D I C E = 2 T P 2 T P + F P + F N
Accuracy is the ratio of the number of correctly classified or segmented samples to the total number of samples in a classification or segmentation task. Sensitivity is the proportion of true cases correctly identified as positive cases, also known as the true case rate. The Jaccard coefficient, also known as Intersection over Union (IoU), is used to evaluate the similarity between two sets. In segmentation tasks, the Jaccard coefficient index is often used to evaluate the degree of overlap between the segmentation results and the ground truth. The Dice similarity coefficient (DICE) is a commonly used indicator to evaluate the quality of image segmentation and is also widely used to assess the performance of medical image segmentation tasks.

4.1.5. The Kappa Coefficient [67]

The Kappa coefficient considers the consistency between the classification or segmentation results and the ground truth, avoiding the limitation of relying solely on superficial indicators such as accuracy to evaluate performance.
K a p p a = p o p e 1 p e

4.2. Test and Analysis of Algorithm Robustness

4.2.1. Synthetic Images

To verify the robustness of the proposed algorithm, four synthetic images shown in Figure 3 were selected and various types of noise were added to these synthetic images for testing the proposed algorithm and all the comparison algorithms.
RN(80) was added to Figure 3a, GN(0, 0.1) was added to Figure 3b, SPN(0.2) was added to Figure 3c, and SN(0.2) was added to Figure 3d. These images with noise were segmented using various fuzzy clustering algorithms. The segmentation results are shown in Figure 4. Table 1 gives the evaluation metrics corresponding to the various algorithms.
From Figure 4, it can be seen that when the image contains high noise, the segmentation results of the FCM_VMF and FLICM algorithms are poor, indicating that they do not have good noise resistance. The ARFCM algorithm has good noise resistance, but over-segmentation may occur. The PFLSCM and HLICM algorithms cannot effectively suppress the noise, while the FLICMLNLI algorithm can suppress the noise, but it can also cause over-segmentation of the image. The FCM_SICM and FSC_LNML algorithms can suppress noise, but their results have uneven edges. The DSFCM_N algorithm is robust to salt and pepper noise but sensitive to other types of noises. The KWFLICM and IHLICM algorithm can effectively suppress a large amount of noise, but the segmentation results are poor. The KWHLICM algorithm is more effective in synthetic images and is more suitable than the other comparison algorithms. From Table 1, it can be seen that the evaluation indexes of the KWHLICM algorithm are all higher than the other algorithms; therefore, the KWHLICM algorithm is universal best choice for segmenting images contaminated by noise.

4.2.2. Natural Images

To further test the performance of the algorithm, four natural images from BSDS500 [68] and VOC2010 [69] were selected for segmentation testing.
GN(0, 0.1) was added to Figure 5a,b and SPN(0.3) was added to Figure 5c,d. These images with noise were segmented using various fuzzy clustering algorithms. The segmentation results are shown in Figure 6. Table 2 gives the evaluation metrics corresponding to various algorithms.
From Figure 6, it can be seen that the ARFCM and KWFLICM algorithms can effectively suppress noise, but their segmentation results are not satisfactory. The FCM_VMF, DSFCM_N, and PFLSCM algorithms cannot effectively suppress noise. The FLICMLNLI algorithm can suppress most of the noise, but it is prone to over-segmentation. The FCM_SICM and FSC_LNML algorithms can still suppress noise, but they are prone to losing detailed information in the image during segmentation. The HLICM and FLICM algorithms can suppress some noise, but their segmentation results are also unsatisfactory. The KWHLICM algorithm can suppress a large amount of noise, indicating its robustness, and its segmentation results are the best among all the algorithms. Therefore, in summary, the KWHLICM algorithm has good application prospects for natural image segmentation.

4.2.3. Remote Sensing Images

To illustrate the effectiveness of the algorithm, four remote sensing images from the UC Merced Land Use dataset [70] were selected for segmentation testing.
SN(0.2) was added to Figure 7a,b, SPN(0.4) was added to Figure 7c,d, and these noisy images were segmented using various fuzzy clustering algorithms. The segmentation results are shown in Figure 8. Table 3 gives the evaluation metrics corresponding to the various algorithms.
From Figure 8, it can be seen that the ARFCM, KWFLICM, and FLICM algorithms can effectively suppress the noise, but their segmentation results are poor. The FCM_VMF, DSFCM_N, PFLSCM, and HLICM algorithms cannot effectively suppress the noise. The FLICMLNLI algorithm cannot effectively segment the image while suppressing the noise. The FCM_SICM and FSC_LNML algorithms can still suppress noise, but they are prone to losing detailed information in the image during segmentation. The KWHLICM algorithm can suppress a large amount of noise, indicating its robustness, and the segmentation results are the best among all the algorithms. In addition, from Table 3, for all the evaluation indexes, the proposed KWHLICM algorithm’s values are the highest among all the algorithms. In summary, the KWHLICM algorithm has good application prospects for remote sensing image segmentation.

4.2.4. Medical Images

To illustrate the effectiveness of the algorithm, four MR images were selected from the Brain Tumor MRI Dataset [71] for segmentation testing.
RN(80) was added to Figure 9a, RN(90) was added to Figure 9b, and GN(0, 0.1) was added to Figure 9c,d. These images with noise were segmented using various fuzzy clustering algorithms. The segmentation results are shown in Figure 10. Table 4 gives the evaluation metrics corresponding to the various algorithms.
From Figure 10, it can see that the ARFCM, KWFLICM, FLICMLNLI algorithms can effectively suppress noise, but their segmentation results are not satisfactory. The FCM_VMF, PFLSCM, HLICM, and DSFCM_N algorithms cannot effectively suppress noise. The FCM_SICM and FSC_LNML algorithms can suppress noise, but often lose the detailed information of the image during the segmentation process. The KWFLICM algorithm can suppress noise while losing a lot of detail information, which is not what we want. The KWHLICM algorithm can suppress a large amount of noise, which indicates that it is robust and the segmentation result is the best among all algorithms. From Table 4, all the evaluation indexes of the proposed KWHLICM algorithm are the highest among all the algorithms. Therefore, in summary, the KWHLICM algorithm has good application prospects for MR image segmentation.

4.3. Testing and Analyzing the Effect of Noise Intensity on Algorithm Performance

Rician noise with a standard deviation of 50 to 80 was added to the medical image of Figure 9a. Noisy images were processed using 12 fuzzy clustering algorithms and the variation curves of the performance metrics of different algorithms with a standard deviation of Rician noise were obtained as shown in Figure 11. The variation of the evaluation metrics of most of the algorithms was not significant, and the FCM_VMF, HLICM and ARFCM algorithms have low evaluation metrics on medical images with Rician noise. With the increase of the standard deviation of the Rician noise, the evaluation metrics of the proposed KWHLICM algorithm have higher values than all the comparison algorithms, and therefore the KWHLICM algorithm is more stable and robust to changes in Rician noise.
Gaussian noise with different normalized variances was added to the synthetic images in Figure 3b. Segmentation tests were performed on these images using 12 algorithms and the variation curves of the evaluation metrics with the normalized variance of Gaussian noise were obtained, as shown in Figure 12. Most of the algorithms have insignificant variation of the evaluation metrics with noise, among which the HLICM, FCM_VMF, and ARFCM algorithms have lower evaluation metrics. Most of the performance metrics of the proposed KWHLICM method are better than all the comparison algorithms. In addition, the performance curve of the KWHLICM algorithm changes slowly with the normalized variance of Gaussian noise. Therefore, the proposed KWHLICM algorithm outperforms all the comparative algorithms in the presence of Gaussian noise.
Salt and pepper noise with different intensity levels was added to the natural image in Figure 5c, 12 fuzzy clustering algorithms were used to segment these noisy images, and the curves of the evaluation metrics of the different algorithms varying with the strength of salt and pepper noise were obtained, as shown in Figure 13. Overall, the metrics of all the algorithms decrease with the increase of the intensity level of salt and pepper noise. When the noise intensity is less than 0.21, the metrics of the PFLSCM algorithm are higher than other algorithms, but with the increase of noise (when the noise intensity is stronger than 0.21), all the evaluation metrics of the KWHLICM algorithm are higher than other algorithms. In addition, the metrics of the ARFCM and FCM_VCM algorithms are very low. Therefore, the proposed KWHLICM algorithm outperforms all the comparison algorithms in the presence of salt and pepper noise.
Speckle noise with different normalized variances was added to the remote sensing image in Figure 9a, 12 fuzzy clustering algorithms were used to segment these noisy images, and the curves of the evaluation metrics of the different algorithms with the intensity of the speckle noise were obtained, as shown in Figure 14. As the normalized variance of the speckle noise increases, most of the evaluation metrics of all the algorithms decrease. The evaluation index value of the DSFCM_N algorithm is higher when the noise is lower than 0.05, but as the noise increases, the KWHLICM algorithm obtains higher evaluation values than all the comparison algorithms. The evaluation index values of the HLICM and FCM_VMF algorithms are generally low. Overall, the segmentation performance of the proposed KWHLICM algorithm is minimally affected by the variation of the noise, and it is more robust to speckle noise than all the comparison algorithms.

4.4. Analysis and Testing of Algorithm Complexity

Algorithm complexity is an important indicator for measuring algorithm performance. Table 5 gives the computational complexity of all the algorithms in this paper, where γ is the number of pixels in image, w is the size of the local neighborhood window under robust fuzzy clustering, c is the number of classes, t is the number of iterations for algorithm convergence, w 1 is the size of the local neighborhood window under Gaussian and bootstrap bilateral filtering, w 2 is the size of the local neighborhood window under non-local mean filtering, and w 3 is the size of the local search window under non-mean filtering.
As shown in Table 5, the computational complexities of the ARFCM, FLICMLNLI, and PFLSCM algorithms are significantly higher than other compared algorithms. To verify the computational complexity of these algorithms, we tested and analyzed them using different images. By analyzing the time cost of these algorithms for processing noisy images, we confirmed the computational complexity of the algorithms covered in this paper. RN(80), GN(0, 0.1), SPN(0.3), and SN(0.2) were added to the two synthetic images in Figure 3b,d; SPN(0.3) and GN(0, 0.3) were added to the two natural images in Figure 5b,d; SPN(0.2) and SN(0.2) were added to the two remote sensing images in Figure 7a,c; and RN(80) and GN(0, 0.1) were added to two medical images in Figure 9a,d. Figure 15 shows the time cost bar chart of the different algorithms for these noisy images, where RN is Rician noise, SPN is salt and pepper noise, GN is Gaussian noise, and SN is speckle noise.
As shown in Figure 15, the ARFCM, FLICMLNLI and PFLSCM algorithms consume significantly more time in processing noisy images than the other algorithms, and all three algorithms are relatively ineffective in segmenting images. However, the proposed KWHLICM, HLICM, and IHLICM algorithms consume less time but still struggle to meet the requirements of large-scale real-time image processing. In the near future, we will study fast algorithms related to the proposed algorithms [72,73] to meet the requirements of real-time image processing.

4.5. Impact of Neighborhood Size on Algorithm Performance

To investigate the influence of local neighborhood window size on the algorithm, we selected local domain windows of 3 × 3, 5 × 5, 7 × 7, 9 × 9, and 11 × 11 to test images with various types of noise and analyzed the segmentation results to determine the impact of local neighborhood window size on the algorithm.
GN(0, 0.1) was added to the synthetic image in Figure 3b, SPN(0.3) was added to the natural image in Figure 5d, SN(0.2) was added to the remote sensing image in Figure 7b, and RN(80) was added to the medical image in Figure 9a. The segmentation results of three proposed algorithms HLICM, IHLICM, and KWHLICM are shown in Figure 16, Figure 17 and Figure 18, respectively.

4.5.1. Impact Testing of Neighborhood Window Size on HLICM

From Figure 16, it can be seen that although the tested image contains a small amount of noise, the segmentation results deteriorated as the neighborhood window size increased. From Table 6, it can be seen that when the neighborhood window size is 3 × 3, the segmentation evaluation index of the tested image was the highest. Therefore, the proposed HLICM algorithm can obtain satisfactory segmentation results when the neighborhood window size is 3 × 3.

4.5.2. Impact Testing of Window Size on IHLICM

From Figure 17, although the tested image contains a small amount of noise, the segmentation results deteriorated as the window size increased. From Table 7, it can be seen that when the neighborhood window size is 3 × 3, the segmentation evaluation index of the tested image was the highest. Therefore, the proposed IHLICM algorithm can obtain satisfactory segmentation results when the neighborhood window size is 3 × 3.

4.5.3. Impact Testing of Window Size on KWHLICM

From Figure 18, it can be seen that when the neighborhood window size was 3 × 3, the segmentation results of the remote sensing images and medical images still contained a small amount of noise, which is not what we want, and when the neighborhood window size was 5 × 5, the noise was suppressed and the segmentation results were satisfactory. As the neighborhood window size continues to increase, the algorithm’s ability to suppress the noise is enhanced; however, some details are lost in image segmentation. From Table 8, it can be seen that when the neighborhood window size is 3 × 3, the segmentation evaluation index of the synthetic image is the highest, while when the neighborhood window size is 5 × 5, the segmentation evaluation index of the natural image, remote sensing image, and medical image is the highest. Therefore, it can be concluded that the KWHLICM algorithm can achieve satisfactory segmentation results when the window size is between 3 × 3 and 5 × 5.

4.6. Impact of the Fuzzy Weighting Exponent on Algorithm Performance

The fuzzy weighting exponent is an important parameter in fuzzy clustering algorithms that has a certain impact on clustering performance. The range of the fuzzy weighting exponent varies depending on the algorithm. In general, the range of the fuzzy weighting exponent in FCM-related algorithms is between [1.5, 2.5] [74]. For the proposed harmonic fuzzy C-means clustering algorithm, we have conducted extensive experimental tests to confirm that its fuzzy weighting exponent is reasonably selected as [5,10]. Therefore, the HLICM, IHLICM, and KWHLICM algorithms proposed in this paper select values within this interval for testing and objectively analyze the impact of the fuzzy weighting exponent on its algorithm segmentation performance.

4.6.1. Fuzzy Weighting Exponent in HCM

This paper selected four representative numerical datasets (https://github.com/milaan9/Clustering-Datasets) (accessed on 15 October 2023) for clustering testing. These numerical data included Iris with 150 samples and four features, Triangle1 with 1000 samples and two features, Seeds with 210 samples and seven features, and Wine with 178 samples and thirteen features. The common accuracy (ACC) and normalized mutual information (NMI) indexes are used to evaluate the clustering performance of the proposed HCM algorithm, and the test results are displayed in Figure 19.
As shown in Figure 19, it can be seen that when m 5 , the ACC and NMI values basically do not change with the change of m . Therefore, we conclude that the fuzzy weighting exponent in the HCM algorithm can be taken from 5.0.
In addition, we also selected four different types of images for segmentation testing. These images included an MRI brain image (https://www.kaggle.com/preetviradiya/brian-tumor-dataset) (accessed on 15 October 2023), two MSRC natural images (https://www.microsoft.com/en-us/research/project/image-understanding) (accessed on 15 October 2023), and an uneven illumination image. The common accuracy (ACC) and Jaccard similarity coefficient (Jaccard) were used to evaluate the segmentation performance of the proposed HCM algorithm, and the test results are displayed in Figure 20.
As shown in Figure 20, it can be seen that when m 5 the two indicators do not change much, and it can be concluded that for image data the reasonable range of the weighting exponent m in the HCM algorithm should be greater than or equal to 5.
Based on the experimental results of Figure 19 and Figure 20, it can be seen that with the change of m , the values of the ACC, NMI, and Jaccard change or unchanged within a small range, with basic fluctuations within 0.1. This indicates that the fuzzy weighting exponent in the HCM algorithm has almost no impact on the algorithm, so the reasonable range of the weighting exponent m in the HCM algorithm for numerical data and image data is [5, 10].
Overall, it is precisely because the range of the fuzzy weighting exponent in the HCM algorithm is [5, 10] that we will also select the value range of the HLICM, IHLICM, and KWHLICM algorithms to be [5, 10].

4.6.2. The Fuzzy Weighting Exponent in the HLICM, IHLICM, and KWHLICM Algorithms

GN(0, 0.1) was used to corrupt the synthetic image in Figure 3b, SPN(0.3) was used to corrupt the natural image in Figure 5d, SN(0.2) was used to corrupt the remote sensing image in Figure 7b, and RN(80) was used to corrupt the MR image in Figure 9a. The proposed algorithm was tested with different fuzzy weighting exponents using corresponding noisy images. Figure 21 shows the curve of the algorithm performance as a function of the fuzzy weighting exponent.
As shown in Figure 21, the comparison of four types of noise shows that the three proposed algorithms are more sensitive to speckle noise, while for other three types of noise, these three algorithms are less sensitive to changes in the fuzzy weighting exponent. Overall, as the fuzzy weighting exponent changes, the performance of three proposed algorithms remains stable, and the range of fuzzy weighting exponents provided in the proposed algorithms is reasonable.

4.7. Impact of Initial Clustering Centers on Algorithm Performance

To verify the sensitivity of the algorithm to the initial clustering centers, the gray levels within the maximum and minimum values of the noisy image were equally divided into c segments, and the gray levels with frequency 0 were removed. At each execution, a set of values from c segment was selected as the initial clustering centers [75], and five sets of initial clustering centers were selected for segmentation testing. We selected the synthetic image in Figure 3b, which was polluted by Gaussian noise; the natural image in Figure 5d, which was corrupted by salt and pepper noise; the remote sensing image in Figure 7b, which was corrupted by speckle noise; and the MR image in Figure 9a, which was corrupted by Rician noise. Figure 22, Figure 23, Figure 24 and Figure 25 show a series of box plots of algorithm performance as a function of the initial clustering centers.
As shown in Figure 22, Figure 23, Figure 24 and Figure 25, compared with images with different types of noise, the IHLICM and KWHLICM algorithms are more sensitive to the initial clustering centers on images containing speckle noise and salt and pepper noise, while the HLICM algorithm is more sensitive to the initial clustering centers on images containing Rician noise and speckle noise. These three proposed algorithms are insensitive to Gaussian noise, and overall, three proposed algorithms are not very sensitive to initial clustering centers.

4.8. Testing of the Algorithms’ Generalization Performance

To verify the adaptability of the algorithms, we select a large number of images from the BSDS500 dataset to verify the effectiveness and generalizability of the HLICM, IHLICM and KWHLICM algorithms. Considering the spatial limitation of this paper, only eight images from the BSDS500 dataset are provided and segmented using 12 fuzzy clustering algorithms. GN(0, 0.1) was used to corrupt #120003, #296028, #385028, and #353013 and SPN(0.3) was used to corrupt #56028, #140075, #223061, and #97033. The segmentation results of these algorithms for noisy natural images are shown in Figure 26.
From Figure 26, it can be seen that the segmentation results of the FCM_VMF and FLICMLNLI algorithms are poor, indicating that these algorithms lack a certain robustness to noise. The FCM_VMF and FLICMLNLI algorithms have the worst segmentation effect on noisy images; the ARFCM algorithm loses a large amount of information in the image during segmentation; the segmentation results of the DSFCM_N, PFLSCM, FLICM, and HLICM algorithms contain a large amount of noise, which is unsatisfactory; the FCM_SCIM and FSC_LNML algorithms suppress a large amount of noise, but the edges of the segmented image are not smooth; and although the KWFLCM and IHLICM algorithms have a good performance in image segmentation, there is still noise in the segmentation results. Therefore, compared with other algorithms, the segmentation results of KWHLICM algorithm are closer to the ground truth and preserve many details. From Figure 27, it can be seen that KWHLICM outperforms other algorithms in terms of segmentation performance. Therefore, the KWHLICM algorithm has a significant advantage in segmenting natural images with noise.
To further verify the generalizability of the algorithm, we continued by selecting a large number of images from the BSDS500 dataset for noise-free segmentation testing. The experimental results still show that KWHLICM still has good segmentation performance when segmenting noiseless images. Figure 28 show the segmentation results of eight noiseless natural images.
From Figure 28, it can be seen that when segmenting natural images without noise, the ARFCM and FCM_VMF algorithms severely lose the detailed information of the image, while the FCM_SICM and FSC_LNML algorithms still blur the boundaries of the image when processing it. For #306005, the FCM_SICM algorithm failed to segment and for #106020 and #21077, the FSC_LNML algorithm cannot perform reasonable segmentation. Using the HLICM algorithm resulted in misclassification. For the FLICMLNLI, DSFCM_N, KWFLICM, PFLSCM, FLICM, KWHLICM and IHLICM algorithms, the difference in image segmentation results is small, and the naked eye cannot distinguish their differences well. From Figure 29, it can be seen that the ACC and PSNR values of the KWHLIMC algorithm are better than those of the other algorithms. Overall, the proposed KWHLICM algorithm for noise-free images has better generalization than existing fuzzy clustering-related algorithms.

4.9. Testing and Analysis of the Algorithms for Color Image

To verify the adaptability of the algorithm to color images, a large number of images from BSDS500 and UC Merced Land Use remote sensing datasets were selected to verify the generalizability and effectiveness of the proposed algorithm. SPN(0.2) was added to #296059# 304034, GN(0, 0.1) was added to #189011# 232038, and SN(0.2) was added to buildings18. Using 12 algorithms to segment these noise-free and noisy images, the segmentation results are shown in Figure 30.
From Figure 30, it can be seen that for noise-free color images, all algorithms except the ARFCM, FCM_VMF and FCM_SICM algorithms can effectively extract the target in the image. For color images contaminated by noise, the DSFCM-N, PFLSCM, FLICMLLI, FLICM and HLICM algorithms cannot completely suppress the noise, and the segmentation results still contain some noise. The ARFCM, FCMSCIM and FSC_LNML algorithms can suppress most of the noise in the image, but the edges of the segmented image are not smooth and satisfactory. The KWHLICM algorithm achieves better results, which are closer to the ground truths and also retain some details. From Figure 31, it can be seen that for both noiseless and noisy color images, the KWHLICM algorithm has higher metrics than other algorithms. Therefore, the proposed KWHLICM algorithm also has good universality for color image segmentation.

4.10. Statistical Comparisons by the Friedman Test

To systematically compare different algorithms, this paper this paper uses the Friedman test [76] to evaluate the running efficiency (time) and segmentation quality (ACC, PSNR, and mIoU) of the 12 algorithms on the 16 images in Figure 3, Figure 5, Figure 7 and Figure 9. More specifically, the Friedman test rejects the null hypothesis that exhibits the same level of significance, which requires the use of the Nemenyi test for verification. If the average rankings of two algorithms differ by at least one key difference, their performance will be significantly different. The key difference is defined as CD = q α K ( K + 1 ) / 6 N . The CD plots are shown in Figure 32.
According to the CD plots in Figure 32, the KWHLICM algorithm for 16 images has a statistical advantage over the other compared algorithms in terms of running efficiency and segmentation quality. In Figure 32a, the FLICMLNLI, PFLSCM, ARFCM, and DSFCM_N algorithms outperform the KWHLICM algorithm in terms of running efficiency. Therefore, the KWHLICM algorithm does not perform much better than these comparison algorithms in terms of runtime, but achieves a good compromise between segmentation quality and statistical running efficiency.

4.11. Algorithm Convergence Test

In this article, we monitor the convergence of the algorithm by calculating the number of iterations to determine if the change in the clustering centers that the algorithm converges to is below a preset threshold. Updating the cluster center is crucial during the clustering iteration process. When the change in the cluster center is less than the preset threshold, it can be considered that the iterative algorithm has converged and the cluster center has reached a stable state. To test the convergence of different algorithms, four images from the BSDS500, UC Merced land use, and brain tumor MRI datasets were selected for segmentation testing. Table 9 gives the algorithm iterations for different noisy images.
From Table 9, it can be seen that the HLICM algorithm has fewer iterations than the other algorithms, while the IHLICM and KHLICM algorithms have more iterations. However, by comparing the iteration times of different algorithms, it can be seen that the HLICM algorithm is significantly better than other compared algorithms in terms of convergence speed, while the convergence speed of the KWHLICM and IHLICM algorithms is not significantly different from other algorithms. Overall, the proposed KWHLICM and IHLICM algorithms not only have good segmentation performance, but also have high computational efficiency.

5. Conclusions and Outlook

This paper first defined the new concepts of harmonic fuzzy sets and harmonic fuzzy partition. Then, based on these basic concepts, a new harmonic fuzzy local information C-mean clustering algorithm was proposed, and the local convergence of this algorithm was rigorously proved using Zangwill’s theorem. Meanwhile, in order to improve the generalizability of the algorithms, as inspired by the IFLICM and KWFLCM algorithms, two enhanced robust image segmentation algorithms were proposed using local information, and their good performances were verified through experiments. In conclusion, the originality of our work lies in the integration of harmonic fuzzy membership and local spatial information in the clustering process:
(1)
The use of harmonic fuzzy partition to constrain fuzzy membership of all the algorithms;
(2)
In the proposed algorithm, sample clustering is performed by using harmonic fuzzy membership and neighborhood information of pixels;
(3)
The fuzzy membership degree in the proposed algorithm has a value range of ( 1 , + ) , which is only supported by the harmonic fuzzy set theory proposed in this paper.
Overall, the proposed algorithms have satisfactory clustering results. These algorithms and related findings not only help to reveal the intrinsic structure of data, but also promote the development of fuzzy clustering related theories. Meanwhile, they can be widely used in data mining, image processing, and medical diagnosis. Clustering algorithms have important applications in various fields and can help people better understand data and make effective decisions.
There are many directions in which future work can be expanded. The following are some examples:
(1)
Explore the theory of harmonic fuzzy graphs [77] used to solve complex engineering problems;
(2)
Constructing harmonic fuzzy logic to solve problems related to fuzzy inference [78];
(3)
Construction of harmonic fuzzy neural networks [79,80] for solving complex engineering problems such as intelligent transportation and industrial automation.

Author Contributions

Conceptualization, C.W.; methodology, C.W. and S.Z.; writing—review and editing, C.W.; software, S.Z.; application to real data, S.Z.; and writing—original draft, S.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

All data used to support the findings of this study are included within the article.

Acknowledgments

The authors would like to thank everyone involved for their contributions to this article. They would also like to thank the anonymous reviewers for their helpful comments and suggestions.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Schaefer, A.; Scolaro, T.P.; Ghisi, E.E. Cluster analysis applied to obtaining reference models for building thermal performance studies. J. Build. Eng. 2024, 89, 109273. [Google Scholar] [CrossRef]
  2. Lee, M.; Hwang, S.; Park, Y.; Choi, B. Factors affecting bike-sharing system demand by inferred trip purpose: Integration of clustering of travel patterns and geospatial data analysis. Int. J. Sustain. Transp. 2022, 16, 847–860. [Google Scholar] [CrossRef]
  3. Yang, P.; Zhu, L.; Zhang, Y.; Ma, C.; Liu, L.; Yu, X.; Hu, W. On the relative value of clustering techniques for Unsupervised Effort-Aware Defect Prediction. Expert Syst. Appl. 2024, 245, 123041. [Google Scholar] [CrossRef]
  4. Taylor, H.H.; Bremner, A.J. Cluster kinds and the developmental origins of consciousness. Trends Cognit. Sci. 2024, 28, 586–587. [Google Scholar] [CrossRef]
  5. Yan, X.; Shi, K.; Ye, Y.; Yu, H. Deep correlation mining for multi-task image clustering. Expert Syst. Appl. 2022, 187, 115973. [Google Scholar] [CrossRef]
  6. Choi, S.; Lim, H.; Lim, J.; Yoon, S. Retrofit building energy performance evaluation using an energy signature-based symbolic hierarchical clustering method. Build. Environ. 2024, 251, 111206. [Google Scholar] [CrossRef]
  7. Gong, H.; Zhang, S.; Zhang, X.; Chen, Y. A method for chromatin domain partitioning based on hypergraph clustering. Comput. Struct. Biotec. 2024, 23, 1584–1593. [Google Scholar] [CrossRef]
  8. Mo, W.; Ni, S.; Zhou, M.; Wen, J.; Qi, D.; Huang, J.; Yang, Y.; Xu, Y.; Wang, X.; Zhao, Z. An electron density clustering based adaptive segmentation method for protein Raman spectrum calculation. Spectrochim. Acta A 2024, 314, 124155. [Google Scholar] [CrossRef] [PubMed]
  9. Laclau, C.; Nadif, M. Hard and fuzzy diagonal co-clustering for document-term partitioning. Neurocomputing 2016, 193, 133–147. [Google Scholar] [CrossRef]
  10. Zhi, X.; Fan, J.; Zhao, F. Robust local feature weighting hard c-means clustering algorithm. Neurocomputing 2014, 134, 20–29. [Google Scholar] [CrossRef]
  11. Ferreira, M.R.P.; Carvalho, F.A.T.; Simões, E.C. Kernel-based hard clustering methods with kernelization of the metric and automatic weighting of the variables. Pattern Recognit. 2016, 51, 310–321. [Google Scholar] [CrossRef]
  12. Dunn, J.C. A fuzzy relative of the ISODATA process and its use in detecting compact well separated clusters. J. Cybern. 1973, 3, 32–57. [Google Scholar] [CrossRef]
  13. Cardone, B.; Martino, F.D.; Senatore, S. Real estate price estimation through a fuzzy partition-driven genetic algorithm. Inf. Sci. 2024, 667, 120442. [Google Scholar] [CrossRef]
  14. Bezdek, J.C. Cluster validity with fuzzy sets. Cybernet. Syst. 1973, 3, 58–73. [Google Scholar] [CrossRef]
  15. Wu, J.; Wang, X.; Wei, T.; Fang, C. Full-parameter adaptive fuzzy clustering for noise image segmentation based on non-local and local spatial information. Comput. Vis. Image Underst. 2023, 235, 103765. [Google Scholar] [CrossRef]
  16. Mújica-Vargas, D.; Gallegos-Funes, F.; Rosales-Silva, A.J. A fuzzy clustering algorithm with spatial robust estimation constraint for noisy color image segmentation. Pattern Recogn. Lett. 2013, 34, 400–413. [Google Scholar] [CrossRef]
  17. Ahmed, M.N.; Yamany, S.M.; Mohamed, N.; Farag, A.A.; Moriarty, T. A modified fuzzy c-means algorithm for bias field estimation and segmentation of MRI data. IEEE Trans. Med. Imaging 2002, 21, 193–199. [Google Scholar] [CrossRef]
  18. Chen, S.; Zhang, D. Robust image segmentation using FCM with spatial constraints based on new kernel-induced distance measure. IEEE Trans. Syst. Man Cybern. Part B Cybern. 2004, 34, 1907–1916. [Google Scholar] [CrossRef]
  19. Szilagyi, L.; Benyo, Z.; Szilagyi, S.M.; Adam, H.S. MR brain image segmentation using an enhanced fuzzy C-means algorithm. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society(EMBC), Cancun, Mexico, 17–21 September 2003; pp. 17–21. [Google Scholar]
  20. Cai, W.; Chen, S.; Zhang, D. Fast and robust fuzzy c-means clustering algorithms incorporating local information for image segmentation. Pattern Recogn. 2007, 40, 825–838. [Google Scholar] [CrossRef]
  21. Krinidis, S.; Chatzis, V. A robust fuzzy local information C-means clustering algorithm. IEEE Trans. Image Process. 2010, 19, 1328–1337. [Google Scholar] [CrossRef]
  22. Li, Z.; Zheng, Z. Fuzzy C-mean clustering based on improved local information for MR image segmentation. Autom. Appl. 2024, 65, 225–228. [Google Scholar]
  23. Gong, M.; Liang, Y.; Shi, J.; Ma, W.; Ma, J. Fuzzy C-means clustering with local information and kernel metric for image segmentation. IEEE Trans. Image Process. 2013, 22, 573–584. [Google Scholar] [CrossRef] [PubMed]
  24. Leszczyński, K.; Penczek, P.; Grochulski, W. Sugeno’s fuzzy measure and fuzzy clustering. Fuzzy Sets Syst. 1985, 15, 147–158. [Google Scholar] [CrossRef]
  25. Li, J.J.; Zhang, L.; Yao, T.S. Convergence analysis of the GLCA algorithm. J. Comp. Res. Dev. 1999, 36, 978–981. [Google Scholar]
  26. Leski, J.M. Generalized weighted conditional fuzzy clustering. IEEE Trans. Fuzzy Syst. 2003, 11, 709–715. [Google Scholar] [CrossRef]
  27. Zadeh, L.A. Similarity relations and fuzzy ordering. Inf. Sci. 1971, 3, 177–200. [Google Scholar] [CrossRef]
  28. Huang, W.; Zhang, F.; Wang, S.; Kong, F. A novel knowledge-based similarity measure on intuitionistic fuzzy sets and its applications in pattern recognition. Expert Syst. Appl. 2024, 249, 123835. [Google Scholar] [CrossRef]
  29. Ruspini, E.H. A new approach to clustering. Inf. Control. 1969, 15, 22–32. [Google Scholar] [CrossRef]
  30. Ruspini, E.H. New experimental results in fuzzy clustering. Inf. Sci. 1973, 6, 273–284. [Google Scholar] [CrossRef]
  31. Pan, X.; Xu, Y. Redefinition of the concept of fuzzy set based on vague partition from the perspective of axiomatization. Soft Comput. 2018, 22, 1777–1789. [Google Scholar] [CrossRef]
  32. Mesiar, R.; Rybátrik, J. Entropy of fuzzy partitions: A general model. Fuzzy Sets Syst. 1998, 99, 73–79. [Google Scholar] [CrossRef]
  33. Torra1, V.; JurÍo, A.; Bustince, H.; Aliahmadipour, L. Fuzzy sets in clustering: On fuzzy partitions. Advances in Intelligent Systems and Computing. In Proceedings of the International Conference on Intelligent and Fuzzy Systems, INFUS 2019, Istanbul, Turkey, 23–25 July 2019; Springer: Cham, Switzerland, 2019; Volume 1029, pp. 7–14. [Google Scholar]
  34. Bezdek, J.C.; Harris, J.D. Fuzzy partitions and relations; an axiomatic basis for clustering. Fuzzy Sets Syst. 1978, 1, 111–127. [Google Scholar] [CrossRef]
  35. Torra, I.; Miyamoto, S. A definition for I-fuzzy partitions. Soft Comput. 2011, 15, 363–369. [Google Scholar] [CrossRef]
  36. Wang, J.; Chung, F.; Wang, S.; Deng, Z. Double indices-induced FCM clustering and its integration with fuzzy subspace clustering. Pattern Analysis Appl. 2014, 17, 549–566. [Google Scholar] [CrossRef]
  37. Zhu, J.; Li, K.; Xia, K.; Gu, X.; Xue, J.; Qiu, S.; Jiang, Y.; Qian, P. A novel double-index-constrained, multi-view, fuzzy-clustering algorithm and its application for detecting epilepsy electroencephalogram signals. IEEE Access 2019, 7, 103823–103832. [Google Scholar] [CrossRef]
  38. Zhang, X.; Juan, A.; Tauler, R. Local rank-based spatial information for improvement of remote sensing hyperspectral imaging resolution. Talanta 2016, 146, 1–9. [Google Scholar] [CrossRef]
  39. Siriapisith, T.; Kusakunniran, W.; Haddawy, P. Pyramid graph cut: Integrating intensity and gradient information for grayscale medical image segmentation. Comput. Biol. Med. 2020, 126, 103997. [Google Scholar] [CrossRef]
  40. Wu, C.; Zhang, X. A novel kernelized total Bregman divergence-based fuzzy clustering with local information for image segmentation. Int. J. Approx. Reason 2021, 136, 281–305. [Google Scholar] [CrossRef]
  41. Memon, K.H.; Lee, D.H. Generalised kernel weighted fuzzy C-means clustering algorithm with local information. Fuzzy Sets Syst. 2018, 340, 91–108. [Google Scholar] [CrossRef]
  42. Nik-Khorasani, A.; Mehrizi, A.; Sadoghi-Yazd, H. Robust hybrid learning approach for adaptive neuro-fuzzy inference systems. Fuzzy Sets Syst. 2024, 481, 108890. [Google Scholar] [CrossRef]
  43. Zadeh, L.A. Fuzzy sets. Inf. Control 1965, 8, 338–353. [Google Scholar] [CrossRef]
  44. Aggarwal, M. Redefining fuzzy entropy with a general framework. Expert Syst. Appl. 2021, 164, 113671. [Google Scholar] [CrossRef]
  45. Marín, N.; Rivas-Gervilla, G.; Sánchez, D.; Yager, R.R. Specificity measures based on fuzzy set similarity. Fuzzy Sets Syst. 2020, 401, 189–199. [Google Scholar] [CrossRef]
  46. Li, J.; Deng, G.; Li, H.; Zeng, W. The relationship between similarity measure and entropy of intuitionistic fuzzy sets. Inf. Sci. 2012, 188, 314–321. [Google Scholar] [CrossRef]
  47. Nagoev, Z.; Pshenokova, I.; Bzhikhatlov, K.; Kankulov, S.; Atalikov, B. Multi-agent neurocognitive architecture of an intelligent agent pattern recognition system. Procedia Comput. Sci. 2022, 213, 504–509. [Google Scholar] [CrossRef]
  48. Wu, C.; Li, M. Generalized multiplicative fuzzy possibilistic product partition C-means clustering. Inf. Sci. 2024, 670, 120588. [Google Scholar] [CrossRef]
  49. Xu, Z. Priority weight intervals derived from intuitionistic multiplicative preference. IEEE Trans. Fuzzy Syst. 2013, 21, 642–654. [Google Scholar]
  50. Chen, J.; Zhu, J.; Jiang, H.; Yang, H.; Nie, F. Sparsity fuzzy C-means clustering with principal component analysis embedding. IEEE Trans. Fuzzy Syst. 2023, 31, 2099–2111. [Google Scholar] [CrossRef]
  51. Li, H.; Wei, M. Fuzzy clustering based on feature weights for multivariate time series. Knowl. Based Syst. 2020, 197, 105907. [Google Scholar] [CrossRef]
  52. Saha, A.; Das, S. Stronger convergence results for the center-based fuzzy clustering with convex divergence measure. IEEE Trans. Cybern. 2019, 49, 4229–4242. [Google Scholar] [CrossRef]
  53. Bezdek, J.C. A convergence theorem for the fuzzy ISODATA clustering algorithms. IEEE Trans. on Pattern Anal. Mach. Intell. 1980, 2, 1–8. [Google Scholar] [CrossRef] [PubMed]
  54. Yang, M.S. Convergence properties of the generalized fuzzy C-means clustering algorithms. Comput. Math. Appl. 1993, 25, 3–11. [Google Scholar] [CrossRef]
  55. Gröll, L.; Jäkel, J. A new convergence proof of fuzzy C-means. IEEE Trans. Fuzzy Syst. 2005, 13, 717–720. [Google Scholar] [CrossRef]
  56. Tian, Y.; Yang, M.S. Bias-correction fuzzy clustering algorithms. Inf. Sci. 2015, 309, 138–162. [Google Scholar]
  57. Gao, Y.; Wang, Z.; Xie, J.; Pan, J. A new robust fuzzy c-means clustering method based on adaptive elastic distance. Knowl. Based Syst. 2022, 237, 107769. [Google Scholar] [CrossRef]
  58. Zhang, X.; Sun, Y.; Liu, H.; Hou, Z.; Zhao, F.; Zhang, C. Improved clustering algorithms for image segmentation based on non-local information and back projection. Inf. Sci. 2021, 550, 129–144. [Google Scholar] [CrossRef]
  59. Zhang, H.; Li, H.; Chen, N.; Chen, S.; Liu, J. Novel fuzzy clustering algorithm with variable multi-pixel fitting spatial information for image segmentation. Pattern Recogn. 2022, 121, 108201. [Google Scholar] [CrossRef]
  60. Ghosh, S.; Hazarika, A.P.; Chandra, A.; Mudi, R.K. Adaptive neighbor constrained deviation sparse variant fuzzy C-means clustering for brain MRI of AD subject. Visual Inf. 2021, 5, 67–80. [Google Scholar] [CrossRef]
  61. Tang, Y.; Ren, F.; Pedrycz, W. Fuzzy C-Means clustering through SSIM and patch for image segmentation. Appl. Soft Comput. 2020, 87, 105928. [Google Scholar] [CrossRef]
  62. Wang, Q.; Wang, X.; Fang, C.; Yang, W. Robust fuzzy c-means clustering algorithm with adaptive spatial & intensity constraint and membership linking for noise image segmentation. Appl. Soft Comput. 2020, 92, 106318. [Google Scholar]
  63. Wei, T.; Wang, X.; Li, X.; Zhu, S. Fuzzy subspace clustering noisy image segmentation algorithm with adaptive local variance non-local information and mean membership linking. Eng. Appl. Artifi. Intell. 2022, 110, 104672. [Google Scholar] [CrossRef]
  64. José-García, A.; Gómez-Flores, W. CVIK: A Matlab-based cluster validity index toolbox for automatic data clustering. SoftwareX 2023, 22, 101359. [Google Scholar] [CrossRef]
  65. Wang, Q.; Wang, X.; Fang, C.; Jiao, J. Fuzzy image clustering incorporating local and region-level information with median memberships. Appl. Soft Comput. 2021, 105, 107245. [Google Scholar] [CrossRef]
  66. Gharieb, R.R.; Gendy, G.; Selim, H. A hard C-means clustering algorithm incorporating membership KL divergence and local data information for noisy image segmentation. Int. J. Pattern Recogn. 2018, 32, 1850012. [Google Scholar] [CrossRef]
  67. Zheng, X.; Chen, T. High spatial resolution remote sensing image segmentation based on the multiclassification model and the binary classification model. Neural Comput. Appl. 2023, 35, 3597–3604. [Google Scholar] [CrossRef]
  68. Available online: https://www2.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/resources.html (accessed on 10 March 2024.).
  69. Visual Object Classes Challenge 2012 (VOC2012). Available online: http://host.robots.ox.ac.uk/pascal/VOC/voc2012/index.html (accessed on 12 March 2024).
  70. UC Merced Land Use Dataset. Available online: http://weegee.vision.ucmerced.edu/datasets/landuse.html (accessed on 14 March 2024).
  71. Brain Tumor MRI Dataset. Available online: https://www.kaggle.com/datasets/masoudnickparvar/brain-tumor-mri-dataset (accessed on 16 March 2024).
  72. Hu, L.; Pan, X.; Tan, Z.; Luo, X. A fast fuzzy clustering algorithm for complex networks via a generalized momentum method. IEEE Trans. Fuzzy Syst. 2022, 30, 3473–3485. [Google Scholar] [CrossRef]
  73. Zhang, X.; Dai, L. Fast bilateral filtering. Electron. Lett. 2019, 55, 258–260. [Google Scholar] [CrossRef]
  74. Pal, N.R.; Bezdek, J.C. On clustering validity for the fuzzy C-means model. IEEE Trans. Fuzzy Syst. 1955, 3, 370–379. [Google Scholar] [CrossRef]
  75. Wu, C.; Guo, X. A novel single fuzzifier interval type-2 fuzzy C-means clustering with local information for land-cover segmentation. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 5903–5917. [Google Scholar] [CrossRef]
  76. Demšar, J. Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 2006, 7, 1–30. [Google Scholar]
  77. Bhattacharya, A.; Pal, M. A fuzzy graph theory approach to the facility location problem: A case study in the Indian banking system. Mathematics 2023, 11, 2992. [Google Scholar] [CrossRef]
  78. Pal, M.; Jana, C.; Bhattacharya, A. Fundamentals of Fuzzy Optimization and Decision-Making Problems; Springer: Cham, Switzerland, 2023; pp. 1–30. [Google Scholar]
  79. Bhattacharya, A.; Pal, M. Prediction on nature of cancer by fuzzy graphoidal covering number using artificial neural network. Artif. Intell. Med. 2024, 148, 102783. [Google Scholar] [CrossRef] [PubMed]
  80. Li, K.; Ni, T.; Xue, J.; Jiang, Y. Deep soft clustering: Simultaneous deep embedding and soft-partition clustering. J. Amb. Intell. Hum. Comp. 2023, 14, 5581–5593. [Google Scholar] [CrossRef]
Figure 1. The main framework of the proposed algorithm.
Figure 1. The main framework of the proposed algorithm.
Symmetry 16 01370 g001
Figure 2. Schematic diagram of dynamic iteration process for the proposed KWHLICM algorithm.
Figure 2. Schematic diagram of dynamic iteration process for the proposed KWHLICM algorithm.
Symmetry 16 01370 g002
Figure 3. Synthetic image. (a) Regular image with three classes; (b) regular image with three classes; (c) irregular image with three classes; and (d) regular image with three classes.
Figure 3. Synthetic image. (a) Regular image with three classes; (b) regular image with three classes; (c) irregular image with three classes; and (d) regular image with three classes.
Symmetry 16 01370 g003
Figure 4. Segmentation results of different algorithms for noisy synthetic images. (a) Noisy images; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Figure 4. Segmentation results of different algorithms for noisy synthetic images. (a) Noisy images; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Symmetry 16 01370 g004
Figure 5. Natural images. (a) 35010 from BSDS500; (b) 2007_000738 from VOC2010; (c) 2009_003974 from VOC2010; (d) 48017 from BSDS500.
Figure 5. Natural images. (a) 35010 from BSDS500; (b) 2007_000738 from VOC2010; (c) 2009_003974 from VOC2010; (d) 48017 from BSDS500.
Symmetry 16 01370 g005
Figure 6. Segmentation results of different algorithms for noisy natural images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Figure 6. Segmentation results of different algorithms for noisy natural images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Symmetry 16 01370 g006
Figure 7. Remote sensing images. (a) overpass91; (b) baseballdiamond03; (c) storagetanks42; and (d) runway22.
Figure 7. Remote sensing images. (a) overpass91; (b) baseballdiamond03; (c) storagetanks42; and (d) runway22.
Symmetry 16 01370 g007
Figure 8. Segmentation results of different algorithms for noisy remote sensing images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Figure 8. Segmentation results of different algorithms for noisy remote sensing images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Symmetry 16 01370 g008
Figure 9. MR images from Brain Tumor MRI dataset. (a) Tr-me_0180; (b) Te-no_0013; (c) 37 no; and (d) Tr-me_0235.
Figure 9. MR images from Brain Tumor MRI dataset. (a) Tr-me_0180; (b) Te-no_0013; (c) 37 no; and (d) Tr-me_0235.
Symmetry 16 01370 g009
Figure 10. Segmentation results of different algorithms for noisy MRI images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Figure 10. Segmentation results of different algorithms for noisy MRI images. (a) Noisy image; (b) ARFCM; (c) FLICMLNLI; (d) FCM_VMF; (e) DSFCM_N; (f) KWFLICM; (g) PFLSCM; (h) FCM_SICM; (i) FSC_LNML; (j) FLICM; (k) HLICM; (l) IHLICM; and (m) KWHLICM.
Symmetry 16 01370 g010
Figure 11. Performance curves of different algorithms varying with standard deviation of Rician noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Figure 11. Performance curves of different algorithms varying with standard deviation of Rician noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Symmetry 16 01370 g011
Figure 12. Performance curves of different algorithms varying with the normalized variance of Gaussian noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Figure 12. Performance curves of different algorithms varying with the normalized variance of Gaussian noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Symmetry 16 01370 g012
Figure 13. Performance curves of different algorithms varying with the intensity of salt and pepper noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Figure 13. Performance curves of different algorithms varying with the intensity of salt and pepper noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Symmetry 16 01370 g013
Figure 14. Performance curves of different algorithms varying with standard deviation of Speckle noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Figure 14. Performance curves of different algorithms varying with standard deviation of Speckle noise. (a) Acc; (b) Sen; (c) Jaccard; (d) PSNR; (e) SA; (f) Kappa; (g) mIoU; and (h) DICE.
Symmetry 16 01370 g014
Figure 15. Bar chart of time cost of different algorithms for noisy images. (a) Synthetic images; (b) natural images; (c) remote sensing images; (d) MR images.
Figure 15. Bar chart of time cost of different algorithms for noisy images. (a) Synthetic images; (b) natural images; (c) remote sensing images; (d) MR images.
Symmetry 16 01370 g015aSymmetry 16 01370 g015b
Figure 16. Segmentation results of the proposed HLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Figure 16. Segmentation results of the proposed HLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Symmetry 16 01370 g016aSymmetry 16 01370 g016b
Figure 17. Segmentation results of the proposed IHLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Figure 17. Segmentation results of the proposed IHLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Symmetry 16 01370 g017aSymmetry 16 01370 g017b
Figure 18. Segmentation results of the proposed KWHLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Figure 18. Segmentation results of the proposed KWHLICM algorithm varying with the size of neighborhood window for noisy images. (a) Noisy image; (b) ground truths; (c) 3 × 3; (d) 5 × 5; (e) 7 × 7; (f) 9 × 9; and (g) 11 × 11.
Symmetry 16 01370 g018aSymmetry 16 01370 g018b
Figure 19. Clustering evaluation results of the HCM algorithm varying with the fuzzy weighting exponent for numerical data. (a) Iris; (b) Triangle1; (c) Seeds; (d) Wine.
Figure 19. Clustering evaluation results of the HCM algorithm varying with the fuzzy weighting exponent for numerical data. (a) Iris; (b) Triangle1; (c) Seeds; (d) Wine.
Symmetry 16 01370 g019
Figure 20. Segmentation evaluation results of HCM algorithm varying with the fuzzy weighting exponent for image data. (a) Tr-me_0336; (b) 166_6645; (c) 112_1264; and (d) Uneven illumination image.
Figure 20. Segmentation evaluation results of HCM algorithm varying with the fuzzy weighting exponent for image data. (a) Tr-me_0336; (b) 166_6645; (c) 112_1264; and (d) Uneven illumination image.
Symmetry 16 01370 g020aSymmetry 16 01370 g020b
Figure 21. Algorithm performance curve with various fuzzy weighting exponents. (a) Synthetic image with Gaussian noise; (b) natural image with salt and pepper noise; (c) remote sensing image with speckle noise; and (d) MR image with Rician noise.
Figure 21. Algorithm performance curve with various fuzzy weighting exponents. (a) Synthetic image with Gaussian noise; (b) natural image with salt and pepper noise; (c) remote sensing image with speckle noise; and (d) MR image with Rician noise.
Symmetry 16 01370 g021
Figure 22. Box plots of algorithm performance varying with initial clustering centers under Gaussian noise with different normalized variances. (a) GN(0, 0.05); (b) GN(0, 0.07); (c) GN(0, 0.1); and (d) GN(0, 0.12).
Figure 22. Box plots of algorithm performance varying with initial clustering centers under Gaussian noise with different normalized variances. (a) GN(0, 0.05); (b) GN(0, 0.07); (c) GN(0, 0.1); and (d) GN(0, 0.12).
Symmetry 16 01370 g022
Figure 23. Box plots of algorithm performance varying with initial clustering centers under salt and pepper noise with different intensity levels. (a) SPN(0.15); (b) SPN(0.2); (c) SPN(0.25); and (d) SPN(0.3).
Figure 23. Box plots of algorithm performance varying with initial clustering centers under salt and pepper noise with different intensity levels. (a) SPN(0.15); (b) SPN(0.2); (c) SPN(0.25); and (d) SPN(0.3).
Symmetry 16 01370 g023
Figure 24. Box plots of algorithm performance varying with different initial clustering centers under speckle noise with different normalized variances. (a) SN(0.05); (b) SN(0.1); (c) SN(0.15); and (d) SN(0.2).
Figure 24. Box plots of algorithm performance varying with different initial clustering centers under speckle noise with different normalized variances. (a) SN(0.05); (b) SN(0.1); (c) SN(0.15); and (d) SN(0.2).
Symmetry 16 01370 g024
Figure 25. Box plots of algorithm performance varying with different initial clustering centers under Rician noise with different standard deviations. (a) RN(50); (b) RN(60); (c) RN(70); and (d) RN(80).
Figure 25. Box plots of algorithm performance varying with different initial clustering centers under Rician noise with different standard deviations. (a) RN(50); (b) RN(60); (c) RN(70); and (d) RN(80).
Symmetry 16 01370 g025
Figure 26. Segmentation results of all algorithms for noisy images from the BSDS500 dataset. (a) 120003; (b) 296028; (c) 385028; (d) 353013; (e) 56028; (f) 140075; (g) 223061; and (h) 97033.
Figure 26. Segmentation results of all algorithms for noisy images from the BSDS500 dataset. (a) 120003; (b) 296028; (c) 385028; (d) 353013; (e) 56028; (f) 140075; (g) 223061; and (h) 97033.
Symmetry 16 01370 g026aSymmetry 16 01370 g026b
Figure 27. Bar charts of Acc and PSNR indexes of different algorithms for noisy natural images. (a) ACC; (b) PSNR.
Figure 27. Bar charts of Acc and PSNR indexes of different algorithms for noisy natural images. (a) ACC; (b) PSNR.
Symmetry 16 01370 g027
Figure 28. Segmentation results of all algorithms for noise-free images from the BSDS500 dataset. (a) 208078; (b) 217090; (c) 8143; (d) 55067; (e) 106020; (f) 21077; (g) 306005; and (h) 197017.
Figure 28. Segmentation results of all algorithms for noise-free images from the BSDS500 dataset. (a) 208078; (b) 217090; (c) 8143; (d) 55067; (e) 106020; (f) 21077; (g) 306005; and (h) 197017.
Symmetry 16 01370 g028
Figure 29. Bar charts of Acc and PSNR indexes of different algorithms for noise-free natural images. (a) ACC; (b) PSNR.
Figure 29. Bar charts of Acc and PSNR indexes of different algorithms for noise-free natural images. (a) ACC; (b) PSNR.
Symmetry 16 01370 g029
Figure 30. Segmentation results of all algorithms for color images from the BSDS500 and UC Merced Land Use datasets. (a) 36046; (b) 65019; (c) 241004; (d) 189011; (e) 232038; (f) 296059; (g) 304034; and (h) buildings18.
Figure 30. Segmentation results of all algorithms for color images from the BSDS500 and UC Merced Land Use datasets. (a) 36046; (b) 65019; (c) 241004; (d) 189011; (e) 232038; (f) 296059; (g) 304034; and (h) buildings18.
Symmetry 16 01370 g030aSymmetry 16 01370 g030b
Figure 31. Bar charts of ACC and PSNR indexes of different algorithms for color images with or without noise. (a) ACC; (b) PSNR.
Figure 31. Bar charts of ACC and PSNR indexes of different algorithms for color images with or without noise. (a) ACC; (b) PSNR.
Symmetry 16 01370 g031
Figure 32. CD diagrams of performance indexes of different algorithms for sixteen images. (a) Time; (b) ACC; (c) PSNR; and (d) mIoU.
Figure 32. CD diagrams of performance indexes of different algorithms for sixteen images. (a) Time; (b) ACC; (c) PSNR; and (d) mIoU.
Symmetry 16 01370 g032
Table 1. Evaluation indexes of different algorithms for the noisy synthetic images.
Table 1. Evaluation indexes of different algorithms for the noisy synthetic images.
ImageAlgorithmAccSenJaccardPSNRSAKappamIoUDICE
Figure 3a +
RN(80)
ARFCM0.66730.5010.33428.55110.5010.297311.13930.501
FLICMLNLI0.98080.97120.94421.4370.97120.953931.46610.9712
FCM_VMF0.56780.35160.21337.30930.35160.03467.1110.3516
DSFCM_N 0.67870.51810.34969.29060.51810.357811.65420.5181
KWFLICM0.99370.99060.981326.21680.99060.984932.70950.9906
PFLSCM0.9880.98190.964523.35810.98190.971332.15020.9819
FCM_SICM0.97930.96890.939621.23320.96890.951131.3210.9689
FSC_LNML0.98030.97040.942621.25920.97040.95331.41950.9704
FLICM0.99280.98920.978725.62390.98920.982832.62260.9892
HLICM0.59020.38540.23877.54070.38540.14777.95580.3854
IHLICM0.99140.98710.974625.01620.98710.979532.48680.9871
KWHLICM0.99460.9920.984126.90840.9920.987232.80210.992
Figure 3b +
GN(0, 0.1)
ARFCM0.77010.72210.422312.44690.50430.398411.2390.5939
FLICMLNLI0.88760.97460.668818.95980.68060.579617.1960.8015
FCM_VMF0.71770.60960.33454.98340.42570.22999.01410.5013
DSFCM_N 0.66670.50.258810.22820.34920.25827.05110.4112
KWFLICM0.89660.99380.691122.62260.6940.596217.71490.8173
PFLSCM0.89630.99330.690522.68540.69370.595817.70120.8169
FCM_SICM0.89540.99120.68820.81130.69220.589517.64340.8152
FSC_LNML0.89260.98520.68121.6410.6880.588217.4810.8102
FLICM0.89380.98780.684121.01760.68990.588517.5520.8124
HLICM0.44310.01990.00832.4670.01390.34690.23330.0164
IHLICM0.99230.98850.977222.83430.98850.982732.57310.9885
KWHLICM0.9990.99850.99729.69050.99850.997733.23180.9985
Figure 3c +
SPN(0.2)
ARFCM0.78890.68340.51910.49280.68340.515317.30070.6834
FLICMLNLI0.78830.68250.5185.27620.68250.517617.26690.6825
FCM_VMF0.74180.61270.44179.02930.61270.408414.72260.6127
DSFCM_N 0.99450.99180.983626.89060.99180.987632.78790.9918
KWFLICM0.99540.99310.986223.86010.99310.989632.87360.9931
PFLSCM0.99220.98840.977021.7680.98840.982532.56640.9884
FCM_SICM0.98960.98430.969123.81480.98430.976532.30460.9843
FSC_LNML0.99150.98720.974724.96980.98720.980832.490.9872
FLICM0.93650.90470.826016.10690.90470.857527.53230.9047
HLICM0.34210.01320.00662.20150.01320.48430.22110.0132
IHLICM0.89280.99380.679922.01720.68280.584417.27720.8094
KWHLICM0.99740.99620.992328.37980.99620.994233.07810.9962
Figure 3d +
SN(0.2)
ARFCM0.86240.72480.568412.49590.72480.500514.21020.7248
FLICMLNLI0.87240.74480.59349.24250.74480.612814.8340.7448
FCM_VMF0.86920.73840.58539.57150.73840.582914.63290.7384
DSFCM_N 0.92880.85760.750716.50310.85760.774618.76730.8576
KWFLICM0.94740.89480.809719.13730.89480.838720.24210.8948
PFLSCM0.90980.81970.694516.23110.81970.705617.36130.8197
FCM_SICM0.88650.77290.629915.80170.77290.6615.74770.7729
FSC_LNML0.95730.91450.842522.66460.91450.866421.06210.9145
FLICM0.92330.84660.733918.49190.84660.770418.34850.8466
HLICM0.55390.10770.05693.5040.10770.1381.42330.1077
IHLICM0.93680.87360.775618.05370.87360.787919.39010.8736
KWHLICM0.96580.93160.87220.99930.93160.892621.80090.9316
N(80) is Rician noise with standard deviation of 80, SPN(0.3) is salt and pepper noise with the intensity of 30%, GN(0, 0.1) is Gaussian noise with mean value of 0 and the normalized variance of 0.1; and SN(0.2) is speckle noise with the normalized variance of 0.1.
Table 2. Evaluation indexes of different algorithms for natural images with noise.
Table 2. Evaluation indexes of different algorithms for natural images with noise.
ImageAlgorithmAccSenJaccardPSNRSAKappamIoUDICE
35010 +
GN(0, 0.1)
ARFCM0.80710.71070.55129.71390.71070.535718.37380.7107
FLICMLNLI0.8790.81850.692813.12090.81850.726823.09240.8185
FCM_VMF0.60830.41250.25993.8860.41250.03998.66190.4125
DSFCM_N0.94160.91230.838816.41710.91230.865627.96030.9123
KWFLICM0.92170.88260.789814.81580.88260.821326.32820.8826
PFLSCM0.93590.90380.824516.26090.90380.852527.48190.9038
FCM_SICM0.86720.80070.667712.80650.80070.699222.25590.8007
FSC_LNML0.90540.85820.751613.9860.85820.783125.0520.8582
FLICM0.89840.84750.735414.12980.84750.770924.51430.8475
HLICM0.66190.49280.3278.81410.49280.22810.89850.4928
IHLICM0.93330.89990.81815.93460.89990.848427.26670.8999
KWHLICM0.94260.9140.841616.68620.9140.869428.05240.914
2007_000738 +
GN(0, 0.1)
ARFCM0.84560.76840.62411.70940.76840.641720.79850.7684
FLICMLNLI0.94960.92440.859416.52320.92440.885828.64780.9244
FCM_VMF0.56140.34210.20643.72810.34210.00476.87840.3421
DSFCM_N0.81340.72010.562611.51320.72010.560118.75270.7201
KWFLICM0.95240.92860.866716.88690.92860.891728.88960.9286
PFLSCM0.95390.93080.870617.72530.93080.895429.02150.9308
FCM_SICM0.94120.91180.837916.07250.91180.86627.9290.9118
FSC_LNML0.95180.92770.865216.7340.92770.890328.8390.9277
FLICM0.94870.9230.85716.70760.9230.883428.5660.923
HLICM0.7130.56950.39819.47040.56950.339113.26940.5695
IHLICM0.94080.91120.836816.48460.91120.866827.89410.9112
KWHLICM0.95920.93880.884718.07920.93880.907829.49160.9388
2009_003974 +
SPN(0.3)
ARFCM0.93310.93310.874611.74560.93310.847343.72940.9331
FLICMLNLI0.73590.73590.58225.78230.73590.207529.10760.7359
FCM_VMF0.83830.83830.72167.91270.83830.638136.08020.8383
DSFCM_N0.97160.97160.944815.47320.97160.935547.24240.9716
KWFLICM0.98090.98090.962617.19410.98090.955948.12770.9809
PFLSCM0.95230.95230.908913.21470.95230.890845.4470.9523
FCM_SICM0.96540.96540.933114.61040.96540.921646.65650.9654
FSC_LNML0.97910.97910.95916.79330.97910.951347.95040.9791
FLICM0.96270.96270.92814.27930.96270.915646.40120.9627
HLICM0.95870.95870.920613.83720.95870.904146.03090.9587
IHLICM0.97790.97790.956716.54990.97790.948647.83480.9779
KWHLICM0.98240.98240.965417.5390.98240.959148.26810.9824
2007_003330 +
SPN(0.3)
ARFCM0.74410.61610.44526.86060.61610.241414.84010.6161
FLICMLNLI0.43890.15830.0862.67780.15830.07972.86590.1583
FCM_VMF0.56630.34950.21177.86810.34950.1367.05780.3495
DSFCM_N0.88040.82060.695713.40250.82060.710623.19030.8206
KWFLICM0.89420.83130.711713.40620.83130.720524.05620.8313
PFLSCM0.72450.58680.41529.08320.58680.291413.83930.5868
FCM_SICM0.83430.75150.601911.920.75150.601920.06340.7515
FSC_LNML0.83340.750.600111.69170.750.607220.00170.75
FLICM0.62280.43420.27738.29360.43420.25589.24360.4342
HLICM0.54080.31120.18437.01160.31120.11616.14290.3112
IHLICM0.89130.83690.719613.41130.83690.721223.98630.8369
KWHLICM0.910.8650.762214.3710.8650.775425.40520.865
Table 3. Evaluation indexes of different algorithms for noisy remote sensing images.
Table 3. Evaluation indexes of different algorithms for noisy remote sensing images.
ImageAlgorithmAccSenJaccardPSNRSAKappamIoUDICE
buildings69 +
SN(0.2)
ARFCM0.68640.52950.36017.72770.52950.299512.00350.5295
FLICMLNLI0.74290.61430.44339.57920.61430.414614.77780.6143
FCM_VMF0.56720.35080.21277.87960.35080.0237.0910.3508
DSFCM_N0.82330.73490.580911.6030.73490.603419.36440.7349
KWFLICM0.80970.71460.555911.43410.71460.569618.53120.7146
PFLSCM0.77580.66370.49669.96820.66370.497116.55440.6637
FCM_SICM0.7820.6730.50729.9490.6730.512616.90660.673
FSC_LNML0.78730.68090.516210.92290.68090.518417.20590.6809
FLICM0.79430.69140.528411.09620.69140.533417.61310.6914
HLICM0.5270.29050.16994.69040.29050.07555.66330.2905
IHLICM0.69650.54480.37439.36950.54480.316812.47790.5448
KWHLICM0.82670.74010.587411.68080.74010.609419.58080.7401
tenniscourt95 +
SN(0.2)
ARFCM0.69020.53530.36559.17030.53530.321312.1820.5353
FLICMLNLI0.95190.92780.865317.06630.92780.857728.84470.9278
FCM_VMF0.5480.3220.19197.29920.3220.16786.39630.322
DSFCM_N0.55710.33570.20177.90790.33570.21066.7230.3357
KWFLICM0.95420.93140.871517.56410.93140.857529.05160.9314
PFLSCM0.75130.62690.456510.36530.62690.445515.21830.6269
FCM_SICM0.96070.9410.888618.24870.9410.877129.61880.941
FSC_LNML0.96930.9540.912119.35370.9540.905630.40210.954
FLICM0.56470.3470.20998.00350.3470.17586.99740.347
HLICM0.54330.31490.18697.53370.31490.17726.2290.3149
IHLICM0.89340.84010.724213.86140.84010.649724.14150.8401
KWHLICM0.97070.9560.915719.50660.9560.909730.52320.956
storagetanks42 +
SPN(0.4)
ARFCM0.70130.55190.38117.14620.55190.17812.70380.5519
FLICMLNLI0.59050.38580.2393.05180.38580.22237.96570.3858
FCM_VMF0.71820.57740.40589.68830.57740.26713.5280.5774
DSFCM_N0.80150.70230.541210.65720.70230.544318.03960.7023
KWFLICM0.91450.87180.772712.07730.87180.783225.75680.8718
PFLSCM0.73020.59530.42388.07590.59530.408614.12710.5953
FCM_SICM0.65020.47530.31178.07350.47530.237510.38970.4753
FSC_LNML0.77750.66620.499510.15030.66620.495916.64910.6662
FLICM0.45820.18730.10336.21310.18730.10963.44390.1873
HLICM0.48450.22680.12796.88090.22680.15154.26360.2268
IHLICM0.83440.75160.60211.70690.75160.586220.06780.7516
KWHLICM0.91870.8780.782612.25180.8780.795826.08550.878
runway22 +
SPN(0.4)
ARFCM0.64410.46610.30397.69380.46610.131710.12870.4661
FLICMLNLI0.47760.21650.12143.45180.21650.00484.04550.2165
FCM_VMF0.49770.24660.14063.61750.24660.03884.6880.2466
DSFCM_N0.90330.85490.746614.01140.85490.775924.88750.8549
KWFLICM0.9340.9010.819813.63660.9010.847627.32620.901
PFLSCM0.74540.61810.44738.36160.61810.437514.90950.6181
FCM_SICM0.80670.710.55048.36930.710.567818.34770.71
FSC_LNML0.84540.7680.623411.26110.7680.639120.78080.768
FLICM0.43530.15290.08283.81780.15290.24882.75940.1529
HLICM0.61780.42670.27126.40050.42670.10239.04160.4267
IHLICM0.87360.81040.681313.23930.81040.691822.70910.8104
KWHLICM0.93660.9050.826414.52440.9050.853927.54690.905
Table 4. Evaluation indexes of different algorithms for MR images with noise.
Table 4. Evaluation indexes of different algorithms for MR images with noise.
ImageAlgorithmAccSenJaccardPSNRSAKappamIoUDICE
Tr-me_
0180 +
RN(80)
ARFCM0.80890.71340.554411.09860.71340.577718.4810.7134
FLICMLNLI0.8690.80360.671612.94670.80360.692922.38730.8036
FCM_VMF0.52560.28850.16854.80380.28850.05285.6180.2885
DSFCM_N0.92960.89440.80915.70270.89440.834326.96630.8944
KWFLICM0.93680.90520.826916.09560.90520.848727.56190.9052
PFLSCM0.82790.74190.589711.85370.74190.618319.65540.7419
FCM_SICM0.92750.89120.803715.54990.89120.826126.79070.8912
FSC_LNML0.93530.90290.82316.07680.90290.843827.43460.9029
FLICM0.5190.27850.16187.18220.27850.02425.39270.2785
HLICM0.65490.48230.31788.60430.48230.306410.59270.4823
IHLICM0.93880.90830.831916.39820.90830.848127.7310.9083
KWHLICM0.94940.92410.858917.15110.92410.878228.63090.9241
Te-no_
0013 +
RN(90)
ARFCM0.79770.69650.534310.34660.69650.494917.8110.6965
FLICMLNLI0.79460.69190.528910.92360.69190.488917.63060.6919
FCM_VMF0.43210.14820.083.31360.14820.02722.66730.1482
DSFCM_N0.64930.47390.31068.58670.47390.299510.35240.4739
KWFLICM0.8590.78840.650812.65220.78840.606421.69260.7884
PFLSCM0.75390.63080.460710.16710.63080.452115.35770.6308
FCM_SICM0.87040.80550.674413.04090.80550.679322.47990.8055
FSC_LNML0.8380.7570.60912.09180.7570.616720.29960.757
FLICM0.43150.14720.07955.97440.14720.41622.64840.1472
HLICM0.6210.43160.27527.98450.43160.23979.17170.4316
IHLICM0.85360.78040.639912.65730.78040.622221.32860.7804
KWHLICM0.90090.85140.741214.21510.85140.75724.70820.8514
37 no +
GN(0, 0.1)
ARFCM0.83910.75860.611110.74290.75860.592420.36840.7586
FLICMLNLI0.92260.88380.791814.34310.88380.799726.3950.8838
FCM_VMF0.56790.35190.21355.40270.35190.12047.11610.3519
DSFCM_N0.8090.71350.554610.61890.71350.520518.48690.7135
KWFLICM0.93560.90350.823915.59010.90350.828327.46470.9035
PFLSCM0.94560.91840.849116.8130.91840.856528.30440.9184
FCM_SICM0.92530.8880.798515.05570.8880.803826.61630.888
FSC_LNML0.94640.91960.851216.75740.91960.858728.37490.9196
FLICM0.94470.9170.846716.78610.9170.852928.22280.917
HLICM0.78850.68270.518310.65030.68270.355117.27670.6827
IHLICM0.94540.9180.848516.81430.9180.856228.28270.918
KWHLICM0.95290.92940.868117.47910.92940.875928.93690.9294
Tr-me_
0235 +
GN(0, 0.1)
ARFCM0.79050.68580.52189.5580.68580.532817.39460.6858
FLICMLNLI0.89310.83970.723713.15010.83970.748424.12330.8397
FCM_VMF0.53820.30730.18156.5420.30730.05126.05140.3073
DSFCM_N0.9290.89350.807515.70620.89350.832926.91770.8935
KWFLICM0.93750.90630.828616.31130.90630.849827.61960.9063
PFLSCM0.94830.92240.85617.11740.92240.87928.53390.9224
FCM_SICM0.87480.81230.683913.1680.81230.694922.79580.8123
FSC_LNML0.92160.88240.789515.29040.88240.813426.31830.8824
FLICM0.91080.86620.76414.74830.86620.783125.46680.8662
HLICM0.60670.41010.25797.92670.41010.11418.59690.4101
IHLICM0.93480.90220.821916.13890.90220.841527.39660.9022
KWHLICM0.95490.93230.873117.71960.93230.891929.10480.9323
Table 5. Computational complexity of different fuzzy clustering algorithms.
Table 5. Computational complexity of different fuzzy clustering algorithms.
AlgorithmComputational Complexity
ARFCM O ( n × c × w × t )
FLICMLNLI O ( n × w 2 × w 3 + c × n × w 2 × t )
FCM_VMF O ( n × w + n × c × t )
DSFCM_N O ( c × n × w × t )
KWFLICM O ( n × w + w × n × c × t )
PFLSCM O ( n × w 2 + n × c × w 2 × t )
FCM_SICM O ( n × w × log 2 ( n × w ) + n + n × c × t )
FSC_LNML O ( n × w 2 + n × w + n × c × t )
FLICM O ( n × c × t × w )
HLICM O ( n × c × t × w )
KWHLICM O ( n × w + w × n × c × t )
IHLICM O ( n × w + w × n × c × t )
Table 6. Evaluation indexes of HLICM varying with neighborhood size for noisy images.
Table 6. Evaluation indexes of HLICM varying with neighborhood size for noisy images.
ImageWindow SizeAccSenJaccardPSNRSAKappamIoUDICE
Figure 3b +
GN(0, 0.1)
3 × 30.78460.6770.511710.58610.6770.503917.05610.677
5 × 50.75610.63420.46439.22840.63420.439415.47710.6342
7 × 70.71390.57080.39947.65560.57080.344613.31230.5708
9 × 90.57830.36750.22514.39190.36750.04737.50360.3675
11 × 110.5850.37740.23264.64630.37740.06067.7540.3774
Figure 5d +
SPN(0.3)
3 × 30.77940.66910.502810.44150.66910.458616.75910.6691
5 × 50.74490.61730.44659.43340.61730.376414.8820.6173
7 × 70.73880.60820.4378.82330.60820.352614.56540.6082
9 × 90.72620.58930.41778.2370.58930.332513.92490.5893
11 × 110.69840.54760.37717.30940.54760.276112.56870.5476
Figure 7b +
SN(0.2)
3 × 30.87070.80610.675111.96140.80610.557922.50420.8061
5 × 50.92210.88320.790813.33590.88320.772826.35990.8832
7 × 70.88490.82740.705611.91990.82740.673923.51930.8274
9 × 90.84750.77120.627610.80840.77120.580520.92120.7712
11 × 110.81070.7160.55779.82150.7160.491318.58910.716
Figure 9a +
RN(80)
3 × 30.77090.65640.48859.35210.65640.421216.28410.6564
5 × 50.72160.58240.41098.80480.58240.311513.69550.5824
7 × 70.71350.57030.39899.23740.57030.274913.29590.5703
9 × 90.68840.53250.36298.57150.53250.231712.09620.5325
11 × 110.66710.50070.3348.51470.50070.177511.13240.5007
Table 7. Evaluation indexes of IHLICM varying with neighborhood size for noisy images.
Table 7. Evaluation indexes of IHLICM varying with neighborhood size for noisy images.
ImageWindow SizeAccSenJaccardPSNRSAKappamIoUDICE
Figure 3b +
GN(0.1)
3 × 30.99240.98860.977421.93860.98860.982832.580.9886
5 × 50.98640.97960.960119.50930.97960.969432.00340.9796
7 × 70.98090.97130.944217.940.97130.956931.47230.9713
9 × 90.97550.96320.929116.83680.96320.944830.96870.9632
11 × 110.96960.95440.912815.93710.95440.931630.42540.9544
Figure 5d +
SPN(0.3)
3 × 30.88840.83260.713213.44790.83260.715623.77230.8326
5 × 50.85780.78670.648412.14510.78670.638421.61430.7867
7 × 70.83990.75990.612811.2860.75990.593620.42540.7599
9 × 90.81890.72830.572710.21770.72830.537219.09150.7283
11 × 110.79870.69810.53629.56460.69810.482617.87230.6981
Figure 7b +
SN(0.2)
3 × 30.89330.840.724113.86060.840.649624.13770.84
5 × 50.88510.82770.70613.49430.82770.60423.53480.8277
7 × 70.87240.80860.678713.03830.80860.5522.62220.8086
9 × 90.86840.80260.670312.90380.80260.531822.34410.8026
11 × 110.8650.79750.663212.79630.79750.516622.10820.7975
Figure 9a +
RN(80)
3 × 30.93880.90820.831816.39740.90820.84827.72610.9082
5 × 50.93090.89640.812315.86960.89640.825827.07550.8964
7 × 70.91720.87580.779115.08260.87580.789225.970.8758
9 × 90.90140.85210.742414.33480.85210.746224.7460.8521
11 × 110.88440.82660.704513.6520.82660.69923.4830.8266
Table 8. Evaluation indexes of KWHLICM varying with window size for noisy images.
Table 8. Evaluation indexes of KWHLICM varying with window size for noisy images.
ImageWindow SizeAccSenJaccardPSNRSAKappamIoUDICE
Figure 3b +
GN(0.1)
3 × 30.9990.99850.99729.69050.99850.997733.23180.9985
5 × 50.99840.99770.995327.72190.99770.996533.17810.9977
7 × 70.99770.99650.99326.2270.99650.994733.10120.9965
9 × 90.99720.99570.991525.26580.99570.993633.05070.9957
11 × 110.99620.99430.988624.04460.99430.991432.9530.9943
Figure 5d +
SPN(0.3)
3 × 30.88450.82670.704613.4230.82670.718523.48730.8267
5 × 50.88540.8280.706513.53330.8280.718423.55140.828
7 × 70.87680.81520.68813.21820.81520.698122.93350.8152
9 × 90.86620.79940.665812.85180.79940.673722.19230.7994
11 × 110.85570.78350.644112.50090.78350.649521.46890.7835
Figure 7b +
SN(0.2)
3 × 30.95740.93610.879918.06120.93610.877229.32990.9361
5 × 50.97680.96520.932720.57070.96520.929831.09010.9652
7 × 70.96710.95070.90618.99880.95070.899630.2010.9507
9 × 90.960.93990.886718.11370.93990.876829.55630.9399
11 × 110.95420.93140.871517.51490.93140.858429.05160.9314
Figure 9a +
RN(80)
3 × 30.9320.8980.814915.85570.8980.83827.16490.898
5 × 50.94940.92410.858917.14820.92410.878328.63160.9241
7 × 70.94790.92190.855117.00720.92190.874528.50480.9219
9 × 90.94140.91210.838416.47420.91210.858827.94690.9121
11 × 110.93140.89710.813315.7570.89710.835227.11130.8971
Table 9. Iteration number of different algorithms for noisy images.
Table 9. Iteration number of different algorithms for noisy images.
Algorithm#36046
+SPN(0.3)
#36046
+GN(0.1)
#37 no
+RN(80)
Buildings18
+SN(0.2)
ARFCM10039100100
FLICMLNLI20202020
FCM_VMF33404145
DSFCM_N695387190608
KWFLICM42434232
PFLSCM5555
FCM_SICM22311724
FSC_LNMI61653175
FLICM38393729
HLICM4444
IHLICM69727971
KWHLICM69727971
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, C.; Zhou, S. Robust Harmonic Fuzzy Partition Local Information C-Means Clustering for Image Segmentation. Symmetry 2024, 16, 1370. https://doi.org/10.3390/sym16101370

AMA Style

Wu C, Zhou S. Robust Harmonic Fuzzy Partition Local Information C-Means Clustering for Image Segmentation. Symmetry. 2024; 16(10):1370. https://doi.org/10.3390/sym16101370

Chicago/Turabian Style

Wu, Chengmao, and Siyu Zhou. 2024. "Robust Harmonic Fuzzy Partition Local Information C-Means Clustering for Image Segmentation" Symmetry 16, no. 10: 1370. https://doi.org/10.3390/sym16101370

APA Style

Wu, C., & Zhou, S. (2024). Robust Harmonic Fuzzy Partition Local Information C-Means Clustering for Image Segmentation. Symmetry, 16(10), 1370. https://doi.org/10.3390/sym16101370

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop