Next Article in Journal
Models for Tree Taper Form: The Gompertz and Vasicek Diffusion Processes Framework
Previous Article in Journal
PointNet++ and Three Layers of Features Fusion for Occlusion Three-Dimensional Ear Recognition Based on One Sample per Person
 
 
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Interval Intuitionistic Fuzzy Clustering Algorithm Based on Symmetric Information Entropy

1
Institute of Big Data for Agriculture and Forestry, Fujian Agriculture and Forestry University, Fuzhou 350002, China
2
College of Computer and Information Sciences, Fujian Agriculture and Forestry University, Fuzhou 350002, China
3
Beijing Intelligent Logistics System Collaborative Innovation Center, Beijing Wuzi University, Beijing 101149, China
*
Author to whom correspondence should be addressed.
Symmetry 2020, 12(1), 79; https://doi.org/10.3390/sym12010079
Received: 19 November 2019 / Revised: 19 December 2019 / Accepted: 31 December 2019 / Published: 2 January 2020

Abstract

:
Based on the continuous optimal aggregation operator, a novel distance measure is proposed to deal with interval intuitionistic fuzzy clustering problems. The optimal ordered weighted intuitionistic fuzzy quasi-averaging (OOWIFQ) operator and the continuous OOWIFQ operator are presented to aggregate all the values in an interval intuitionistic fuzzy number. Some of their desirable properties are also studied. The OOWIFQ operator can describe the fuzzy state of things more realistically and present the fuzzy properties more accurately. The opinions of experts are very important, the OOWIFQ operators take expert preferences into account to reduce systematic errors. Considering the hesitation of things and avoiding distortion of information, we put forward the distance measure for interval intuitionistic fuzzy numbers by using symmetric information entropy. Based on the continuous OOWIFQ operator and proposed distance measure, a new interval intuitionistic fuzzy clustering (IIFC) algorithm is proposed. The application in soil clustering shows the validity and practicability of the IIFC algorithm.

1. Introduction

Clustering is an unsupervised classification and an important way for people to understand society and nature. In recent decades, clustering has attracted extensive attention and played an important role in many fields, such as pattern recognition [1,2,3,4,5], image segmentation [6,7,8,9,10], data mining [11,12,13], etc. The traditional clustering algorithm can be divided into classification clustering method [14,15], hierarchical clustering method [16,17], density clustering method [18,19,20], grid clustering method [21,22,23,24], and model clustering method [25,26]. Traditional clustering algorithms assume that any data vector belongs to only one class. This approach can naturally cluster compact and well separated data sets. However, clusters often overlap, and some data vector parts belong to more than one cluster.
In the real world, the boundaries between many objective things are often vague. It is bound to be accompanied by ambiguity when categorizing things, which leads to fuzzy clustering analysis. Ruspini [27] put forward the concept of fuzzy partition. Many scholars proposed a variety of clustering methods, such as the method based on similarity relation [28] and fuzzy relation [29,30], the transitive closure method based on fuzzy equivalence relation [31,32], the maximum tree method based on fuzzy graph theory [33] and dynamic programming [34].
In recent years, considering the degree of hesitation between things, the research of intuitionistic fuzzy sets and interval intuitionistic fuzzy sets have become a hot topic. The facts show that intuitionistic fuzzy sets and interval intuitionistic fuzzy sets can describe and portray the ambiguous nature of the objective world more delicately. However, most of the interval intuitionistic fuzzy clustering algorithms only consider a certain value in the interval, which makes the information missing and distorted. In addition, in the clustering algorithm the preference of decision maker is not taken into account, which may easily cause the result to be inconsistent with the expected result. From the current literatures, there are a few researches on the interval intuitionistic fuzzy set clustering algorithm, and their research have important practical significance. The proposed distance measure considers all the information in the continuous interval. The distortion and loss of information are avoided. Aiming at the shortcomings of existing algorithms and combining with the distance measure based on symmetric information entropy, an interval intuitionistic fuzzy clustering algorithm with preference is proposed. The algorithm considers not only the preference of the decision maker, but also all the values of the interval. The algorithm is applied to the soil clustering to provide guidance for scientific fertilization.
The rest of this paper is organized as follows. Section 2 introduces some basic knowledge of intuitionistic fuzzy sets and relevant aggregation operators. In Section 3, a continuous optimal aggregation operator based on Chi-squared deviation is proposed. In Section 4, the new distance measure based on symmetric information entropy is proposed. The intuitionistic fuzzy clustering (IIFC) algorithm and its application are analyzed in Section 5 and Section 6, respectively. This work is concluded in the last section.

2. Preliminaries

2.1. Some Basic Concepts of Intuitionistic Fuzzy Set

Intuitionistic fuzzy set theory was developed from fuzzy set theory by Atanassov [35]. Intuitionistic fuzzy set considers membership, non-membership, and hesitation of input data. Therefore, in practical applications, intuitionistic fuzzy set has greater power to represent fuzzy and uncertain information than fuzzy set. In the following, we will briefly review some basic concepts of intuitionistic fuzzy set and introduce the distance measure of intuitionistic fuzzy sets.
Definition 1.
Let X be a fixed set. A fuzzy set α ˜ is defined as [35]:
α ˜ = { x , u α ( x ) , v α ( x ) | x X }
where the functions u α ( x ) : X [ 0 , 1 ] and v α ( x ) : X [ 0 , 1 ] indicate the degree of membership and degree of non-membership respectively. For any x X , 0 μ α ( x ) + ν α ( x ) 1 .
The third parameter of the intuitionistic fuzzy set α ˜ is π α ( x ) , called the hesitancy degree of intuitionistic fuzzy set α , where π α ( x ) = 1 u α ( x ) v α ( x ) . Evidently, we can obtain 0 π α ( x ) 1 . When π α ( x ) = 0 , we have μ α ( x ) + ν α ( x ) = 1 , the intuitionistic fuzzy set becomes the traditional fuzzy set. For convenience, α ˜ = ( u α , v α ) is called an intuitionistic fuzzy number, where u α [ 0 , 1 ] , v α [ 0 , 1 ] and u α + v α 1 .
Definition 2.
Let X be a finite set. An interval intuitionistic fuzzy set A ˜ on X is expressed as [36]:
A ˜ = { x , [ u A ( x ) , u A + ( x ) ] , [ v A ( x ) , v A + ( x ) ] | x X }
where u A ( x ) 0 and v A ( x ) 0 . For any x X on A satisfies the following condition: u A + ( x ) + v A + ( x ) 1 . For convenience, ( [ u A , u A + ] , [ v A , v A + ] ) is called an interval intuitionistic fuzzy number, where u A , v A 0 and u A + + v A + 1 .
Definition 3.
For any interval intuitionistic fuzzy numbers a ˜ = ( [ u A , u A + ] , [ v A , v A + ] ) and b ˜ = ( [ u B , u B + ] , [ v B , v B + ] ) , the relationship between a ˜ and b ˜ is defined as follows [36]:
(a) 
If u A u B , u A + u B + , v A v B and v A + v B + , then a ˜ b ˜ .
(b) 
If u A = u B , u A + = u B + , v A = v B and v A + = v B + , then a ˜ = b ˜ .

2.2. Continuous Aggregation Operators

On a continuous interval, Yager [37] proposed a continuous ordered weighted average operator (C-OWA) based on the OWA operator as follows:
Definition 4.
A C-OWA operator is a mapping F : Γ R + associated with a basic unit-interval monotonic function Q, such that
F Q [ a , b ] = 0 1 [ b x ( b a ) ] d Q ( x )
where Γ is the set of all positive interval numbers, and R + is the positive real number set. Q is the basic unit-interval monotonic (BUM) function, and satisfies Q ( 0 ) = 0 , Q ( 1 ) = 1 and Q ( x ) Q ( y ) for any x y , x , y [ 0 , 1 ] .
Based on the idea of geometric mean, Yager and Xu [38] proposed the continuous ordered weighted geometric (C-OWG) operator:
Definition 5.
A C-OWG operator is a mapping G : Γ R + associated with a BUM function Q , satisfying
G Q ( [ a , b ] ) = b ( a b ) 0 1 ( d Q ( x ) d x x d x )
In 2008, Chen [39] et al. proposed the following continuous ordered weighted harmonic (C-OWH) operators based on the harmonic mean and the C-OWA operator:
Definition 6.
A C-OWH operator is a mapping H : Γ R + associated with a BUM function Q , such that
H Q ( [ a , b ] ) = ( 0 1 d Q ( x ) d x [ 1 b + x ( 1 a 1 b ) ] d x ) 1
In addition, on the continuous interval number, Liu et al. [40] proposed the continuous quasi-ordered weighted averaging (C-QOWA) operators.
Definition 7.
A C-QOWA operator is a mapping L : Γ R + associated with the BUM function Q , such that
L Q ( [ a , b ] ) = f 1 { 0 1 d Q ( x ) d x [ f ( b ) x ( f ( b ) f ( a ) ) ] d x }
where f is a continuous strictly monotonic function on [ a , b ] .

3. Continuous OOWIFQ Operator for Aggregating Interval Intuitionistic Fuzzy Numbers Based on Chi-Squared Deviation

Let α ˜ 1 = ( u 1 , v 1 ) , α ˜ 2 = ( u 2 , v 2 ) , , α ˜ n = ( u n , v n ) be n intuitionistic fuzzy numbers. If the result of aggregation is ( u , v ) and W = ( w 1 , w 2 , w n ) is a weight vector, the total deviation between u and u 1 , u 2 , , u n should be as small as possible, and the total deviation between v and ν 1 , ν 2 , , ν n should also be small. Therefore, the optimization model is constructed as follows:
min J 1 , min J 2 s . t { J 1 = i = 1 n w i ( f ( u i ) f ( u ) 1 ) 2 f ( u i ) f ( u ) J 2 = i = 1 n w i ( f ( v i ) f ( v ) 1 ) 2 f ( v i ) f ( v )
where f is strictly monotonic function.
From the perspective of fairness principle, assuming that decision-makers have no additional preference for the above model, the following optimization model is constructed:
min J = λ J 1 + ( 1 λ ) J 2 = i = 1 n w i ( λ ( f ( u i ) f ( u ) + f ( u ) f ( u i ) ) + ( 1 λ ) ( f ( v i ) f ( v ) + f ( v ) f ( v i ) ) 2 )
Take the partial derivatives of J with respect to u and v , respectively, and we have
J u = λ i = 1 n w i ( f ( u i ) f 2 ( u ) f ( u ) + f ( u ) f ( u i ) )
J v = ( 1 λ ) i = 1 n w i ( f ( v i ) f 2 ( v ) f ( v ) + f ( v ) f ( v i ) )
Since f is strictly monotonic function, let J u = 0 and J v = 0 , we have
u = f 1 ( ( i = 1 n w i f ( u i ) i = 1 n w i f ( u i ) ) 1 2 )
v = f 1 ( ( i = 1 m w i f ( v i ) i = 1 m w i f ( v i ) ) 1 2 )
Based on the ordered weighted average operator, sorting u and v in descending order, the optimal ordered weighted intuitionistic fuzzy quasi-averaging (OOWIFQ) operator based on the Chi-squared deviation is derived as follows:
Definition 8.
Let α ˜ i = ( u i , v i ) be an intuitionistic fuzzy number and Χ ˜ be a set of all intuitionistic fuzzy numbers. An OOWIFQ operator is a mapping OOWIFQ : Χ ˜ n Χ ˜ , defined by an associated weighting vector w = ( w 1 , w 2 , , w n ) , such that
OOWIFQ ( α ˜ 1 , α ˜ 2 , , α ˜ n ) = ( f 1 ( ( i = 1 n w i f ( u σ i ) i = 1 n w i f ( u σ i ) ) 1 2 ) , f 1 ( ( i = 1 m w i f ( v ε i ) i = 1 m w i f ( v ε i ) ) 1 2 ) )
where u σ i u σ i + 1 , v ε i v ε i + 1 is true for any i = 1 , 2 , , n .
In real life, data tends to be continuous, but not discrete. Intuitionistic fuzzy number considers a value in a continuous interval, which may lose some important information and make the clustering result deviated. To overcome this shortcoming, information fusion is needed for all values on the continuous interval. Therefore, we propose a continuous OOWIFQ (C-OOWIFQ) operator based on OOWIFQ operator.
Definition 9.
Let β ˜ = ( [ u , u + ] , [ v , v + ] ) be an interval intuitionistic fuzzy number. A C-OOWIFQ operator is a mapping C : I ˜ Χ ˜ associated with a BUM function Q , satisfying
C O f , Q ( [ u , u + ] , [ v , v + ] ) = ( f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) , f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) )
where I ˜ be a set of all interval intuitionistic fuzzy numbers.
Next, we will introduce the specific derivation process of Equation (13):
Let δ = f ( u + ) f ( u ) n , η = f ( v + ) f ( v ) m , u j = f 1 ( f ( u + ) j δ ) and v i = f 1 ( f ( v ) + i η ) , for all i = 1 , 2 , , m , j = 1 , 2 , , n .
(a).
When f is strictly monotonic increasing, then δ 0 , η 0 holds. As a result, we have
f ( u + ) j δ f ( u + ) ( j 1 ) δ ,   f ( v ) + ( i 1 ) η f ( v ) i η .
Since f 1 is also strictly monotonic increasing, we get
u j = f 1 ( f ( u + ) j δ ) f 1 ( f ( u + ) ( j 1 ) δ ) = u j 1
v i 1 = f 1 ( f ( v ) + ( i 1 ) η ) f 1 ( f ( v 1 ) + i η ) = v i
(b).
On the contrary, when f is strictly monotonic decreasing, then δ 0 , η 0 holds. It follows that
f ( u + ) j δ f ( u + ) ( j 1 ) δ ,   f ( v ) + ( i 1 ) η f ( v ) i η .
which can be further expressed as
u j = f 1 ( f ( u + ) j δ ) f 1 ( f ( u + ) ( j 1 ) δ ) = u j 1
v i 1 = f 1 ( f ( v ) + ( i 1 ) η ) f 1 ( f ( v 1 ) + i η ) = v i
According to Equation (11), the approximation of C f , Q ( [ u , u + ] , [ v , v + ] ) can be derived as follows:
C O f , Q ( [ u , u + ] , [ v , v + ] ) ( f 1 ( ( j = 1 n w j f ( u σ ( j ) ) j = 1 n w j f ( u σ ( j ) ) ) 1 2 ) ) , f 1 ( ( i = 1 m w j f ( v ε i ) i = 1 m w i f ( v ε i ) ) 1 2 ) = ( f 1 ( ( j = 1 n w j f ( u j ) j = 1 n w j f ( u j ) ) 1 2 ) ) , f 1 ( ( i = 1 m w i f ( v i ) i = 1 m w i f ( v i ) ) 1 2 ) )
where w j and w i are the associated weights of the ordered weighted average operator. Using the BUM function, we can get the associated weights w j and w i as [41]
w j = Q ( j n ) Q ( j 1 n ) , j = 1 , 2 , , n .
w i = Q ( i m ) Q ( i 1 m ) , i = 1 , 2 , , m .
where j = 1 n w j = 1 and i = 1 m w i = 1 . Accordingly, we can write C O f , Q as
C O f , Q ( [ u , u + ] , [ v , v + ] ) ( f 1 ( ( j = 1 n ( Q ( j n ) Q ( j 1 n ) ) f ( u j ) j = 1 n ( Q ( j n ) Q ( j 1 n ) ) f ( u j ) ) 1 2 ) , f 1 ( ( i = 1 m ( Q ( i m ) Q ( i 1 m ) ) f ( v i ) i = 1 m ( Q ( i m ) Q ( i 1 m ) ) f ( v i ) ) 1 2 ) )
Let Δ u = 1 n , Δ v = 1 m , and we have
C O f , Q ( [ u , u + ] , [ v , v + ] ) ( f 1 ( ( j = 1 n ( Q ( j Δ u ) Q ( j Δ u Δ u ) ) ( f ( u + ) ( f ( u + ) f ( u ) ) j Δ u ) j = 1 n Q ( j Δ u ) Q ( j Δ u Δ u ) f ( u + ) ( f ( u + ) f ( u ) ) j Δ u ) 1 2 ) , f 1 ( ( i = 1 m ( Q ( i Δ v ) Q ( i Δ v Δ v ) ) ( f ( v + ) + ( f ( v + ) f ( v ) ) i Δ v ) i = 1 m Q ( i Δ v ) Q ( i Δ v Δ v ) f ( v ) + ( f ( v + ) f ( v ) ) i Δ v ) 1 2 ) )
Finally, let n , m , denoted as u = j Δ u , v = i Δ v . For all i = 1 , 2 , m , j = 1 , 2 , n , we can get u [ 0 , 1 ] , v [ 0 , 1 ] and
C O f , Q ( [ u , u + ] , [ v , v + ] ) = ( f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) , f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) )
Denoting u ˜ = [ u , u + ] and v ˜ = [ v , v + ] , for convenience, we abbreviate Equation (21) as:
C O f , Q ( [ u , u + ] , [ v , v + ] ) = ( C _ f , Q ( u ˜ ) , C ¯ f , Q ( v ˜ ) )
The C-OOWIFQ operator has the following properties, which are proved as follows:
Property 1.
For all strictly monotonic continuous function f and BUM function Q , ( u , v + ) C O f , Q ( [ u , u + ] , [ v , v + ] ) ( u + , v ) holds.
Proof. 
Let us consider different cases of function f :
(a)
If f is strictly monotonic increasing, then f ( u )   f ( u + ) . Thus, for all u [ 0 , 1 ] , we have f ( u )   f ( u + ) ( f ( u + ) f ( u ) ) u f ( u + ) . It follows that
f ( u ) 0 1 d Q ( u ) d u d u 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u f ( u + ) 0 1 d Q ( u ) d u d u
and
1 f ( u + ) 0 1 d Q ( u ) d u d u 0 1 d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u d u 1 f ( u ) 0 1 d Q ( u ) d u d u .
By considering
0 1 d Q ( u ) d u d u = Q ( 1 ) Q ( 0 ) = 1 ,
the above two inequalities can be further writhen as
f ( u ) 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u f ( u + )
and
1 f ( u + )   0 1 d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u d u 1 f ( u ) .
Accordingly, we have
( f ( u ) ) 2 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ( f ( u + ) ) 2 .
Since f 1 is also strictly monotonic increasing, it holds that
u f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) u + .
Similarly, for any v [ 0 , 1 ] , it can be proved that
v f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) v + .
(b)
If f is strictly monotonic decreasing, then f ( u )   f ( u + ) . Thus, for all u [ 0 , 1 ] , we have f ( u + )   f ( u + ) ( f ( u + ) f ( u ) ) u f ( u ) . Similarly, we have
f ( u + ) 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u f ( u )
and
1 f ( u )   0 1 d Q ( u ) d u f ( u + ) [ f ( u + ) f ( u ) ] u d u 1 f ( u + ) .
Because f 1 is strictly monotonic decreasing, so
u f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) u + .
Similarly, for any v [ 0 , 1 ] , we have
v f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) v + .
Therefore, the Property 1 is proved. □
Property 2.
For any BUM function Q , if u 1 u 2 , u 1 + u 2 + and v 1 v 2 , v 1 + v 2 + , then C O f , Q ( [ u 1 , u 1 + ] , [ v 1 , v 1 + ] ) C O f , Q ( [ u 2 , u 2 + ] , [ v 2 , v 2 + ] ) .
Proof. 
(a)
When f function is strictly monotonic increasing, for any u [ 0 , 1 ] , we have
f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u .
It follows that
0 1 d Q ( u ) d u ( f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 d Q ( u ) d u ( f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) 0 u ) d u
Since f 1 is also strictly monotonic increasing, so there is
f 1 ( ( 0 1 d Q ( u ) d u ( f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u ) 1 2 ) f 1 ( ( 0 1 d Q ( u ) d u ( f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u ) 1 2 )
By similar proof processes, we can obtain
f 1 ( ( 0 1 d Q ( v ) d v ( f ( v 1 + ) + ( f ( v 1 + ) f ( v 1 ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v 1 + ) + ( f ( v 1 + ) f ( v 1 ) ) v ) d v ) 1 2 ) f 1 ( ( 0 1 d Q ( v ) d v ( f ( v 2 + ) + ( f ( v 2 + ) f ( v 2 ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v 2 + ) + ( f ( v 2 + ) f ( v 2 ) ) v ) d v ) 1 2 )
(b)
When f function is strictly monotonic decreasing, for any u [ 0 , 1 ] , we have
f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u .
Obviously, it holds that
0 1 d Q ( u ) d u ( f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 d Q ( u ) d u ( f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u
Because f 1 is also strictly monotonic decreasing, we get
f 1 ( ( 0 1 d Q ( u ) d u ( f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 1 + ) ( f ( u 1 + ) f ( u 1 ) ) u ) d u ) 1 2 ) f 1 ( ( 0 1 d Q ( u ) d u ( f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u 2 + ) ( f ( u 2 + ) f ( u 2 ) ) u ) d u ) 1 2 )
By similar proof process, we can also get
f 1 ( ( 0 1 d Q ( v ) d v ( f ( v 1 + ) + ( f ( v 1 + ) f ( v 1 ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v 1 + ) + ( f ( v 1 + ) f ( v 1 ) ) v ) d v ) 1 2 ) f 1 ( ( 0 1 d Q ( v ) d v ( f ( v 2 + ) + ( f ( v 2 + ) f ( v 2 ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v 2 + ) + ( f ( v 2 + ) f ( v 2 ) ) v ) d v ) 1 2 )
From the above analysis, we have C O f , Q ( [ u 1 , u 1 + ] , [ v 1 , v 1 + ] ) C O f , Q ( [ u 2 , u 2 + ] , [ v 2 , v 2 + ] ) . Thus, the property is proved. □
Property 3.
For all BUM function Q , if u = u + = a , v = v + = b , then C O f , Q ( [ u , u + ] , [ v , v + ] ) = ( a , b ) .
Proof. 
Since u = u + = a , v = v + = b and 0 1 d Q ( u ) d u d u = 1 , 0 1 d Q ( v ) d v d v = 1 (for all u , v [ 0 , 1 ] ), we can get
0 1 d Q ( u ) d x ( f ( u + ) ( f ( u + ) f ( u ) ) x ) d x 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) x ) d x = 0 1 d Q ( u ) d x f ( a ) d x 0 1 ( d Q ( u ) d u f ( a ) ) d x = f ( a ) 0 1 d Q ( u ) d x d x 1 f ( a ) 0 1 ( d Q ( u ) d u ) d x = ( f ( a ) ) 2 ,
In very similar way, we can also get
0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v = ( f ( b ) ) 2 .
So, we have C O f , Q ( [ u , u + ] , [ v , v + ] ) = ( f 1 ( ( f ( a ) ) 2 ) 1 2 , f 1 ( ( f ( b ) ) 2 ) 1 2 ) = ( a , b ) . □
Property 4.
If Q 1 ( u ) Q 2 ( u ) , Q 1 ( v ) Q 2 ( v ) for all u , v [ 0 , 1 ] , then C O f , Q 1 ( [ u , u + ] , [ v , v + ] ) C O f , Q 2 ( [ u , u + ] , [ v , v + ] ) .
Proof. 
For all BUM function Q , we have:
0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u = 0 1 ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d Q ( u ) = ( f ( u + ) ( f ( u + ) f ( u ) ) u ) Q ( u ) | 0 1 + ( f ( u + ) f ( u ) ) 0 1 Q ( u ) d u = f ( u ) + ( f ( u + ) f ( u ) ) 0 1 Q ( u ) d u .
In a similar process, the following equation can be obtained
0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v = f ( v + ) ( f ( v + ) f ( v ) ) 0 1 Q ( v ) d v .
Moreover, we have
0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u = 0 1 ( 1 f ( u + ) ( f ( u + ) f ( u ) ) u ) d Q ( u ) = Q ( u ) f ( u + ) ( f ( u + ) f ( u ) ) u | 0 1 0 1 ( f ( u + ) f ( u ) ( f ( u + ) ( f ( u + ) f ( u ) ) u ) 2 ) Q ( u ) d u = 1 f ( u ) 0 1 ( f ( u + ) f ( u ) ( f ( u + ) ( f ( u + ) f ( u ) ) u ) 2 ) Q ( u ) d u
and
0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v = 1 f ( v + ) + 0 1 ( f ( v + ) f ( v ) ( f ( v ) + ( f ( v + ) f ( v ) ) v ) 2 ) Q ( v ) d v .
(a)
when f function is strictly monotonic increasing, then f ( u )   f ( u + ) .
Since Q 1 ( u ) Q 2 ( u ) for all u [ 0 , 1 ] , we have 0 1 Q 1 ( u ) d u 0 1 Q 2 ( u ) d u . It holds that
f ( u ) + ( f ( u + ) f ( u ) ) 0 1 Q 1 ( u ) d u f ( u ) + ( f ( u + ) f ( u ) ) 0 1 Q 2 ( u ) d u .
Therefore, we have
1 f ( u ) 0 1 ( f ( u + ) f ( u ) ( f ( u + ) ( f ( u + ) f ( u ) ) u ) 2 ) Q 1 ( u ) d u 1 f ( u ) 0 1 ( f ( u + ) f ( u ) ( f ( u + ) ( f ( u + ) f ( u ) ) u ) 2 ) Q 2 ( u ) d u ,
which can be written as
0 1 d Q 1 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 d Q 2 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u
and
0 1 ( d Q 1 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 2 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u .
Because f 1 is strictly monotonic increasing, so we have
f 1 ( ( 0 1 d Q 1 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 1 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) f 1 ( ( 0 1 d Q 2 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 2 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) .
In a similar process, we get
f 1 ( ( 0 1 d Q 1 ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q 1 ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) f 1 ( ( 0 1 d Q 2 ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q 2 ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) .
Thus, we have C O f , Q 1 ( [ u , u + ] , [ v , v + ] ) C O f , Q 2 ( [ u , u + ] , [ v , v + ] ) .
(b)
when f function is strictly monotonic decreasing, then f ( u )   f ( u + ) . Since Q 1 ( x ) Q 2 ( u ) 0 , we have
0 1 d Q 1 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 d Q 2 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ,
0 1 ( d Q 1 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 2 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u .
Because f 1 is also strictly monotonic decreasing, we get
f 1 ( ( 0 1 d Q 1 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 1 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) f 1 ( ( 0 1 d Q 2 ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q 2 ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) .
Similarly, we have
f 1 ( ( 0 1 d Q 1 ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q 1 ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) f 1 ( ( 0 1 d Q 2 ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q 2 ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) .
Therefore, we have C O f , Q 1 ( [ u , u + ] , [ v , v + ] ) C O f , Q 2 ( [ u , u + ] , [ v , v + ] ) .
In summary, Property 4 is proved. □
Through the above proof, the C-OOWIFQ operator has some desirable properties such as boundedness, monotonicity, identity and monotonicity about the function Q. The C-OOWIFQ operator can aggregate all points in a closed interval, and can also take into account the preferences of experts, so that the clustering of things by the C-OOWIFQ operator can be analyzed more comprehensively and effectively. The proposed IIFC algorithm improves the reliability of analysis results.
Theorem 1.
Let ( [ u , u + ] , [ v , v + ] ) be an interval intuitionistic fuzzy number. For any strictly monotonic function f and BUM function Q , C O f , Q ( [ u , u + ] , [ v , v + ] ) is an intuitionistic fuzzy number.
Proof. 
It is easy to know that u , v 0 , u + + v + 1 , by Property 1 we have
0 u f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) u + ,
0 v f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) v + .
It follows that
f 1 ( ( 0 1 d Q ( u ) d u ( f ( u + ) ( f ( u + ) f ( u ) ) u ) d u 0 1 ( d Q ( u ) d u f ( u + ) ( f ( u + ) f ( u ) ) u ) d u ) 1 2 ) + f 1 ( ( 0 1 d Q ( v ) d v ( f ( v ) + ( f ( v + ) f ( v ) ) v ) d v 0 1 ( d Q ( v ) d v f ( v ) + ( f ( v + ) f ( v ) ) v ) d v ) 1 2 ) u + + v + 1
Therefore, C O f , Q ( [ u , u + ] , [ v , v + ] ) is an intuitionistic fuzzy number. □

4. Distance Measure of Interval Intuitionistic Fuzzy Numbers Based on Symmetric Information Entropy

Interval intuitionistic fuzzy number describes the uncertainty under the fuzzy setting. Since entropy can effectively measure uncertainty, we propose an interval intuitionistic fuzzy distance measure based on symmetric information entropy, which can be shown as follows:
Theorem 2.
Let a ˜ = ( [ u A , u A + ] , [ v A , v A + ] ) and b ˜ = ( [ u B , u B + ] , [ v B , v B + ] ) be two interval intuitionistic fuzzy numbers, then the distance measure between a ˜ and b ˜ is:
d ( a ˜ , b ˜ ) = 1 2 ln 2 ( | C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A C _ f , Q ( u ˜ B ) C _ f , Q ( u ˜ B ) π B | × ln ( | C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A C _ f , Q ( u ˜ B ) C _ f , Q ( u ˜ B ) π B | | C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A C _ f , Q ( u ˜ B ) C _ f , Q ( u ˜ B ) π B | + | C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A C ¯ f , Q ( v ˜ B ) C ¯ f , Q ( v ˜ B ) π B | | C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A C _ f , Q ( u ˜ B ) C _ f , Q ( u ˜ B ) π B | ) + | C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A C ¯ f , Q ( v ˜ B ) C ¯ f , Q ( v ˜ B ) π B | × ln ( | C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A C _ f , Q ( u ˜ B ) C _ f , Q ( u ˜ B ) π B | | C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A C ¯ f , Q ( v ˜ B ) C ¯ f , Q ( v ˜ B ) π B | + | C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A C ¯ f , Q ( v ˜ B ) C ¯ f , Q ( v ˜ B ) π B | | C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A C ¯ f , Q ( v ˜ B ) C ¯ f , Q ( v ˜ B ) π B | ) )
Denoting Δ u ^ = | u ^ A u ^ B | , Δ v ^ = | v ^ A v ^ B | , u ^ A = C _ f , Q ( u ˜ A ) + C _ f , Q ( u ˜ A ) π A ,
v ^ A = C ¯ f , Q ( v ˜ A ) + C ¯ f , Q ( v ˜ A ) π A and π A = 1 C _ f , Q ( u ˜ A ) C _ f , Q ( v ˜ A ) , Equation (22) can be abbreviated to the following formula:
d ( a ˜ , b ˜ ) = 1 2 ln 2 ( Δ u ^ ln Δ u ^ + Δ v ^ Δ u ^ + Δ v ^ ln Δ u ^ + Δ v ^ Δ v ^ )
Theorem 3.
Let a ˜ = ( [ u A , u A + ] , [ v A , v A + ] ) , b ˜ = ( [ u B , u B + ] , [ v B , v B + ] ) and c ˜ = ( [ u C , u C + ] , [ v C , v C + ] ) be three interval intuitionistic fuzzy numbers, the distance measure d must satisfy the following properties:
(a) 
0 d ( a ˜ , b ˜ ) 1 .
(b) 
d ( a ˜ , b ˜ ) = 1 if and only if interval intuitionistic fuzzy numbers a ˜ and b ˜ reduce to a ˜ = ( 1 , 0 ) and b ˜ = ( 0 , 1 ) or a ˜ = ( 0 , 1 ) and b ˜ = ( 1 , 0 ) .
(c) 
d ( a ˜ , b ˜ ) = d ( b ˜ , a ˜ ) .
(d) 
a ˜ b ˜ c ˜ , then d ( a ˜ , b ˜ ) d ( a ˜ , c ˜ ) and d ( b ˜ , c ˜ ) d ( a ˜ , c ˜ ) .
Proof. 
For the proposed distance measure, four properties (a)–(d) should be satisfied, and the proof is as follows:
The partial derivative of Equation (23) is obtained as follows:
d ( a ˜ , b ˜ ) Δ u ^ = 1 2 ln 2 ( ln Δ u ^ + Δ v ^ Δ u ^ + Δ u ^ Δ u ^ Δ u ^ + Δ v ^ Δ u ^ ( Δ u ^ + Δ v ^ ) ( Δ u ^ ) 2 + Δ v ^ Δ v ^ Δ u ^ + Δ v ^ 1 Δ v ^ ) . = 1 2 ln 2 ln ( 1 + Δ v ^ Δ u ^ ) > 0 .
Similarly, we get
d ( a ˜ , b ˜ ) Δ v ^ = 1 2 ln 2 ln ( 1 + Δ u ^ Δ v ^ ) > 0 .
Thus, d ( a ˜ , b ˜ ) is an increasing function of independent variables Δ u ^ and Δ v ^ .
(a)
From Equation (23), we have
d ( a ˜ , b ˜ ) = 1 2 ln 2 ( Δ u ^ ln Δ u ^ + Δ v ^ Δ u ^ + Δ v ^ ln Δ u ^ + Δ v ^ Δ v ^ ) .
Δ u ^ 0 , Δ v ^ 0 , therefore we have Δ u ^ + Δ v ^ Δ u ^ 1 and Δ u ^ + Δ v ^ Δ v ^ 1 . It is clear that d ( a ˜ , b ˜ ) 0 holds. Since d ( a ˜ , b ˜ ) is an increasing function of Δ u ^ and Δ v ^ , d ( a ˜ , b ˜ ) takes the maximum of 1 when Δ u ˜ = Δ v ˜ = 1 .
(b)
the proof of this property can be analyzed by the following two cases.
(i)
d ( a ˜ , b ˜ ) is an increasing function of independent various Δ u ˜ and Δ v ˜ , therefore the maximum value d ( a ˜ , b ˜ ) = 1 is obtained when Δ u ˜ = 1 and Δ v ˜ = 1 . From Δ u ˜ = 1 and Δ v ˜ = 1 , we can get that a ˜ = ( 1 , 0 ) and b ˜ = ( 0 , 1 ) or a ˜ = ( 0 , 1 ) and b ˜ = ( 1 , 0 ) .
(ii)
When a ˜ = ( 1 , 0 ) and b ˜ = ( 0 , 1 ) or a ˜ = ( 0 , 1 ) and b ˜ = ( 1 , 0 ) , Δ u ^ = Δ v ^ = 1 can be brought into Equation (23) to get d ( A , B ) = 1 .
(c)
According to Equation (23), obviously, the symmetry of distance measure is holds.
(d)
When a ˜ b ˜ c ˜ , from Definition 3, we have u A u B u C , u A + u B + u C + , v A v B v C and v A + v B + v C + . Denoting Δ u ^ a ˜ , b ˜ = | u ^ A u ^ B | , Δ v ^ a ˜ , b ˜ = | v ^ A v ^ B | , Δ u ^ a ˜ , c ˜ = | u ^ A u ^ C | and Δ v ^ a ˜ , c ˜ = | v ^ A v ^ C | , respectively. It is clear that Δ u ˜ a ˜ , b ˜ Δ u ˜ a ˜ , c ˜ and Δ v ˜ a ˜ , b ˜ Δ v ˜ a ˜ , c ˜ . Therefore, we get d ( a ˜ , b ˜ ) d ( a ˜ , c ˜ ) . In the same way, d ( b ˜ , c ˜ ) d ( a ˜ , c ˜ ) also holds. □

5. Interval Intuitionistic Fuzzy Clustering Algorithm

For the multiple attribute clustering problem with interval intuitionistic fuzzy information, let m object sets G = { G 1 , G 2 , , G m } and the attribute set H = { H 1 , H 2 , , H n } . In real life, sometimes the membership degree and non-membership degree of attributes for an object cannot be accurately measured. Since multiple measurements often fluctuate in a certain interval, we solve this problem well by using interval intuitionistic fuzzy numbers. Suppose the attribute matrix be A ˜ = ( a ˜ i j ) m × n , where a ˜ i j = ( [ u i j , u i j + ] , [ v i j , v i j + ] )   ( i = 1 , 2 , , m , j = 1 , 2 , , n ) . [ u i j , u i j + ] indicates that the object G i corresponds to the attribute H j with the degree of membership between u i j and u i j + and the non-membership degree between v i j and v i j + .
Based on the C-OOWIFQ operator and the symmetric information entropy based distance measure, we propose an interval intuitionistic fuzzy clustering (IIFC) algorithm as follows. The flow chart of IIFC algorithm is illustrated in Figure 1.
Step 1
Select an strictly monotonic function f ( x ) and BUM function Q ( x ) , and enter the cluster number k and the sample attribute matrix A ˜ = ( a ˜ i j ) m × n , where a ˜ i j = ( [ u i j , u i j + ] , [ v i j , v i j + ] )   ( i = 1 , 2 , , m , j = 1 , 2 , , n ) .
Step 2
b i j = C O f , Q ( [ u i j , u i j + ] , [ v i j , v i j + ] ) is calculated by Equation (12), where B ˜ = ( b i j ) m × n .
Step 3
Select a random sample from B ˜ as the initial clustering center c 1 .
Step 4
Firstly, calculate the shortest distance between each object and the existing clustering center, denoted by D ( x ) ; then calculate the probability D ( x ) 2 x X D ( x ) 2 that each object is selected as the next cluster center. Finally, a cluster center is selected according to the roulette method.
Step 5
Repeat Step 4 until k cluster centers are selected.
Step 6
For each object x i ( x i = { b i 1 , b i 2 , , b i n } ) in B ˜ , its symmetric entropy distance to the k cluster center is calculated by the Equation (23) and divided into the class corresponding to the cluster center with the smallest distance.
Step 7
For each category c i , recalculate its cluster center c i = 1 | c i | x x i x .
Step 8
Repeat Steps 6 and 7 until the position of the cluster center does not change.
Step 9
End.

6. Numerical Example

This article focuses on the theoretical research of the algorithm. The IIFC algorithm can be applied to many fields such as data mining, image segmentation, feature extraction, and soil attribute analysis. This paper verifies the feasibility and effectiveness of the IIFC algorithm through clustering examples. The soil composition can be measured by using conventional soil agrochemical analysis method. Assume that the following experimental sample attributes are selected: total nitrogen (TN), total phosphorus (TP), organic matter (OM), available nitrogen (AN), available phosphorus (AP), and available potassium (AK). Let us suppose that fifteen soil samples are considered here, and the attribute values of these soil samples are given by using interval intuitionistic fuzzy numbers, shown in Table 1.
According to the steps of IIFC algorithm, the specific clustering process is analyzed as follows.
Step 1
Input the sample attribute matrix A ˜ , set f ( x ) = e x , Q ( x ) = x , and the classification number k = 4 .
Step 2
Calculate each b i j with Equation (12), and the calculated result is B ˜ = ( b i j ) m × n .
Step 3
Intuitionistic fuzzy matrix B ˜ is calculated, shown in Table 2.
Step 4
Randomly select a sample from B ˜ as the initial cluster center No. 1.
Step 5
Calculate the shortest distance between each object and the existing clustering center, denoted by D ( x ) ; then calculate the probability that each object is selected as the next cluster center. A cluster center is selected according to the roulette method.
Step 6
Repeat Step 4 until k cluster centers are selected. An initial clustering center is shown in Table 3.
Step 7
For each object x i ( x i = { b i 1 , b i 2 , , b i n } ) in B ˜ , the symmetric entropy distance from the k cluster center is calculated by Equation (22) and divided into the class corresponding to the cluster center with the smallest distance.
Step 8
For each category c i , recalculate its cluster center c i = 1 | c i | x x i x .
After three iterations, the clustering center point no longer changes, and the final cluster center is shown in Table 4.
The sample soil was successfully divided into four categories: (a) very poor 2, 5, 11; (b) poor 4, 6, 12, 15; (c) fertility 1, 3, 8, 10; (d) very fertility 7, 9, 13, 14. The clustering results derived by IIFC algorithm are illustrated in Figure 2. By analyzing the soil composition of the sample, we can scientifically conduct effective fertilization guidance for the sample.

7. Conclusions

In this paper, a new continuous optimal aggregation operator based on Chi-squared deviation is proposed, which can effectively convert interval intuitionistic fuzzy number into intuitionistic fuzzy number. The distance between interval intuitionistic fuzzy numbers is calculated by constructing a new distance measure based on symmetric information entropy. The C-OOWIFQ operator and the distance measure based on symmetric information entropy are applied to deal with soil clustering. The main advantages of this paper are shown as follows:
(1)
Compared with traditional clustering and fuzzy clustering, interval intuitionistic fuzzy clustering describes the fuzzy nature of things more delicately.
(2)
The symmetric information entropy based distance measure considers all the information in the continuous interval. Thus, the distortion and loss of information are avoided, and the result is more accurate and effective.
(3)
The C-OOWIFQ takes into account the preferences of decision makers.
In addition, the IIFC algorithm can effectively solve the problem of soil clustering. In the follow-up study, we will apply the distance measure and symmetric information entropy to pattern recognition, data mining, medical diagnosis and other fields.

Author Contributions

J.L.: conceived, designed, project administration, and wrote the manuscript; G.D.: investigation, methodology; Z.T.: software, investigation. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Natural Science Foundation of China (No. 71601049), the Humanities and Social Sciences Fund of the Ministry of Education (No. 16YJC630064), the Foundation of Beijing Intelligent Logistics System Collaborative Innovation Center (BILSCIC-2019KF-16) and the University Training Program in Scientific Research for Outstanding Young Talents of Fujian Province: cost sharing strategy of incomplete cooperative game and its application in water pollution control.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Horn, D.; Gottlieb, A. Algorithm for data clustering in pattern recognition problems based on quantum mechanics. Phys. Rev. Lett. 2001, 88, 018702. [Google Scholar] [CrossRef] [PubMed][Green Version]
  2. Huntsherger, T.; Jacobs, C.; Cannon, R.L. Iterative fuzzy image segmentation. Pattern Recognit. 1985, 18, 131–138. [Google Scholar] [CrossRef]
  3. Kuang, Z.; Kuh, A. A combined self-organizing feature map and multilayer perceptron for isolated word recognition. IEEE Trans. Signal Process. 1992, 40, 2651–2657. [Google Scholar] [CrossRef]
  4. Minnix, J.I.; McVey, E.S.; Inigo, R.M. A multilayered self-organizing artificial neural network for invariant pattern recognition. IEEE Trans. Knowl. Data Eng. 1992, 4, 162–167. [Google Scholar] [CrossRef]
  5. Cerreto, F.; Nielsen, B.F.; Nielsen, O.A.; Harrod, S.S. Application of data clustering to railway delay pattern recognition. J. Adv. Transp. 2018, 2018. [Google Scholar] [CrossRef][Green Version]
  6. Dhanachandra, N.; Manglem, K.; Chanu, Y.J. Image Segmentation Using K -means Clustering Algorithm and Subtractive Clustering Algorithm. Procedia Comput. Sci. 2015, 54, 764–771. [Google Scholar] [CrossRef][Green Version]
  7. Choy, S.K.; Shu, Y.L.; Yu, K.W.; Lee, W.Y.; Leung, K.T. Fuzzy model-based clustering and its application in image segmentation. Pattern Recognit. 2017, 68, 141–157. [Google Scholar] [CrossRef]
  8. Li, Z.; Chen, J. Superpixel segmentation using Linear Spectral Clustering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015. [Google Scholar]
  9. Pham, T.X.; Siarry, P.; Oulhadj, H. Integrating fuzzy entropy clustering with an improved PSO for MRI brain image segmentation. Appl. Soft Comput. 2018, 65, 230–242. [Google Scholar] [CrossRef]
  10. Huang, H.; Meng, F.; Zhou, S.; Jiang, F.; Manogaran, G. Brain image segmentation based on FCM clustering algorithm and rough set. IEEE Access 2019, 7, 12386–12396. [Google Scholar] [CrossRef]
  11. Dutt, A.; Ismail, M.K.; Mahroeian, H.; Aghabozrgi, S. Clustering Algorithms Applied in Educational Data Mining. Int. J. Inf. Eng. Electron. Bus. 2015, 5. [Google Scholar] [CrossRef][Green Version]
  12. Schubert, E.; Koos, A.; Emrich, T.; Zufle, A.; Schmid, K.A.; Zimek, A. A framework for clustering uncertain data. VLDB Endow. 2015, 8, 1976–1979. [Google Scholar] [CrossRef][Green Version]
  13. Sarkar, B.; Omair, M.; Choi, S.-B. A multi-objective optimization of energy, economic, and carbon emission in a production model under sustainable supply chain management. Appl. Sci. 2018, 8, 1744. [Google Scholar] [CrossRef][Green Version]
  14. Macqueen, J. Some Methods for Classification and Analysis of MultiVariate Observations. In Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, CA, USA, 21 June–18 July 1965. [Google Scholar]
  15. Bezdek, J.C. Pattern Recognition with Fuzzy Objective Function Algorithms. Adv. Appl. Pattern Recognit. 1981, 22, 203–239. [Google Scholar]
  16. Guha, S.; Rastogi, R.; Shim, K.; Guha, S.; Rastogi, R.; Shim, K. CURE: An Efficient Clustering Algorithm for Large Databases. Inf. Syst. 1998, 26, 35–58. [Google Scholar] [CrossRef]
  17. Karypis, G.; Han, E.H.; Kumar, V. Chameleon: Hierarchical Clustering Using Dynamic Modeling. Computer 2002, 32, 68–75. [Google Scholar] [CrossRef][Green Version]
  18. Ester, M.; Kriegel, H.-P.; Sander, J.; Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. In Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, Portland, OR, USA, 2–4 August 1996; pp. 226–231. [Google Scholar]
  19. Ankerst, M.; Breunig, M.M.; Kriegel, H.-P.; Sander, J. OPTICS: Ordering points to identify the clustering structure. In Proceedings of the ACM SIGMOD International Conference on Management of Data, Philadelphia, PA, USA, 1–3 June 1999; pp. 49–60. [Google Scholar]
  20. Roy, S.; Bhattacharyya, D. An approach to find embedded clusters using density based techniques. In Proceedings of the International Conference on Distributed Computing and Internet Technology, Bhubaneswar, India, 22–24 December 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 523–535. [Google Scholar]
  21. Agrawal, R.; Gehrke, J.; Gunopulos, D.; Raghavan, P. Automatic subspace clustering of high dimensional data. Data Min. Knowl. Discov. 2005, 11, 5–33. [Google Scholar] [CrossRef]
  22. Wang, W.; Yang, J.; Muntz, R. STING: A statistical information grid approach to spatial data mining. VLDB 1997, 97, 186–195. [Google Scholar]
  23. Sheikholeslami, G.; Chatterjee, S.; Zhang, A. Wavecluster: A multi-resolution clustering approach for very large spatial databases. In Proceedings of the 24th VLDB Conference, New York, NY, USA, 24–27 August 1998; pp. 428–439. [Google Scholar]
  24. Yanchang, Z.; Junde, S. GDILC: A grid-based density-isoline clustering algorithm. In Proceedings of the 2001 International Conferences on Info-Tech and Info-Net. Proceedings (Cat. No. 01EX479), Beijing, China, 29 October–1 November 2001; pp. 140–145. [Google Scholar]
  25. Fisher, D.H. Improving Inference through Conceptual Clustering. AAAI 1987, 87, 461–465. [Google Scholar]
  26. Gennari, J.H.; Langley, P.; Fisher, D. Models of incremental concept formation. Artif. Intell. 1989, 40, 11–61. [Google Scholar] [CrossRef][Green Version]
  27. Ruspini, E.H. A new approach to clustering. Inf. Control 1969, 15, 22–32. [Google Scholar] [CrossRef][Green Version]
  28. Tamura, S.; Higuchi, S.; Tanaka, K. Pattern classification based on fuzzy relations. IEEE Trans. Syst. Man Cybern. 1971, SMC-1, 61–66. [Google Scholar] [CrossRef][Green Version]
  29. Backer, E.; Jain, A.K. A clustering performance measure based on fuzzy set decomposition. IEEE Trans. Pattern Anal. Mach. Intell. 1981, 3, 66–75. [Google Scholar] [CrossRef] [PubMed]
  30. Zadeh, L.A. Similarity relations and fuzzy orderings. Inf. Sci. 1971, 3, 177–200. [Google Scholar] [CrossRef]
  31. Dunn, J. A graph theoretic analysis of pattern classification via Tamura’s fuzzy relation. IEEE Trans. Syst. Man Cybern. 1974, 3, 310–313. [Google Scholar] [CrossRef]
  32. Le, K. Fuzzy relation compositions and pattern recognition. Inf. Sci. 1996, 89, 107–130. [Google Scholar] [CrossRef]
  33. Wu, Z.; Leahy, R. An optimal graph theoretic approach to data clustering: Theory and its application to image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 1993, 15, 1101–1113. [Google Scholar] [CrossRef][Green Version]
  34. Esogbue, A.O. Optimal clustering of fuzzy data via fuzzy dynamic programming. Fuzzy Sets Syst. 1986, 18, 283–298. [Google Scholar] [CrossRef]
  35. Atanassov, K.T. Intuitionistic fuzzy sets. Control Theory Appl. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  36. Atanassov, K.; Gargov, G. Interval valued intuitionistic fuzzy sets. Fuzzy Sets Syst. 1989, 31, 343–349. [Google Scholar] [CrossRef]
  37. Yager, R.R. OWA Aggregation Over a Continuous Interval Argument With Applications to Decision Making. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2004, 34, 1952–1963. [Google Scholar] [CrossRef]
  38. Yager, R.R.; Xu, Z. The continuous ordered weighted geometric operator and its application to decision making. Fuzzy Sets Syst. 2006, 157, 1393–1402. [Google Scholar] [CrossRef]
  39. Chen, H.; Liu, J.; Wang, H. A class of continuous ordered weighted harmonic (C-OWH) averaging operators for interval argument and its applications. Syst. Eng. Theory Pract. 2008, 28, 86–92. [Google Scholar]
  40. Liu, J.; Lin, S.; Chen, H.; Zhou, L. The Continuous Quasi-OWA Operator and its Application to Group Decision Making. Group Decis. Negot. 2013, 22, 715–738. [Google Scholar] [CrossRef]
  41. Yager, R.R. Quantifier guided aggregation using OWA operators. Int. J. Intell. Syst. 1996, 11, 49–73. [Google Scholar] [CrossRef]
Figure 1. The flow chart of interval intuitionistic fuzzy clustering (IIFC) algorithm.
Figure 1. The flow chart of interval intuitionistic fuzzy clustering (IIFC) algorithm.
Symmetry 12 00079 g001
Figure 2. The clustering results derived by IIFC algorithm.
Figure 2. The clustering results derived by IIFC algorithm.
Symmetry 12 00079 g002
Table 1. The attribute values of soil sample.
Table 1. The attribute values of soil sample.
No.TNTPOMANAPAK
1([0.6,0.6],
[0.3,0.3])
([0.1,0.7],
[0.2,0.3])
([0.1,0.5],
[0.0,0.1])
([0.3,0.3],
[0.1,0.4])
([0.3,0.4],
[0.1,0.1])
([0.2,0.7],
[0.0,0.1])
2([0.1,0.1],
[0.5,0.5])
([0.6,0.9],
[0.1,0.1])
([0.1,0.1],
[0.1,0.1])
([0.2,0.2],
[0.1,0.2])
([0.3,0.3],
[0.0,0.3])
([0.0,0.3],
[0.2,0.2])
3([0.2,0.4],
[0.1,0.2])
([0.6,0.8],
[0.0,0.1])
([0.0,0.7],
[0.0,0.1])
([0.0,0.0],
[0.1,0.1])
([0.5,0.8],
[0.1,0.2])
([0.5,0.9],
[0.0,0.0])
4([0.4,0.6],
[0.2,0.2])
([0.3,0.4],
[0.1,0.3])
([0.5,0.6],
[0.2,0.3])
([0.0,0.1],
[0.1,0.1])
([0.4,0.9],
[0.1,0.1])
([0.0,0.1],
[0.1,0.5])
5([0.0,0.0],
[0.6,0.7])
([0.0,0.5],
[0.0,0.0])
([0.0,0.1],
[0.0,0.3])
([0.3,0.4],
[0.1,0.2])
([0.5,0.8],
[0.0,0.0])
([0.1,0.3],
[0.0,0.0])
6([0.1,0.6],
[0.3,0.4])
([0.2,0.3],
[0.1,0.6])
([0.0,0.8],
[0.2,0.2])
([0.5,0.5],
[0.3,0.5])
([0.2,0.2],
[0.3,0.4])
([0.0,0.3],
[0.3,0.4])
7([0.6,0.6],
[0.1,0.1])
([0.2,0.4],
[0.0,0.1])
([0.1,0.2],
[0.0,0.1])
([0.1,0.2],
[0.2,0.3])
([0.4,0.7],
[0.0,0.2])
([0.1,0.2],
[0.1,0.1])
8([0.4,0.5],
[0.0,0.1])
([0.0,0.1],
[0.6,0.8])
([0.1,0.9],
[0.0,0.0])
([0.2,0.4],
[0.3,0.6])
([0.0,0.5],
[0.0,0.1])
([0.3,0.9],
[0.0,0.1])
9([0.6,0.7],
[0.1,0.2])
([0.0,0.6],
[0.0,0.0])
([0.3,0.8],
[0.0,0.1])
([0.1,0.1],
[0.3,0.6])
([0.0,0.0],
[0.0,0.2])
([0.1,0.6],
[0.1,0.1])
10([0.2,0.4],
[0.0,0.2])
([0.0,0.7],
[0.1,0.1])
([0.2,0.2],
[0.2,0.4])
([0.5,0.8],
[0.0,0.1])
([0.6,0.9],
[0.0,0.1])
([0.6,0.8],
[0.0,0.0])
11([0.0,0.1],
[0.3,0.6])
([0.4,0.9],
[0.0,0.0])
([0.1,0.1],
[0.0,0.0])
([0.3,0.5],
[0.1,0.5])
([0.2,0.2],
[0.1,0.6])
([0.1,0.1],
[0.1,0.5])
12([0.3,0.3],
[0.4,0.6])
([0.1,0.1],
[0.1,0.2])
([0.5,1.0],
[0.0,0.0])
([0.4,0.7],
[0.1,0.1])
([0.1,0.6],
[0.4,0.4])
([0.1,0.2],
[0.2,0.6])
13([0.7,0.8],
[0.0,0.0])
([0.2,0.9],
[0.0,0.0])
([0.1,0.5],
[0.0,0.1])
([0.0,0.1],
[0.5,0.8])
([0.0,0.0],
[0.3,0.7])
([0.0,0.1],
[0.4,0.5])
14([0.4,0.4],
[0.4,0.5])
([0.5,1.0],
[0.0,0.0])
([0.7,0.8],
[0.1,0.1])
([0.3,0.4],
[0.1,0.6])
([0.1,0.1],
[0.5,0.7])
([0.0,0.3],
[0.1,0.3])
15([0.3,0.5],
[0.1,0.3])
([0.4,0.6],
[0.2,0.3])
([0.2,0.6],
[0.0,0.0])
([0.3,0.5],
[0.1,0.2])
([0.3,0.4],
[0.0,0.1])
([0.0,0.0],
[0.0,0.1])
Table 2. Intuitionistic fuzzy matrix B ˜ .
Table 2. Intuitionistic fuzzy matrix B ˜ .
No.TNTPOMANAPAK
1(0.60,0.30)(0.43,0.25)(0.31,0.05)(0.30,0.26)(0.35,0.10)(0.47,0.05)
2(0.10,0.50)(0.76,0.10)(0.10,0.10)(0.20,0.15)(0.30,0.16)(0.16,0.20)
3(0.30,0.15)(0.70,0.05)(0.39,0.05)(0.00,0.10)(0.66,0.15)(0.71,0.00)
4(0.50,0.20)(0.35,0.20)(0.55,0.25)(0.05,0.10)(0.67,0.10)(0.05,0.31)
5(0.00,0.65)(0.27,0.00)(0.05,0.16)(0.35,0.15)(0.66,0.00)(0.20,0.00)
6(0.37,0.35)(0.25,0.37)(0.45,0.20)(0.50,0.40)(0.20,0.35)(0.16,0.35)
7(0.60,0.10)(0.30,0.05)(0.15,0.05)(0.15,0.25)(0.56,0.10)(0.15,0.10)
8(0.45,0.05)(0.05,0.70)(0.55,0.00)(0.30,0.46)(0.27,0.05)(0.63,0.05)
9(0.65,0.15)(0.33,0.00)(0.57,0.05)(0.10,0.46)(0.00,0.10)(0.37,0.10)
10(0.30,0.10)(0.39,0.10)(0.20,0.30)(0.66,0.05)(0.76,0.05)(0.70,0.00)
11(0.05,0.46)(0.67,0.00)(0.10,0.00)(0.40,0.31)(0.20,0.37)(0.10,0.31)
12(0.30,0.50)(0.10,0.15)(0.77,0.00)(0.56,0.10)(0.37,0.40)(0.15,0.41)
13(0.75,0.00)(0.59,0.00)(0.31,0.05)(0.05,0.66)(0.00,0.51)(0.05,0.45)
14(0.40,0.45)(0.77,0.00)(0.75,0.10)(0.35,0.37)(0.10,0.60)(0.16,0.20)
15(0.40,0.20)(0.50,0.25)(0.41,0.00)(0.40,0.15)(0.35,0.05)(0.00,0.05)
Table 3. Initial cluster center.
Table 3. Initial cluster center.
No.TNTPOMANAPAK
1(0.60,0.30)(0.43,0.25)(0.31,0.05)(0.30,0.26)(0.35,0.10)(0.47,0.05)
2(0.50,0.20)(0.35,0.20)(0.55,0.25)(0.05,0.10)(0.67,0.10)(0.05,0.31)
3(0.05,0.46)(0.67,0.00)(0.10,0.00)(0.40,0.31)(0.20,0.37)(0.10,0.31)
4(0.30,0.15)(0.70,0.05)(0.39,0.05)(0.00,0.10)(0.66,0.15)(0.71,0.00)
Table 4. The final cluster center.
Table 4. The final cluster center.
No.TNTPOMANAKAK
1(0.54,0.16)(0.32,0.25)(0.40,0.03)(0.25,0.31)(0.31,0.08)(0.32,0.07)
2(0.54,0.18)(0.40,0.19)(0.44,0.17)(0.20,0.39)(0.29,0.32)(0.09,0.37)
3(0.21,0.48)(0.57,0.06)(0.43,0.05)(0.38,0.23)(0.24,0.38)(0.14,0.28)
4(0.20,0.30)(0.45,0.05)(0.21,0.17)(0.34,0.10)(0.69,0.07)(0.54,0.00)

Share and Cite

MDPI and ACS Style

Lin, J.; Duan, G.; Tian, Z. Interval Intuitionistic Fuzzy Clustering Algorithm Based on Symmetric Information Entropy. Symmetry 2020, 12, 79. https://doi.org/10.3390/sym12010079

AMA Style

Lin J, Duan G, Tian Z. Interval Intuitionistic Fuzzy Clustering Algorithm Based on Symmetric Information Entropy. Symmetry. 2020; 12(1):79. https://doi.org/10.3390/sym12010079

Chicago/Turabian Style

Lin, Jian, Guanhua Duan, and Zhiyong Tian. 2020. "Interval Intuitionistic Fuzzy Clustering Algorithm Based on Symmetric Information Entropy" Symmetry 12, no. 1: 79. https://doi.org/10.3390/sym12010079

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop