Next Article in Journal
Efficiency Improvement of Vehicular Antenna Systems for Ubiquitous Intelligent Systems
Previous Article in Journal
Forecasting Hoabinh Reservoir’s Incoming Flow: An Application of Neural Networks with the Cuckoo Search Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Intuitionistic Fuzzy Entropy and Application in Multi-Attribute Decision Making

1
School of Information Management, Jiangxi University of Finance and Economics, Nanchang 330013, China
2
School of Software, Jiangxi University of Science and Technology, Nanchang 330013, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Information 2014, 5(4), 587-601; https://doi.org/10.3390/info5040587
Submission received: 7 August 2014 / Revised: 1 November 2014 / Accepted: 4 November 2014 / Published: 11 November 2014

Abstract

:
In this paper, firstly, a new intuitionistic fuzzy (IF) entropy has been put forward, which considered both the uncertainty and the hesitancy degree of IF sets. Through comparing with other entropy measures, the advantage of the new entropy measure is obvious. Secondly, based on the new entropy measure, a new decision making method of a multi-attribute decision making problem was subsequently put forward, in which attribute values are expressed with IF values. In the cases of attribute weights, completely unknown and attribute weights are partially known. Two methods were constructed to determine them. One method is an extension of the ordinary entropy weight method, and the other method is a construction the optimal model according to the minimum entropy principle. Finally, two practical examples are given to illustrate the effectiveness and practicability of the proposed method.

1. Introduction

For multi-attributes decision problems, such as supplier selection, material selection in manufactory and evaluation of firm’s safety performance, it is necessary to consider many factors simultaneously. This makes the problem become complex and it is difficult to find the best solution. We often notice that, in many situations, crisp data are inadequate or insufficient for setting up a model of realistic decision problems [1,2], because the problems are vague or fuzzy in nature and could not be represented by crisp numbers. In these cases, a better approach to model human judgments is to adopt fuzzy set or extended fuzzy set, such as interval number, triangular fuzzy number or intuitionistic fuzzy (IF) set which are the extended of Zadeh’s fuzzy set [3,4,5,6,7,8]. IF set, firstly proposed by Atanassov [9], is an extension of Zadeh’s fuzzy set. IF sets seem to be more suitable for expressing the decision maker’s satisfaction and/or dissatisfaction degrees rather than crisp numbers, fuzzy sets or linguistic variables [10,11,12,13]. Gau and Buehrer [14] defined vague sets in 1993. Bustince and Burillo [15] pointed out that the notion of vague sets was the same with that of IFS. Many studies also reveal that the IF set is a useful tool to handle imprecise data and vague expressions that can be more natural than rigid mathematical rules and equations. Then many IF multi-attribute decision making (MADM) methods are developed to deal with these situations [16,17,18].
Entropy is an effective measure for depicting the fuzziness of a fuzzy set. Zadeh [19] first introduced the entropy of a fuzzy event in 1968. Later, in 1972, Deluca and Termini [20] gave the definition of fuzzy entropy, and they also proposed fuzzy entropy based on Shannon’s function. Since then, many authors have realized the importance of entropy and have constructed fuzzy entropy measures from different viewpoints [21,22,23]. Burillo and Bustince [24] introduced the IF entropy measure in IF theory for measuring fuzziness degree or uncertain information of IF sets. As a result, the research and application of IF entropy caused a lot of attention. Zhang and Jiang [25] defined a measure of IF entropy by generalizing of the Deluca and Termini [20] logarithmic fuzzy entropy; Ye [26] proposed two IF entropy measures using triangular function; Verma and Sharma [27] defined an exponential IF entropy measure by generalizing of the Pal and Pal [23] exponential fuzzy entropy. However, the all above mentioned IF entropy measures only consider the derivation of membership and non-membership, not consider the effect of hesitancy degree of the IF set. Some authors have already realized these shortcomings and some new IF entropy measures are proposed, such as Wei et al. [28] proposed an IF entropy measure using a cosine function, and Wang et al. [29] proposed an IF entropy measure using a cotangent function.
In this paper, we will put forward a new IF entropy measure, which not only considers the derivation of membership and non-membership, but also considers the effect of hesitancy degree of the IF set. TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) is one of the important techniques in dealing with MADM problems. It simultaneously considers both the shortest distance from a positive ideal solution (PIS) and the farthest distance from a negative ideal solution (NIS), and the order of the alternatives is ranked according to relative closeness coefficients [30,31]. TOPSIS has been widely applied to the traditional crisp and fuzzy MADM problems [32,33,34]. Based on the proposed IF entropy measure and TOPSIS, we will give a new MADM decision making method. The subsequent contents of this paper are organized as follows: In Section 2, the basic definitions and notations of IF set are defined and reviewed. In Section 3 a new IF entropy is constructed, and the advantages of this IF entropy measures comparing with other IF entropy measures are also analyzed. In Section 4 an intuitionistic fuzzy MADM method is put forward, in which the weights of attributes are obtained according to the proposed IF entropy measure. Two examples are given in Section 5. Finally, conclusions are given in Section 6.

2. Preliminaries

Definition 1 [9]. Suppose that X is a given universal set, a set A called an IF set, if
A = { < x i , μ A ( x i ) , υ A ( x i ) > | x i X } ,
where the functions μ A : X [ 0 , 1 ] and υ A : X [ 0 , 1 ] are the membership degree and non-membership degree of x i , and for every x i X , 0 μ A ( x i ) + υ A ( x i ) 1 . Furthermore, we call π A ( x i ) = 1 μ A ( x i ) υ A ( x i ) is the IF index or hesitancy degree of x i . Conveniently, if there is only one element in X , we call A the IF number, abbreviated as A = ( μ A , υ A ) .
Definition 2. Suppose that A = { < x i , μ A ( x i ) , υ A ( x i ) > | x i X } and B = { < x i , μ B ( x i ) , υ B ( x i ) > | x i X } are two IF sets, then the following operations can be founded in [9,10]:
(1)
A B if and only if μ A ( x i ) μ B ( x i ) , υ A ( x i ) υ B ( x i ) , x i X ;
(2)
A = B if and only if A B and B A ;
(3)
The complementary set of A denoted by A C , is A C = { < x i , υ A ( x i ) , μ A ( x i ) > | x i X } ;
(4)
A _ B called A less fuzzy than B , i.e., for x i X ,
  • If μ B ( x i ) υ B ( x i ) , then μ A ( x i ) μ B ( x i ) ,   υ A ( x i ) υ B ( x i ) ;
  • If μ B ( x i ) υ B ( x i ) , then μ A ( x i ) μ B ( x i ) , υ A ( x i ) υ B ( x i ) ;
Definition 3 [35]. Suppose A = { < x i , μ A ( x i ) , υ A ( x i ) > | x i X } and B = { < x i , μ B ( x i ) , υ B ( x i ) > | x i X } are two IF set, the weight of x i is w i , then the weighted Hamming distance measure of A and B is defined as follows:
d ( A , B ) = 1 2 i = 1 n w i ( | μ A ( x i ) μ B ( x i ) | + | υ A ( x i ) υ B ( x i ) | + | π A ( x i ) π B ( x i ) | )
Szmidt and Kacprzyk [24] (2001) first axiomatized intuitionistic fuzzy entropy measure, which is an extension of the De Luca and Termini axioms [20] in 1972 for fuzzy sets. The axioms of intuitionistic fuzzy entropy measure were formulated in the following way.
Definition 4 [24]. A map E : I F S ( X ) [ 0 , 1 ] is called the IF entropy, if it satisfies the following properties:
(i)
E ( A ) = 0 if and only if A is a crisp set;
(ii)
E ( A ) = 1 if and only if μ A ( x i ) = υ A ( x i ) , x i X ;
(iii)
E ( A ) = E ( A C ) ;
(iv)
If A _ B , then E ( A ) E ( B ) .

3. A New Effective Intuitionistic Fuzzy Entropy

First we review several already existing IF entropy measures in reference.
(1) Zhang and Jiang’s IF entropy measurement [25]:
E 1 ( A ) = 1 n i = 1 n [ μ A ( x i ) + 1 υ A ( x i ) 2 log ( μ A ( x i ) + 1 υ A ( x i ) 2 ) + υ A ( x i ) + 1 μ A ( x i ) 2 log ( υ A ( x i ) + 1 μ A ( x i ) 2 ) ]
(2) Ye’s IF entropy measure [26]:
E Y 1 ( A ) = 1 n i = 1 n [ ( sin μ A ( x i ) + 1 υ A ( x i ) 4 π + sin υ A ( x i ) + 1 μ A ( x i ) 4 π 1 ) × 1 2 1 ]
E Y 2 ( A ) = 1 n i = 1 n [ ( cos μ A ( x i ) + 1 υ A ( x i ) 4 π + cos υ A ( x i ) + 1 μ A ( x i ) 4 π 1 ) × 1 2 1 ]
Wei et al. [28] have shown that the above two IF entropy formulas (3) and (4) are the same in mathematical terms and have given a simplified version as:
E 2 ( A ) = 1 n i = 1 n [ ( 2 cos μ A ( x i ) υ A ( x i ) 4 π 1 ) × 1 2 1 ]
(3) Verma and Sharma’s exponential IF entropy measure [27]:
E 3 ( A ) = 1 n ( e 1 ) i = 1 n [ ( μ A ( x i ) + 1 υ A ( x i ) 2 e 1 μ A ( x i ) + 1 υ A ( x i ) 2 + υ A ( x i ) + 1 μ A ( x i ) 2 e 1 υ A ( x i ) + 1 μ A ( x i ) 2 ) 1 ]
All above IF entropy measures only consider the derivation of membership and non-membership, not consider the effect of hesitation degree of the IF set. Thus, when any two IF sets, such as A 1 = ( 0.4 , 0.1 ) , A 2 = ( 0.6 , 0.3 ) , in the real assessment, A 1 is obviously more fuzzy than A 2 . However, the entropy measures E 1 , E 2 and E 3 give the same entropy values, which are not consistent with the true situation. Some authors have already realized the disadvantage, so some new IF entropy measures are proposed, such as IF entropy measure proposed by Wei et al. [28]:
E 4 ( A ) = 1 n i = 1 n cos ( μ A ( x i ) υ A ( x i ) 4 ( 1 + π A ( x i ) ) π )
and IF entropy measure proposed by Wang and Wang [29]
E 5 ( A ) = 1 n i = 1 n cot ( π 4 + | μ A ( x i ) υ A ( x i ) | 4 ( 1 + π A ( x i ) ) π )
In this paper, we also devote to the development of IF entropy measures and we will construct a new IF entropy given as follows:
E ( A ) = 1 n i = 1 n cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π )
The new IF entropy measure can be written as
E ( A ) = 1 n i = 1 n cot ( π 4 + | μ A ( x i ) υ A ( x i ) | × ( 1 π A ( x i ) ) 4 π )
which not only considers the deviation between membership with nonmembership degrees, but also considers the hesitancy degree of the IF set.
Theorem 1. The measure given by Equation (9) is an IF entropy.
Proof. To prove the measure given by Equation (9) is an IF entropy, we only need to prove it satisfies the properties in Definition 4. Obviously, for every x i , we have:
π 4 π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π π 2 ,
then
0 cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π ) 1 ,
thus we have 0 E ( A ) 1 .
(i) Let A be a crisp set, i.e., for x i X , we have μ A ( x i ) = 0 ,   υ A ( x i ) = 1 or μ A ( x i ) = 1 , υ A ( x i ) = 0 . It is obvious that E ( A ) = 0 .
If E ( A ) = 0 i.e., E ( A ) = 1 n i = 1 n cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π ) = 0 , then x i X , we have
cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π ) = 0 ,
thus | μ A 2 ( x i ) υ A 2 ( x i ) | = 1 , then we have μ A ( x i ) = 0 , υ A ( x i ) = 1 or μ A ( x i ) = 1 , υ A ( x i ) = 0 , Therefore A is a crisp set.
(ii) Let μ A ( x i ) = υ A ( x i ) , x i X , from Equation (9), we have E ( A ) = 1 .
Now we assume that E ( A ) = 1 , then for all x i X , we have:
cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π ) = 1 ,
then | μ A 2 ( x i ) υ A 2 ( x i ) | = 0 , we can obtain the conclusion μ A ( x i ) = υ A ( x i ) for all x i X .
(iii) By A C = { < x i , υ A ( x i ) , μ A ( x i ) > | x i X } and Equation (9), we have:
E ( A C ) = 1 n i = 1 n cot ( π 4 + | μ A 2 ( x i ) υ A 2 ( x i ) | 4 π ) = 1 n i = 1 n cot ( π 4 + | υ A 2 ( x i ) μ A 2 ( x i ) | 4 π ) = E ( A ) .
(iv) Construct the function:
f ( x , y ) = cot ( π 4 + | x 2 y 2 | 4 π )
where x , y [ 0 , 1 ] .
Now when x y , we have f ( x , y ) = cot ( π 4 + y 2 x 2 4 π ) , we need to prove the function f ( x , y ) is increasing with x and decreasing with y.
We can easily derived the partial derivatives of f ( x , y ) to x and to y, respectively:
f ( x , y ) x = π x 2 csc 2 ( π 4 + y 2 x 2 4 π )
f ( x , y ) y = π y 2 csc 2 ( π 4 + y 2 x 2 4 π )
When x y , we have f ( x , y ) x 0 ,   f ( x , y ) x 0 , then f ( x , y ) is increasing with x and decreasing with y, thus when μ B ( x i ) υ B ( x i ) and μ A ( x i ) μ B ( x i ) ,   υ A ( x i ) υ B ( x i ) satisfied, we have f ( μ A ( x i ) , υ A ( x i ) ) f ( μ B ( x i ) , υ B ( x i ) ) .
Similarly, we can prove that when x y , f ( x , y ) x 0 ,   f ( x , y ) x 0 , then f ( x , y ) is decreasing with x and increasing with y; thus when μ B ( x i ) υ B ( x i ) and μ A ( x i ) μ B ( x i ) , υ A ( x i ) υ B ( x i ) , we have f ( μ A ( x i ) , υ A ( x i ) ) f ( μ B ( x i ) , υ B ( x i ) ) .
Therefore, if A _ B , we have 1 n i = 1 n f ( μ A ( x i ) ,   υ A ( x i ) ) 1 n i = 1 n f ( μ B ( x i ) , υ B ( x i ) ) i.e., E ( A ) E ( B ) .
Exmple 1. Let A = { < x i , μ A ( x i ) , υ A ( x i ) > | x i X } be an IF set in X = { x 1 , x 2 , ... , x n } . For any positive real number n , De et al. [36] defined the IF set A n as follows:
A n = { < x i , [ μ A ( x i ) ] n , 1 [ 1 υ A ( x i ) ] n > | x i X }
We consider the IF set A in X = { 6 , 7 , 8 , 9 , 10 } defined in De et al. [36] as
A = { < 6 , 0.1 , 0.8 > , < 7 , 0.3 , 0.5 > , < 8 , 0.6 , 0.2 > , < 9 , 0.9 , 0.0 > , < 10 , 1.0 , 0.0 > }
.
By taking into consideration the characterization of linguistic variables, De et al. [36] regarded A as “LARGE” on X . Using the above operations, we have:
A 1 / 2 may be treated as “More or less LARGE”;
A 2 may be treated as “Very LARAGE”;
A 3 may be treated as “Quite very LARAGE”;
A 4 may be treated as “Very very LARAGE”.
Now we consider these IF sets to compare the above entropy measures. It may be mentioned that from the logical consideration, then entropies of these IF sets are required to follow the following pattern:
E ( A 1 / 2 ) > E ( A ) > E ( A 2 ) > E ( A 3 ) > E ( A 4 )
Calculated numerical values of the six entropy functions for these cases are given in the Table 1.
Table 1. Values of the different entropy measures under A 1 / 2 , A , A 2 , A 3 and A 4 .
Table 1. Values of the different entropy measures under A 1 / 2 , A , A 2 , A 3 and A 4 .
E1E2E3E4E5E
A1/20.37860.50160.51060.86600.36450.3686
A0.38100.49390.50540.86850.35640.3633
A20.31600.39530.40650.84370.33390.3407
A30.27000.33300.34380.82630.25120.2643
A40.24030.29380.30440.81470.21420.2313
Form Table 1, we know the performance of E 2 , E 3 , E 5 and E (the new IF entropy) are good, which satisfy the Equation (15). For further comparison of these entropy measures, another example will be given in Example 2.
Example 2. Suppose that X = { x } , there are five IF sets, which also can be seen as IFNs: A 1 = ( 0.4 , 0.1 ) , A 2 = ( 0.6 , 0.3 ) , A 3 = ( 0.2 , 0.6 ) , and A 4 = ( 0.13 , 0.565 ) . Intuitively, we see that A 1 is more fuzzy than A 2 , and A 4 is more fuzzy than A 3 . The calculated numerical values of these six entropy measures are shown in Table 2.
Table 2. Values of the different entropy measures under A i ( i = 1 , 2 , 3 , 4 ) .
Table 2. Values of the different entropy measures under A i ( i = 1 , 2 , 3 , 4 ) .
E1E2E3E4E5E
A10.64740.90570.91380.98770.72650.7883
A20.64740.90570.91380.97710.64270.6457
A30.61090.83290.84630.96590.57740.5914
A40.59530.80270.81800.96590.57740.6103
From Table 2, we see that the entropy of A 1 and A 2 are equal under the entropy measures E 1 , E 2 and E 3 , which are not consistent with our intuition. The reason is that these entropy measures do not consider the hesitation degree in their entropy formulas, but the hesitation degree is an important uncertain aspect of IF set. Entropy measures E 4 , E 5 and E can give the correct result because they are considered the effect of hesitation degree. For A 3 and A 4 , the entropy measures E 4 and E 5 cannot distinguish them, while the proposed entropy E can get result E ( A 4 ) > E ( A 3 ) which in agreement with our intuition.
According to the above two examples, we see that the proposed entropy measure have a better performance than the entropy measures E 1 , E 2 , E 3 , E 4 , E 5 . Furthermore the new entropy measure considers the two aspects of IF set (i.e., the uncertainty depicted by the derivation of membership and non-membership, and the unknown reflected by hesitation degree of the IF set [29]), and thus the proposed entropy measure is a good entropy measure formula of IF set.

4. Intuitionistic Fuzzy MADM Method Based on the New IF Entropy

For a MADM problem, supposed that A = { A 1 , A 2 , , A m } is a set of m alternatives, O = { o 1 , o 2 , , o n } ; is a set of n attributes. Suppose that there exists an alternative set consisting of n non-inferior alternatives from which the most desirable alternative is to be selected. Ratings of alternatives A i A on attributes o j O are expressed with the IFN a ˜ i j = ( μ i j , υ i j ) , respectively, where μ i j and υ i j are the membership (satisfactory) and nonmembership (nonsatisfactory) degrees of the alternative A i A on the attribute o j O with respect to the fuzzy concept “excellence” given by the decision maker so that they satisfy the conditions: 0 μ i j 1 , 0 υ i j 1 and 0 μ i j + υ i j 1 ( i = 1 , 2 , , m ; j = 1 , 2 , , n ).
In the MADM problems, the IF values are obtained according to Liu and Wang [37] as follows. For the sake of obtaining the degrees to which alternative A i satisfies and does not satisfy attribute o j ( i = 1 , 2 , , m ; j = 1 , 2 , , n ), we now use the statistical method. Suppose we invite n experts to make the judgment. They are expected to answer “yes” or “no” or “I do not know” to the question whether alternative A i satisfies attribute o j . We use n Y ( i , j ) and n N ( i , j ) to denote the number of “yes” and “no”, respectively, from n experts. Then, the degrees to which alternative A i satisfies and does not satisfy attribute o j can be calculated as:
μ i j = n Y ( i , j ) n  and  v i j = n N ( i , j ) n .
Thus, a MADM problem can be expressed with the decision matrix D = ( a ˜ i j ) m × n as follows:
o 1 o 2 o m D = ( a ˜ i j ) m × n = A 1 A 2 A m ( ( μ 11 , υ 11 ) ( μ 12 , υ 12 ) ( μ 1 n , υ 1 n ) ( μ 21 , υ 21 ) ( μ 22 , υ 22 ) ( μ 2 n , υ 2 n ) ( μ m 1 , υ m 1 ) ( μ m 2 , υ m 2 ) ( μ m n , υ m n ) )
Let w = ( w 1 , w 2 , , w n ) T be the weight vector of all attributes, where 0 w j 1 ( j = 1 , 2 , , n ) are weights of attributes o j O ( j = 1 , 2 , , n ), and j = 1 n w j = 1 . The attribute weights information is usually unknown or partially known due to the insufficient knowledge or limitation of time of decision makers in the decision making process. Therefore, the determination of attribute weights is an important issue in MADM problems in which the attribute weights are partially known or unknown. In this paper, we will put forward two methods to determine the attribute weights for the above-mentioned two cases.

4.1. MADM Problem with Unknown Attribute Weights Information

Chen et al. [38] and Ye [39] discussed the intuitionistic fuzzy MADM problems with unknown attribute weights using IF entropy measure. Based on their work, when the attribute weights are completely unknown, we can use the proposed IF entropy to determine the attribute weights:
w j = 1 e j n j = 1 n e j , j = 1 , 2 , ... , n
where e j = 1 m i = 1 m E ( a ˜ i j ) , and E ( a ˜ i j ) = cot ( π 4 + | μ i j 2 υ i j 2 | 4 π ) is the IF entropy of a ˜ i j = ( μ i j , υ i j ) .

4.2. MADM Problem with Partially Known Attribute Weights Information

Generally, there will have more constraint conditions for the weight vector w = ( w 1 , w 2 , ... , w n ) . We denote H as the set of the known weight information. To determine the attribute weights for MADM problem with attribute weights partially known under intuitionistic fuzzy environment, Xu [40] proposed an optimization model based on the Chen and Tan’s score function [41]; Wu and Zhang [42], Wang and Wang [29] determined the attribute weights by establishing a programming model according to the minimum entropy principle. In this paper, we will use the new IF entropy measure to determine the attribute weights and the method is similarly with Wang and Wang [29]. The specific process is given as follows.
To rank the alternatives according to the decision matrix D = ( a ˜ i j ) m × n , we propose a method to obtain the attribute weight vector by means of the proposed IF entropy measure. Entropy measure describes the degree of the fuzziness and intuitionism. The smaller the intuitionistic fuzzy entropy, the smaller of the fuzzy degree of attribute evaluation information, thus the more decision-making certainty information will be. Hence, we can utilize the principle of minimum entropy value to get the weight vector of attribute by computing the following programming:
min E ( A i ) = j = 1 n w j E ( a ˜ i j ) = j = 1 n w j cot ( π 4 + | μ i j 2 υ i j 2 | 4 π ) s . t . j = 1 n w j = 1 w H
Because each alternative is fair competition, the weight coefficient with respect to the same attribute should be also equal, thus we get the following optimization model:
min E = i = 1 m E ( A i ) = i = 1 m j = 1 n w j cot ( π 4 + | μ i j 2 υ i j 2 | 4 π ) s . t . j = 1 n w j = 1 w H
Hence, by solving the Equation (19), the optimal solution w = arg min E is chosen as the optimal attribute weights.

4.3. The New MADM Method Based the Proposed IF Entropy

In this subsection, we put forward the new MADM method based on the above-mentioned work and the concept of TOPSIS. The specific calculation steps are given as follows:
Step 1. Calculate the attribute weights according to Section 4.1 and Section 4.2;
Step 2. Determine the positive ideal solution (PIS) and negative ideal solution (NIS) of the intuitionistic fuzzy MADM problem.
The PIS is defined as follows:
A * = ( ( μ 1 * , υ 1 * ) , ( μ 2 * , υ 2 * ) , , ( μ n * , υ n * ) )
where ( μ j * , υ j * ) = ( 1 , 0 ) ( j = 1 , 2 , , n ).
The NIS is defined as follows:
A = ( ( μ 1 , υ 1 ) , ( μ 2 , υ 2 ) , , ( μ n , υ n ) )
where ( μ j , υ j ) = ( 0 , 1 ) ( j = 1 , 2 , , n ).
Step 3. According to the weighted Hamming distance measure in Definition 3, the distance measures between alternative A i with PIS and NIS are calculated respectively as follows:
d ( A i , A ) = 1 2 j = 1 n w j ( | μ i j μ j | + | υ i j υ j | + | π i j π j | ) = 1 2 j = 1 n w j ( | 1 μ i j | + | υ i j | + | 1 μ i j υ i j | )
d ( A i , A ) = 1 2 j = 1 n w j ( | μ i j μ j | + | υ i j υ j | + | π i j π j | ) = 1 2 j = 1 n w j ( | μ i j | + | 1 υ i j | + | 1 μ i j υ i j | )
Step 4. Calculate the relative closeness coefficient of each alternative.
The closeness coefficient C i represents the distances between the PIS and NIS simultaneously. The closeness coefficient of each alternative is calculated as:
C i = d ( A i , A ) d ( A i , A ) + d ( A i , A )
Step 5. Rank the alternatives according to the closeness coefficient ( C i ) in decreasing order. The best alternative is the closest to the PIS and the farthest from the NIS.

5. Numerical Examples

In order to illustrate the application of the proposed MADM method, two examples are given as follows:
Example 3. (This is the case of the attribute weights are complete unknown) Suppose that a company wants to invest a sum of money in the best option, and there are four parallel alternatives to be selected: A 1 (a car company), A 2 (a food company), A 3 (a computer company), and A 4 (an arms company). Evaluation attributes are o 1 (the risk analysis), o 2 (the growth analysis), and o 3 (the environmental impact analysis) (this example is adopted from Herrera and Herrera-Viedma [43]; Ye [39]). Using statistical methods, we can obtain the membership degree μ i j (i.e., μ i j means the satisfactory degree) and non-membership degree υ i j (i.e., υ i j means the nonsatisfactory degree) for the alternative A i satisfying the attributes the attribute o j respectively. The IF decision matrix provided by experts is shown in Table 3.
Table 3. Intuitionistic fuzzy decision matrix.
Table 3. Intuitionistic fuzzy decision matrix.
Air-condition systemEvaluation attribute
o1o2o3
A1(0.45,0.35)(0.50,0.30)(0.20,0.55)
A2(0.65,0.25)(0.65,0.25)(0.55,0.15)
A3(0.45,0.35)(0.55,0.35)(0.55,0.20)
A4(0.75,0.15)(0.65,0.20)(0.35,0.15)
The calculation steps of the proposed method are given as follows:
Step 1. According to the Section 4.1, the attribute weights vector is obtained as:
w = ( w 1 , w 2 , w 3 ) T = ( 0.3349 , 0.3573 , 0.3078 ) T
Step 2.The PIS ( A ) and NIS ( A ) are respectively given as:
A = ( ( μ 1 * , υ 1 * ) , ( μ 2 * , υ 2 * ) , ( μ 3 * , υ 3 * ) ) = ( ( 1 , 0 ) , ( 1 , 0 ) , ( 1 , 0 ) )
A = ( ( μ 1 , υ 1 ) , ( μ 2 , υ 2 ) , ( μ 3 , υ 3 ) ) = ( ( 0 , 1 ) , ( 0 , 1 ) , ( 0 , 1 ) )
Step 3. The distance measures of each alternative from PIS and NIS are calculated as:
d ( A 1 , A ) = 0.3045 , d ( A 2 , A ) = 0.1904 , d ( A 3 , A ) = 0.2417 , d ( A 4 , A ) = 0.2044
d ( A 1 , A ) = 0.3032 , d ( A 2 , A ) = 0.3904 , d ( A 3 , A ) = 0.3481 , d ( A 4 , A ) = 0.4161
Step 4. The relative closeness coefficients are calculated as:
C 1 = 0.4989 , C 2 = 0.6722 , C 3 = 0.5901 , C 4 = 0.6705 .
Therefore, the ranking order of all alternatives is A 2 A 4 A 3 A 1 , and A 2 is the desirable alternative. The ranking order is in agreement with the result obtained in Ye [39].
Example 4. (This is the case of the attribute weights are partially known.) The example is adopted from Li [44], which considers an air-condition system selection problem. Suppose there are three air-condition systems: A i ( i = 1 , 2 , 3 ) are to be selected. Evaluation attributes are o 1 (economical), o 2 (function), and o 3 (operationality). Using statistical methods, we can obtain the membership degree μ i j and non-membership degree υ i j for the alternative A i satisfying the attributes the attribute o j respectively. The IF decision matrix provided by experts is shown in Table 4.
Table 4. Intuitionistic fuzzy decision matrix.
Table 4. Intuitionistic fuzzy decision matrix.
Air-condition systemEvaluation attribute
o1o2o3
A1(0.75,0.10)(0.60,0.25)(0.80,0.20)
A2(0.80,0.15)(0.68,0.20)(0.45,0.50)
A3(0.40,0.45)(0.75,0.05)(0.60,0.30)
Assume the attribute weights are partially known, and the weights satisfies the set
H = { 0.25 w 1 0.75 , 0.35 w 2 0.60 , 0.30 w 3 0.35 } ,
Then the calculation steps of the proposed decision making method are given as follows:
Step 1. According to the Equation (19), we can establish the following programming model:
min E = 1.6119 w 1 + 1.4631 w 2 + 1.8986 w 3 s . t . { 0.25 w 1 0.75 0.35 w 2 0.60 0.30 w 3 0.35 w 1 + w 2 + w 3 = 1 .
We use MATLAB software to solve this model, and get the optimum attribute weight vector:
w = ( 0.25 , 0.45 , 0.30 ) T .
.
Step 2.The PIS ( A ) and NIS ( A ) are respectively given as:
A = ( ( μ 1 * , υ 1 * ) , ( μ 2 * , υ 2 * ) , ( μ 3 * , υ 3 * ) ) = ( ( 1 , 0 ) , ( 1 , 0 ) , ( 1 , 0 ) )
A = ( ( μ 1 , υ 1 ) , ( μ 2 , υ 2 ) , ( μ 3 , υ 3 ) ) = ( ( 0 , 1 ) , ( 0 , 1 ) , ( 0 , 1 ) )
Step 3. The distance measures of each alternative from PIS and NIS are calculated as
d ( A 1 , A ) = 0.1512 , d ( A 2 , A ) = 0.1795 , d ( A 3 , A ) = 0.1913
d ( A 1 , A ) = 0.4013 , d ( A 2 , A ) = 0.3612 , d ( A 3 , A ) = 0.3875
Step 4. The relative closeness coefficients are calculated as:
C 1 = 0.7262 , C 2 = 0.6681 , C 3 = 0.6695
Step 5. Based on C i values, the ranking of the alternatives in descending order are A 1 A 3 A 2 , and A 1 is the best desirable supplier. This reveals that the alternative is in agreement with the result in Li [44].

6. Conclusions

IF sets are suitable to describe and deal with the uncertain and vague information occurred in many MADM problems. In this paper, we have proposed a new IF entropy measure which not only considers the deviation of membership degree with non-membership degree, but also considers the hesitation degree of the IF set. Through comparing with other IF entropy measures, the new IF measure is more reasonable and has more advantages. The proposed entropy can be applied to the field of image processing, pattern recognition and medical diagnosis. Based on the proposed IF entropy measure, a new attribute weights determination method is put forward, which we then use to approach the multi-attribute decision making problem. Two numerical examples are used to illustrate the feasibility and practicability of the proposed MADM method. The MADM method proposed in this paper can be applied to other alternative problems, such as the evaluation project investment risk, site selection and credit evaluation. In the future we will use the entropy measure to determine the weights of experts in group decision problems under IF environment and we will study entropy measures of interval IF set based on the concept of this article.

Acknowledgments

This paper was supported by the National Natural Science Foundation of China (No. 71263020 and No. 71061006) and Natural Science Foundation of Jiangxi Province (No. 20132BAB211015 and No. 2014BAB201009). Science and Technology Research Project of Jiangxi Provincial Education Department (No. GJJ14449), and Natural Science Foundation of Jiangxi University of Science and Technology (JXUST) (No. NSFJ2014-G38).

Author Contributions

Haiping Ren has developed the idea of the article. He has developed and written the manuscript. Manfeng Liu constructed the optimization model for determining attributes’ weight. Both authors have read and approved the final manuscript.

Conflicts of Interest

The authors declared no conflict of interest.

References

  1. Carlsson, C.; Fuller, R. Multiobjective linguistic optimization. Fuzzy Sets Syst. 2000, 115, 5–10. [Google Scholar] [CrossRef]
  2. Chen, C.T. Extensions of the TOPSIS for group decision-making under fuzzy environment. Fuzzy Sets Syst. 2000, 114, 1–9. [Google Scholar] [CrossRef]
  3. Jahanshahloo, G.R.; Lotfi, F.H.; Izadikhah, M. An algorithmic method to extend TOPSIS for decision-making problems with interval data. Appl. Math. Comput. 2006, 175, 1375–1284. [Google Scholar] [CrossRef]
  4. Amiri, M.; Nosratian, N.E.; Jamshidi, A.; Kazemi, A. Developing a new ELECTRE method with interval data in multiple attribute decision making problems. J. Appl. Sci. 2008, 8, 4017–4028. [Google Scholar] [CrossRef]
  5. Wang, Y. A fuzzy multi-criteria decision-making model by associating technique for order preference by similarity to ideal solution with relative preference relation. Inf. Sci. 2014, 268, 169–184. [Google Scholar] [CrossRef]
  6. Jing, Y.; Bai, H.; Wang, J. A fuzzy multi-criteria decision-making model for CCHP systems driven by different energy sources. Energy Policy 2012, 42, 286–296. [Google Scholar] [CrossRef]
  7. Wang, J.; Zhang, H. Multicriteria decision-making approach based on Atanassov’s intuitionistic fuzzy sets with incomplete certain information on weights. IEEE Trans. Fuzzy Syst. 2013, 21, 510–515. [Google Scholar] [CrossRef]
  8. Yue, Z.; Jia, Y.; Ye, G. An approach for multiple attribute group decision making based on intuitionistic fuzzy information. Int. J. Uncertain. Fuzziness Knowl.-Based Syst. 2009, 17, 317–332. [Google Scholar] [CrossRef]
  9. Atanassov, K.T. Intuitionistic fuzzy sets. Fuzzy Sets Syst. 1986, 20, 87–96. [Google Scholar] [CrossRef]
  10. Atanassov, K.T. Intuitionistic Fuzzy Sets; Springer-Verlag: New York, NY, USA, 1999. [Google Scholar]
  11. Chen, T.Y.; Li, C.H. Objective weights with intuitionistic fuzzy entropy measures and computational experiment analysis. Appl. Soft Comput. 2011, 11, 5411–5423. [Google Scholar] [CrossRef]
  12. Beliakov, G.; Bustinc, H.; Goswami, D.P.; Mukherjee, U.K.; Pal, N.R. On averaging operators for Atanassov’s intuitionistic fuzzy sets. Inf. Sci. 2011, 181, 1161–1124. [Google Scholar] [CrossRef]
  13. Pei, Z.; Zheng, L. A novel approach to multi-attribute decision making based on intuitionistic fuzzy sets. Expert Syst. Appl. 2012, 39, 2560–2566. [Google Scholar] [CrossRef]
  14. Gau, W.; Buehrer, D.J. Vague sets. IEEE Trans. Syst. Man Cybern. 1993, 23, 610–614. [Google Scholar] [CrossRef]
  15. Bustince, H.; Burillo, P. Vague sets are intuitionistic fuzzy sets. Fuzzy Sets Syst. 1996, 79, 403–405. [Google Scholar] [CrossRef]
  16. Szmidt, E.; Kacprzyk, J. Using intuitionistic fuzzy sets in group decision making. Control Cybern. 2002, 31, 1037–1054. [Google Scholar]
  17. Xu, Z. A deviation-based approach to intuitionistic fuzzy multiple attribute group decision making. Group Decis. Negot. 2010, 19, 57–76. [Google Scholar] [CrossRef]
  18. Zeng, S.; Balezentis, T.; Chen, J.; Luo, G. A projection method for multiple attribute group decision making with intuitionistic fuzzy information. Informatica 2013, 24, 485–503. [Google Scholar]
  19. Zadeh, L.A. Probability measures of fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427. [Google Scholar] [CrossRef]
  20. De Luca, A.; Termini, S. A definition of non-probabilistic entropy in the setting of fuzzy set theory. Inf. Control 1972, 20, 301–312. [Google Scholar] [CrossRef]
  21. Bhandari, D.; Pal, N.R. Some new information measures for fuzzy sets. Inf. Sci. 1993, 67, 209–228. [Google Scholar] [CrossRef]
  22. Fan, J. Some new fuzzy entropy formulas. Fuzzy Sets Syst. 2002, 128, 277–284. [Google Scholar] [CrossRef]
  23. Pal, N.R.; Pal, S.K. Object background segmentation using new definitions of entropy. IEE Proc. E 1989, 366, 284–295. [Google Scholar]
  24. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 2001, 118, 305–316. [Google Scholar]
  25. Zhang, Q.; Jiang, S. A note on information entropy measure for vague sets. Inf. Sci. 2008, 178, 4184–4191. [Google Scholar] [CrossRef]
  26. Ye, J. Two effective measures of intuitionistic fuzzy ertropy. Computing 2010, 87, 55–62. [Google Scholar] [CrossRef]
  27. Verma, R.; Sharma, B.D. Exponential entropy on intuitionistic fuzzy sets. Kybernetika 2013, 49, 114–127. [Google Scholar]
  28. Wei, C.; Gao, Z.; Guo, T. An intuitionistic fuzzy entropy measure based on trigonometric function. Control Decis. 2012, 27, 571–574. [Google Scholar]
  29. Wang, J.; Wang, P. Intuitionistic linguistic fuzzy multi-criteria decision-making method based on intuitionistic fuzzy entropy. Control Decis. 2012, 27, 1694–1698. [Google Scholar]
  30. Hwang, C.L.; Yoon, K.P. Multiple Attribute Decision Making: Methods and Applications; Springer-Verlag: New York, NY, USA, 1981. [Google Scholar]
  31. Jiang, J.; Chen, Y.; Yang, K. TOPSIS with fuzzy belief structure for group belief multiple criteria decision making. Expert Syst. Appl. 2011, 38, 9400–9406. [Google Scholar] [CrossRef]
  32. Krohling, R.A.; Campanharo, V.C. Fuzzy TOPSIS for group decision making: A case study for accidents with oil spill in the sea. Expert Syst. Appl. 2011, 38, 4190–4197. [Google Scholar] [CrossRef]
  33. Yue, Z. A method for group decision-making based on determining weights of decision makers using TOPSIS. Appl. Math. Model. 2011, 35, 1926–1936. [Google Scholar] [CrossRef]
  34. Amiri, M.P. Project selection for oil-fields development by using the AHP and fuzzy TOPSIS methods. Expert Syst. Appl. 2010, 37, 6218–6224. [Google Scholar] [CrossRef]
  35. Li, D.F. Intuitionistic Fuzzy Set Decision and Game Analysis Methodologies; National Defense Industry Press: Beijing, China, 2012. [Google Scholar]
  36. De, S.K.; Biswas, R.; Roy, A.R. Some operations on intuitionistic fuzzy sets. Fuzzy Sets Syst. 2000, 114, 477–484. [Google Scholar] [CrossRef]
  37. Liu, H.; Wang, G. Multi-criteria decision-making methods based on intuitionistic fuzzy sets. Eur. J. Oper. Res. 2007, 179, 220–233. [Google Scholar] [CrossRef]
  38. Chen, T.; Li, C. Determining objective weights with intuitionistic fuzzy entropy measures: A comparative analysis. Inf. Sci. 2010, 180, 4207–4222. [Google Scholar] [CrossRef]
  39. Ye, J. Fuzzy decision-making method based on the weighted correlation coefficient under intuitionistic fuzzy environment. Eur. J. Oper. Res. 2010, 205, 202–204. [Google Scholar] [CrossRef]
  40. Xu, Z. Multi-person multi-attribute decision making models under intuitionistic fuzzy environment. Fuzzy Optim. Decis. Mak. 2007, 6, 221–236. [Google Scholar] [CrossRef]
  41. Chen, S.; Tan, J. Handing multicriteria fuzzy decision-making problems based vague set theory. Fuzzy Sets Syst. 1994, 67, 163–172. [Google Scholar] [CrossRef]
  42. Wu, J.; Zhang, Q. Multicriteria decision making method based on intuitionistic fuzzy weighted entropy. Expert Syst. Appl. 2011, 38, 916–922. [Google Scholar] [CrossRef]
  43. Herrera, F.; Herrera-Viedma, E. Linguistic decision analysis: Steps for solving decision problems under linguistic information. Fuzzy Sets Syst. 2000, 115, 67–82. [Google Scholar] [CrossRef]
  44. Li, D. Multiattribute decision making models and methods using intuitionistic fuzzy sets. J. Comput. Syst. Sci. 2005, 70, 73–85. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Liu, M.; Ren, H. A New Intuitionistic Fuzzy Entropy and Application in Multi-Attribute Decision Making. Information 2014, 5, 587-601. https://doi.org/10.3390/info5040587

AMA Style

Liu M, Ren H. A New Intuitionistic Fuzzy Entropy and Application in Multi-Attribute Decision Making. Information. 2014; 5(4):587-601. https://doi.org/10.3390/info5040587

Chicago/Turabian Style

Liu, Manfeng, and Haiping Ren. 2014. "A New Intuitionistic Fuzzy Entropy and Application in Multi-Attribute Decision Making" Information 5, no. 4: 587-601. https://doi.org/10.3390/info5040587

Article Metrics

Back to TopTop