A New Intuitionistic Fuzzy Entropy and Application in Multi-Attribute Decision Making

: In this paper, firstly, a new intuitionistic fuzzy (IF) entropy has been put forward, which considered both the uncertainty and the hesitancy degree of IF sets. Through comparing with other entropy measures, the advantage of the new entropy measure is obvious. Secondly, based on the new entropy measure, a new decision making method of a multi-attribute decision making problem was subsequently put forward, in which attribute values are expressed with IF values. In the cases of attribute weights, completely unknown and attribute weights are partially known. Two methods were constructed to determine them. One method is an extension of the ordinary entropy weight method, and the other method is a construction the optimal model according to the minimum entropy principle. Finally, two practical examples are given to illustrate the effectiveness and practicability of the proposed method.


Introduction
For multi-attributes decision problems, such as supplier selection, material selection in manufactory and evaluation of firm's safety performance, it is necessary to consider many factors simultaneously.This makes the problem become complex and it is difficult to find the best solution.We often notice that, in many situations, crisp data are inadequate or insufficient for setting up a model of realistic decision problems [1,2], because the problems are vague or fuzzy in nature and could not be represented by crisp numbers.In these cases, a better approach to model human judgments is to adopt fuzzy set or extended fuzzy set, such as interval number, triangular fuzzy number or intuitionistic fuzzy (IF) set which are the extended of Zadeh's fuzzy set [3][4][5][6][7][8].IF set, firstly proposed by Atanassov [9], is an extension of Zadeh's fuzzy set.IF sets seem to be more suitable for expressing the decision maker's satisfaction and/or dissatisfaction degrees rather than crisp numbers, fuzzy sets or linguistic variables [10][11][12][13].Gau and Buehrer [14] defined vague sets in 1993.Bustince and Burillo [15] pointed out that the notion of vague sets was the same with that of IFS.Many studies also reveal that the IF set is a useful tool to handle imprecise data and vague expressions that can be more natural than rigid mathematical rules and equations.Then many IF multi-attribute decision making (MADM) methods are developed to deal with these situations [16][17][18].
Entropy is an effective measure for depicting the fuzziness of a fuzzy set.Zadeh [19] first introduced the entropy of a fuzzy event in 1968.Later, in 1972, Deluca and Termini [20] gave the definition of fuzzy entropy, and they also proposed fuzzy entropy based on Shannon's function.Since then, many authors have realized the importance of entropy and have constructed fuzzy entropy measures from different viewpoints [21][22][23].Burillo and Bustince [24] introduced the IF entropy measure in IF theory for measuring fuzziness degree or uncertain information of IF sets.As a result, the research and application of IF entropy caused a lot of attention.Zhang and Jiang [25] defined a measure of IF entropy by generalizing of the Deluca and Termini [20] logarithmic fuzzy entropy; Ye [26] proposed two IF entropy measures using triangular function; Verma and Sharma [27] defined an exponential IF entropy measure by generalizing of the Pal and Pal [23] exponential fuzzy entropy.However, the all above mentioned IF entropy measures only consider the derivation of membership and non-membership, not consider the effect of hesitancy degree of the IF set.Some authors have already realized these shortcomings and some new IF entropy measures are proposed, such as Wei et al. [28] proposed an IF entropy measure using a cosine function, and Wang et al. [29] proposed an IF entropy measure using a cotangent function.
In this paper, we will put forward a new IF entropy measure, which not only considers the derivation of membership and non-membership, but also considers the effect of hesitancy degree of the IF set.TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) is one of the important techniques in dealing with MADM problems.It simultaneously considers both the shortest distance from a positive ideal solution (PIS) and the farthest distance from a negative ideal solution (NIS), and the order of the alternatives is ranked according to relative closeness coefficients [30,31].TOPSIS has been widely applied to the traditional crisp and fuzzy MADM problems [32][33][34].Based on the proposed IF entropy measure and TOPSIS, we will give a new MADM decision making method.The subsequent contents of this paper are organized as follows: In Section 2, the basic definitions and notations of IF set are defined and reviewed.In Section 3 a new IF entropy is constructed, and the advantages of this IF entropy measures comparing with other IF entropy measures are also analyzed.In Section 4 an intuitionistic fuzzy MADM method is put forward, in which the weights of attributes are obtained according to the proposed IF entropy measure.Two examples are given in Section 5. Finally, conclusions are given in Section 6.

Preliminaries
Definition 1 [9].Suppose that X is a given universal set, a set where the functions are the membership degree and non-membership degree of i x , and for every i x X ∈ , 0 ( ) ( ) 1 Conveniently, if there is only one element in X , we call A the IF number, abbreviated as ( , ) are two IF sets, then the following operations can be founded in [9,10]: x is i w , then the weighted Hamming distance measure of A and B is defined as follows: Szmidt and Kacprzyk [24] (2001) first axiomatized intuitionistic fuzzy entropy measure, which is an extension of the De Luca and Termini axioms [20] in 1972 for fuzzy sets.The axioms of intuitionistic fuzzy entropy measure were formulated in the following way.

Definition 4 [24]. A map : IFS
is called the IF entropy, if it satisfies the following properties:

A New Effective Intuitionistic Fuzzy Entropy
First we review several already existing IF entropy measures in reference.
(1) Zhang and Jiang's IF entropy measurement [25]: (2) Ye's IF entropy measure [26]: Wei et al. [28] have shown that the above two IF entropy formulas ( 3) and ( 4) are the same in mathematical terms and have given a simplified version as: (3) Verma and Sharma's exponential IF entropy measure [27]: All above IF entropy measures only consider the derivation of membership and non-membership, not consider the effect of hesitation degree of the IF set.Thus, when any two IF sets, such as 1 (0.4, 0.1) , in the real assessment, 1 A is obviously more fuzzy than 2 A .However, the entropy measures 1 E , 2 E and 3 E give the same entropy values, which are not consistent with the true situation.
Some authors have already realized the disadvantage, so some new IF entropy measures are proposed, such as IF entropy measure proposed by Wei et al. [28]: and IF entropy measure proposed by Wang and Wang [29] 5 1 In this paper, we also devote to the development of IF entropy measures and we will construct a new IF entropy given as follows: The new IF entropy measure can be written as which not only considers the deviation between membership with nonmembership degrees, but also considers the hesitancy degree of the IF set.Theorem 1.The measure given by Equation ( 9) is an IF entropy.
Proof.To prove the measure given by Equation ( 9) is an IF entropy, we only need to prove it satisfies the properties in Definition 4. Obviously, for every i x , we have: for i x X ∀ ∈ , we have ( ) 0, ( ) 1 9), we have ( ) 1 E A = .Now we assume that ( ) 1 E A = , then for all i x X ∈ , we have:

we can obtain the conclusion ( ) ( )
and Equation ( 9), we have: where , [0,1] x y ∈ .Now when x y ≤ , we have 2 2 ( , ) cot( ) 4 4 , we need to prove the function ( , ) f x y is increasing with x and decreasing with y.
We can easily derived the partial derivatives of ( , ) f x y to x and to y, respectively: f x y is increasing with x and decreasing with y, thus when ( ) ( ) Similarly, we can prove that when x y ≥ , ( , ) ( , ) 0, 0 f x y is decreasing with x and increasing with y; thus when ( ) ( ) x x υ υ ≤ , we have ( ( ), ( )) ( ( ), ( )) . For any positive real number n , De et al. [36] defined the IF set n A as follows: We consider the IF set A in {6, 7,8,9,10} X = defined in De et al. [36] as { 6, 0.1, 0.8 , 7, 0.3, 0.5 , 8, 0.6, 0.2 , 9, 0.9, 0.0 , 10,1.0, 0.0 } A = < > < > < > < > < > .By taking into consideration the characterization of linguistic variables, De et al. [36] regarded A as "LARGE" on X .Using the above operations, we have: A may be treated as "More or less LARGE"; 2 A may be treated as "Very LARAGE"; 3 A may be treated as "Quite very LARAGE"; 4 A may be treated as "Very very LARAGE".Now we consider these IF sets to compare the above entropy measures.It may be mentioned that from the logical consideration, then entropies of these IF sets are required to follow the following pattern: Calculated numerical values of the six entropy functions for these cases are given in the Table 1.E .Furthermore the new entropy measure considers the two aspects of IF set (i.e., the uncertainty depicted by the derivation of membership and non-membership, and the unknown reflected by hesitation degree of the IF set [29]), and thus the proposed entropy measure is a good entropy measure formula of IF set.

Intuitionistic Fuzzy MADM Method Based on the New IF Entropy
For a MADM problem, supposed that  is a set of n attributes.Suppose that there exists an alternative set consisting of n non-inferior alternatives from which the most desirable alternative is to be selected.Ratings of , respectively, where ij μ and ij υ are the membership (satisfactory) and nonmembership (nonsatisfactory) degrees of the alternative i A A ∈ on the attribute j o O ∈ with respect to the fuzzy concept "excellence" given by the decision maker so that they satisfy the conditions: 0 In the MADM problems, the IF values are obtained according to Liu and Wang [37] as follows.For the sake of obtaining the degrees to which alternative i A satisfies and does not satisfy attribute  ), we now use the statistical method.Suppose we invite n experts to make the judgment.They are expected to answer "yes" or "no" or "I do not know" to the question whether alternative i A satisfies attribute j o .We use ( , ) Y n i j and ( , ) N n i j to denote the number of "yes" and "no", respectively, from n experts.Then, the degrees to which alternative i A satisfies and does not satisfy attribute j o can be calculated as: ( , ) Thus, a MADM problem can be expressed with the decision matrix ( ) as follows: 1 2 ( , , , ) be the weight vector of all attributes, where 0 1


. The attribute weights information is usually unknown or partially known due to the insufficient knowledge or limitation of time of decision makers in the decision making process.Therefore, the determination of attribute weights is an important issue in MADM problems in which the attribute weights are partially known or unknown.In this paper, we will put forward two methods to determine the attribute weights for the above-mentioned two cases.

MADM Problem with Unknown Attribute Weights Information
Chen et al. [38] and Ye [39] discussed the intuitionistic fuzzy MADM problems with unknown attribute weights using IF entropy measure.Based on their work, when the attribute weights are completely unknown, we can use the proposed IF entropy to determine the attribute weights:

MADM Problem with Partially Known Attribute Weights Information
Generally, there will have more constraint conditions for the weight vector .We denote H as the set of the known weight information.To determine the attribute weights for MADM problem with attribute weights partially known under intuitionistic fuzzy environment, Xu [40] proposed an optimization model based on the Chen and Tan's score function [41]; Wu and Zhang [42], Wang and Wang [29] determined the attribute weights by establishing a programming model according to the minimum entropy principle.In this paper, we will use the new IF entropy measure to determine the attribute weights and the method is similarly with Wang and Wang [29].The specific process is given as follows.
To rank the alternatives according to the decision matrix ( ) , we propose a method to obtain the attribute weight vector by means of the proposed IF entropy measure.Entropy measure describes the degree of the fuzziness and intuitionism.The smaller the intuitionistic fuzzy entropy, the smaller of the fuzzy degree of attribute evaluation information, thus the more decision-making certainty information will be.Hence, we can utilize the principle of minimum entropy value to get the weight vector of attribute by computing the following programming: . . 1 Because each alternative is fair competition, the weight coefficient with respect to the same attribute should be also equal, thus we get the following optimization model: Hence, by solving the Equation ( 19), the optimal solution arg min E * = w is chosen as the optimal attribute weights.

The New MADM Method Based the Proposed IF Entropy
In this subsection, we put forward the new MADM method based on the above-mentioned work and the concept of TOPSIS.The specific calculation steps are given as follows: Step 1. Calculate the attribute weights according to Section 4.1 and Section 4.2; Step 2. Determine the positive ideal solution (PIS) and negative ideal solution (NIS) of the intuitionistic fuzzy MADM problem.
The closeness coefficient i C represents the distances between the PIS and NIS simultaneously.The closeness coefficient of each alternative is calculated as: ( , ) ( , ) ( , ) Step 5. Rank the alternatives according to the closeness coefficient ( i C ) in decreasing order.The best alternative is the closest to the PIS and the farthest from the NIS.

Numerical Examples
In order to illustrate the application of the proposed MADM method, two examples are given as follows: Example 3. (This is the case of the attribute weights are complete unknown) Suppose that a company wants to invest a sum of money in the best option, and there are four parallel alternatives to be selected: 1 A (a car company), 2 A (a food company), 3 A (a computer company), and 4 A (an arms company).Evaluation attributes are 1 o (the risk analysis), 2 o (the growth analysis), and 3 o (the environmental impact analysis) (this example is adopted from Herrera and Herrera-Viedma [43]; Ye [39]).Using statistical methods, we can obtain the membership degree ij μ (i.e., ij μ means the satisfactory degree) and non-membership degree ij υ (i.e., ij υ means the nonsatisfactory degree) for the alternative i A satisfying the attributes the attribute j o respectively.The IF decision matrix provided by experts is shown in Table 3.The calculation steps of the proposed method are given as follows: Step 1.According to the Section 4.1, the attribute weights vector is obtained as: ( ) (( , ), ( , ), ( , )) ((0,1), (0,1), (0,1)) Step 3. The distance measures of each alternative from PIS and NIS are calculated as: ( , ) 0.3032, ( , ) 0.3904, ( , ) 0.3481, ( , ) 0.4161 Step 4. The relative closeness coefficients are calculated as: Then the calculation steps of the proposed decision making method are given as follows: Step1.According to the Equation ( 19), we can establish the following programming model: A A A   , and 1 A is the best desirable supplier.This reveals that the alternative is in agreement with the result in Li [44].

Conclusions
IF sets are suitable to describe and deal with the uncertain and vague information occurred in many MADM problems.In this paper, we have proposed a new IF entropy measure which not only considers the deviation of membership degree with non-membership degree, but also considers the hesitation degree of the IF set.Through comparing with other IF entropy measures, the new IF measure is more reasonable and has more advantages.The proposed entropy can be applied to the field of image processing, pattern recognition and medical diagnosis.Based on the proposed IF entropy measure, a new attribute weights determination method is put forward, which we then use to approach the multi-attribute decision making problem.Two numerical examples are used to illustrate the feasibility and practicability of the proposed MADM method.The MADM method proposed in this paper can be applied to other alternative problems, such as the evaluation project investment risk, site selection and credit evaluation.In the future we will use the entropy measure to determine the weights of experts in group decision problems under IF environment and we will study entropy measures of interval IF set based on the concept of this article.

Step 2 .
The PIS ( A * ) and NIS ( A − ) are respectively given as:

Step 2 .Step 3 .Step 4 . 5 .
We use MATLAB software to solve this model, and get the optimum attribute weight vector: ( ) 0.25, 0.45, 0.30 T = w .The PIS ( A * ) and NIS ( A − ) are respectively given as: The distance measures of each alternative from PIS and NIS are calculated as The relative closeness coefficients are calculated as: Based on i C values, the ranking of the alternatives in descending order are 1 3 2

Table 1 .
Values of the different entropy measures under1/2 (15)d E (the new IF entropy) are good, which satisfy the Equation(15).For further comparison of these entropy measures, another example will be given in Example 2.

Table 4 .
Example 4. (This is the case of the attribute weights are partially known.)Theexample is adopted fromLi [44], which considers an air-condition system selection problem.Suppose there are three air-condition systems: ( 1, 2,3) Intuitionistic fuzzy decision matrix.
The IF decision matrix provided by experts is shown in Table4.