# New Distance Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application in Decision Making

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Preliminaries

**Definition**

**1**

**.**Let a non-empty set $X=\{{x}_{1},{x}_{2},\cdots ,{x}_{n}\}$ be the universe of discourse. Then a fuzzy set $A$ in $X$ is defined as follows:

**Definition**

**2**

**.**An intuitionistic fuzzy set $A$ in $X$ defined by Atanassov can be written as:

**Definition**

**3**

**.**For two AIFSs $A\in AIFSs(X)$ and $B\in AIFSs(X)$, some relations between them are defined as:

- (R1)
- $A\subseteq B$ ⇔ $\forall x\in X$ ${\mu}_{A}(x)\le {\mu}_{B}(x),{v}_{A}(x)\ge {v}_{B}(x)$;
- (R2)
- $A=B$ ⇔ $\forall x\in X$ ${\mu}_{A}(x)={\mu}_{B}(x),{v}_{A}(x)={v}_{B}(x)$;
- (R3)
- ${A}^{C}=\{\langle x,{v}_{A}(x),{\mu}_{A}(x)\rangle |x\in X\}$, where ${A}^{C}$ is called as the complement of AIFS $A$.

**Definition**

**4**

**.**For two IFVs $a=\langle {\mu}_{a},{v}_{a}\rangle $, $b=\langle {\mu}_{b},{v}_{b}\rangle $, they can be ranked by the partial order as: $a\le b\iff {\mu}_{a}\le {\mu}_{b},{v}_{a}\ge {v}_{b}$.

**0**, and the largest IFV is $\langle 1,0\rangle $, denoted by

**1**.

**Definition**

**5**

**.**Let $D$ denote a mapping $D:AIFS\times AIFS\to [0,1]$, if $D(A,B)$ satisfies the following properties, $D(A,B)$ is called a distance measure between $A\in AIFSs(X)$ and $B\in AIFSs(X)$.

- (DP1)
- $0\le D(A,B)\le 1$;
- (DP2)
- $D(A,B)=0$, if and only if $A=B$;
- (DP3)
- $D(A,B)=D(B,A)$;
- (DP4)
- If $A\subseteq B\subseteq C$, then $D(A,B)\le D(A,C)$, and$D(B,C)\le D(A,C)$.

**Definition**

**6**

**.**A mapping $S:AIFS\times AIFS\to [0,1]$ is called a similarity measure between $A\in AIFSs(X)$ and $B\in AIFSs(X)$, if $S(A,B)$ satisfies the following properties:

- (SP1)
- (SP1) $0\le S(A,B)\le 1$;
- (SP2)
- $S(A,B)=1$, if and only if $A=B$;
- (SP3)
- $S(A,B)=S(B,A)$;
- (SP4)
- If $A\subseteq B\subseteq C$, then $S(A,B)\ge S(A,C)$, and $S(B,C)\ge S(A,C)$.

## 3. Reviewing on Existing Distance and Similarity Measures

_{DC}as:

_{HB}defined as:

_{IFS}(A,B):

^{C}is the complement of B and D

_{NH}(A,B) is the normalized Hamming distance.

## 4. A New Distance Measure between AIFSs

_{i}to A is unknown, with lower bound and upper bound as ${\mu}_{A}({x}_{i})$ and $1-{v}_{A}({x}_{i})$, respectively. In other words, the membership degree of x

_{i}to A is an interval $[{\mu}_{A}({x}_{i}),1-{v}_{A}({x}_{i})]$, $i=1,2,\cdots ,n$. So the distance between two AIFSs A and B in the same universe $X=\{{x}_{1},{x}_{2},\cdots ,{x}_{n}\}$ can be measured by comparing interval sequences $[{\mu}_{A}({x}_{i}),1-{v}_{A}({x}_{i})]$ and $[{\mu}_{B}({x}_{i}),1-{v}_{B}({x}_{i})]$, $i=1,2,\cdots ,n$.

**Definition**

**7**

**.**Given two interval values a = [a

_{1},a

_{2}] and b = [b

_{1},b

_{2}], the distance between them id defined as:

**Theorem**

**1.**

**Proof.**

- (i)
- Given ${D}^{I}(A,B)=0$, we can get $\frac{{\mu}_{A}({x}_{i})-{v}_{A}({x}_{i})}{2}-\frac{{\mu}_{B}({x}_{i})-{v}_{B}({x}_{i})}{2}=0$ and $\frac{{\mu}_{A}({x}_{i})+{v}_{A}({x}_{i})}{2}-\frac{{\mu}_{B}({x}_{i})+{v}_{B}({x}_{i})}{2}=0$ $\forall i\in \{1,2,\cdots ,n\}$, which can be written identically as:$${\mu}_{A}({x}_{i})-{v}_{A}({x}_{i})={\mu}_{B}({x}_{i})-{v}_{B}({x}_{i})$$$${\mu}_{A}({x}_{i})+{v}_{A}({x}_{i})={\mu}_{B}({x}_{i})+{v}_{B}({x}_{i})$$
- (ii)
- For two AIFS A and B defined in $X=\{{x}_{1},{x}_{2},\cdots ,{x}_{n}\}$, we have the following relation:$$A=B\Rightarrow \forall i\in \{1,2,\cdots ,n\},{\mu}_{A}({x}_{i})={\mu}_{B}({x}_{i}),{v}_{A}({x}_{i})={v}_{B}({x}_{i})\Rightarrow {D}^{I}(A,B)=0.$$

**Theorem**

**2.**

**Proof.**

**Theorem**

**3.**

**Proof.**

- (i)
- Given the condition $0\le a\le x\le 1$ and $0\le y\le b\le 1$, we can obtain that $\partial f/\partial x\ge 0$ and $\partial f/\partial y\le 0$. Thus, $f(x,y)$ is an increasing function of variable x, and it is a decreasing function of variable y.Let $a={\mu}_{A}({x}_{i})$, $b={v}_{A}({x}_{i})$ then $a={\mu}_{A}({x}_{i})\le {\mu}_{B}({x}_{i})\le {\mu}_{C}({x}_{i})$ and $b={v}_{A}({x}_{i})\ge {v}_{B}({x}_{i})\ge {v}_{C}({x}_{i})$, $i\in \{1,2,\cdots ,n\}$. Considering the monotonicity of $f(x,y)$, we have $f({\mu}_{B}({x}_{i}),{v}_{B}({x}_{i}))\le f({\mu}_{C}({x}_{i}),{v}_{C}({x}_{i}))$, $\forall i\in \{1,2,\cdots ,n\}$.Under the condition $a={\mu}_{A}({x}_{i})$, $b={v}_{A}({x}_{i})$, $\forall i\in \{1,2,\cdots ,n\}$, the following expressions hold:$${D}^{I}(A,C)=\frac{1}{2n}{\displaystyle \sum _{i=1}^{n}\sqrt{f({\mu}_{C}({x}_{i}),{v}_{C}({x}_{i}))}},\text{}{D}^{I}(A,B)=\frac{1}{2n}{\displaystyle \sum _{i=1}^{n}\sqrt{f({\mu}_{B}({x}_{i}),{v}_{B}({x}_{i}))}}.$$
- (ii)
- In the condition $0\le x\le a\le 1$ and $0\le b\le y\le 1$, we have $\partial f/\partial x\le 0$ and $\partial f/\partial y\ge 0$. So $f(x,y)$ is an decreasing function of variable x, and it is an increasing function of variable y.Set $a={\mu}_{C}({x}_{i})$, $b={v}_{C}({x}_{i})$, then ${\mu}_{A}({x}_{i})\le {\mu}_{B}({x}_{i})\le {\mu}_{C}({x}_{i})=a$ and ${v}_{A}({x}_{i})\ge {v}_{B}({x}_{i})\ge {v}_{C}({x}_{i})=b$, $i\in \{1,2,\cdots ,n\}$. Considering the monotonicity of $f(x,y)$, we have $f({\mu}_{B}({x}_{i}),{v}_{B}({x}_{i}))\le f({\mu}_{A}({x}_{i}),{v}_{A}({x}_{i}))$, $\forall i\in \{1,2,\cdots ,n\}$.For $a={\mu}_{C}({x}_{i})$, $b={v}_{C}({x}_{i})$, $\forall i\in \{1,2,\cdots ,n\}$, the following expressions hold:$${D}^{I}(A,C)=\frac{1}{2n}{\displaystyle \sum _{i=1}^{n}\sqrt{f({\mu}_{A}({x}_{i}),{v}_{A}({x}_{i}))}},\text{}{D}^{I}(B,C)=\frac{1}{2n}{\displaystyle \sum _{i=1}^{n}\sqrt{f({\mu}_{B}({x}_{i}),{v}_{B}({x}_{i}))}}.$$

**Theorem**

**4.**

**Proof.**

**Theorem 1**says that ${D}^{I}(A,B)=0$ when $A=B$.

**Definition 3**.

**Theorem**

**3**, the condition ${F}_{\ast}\subseteq A\subseteq B\subseteq {F}^{\ast}$ implies that ${D}^{I}(A,B)\le {D}^{I}({F}_{\ast},{F}^{\ast})$. Since AIFSs A and B are arbitrary, the relation ${D}^{I}(A,B)\le {D}^{I}({F}_{\ast},{F}^{\ast})$ holds in the set of all AIFSs defined in $X$.

**Theorem**

**5.**

**Proof.**

**Theorems 1**–

**4**. ☐

**Theorem**

**6.**

**Proof.**

**Theorems 1**–

**4**. ☐

## 5. Numerical Examples

**Example**

**1.**

_{1},x

_{2},x

_{3}}. Two AIFSs representing two patterns are expressed as:

_{1}= {<x

_{1},0.1,0.1>,<x

_{2},0.1,0.3>,<x

_{3},0.1,0.9>} and A

_{2}= {<x

_{1},0.7,0.2>,<x

_{2},0.1,0.8>,<x

_{3},0.4,0.4>}.

_{1},0.4,0.4>,<x

_{2},0.6,0.2>,<x

_{3},0,0.8>}.

^{I}, we have:

_{1}based on the principle of minimum distance degree. It is also shown that the proposed measure D

^{I}have greater discrimination capability since the distance ${D}^{I}({A}_{1},B)$ is much less than ${D}_{NE}({A}_{2},B)$. Nevertheless, two close values ${D}_{NE}({A}_{1},B)$ and ${D}_{NE}({A}_{2},B)$ generated by normalized Euclidean distance are not helpful for making a sound decision.

**Example**

**2.**

_{1},x

_{2},x

_{3}}. Three corresponding AIFSs are given as:

_{1}= {<x

_{1},0.4,0.5>,<x

_{2},0.7,0.1>,<x

_{3},0.3,0.3>},

_{2}= {<x

_{1},0.5,0.4>,<x

_{2},0.7,0.2>,<x

_{3},0.4,0.3>},

_{3}= {<x

_{1},0.4,0.5>,<x

_{2},0.7,0.1>,<x

_{3},0.4,0.3>}.

_{1},0.1,0.1>,<x

_{2},1,0>, <x

_{3},0,1>}.

^{I}can lead to the following results:

_{1}.

**Example**

**3.**

_{1},x

_{2},x

_{3}}, which are listed as:

_{1}= {<x

_{1},0.3,0.2>,<x

_{2},0.4,0.4>,<x

_{3},0.2,0.2>},

_{2}= {<x

_{1},0.1,0.4>,<x

_{2},0,0.4>,<x

_{3},0.3,0.3>},

_{3}= {<x

_{1},0.2,0.5>,<x

_{2},0.3,0.3>,<x

_{3},0.5,0.1>}.

_{1},0.4,0.5>,<x

_{2},0.3,0.5>, <x

_{3},0.4,0.2>}.

^{I}can generate the following results:

^{I}can distinguish three pattern by assigning different values to the distance between the sample and different patterns. Based the principle of minimum distance, we can classify the sample as A

_{3}.

**Example**

**4.**

_{1},x

_{2},x

_{3}}, which are listed as:

_{1}= {<x

_{1},0.1,0.5>,<x

_{2},0.2,0.4>,<x

_{3},0.5,0.3>},

_{2}= {<x

_{1},0.2,0.4>,<x

_{2},0.1,0.6>,<x

_{3},0.8,0.1>},

_{3}= {<x

_{1},0.1,0.4>,<x

_{2},0.1,0.1>,<x

_{3},0.5,0.2>}.

_{1},0.2,0.3>,<x

_{2},0.1,0.4>,<x

_{3},0.6,0.2>}.

^{I}can generate the following results:

_{HB}and S

_{HK}cannot be used to classify sample B in such example. The proposed distance measure D

^{I}can assign different values to the distance between the sample and different patterns. So the proposed distance measure D

^{I}can be used to classify the sample as A

_{1}based the principle of minimum distance.

**Example**

**5.**

_{1}, x

_{2}, x

_{3}, x

_{4}}. They are given as:

_{1}= {<x

_{1},0.3,0.4>,<x

_{2},0.3,0.4>,<x

_{3},0.6,0.1>,<x

_{4},0.6,0.1>},

_{2}= {<x

_{1},0.4,0.4>,<x

_{2},0.3,0.5>,<x

_{3},0.7,0.1>,<x

_{4},0.6,0.2>},

_{3}= {<x

_{1},0.4,0.4>,<x

_{2},0.3,0.4>,<x

_{3},0.7,0.1>,<x

_{4},0.6,0.1>}.

_{1},0.35,0.65>,<x

_{2},0.55,0.45>, <x

_{3},0.65,0.1>,<x

_{4},0.6,0.15>}.

_{1}= ω

_{2}= ω

_{3}= 1/3, we can get the results as following:

^{I}, we can get:

^{I}can generate different values to measure the distance between the sample and there patterns. So the sample B is classified as A

_{1}based on the minimum distance value.

**Example**

**6.**

_{1}, x

_{2}, x

_{3}}. They are given as:

_{1}= {<x

_{1},0.2,0.3>,<x

_{2},0.1,0.4>,<x

_{3},0.2,0.6>},

_{2}= {<x

_{1},0.3,0.2>,<x

_{2},0.4,0.1>,<x

_{3},0.5,0.3>},

_{3}= {<x

_{1},0.2,0.3>,<x

_{2},0.4,0.1>,<x

_{3},0.5,0.3>}.

_{1},0.1,0.2>,<x

_{2},0.4,0.5>,<x

_{3},0,0>}.

^{I}, the distance degrees between B and three patterns can be yielded as following:

^{I}can classify the sample B as A

_{1}based on the minimum distance principle.

**Example**

**7.**

_{1},x

_{2},x

_{3}}. They are given as:

_{1}= {<x

_{1},0.2,0.4>,<x

_{2},0.1,0.4>,<x

_{3},0.2,0.5>},

_{2}= {<x

_{1},0.3,0.3>,<x

_{2},0.2,0.3>,<x

_{3},0.4,0.3>}.

_{1},0,0>,<x

_{2},0,0>, <x

_{3},0,0>}.

^{I}, the distance degrees between B and three patterns can be yielded as following:

_{1}is identical to the similarity between B and A

_{2}. Although the given sample is hard to classify, we can see that it is closer to A

_{2}whose membership and non-membership grades have less difference. It is shown that the proposed distance measure D

^{I}can get ${D}^{I}({A}_{2},B)<{D}^{I}({A}_{1},B)$, which coincides with intuitive analysis. So the sample B should be classified as A

_{2}based on the minimum distance principle.

**Example**

**8.**

_{1},x

_{2}}. They are given as:

_{1}= {<x

_{1},0.1,0.2>,<x

_{2},0.1,0.7>},

_{2}= {<x

_{1},0.3,0.4>,<x

_{2},0.75,0.15>}.

_{1},0.4,0.5>,<x

_{2},0.4,0.4>}.

^{I}, we can get:

^{I}can classify the sample as A

_{2}. But the complex similarity measures with additional parameters cannot classify this sample to a specific pattern.

**Example**

**9.**

_{1},x

_{2}}. They are denoted as:

_{1}= {<x

_{1},0.1,0.3>,<x

_{2},0.3,0.3>},

_{2}= {<x

_{1},0.2,0.2>,<x

_{2},0.4,0.2>}.

_{1},0.4,0.4>,<x

_{2},0.6,0.4>}.

_{XY}, we can obtain:

^{I}, we can derive:

^{I}can classify the sample as A

_{2}. But the complex similarity measures ${S}_{{G}_{k,\lambda}}^{\eta}$ with additional parameters and the similarity measure developed from Hamming distance cannot classify this sample to a specific pattern.

^{I}shows that the difference between membership degree and non-membership degree, i.e., $\mu (x)-v(x)$, and the hesitance degree, used as $\mu (x)+v(x)=1-\pi (x)$, are considered in the definition of new distance measures. Moreover, their contribution degrees on the distance measure are different. So the proposed measure can measure the difference between AIFSs more effectively. In the following section, we will discuss its performance in the application of multi-attribute decision making (MADM) under intuitionistic fuzzy environment.

## 6. Application to Multi-Attribute Decision Making

_{i}is w

_{i}, where 0 ≤ w

_{i}≤ 1, $i=1,2,\cdots ,n$, and ${\sum}_{i=1}^{n}{w}_{i}}=1$. The decision matrix given by decision maker is presented as:

_{i}with respect to attribute a

_{j}, with $i=1,2,\cdots ,m$ and $j=1,2,\cdots ,n$. The new method for MADM based on the proposed distance measure and TOPSIS method [37] can be depicted as following:

**Step 1**. According to the partial order relation between IFVs, obtain the relative positive ideal solution ${\alpha}^{+}=({\alpha}_{1}^{+},{\alpha}_{2}^{+},\cdots ,{\alpha}_{n}^{+})$ of the attributes, where each relative value ${\alpha}_{j}^{+}$ is calculated as:

**Step 2.**Utilize the distance measure of IFVs defined in Equation (34) to calculate the similarity degree between the evaluating result ${d}_{ij}=<{\mu}_{ij},{v}_{ij}>$ of alternative x

_{i}with respect to attribute a

_{j}and the relative positive ideal value ${\alpha}_{j}^{+}$ of attribute a

_{j}, with $i=1,2,\cdots ,m$ and $j=1,2,\cdots ,n$. Construct the positive similarity matrix ${G}^{+}={({g}_{ij}^{+})}_{m\times n}$. The similarity degree is expressed as:

**Step 3.**Appling the distance measure of IFVs defined in Equation (34) to calculate the similarity degree between the evaluating result ${d}_{ij}=<{\mu}_{ij},{v}_{ij}>$ of alternative x

_{i}respect to attribute a

_{j}and the relative negative ideal value ${\alpha}_{j}^{-}$ of attribute a

_{j}, with $i=1,2,\cdots ,m$ and $j=1,2,\cdots ,n$. Construct the negative similarity matrix ${G}^{-}={({g}_{ij}^{-})}_{m\times n}$. The similarity degree is expressed as:

**Step 4.**Based on the attribute weights w

_{i}and the distance matrices ${G}^{+}$ and ${G}^{-}$, calculate the weighted positive score ${S}^{+}({x}_{i})$ and weighted negative score ${S}^{-}({x}_{i})$ for each alternative x

_{i}, respectively, $i=1,2,\cdots ,m$. The positive and negative scores are calculated as: ${S}^{+}({x}_{i})={\displaystyle {\sum}_{j=1}^{n}{w}_{j}{g}_{ij}^{+}}$ and ${S}^{-}({x}_{i})={\displaystyle {\sum}_{j=1}^{n}{w}_{j}{g}_{ij}^{-}}$.

**Step 5.**Obtain the relative closeness degree $T({x}_{i})$ of each alternative x

_{i}as: $T({x}_{i})={S}^{+}({x}_{i})/({S}^{+}({x}_{i})+{S}^{-}({x}_{i}))$, $i=1,2,\cdots ,m$.

**Step 6.**Get the preference order of all alternatives by the comparing their relative closeness degree. Larger closeness degree indicates better preference order.

**Example**

**10.**

_{1}: the food company; (2) x

_{2}: the chemical company; (3) x

_{3}: the car company; (4) x

_{4}: the computer company; (5) x

_{5}: the furniture company. For attributes to be accounted are: (1) a

_{1}: the benefit rate; (2) a

_{2}: the risk of investment; (3) a

_{3}: the difficulty of investment; and (4) a

_{4}: the influence on environment. The attribute weights of a

_{1}, a

_{2}, a

_{3}and a

_{4}are 0.25, 0.4, 0.2 and 0.15, respectively, i.e., w

_{1}= 0.25, w

_{2}= 0.4, w

_{3}= 0.2, and w

_{4}= 0.15. The decision matrix provided by decision maker is:

- (1)
- Comparing the IFVs in each column, we can get the relative positive and negative ideal solutions as:$${\alpha}_{1}^{+}=<0.9,0.1>,\text{}{\alpha}_{2}^{+}=0.8,0.1,\text{}{\alpha}_{3}^{+}=0.6,0.1,\text{}{\alpha}_{4}^{+}=0.5,0.2;\phantom{\rule{0ex}{0ex}}{\alpha}_{1}^{-}=0.6,0.3,\text{}{\alpha}_{2}^{-}=0.3,0.3,\text{}{\alpha}_{3}^{-}=0.2,0.5,\text{}{\alpha}_{4}^{-}=0.1,0.6.$$
- (2)
- Based on the distance measure D
^{I}defined in Equation (39), the positive similarity matrix can be constructed as:$${\mathit{G}}^{+}=\left(\begin{array}{llll}0.7483& 0.7918& 0.6000& 0.6000\\ 0.9000& 1.0000& 1.0000& 0.8000\\ 0.7483& 0.6945& 0.8472& 1.0000\\ 1.0000& 0.7918& 0.6945& 0.6488\\ 0.8845& 0.6785& 0.9243& 0.9243\end{array}\right)$$ - (3)
- Using the distance measure D
^{I}of IFVs and the expression in Equation (40), we can get the negative similarity matrix as:$${\mathit{G}}^{-}=\left(\begin{array}{llll}1.0000& 0.8472& 1.0000& 1.0000\\ 0.8472& 0.6394& 0.6000& 0.8000\\ 1.0000& 0.9423& 0.7483& 0.6000\\ 0.7483& 0.8472& 0.8845& 0.9423\\ 0.8472& 0.9423& 0.6488& 0.6488\end{array}\right)$$ - (4)
- According to the attribute weights, we can get the weighted positive scores of all alternatives as:$${S}^{+}({x}_{1})=0.7138,{S}^{+}({x}_{2})=0.9450,{S}^{+}({x}_{3})=0.7834,{S}^{+}({x}_{4})=0.8030,{S}^{+}({x}_{5})=0.8223.$$$${S}^{-}({x}_{1})=0.9389,{S}^{-}({x}_{2})=0.7076,{S}^{-}({x}_{3})=0.8666,{S}^{-}({x}_{4})=0.8442,{S}^{-}({x}_{5})=0.8158.$$
- (5)
- The relative closeness degree of each alternative can be calculated as:$$T({x}_{1})=0.4319,T({x}_{2})=0.5718,T({x}_{3})=0.4751,T({x}_{4})=0.4875,T({x}_{5})=0.5020.$$
- (6)
- Since T(x
_{2}) > T(x_{5}) > (x_{4}) > T(x_{3}) > T(x_{1}), we can rank five alternatives x_{1}, x_{2}, x_{3}, x_{4}and x_{5}in the preference order: ${x}_{2}\succ {x}_{5}\succ {x}_{4}\succ {x}_{3}\succ {x}_{1}$.

**Example**

**11.**

_{1}, x

_{2}, x

_{3}, x

_{4}and x

_{5}. They will be assessed from four aspects, namely, work performance, academic performance, leadership, and personality, which are denoted by four attributes a

_{1}, a

_{2}, a

_{3}and a

_{4}, respectively. The weights of a

_{1}, a

_{2}, a

_{3}and a

_{4}are 0.1, 0.2, 0.3 and 0.4, respectively. So we have w

_{1}= 0.1, w

_{2}= 0.2. w

_{3}= 0.3 and w

_{4}= 0.4. The decision matrix expressed by IFVs is shown as following:

- (1)
- Based on the partial order relation between IFVs, the relative positive and negative ideal solutions of each attribute can be get as:$${\alpha}_{1}^{+}=<0.5,0.4>,\text{}{\alpha}_{2}^{+}=0.5,0.4,\text{}{\alpha}_{3}^{+}=0.4,0.4,\text{}{\alpha}_{4}^{+}=0.5,0.3;\phantom{\rule{0ex}{0ex}}{\alpha}_{1}^{-}=0.3,0.6,\text{}{\alpha}_{2}^{-}=0.2,0.6,\text{}{\alpha}_{3}^{-}=0.1,0.9,\text{}{\alpha}_{4}^{-}=0.3,0.6.$$
- (2)
- We can get the positive similarity matrix according to the proposed distance measures:$${\mathit{G}}^{+}=\left(\begin{array}{llll}0.9000& 0.8000& 1.0000& 1.0000\\ 0.9423& 1.0000& 0.9000& 0.8472\\ 0.8472& 0.9423& 0.8845& 0.7918\\ 0.8845& 0.7483& 0.5959& 0.9000\\ 1.0000& 0.8000& 0.9000& 0.8750\end{array}\right)$$
- (3)
- The negative similarity matrix can be obtained based on the proposed distance measure D
^{I}of IFVs and the expression in Equation (40).$${\mathit{G}}^{-}=\left(\begin{array}{llll}0.9000& 0.9423& 0.5959& 0.7483\\ 0.8472& 0.7483& 0.6945& 0.8845\\ 0.9423& 0.7918& 0.7000& 0.9423\\ 0.8845& 1.0000& 1.0000& 0.8472\\ 0.8000& 0.9423& 0.6945& 0.8635\end{array}\right)$$ - (4)
- According to the attribute weights w
_{1}= 0.1, w_{2}= 0.2. w_{3}= 0.3 and w_{4}= 0.4, we can get the weighted positive scores of all alternatives as:$${S}^{+}({x}_{1})=0.9500,{S}^{+}({x}_{2})=0.9031,{S}^{+}({x}_{3})=0.8553,{S}^{+}({x}_{4})=0.7769,{S}^{+}({x}_{5})=0.8800.$$$${S}^{-}({x}_{1})=0.7565,{S}^{-}({x}_{2})=0.7966,{S}^{-}({x}_{3})=0.8395,{S}^{-}({x}_{4})=0.9274,{S}^{-}({x}_{5})=0.8222.$$ - (5)
- The relative closeness degree of each alternative can be calculated as:$$T({x}_{1})=0.5567,T({x}_{2})=0.5314,T({x}_{3})=0.5047,T({x}_{4})=0.4559,T({x}_{5})=0.5170.$$
- (6)
- Since T(x
_{1}) > T(x_{2}) > (x_{5}) > T(x_{3}) > T(x_{4}), we can rank five alternatives x_{1}, x_{2}, x_{3}, x_{4}and x_{5}in the preference order: ${x}_{1}\succ {x}_{2}\succ {x}_{5}\succ {x}_{3}\succ {x}_{4}$.

^{I}performs as well as existing methods in Reference [38] and Reference [40], which are also developed based on the TOPSIS method. However, we note that the distance measure used in Reference [38] is defined based on the Hausdorff metric, which has weaker discrimination ability. Moreover, Joshi and Kumar’s [38] method will also be troubled by the problem of “division by zero”. The method proposed in [40] is also an effective method for solving MADM in intuitionistic fuzzy environment. But the similarity measure used in Reference [40] is developed based on the right-angled triangular fuzzy numbers [28], which is more complex than the proposed distance measure. So the MADM method based on the proposed distance method and the TOPSIS method is much easier to implement with less computation complexity.

## 7. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Zadeh, L.A. Fuzzy sets. Inf. Control.
**1965**, 8, 338–353. [Google Scholar] [CrossRef] [Green Version] - Atanassov, K. Intuitionistic fuzzy sets. Fuzzy Sets. Syst.
**1986**, 20, 87–96. [Google Scholar] [CrossRef] - Gau, W.L.; Buehrer, D.J. Vague sets. IEEE Trans. Syst. Man Cybern.
**1993**, 23, 610–614. [Google Scholar] [CrossRef] - Bustince, H.; Burillo, P. Vague sets are intuitionistic fuzzy sets. Fuzzy Sets. Syst.
**1996**, 79, 403–405. [Google Scholar] [CrossRef] - Fan, C.; Song, Y.; Fu, Q.; Lei, L.; Wang, X. New operators for aggregating intuitionistic fuzzy information with their application in decision making. IEEE Access.
**2018**, 6, 27214–27238. [Google Scholar] [CrossRef] - Song, Y.; Wang, X.; Lei, L.; Xue, A. A novel similarity measure on intuitionistic fuzzy sets with its applications. Appl. Intell.
**2015**, 42, 252–261. [Google Scholar] [CrossRef] - Song, Y.; Wang, X.; Quan, W.; Huang, W. A new approach to construct similarity measure for intuitionistic fuzzy sets. Soft Comput.
**2017**, 1–14. [Google Scholar] [CrossRef] - Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell.
**2018**, 48, 1672–1688. [Google Scholar] [CrossRef] - Song, Y.; Wang, X.; Zhu, J.; Lei, L. Sensor dynamic reliability evaluation based on evidence and intuitionistic fuzzy sets. Appl. Intell.
**2018**, 1–13. [Google Scholar] [CrossRef] - Fan, C.; Song, Y.; Lei, L.; Wang, X.; Bai, S. Evidence reasoning for temporal uncertain information based on relative reliability evaluation. Exp. Syst. Appl.
**2018**, 113, 264–276. [Google Scholar] [CrossRef] - Szmidt, E.; Kacprzyk, J. Distances between intuitionistic fuzzy sets. Fuzzy Sets. Syst.
**1997**, 114, 505–518. [Google Scholar] [CrossRef] - Vlachos, K.I.; Sergiadis, G.D. Intuitionistic fuzzy information-applications to pattern recognition. Pattern Recognit. Lett.
**2007**, 28, 197–206. [Google Scholar] [CrossRef] - Wang, W.; Xin, X. Distance measure between intuitionistic fuzzy sets. Pattern Recognit. Lett.
**2005**, 26, 2063–2069. [Google Scholar] [CrossRef] - Li, D.; Cheng, C. New similarity measures of intuitionistic fuzzy sets and application to pattern recognition. Pattern Recognit. Lett.
**2002**, 23, 221–225. [Google Scholar] [CrossRef] - Mitchell, H.B. On the Dengfeng–Chuntian similarity measure and its application to pattern recognition. Pattern Recognit. Lett.
**2003**, 24, 3101–3104. [Google Scholar] [CrossRef] - Liang, Z.; Shi, P. Similarity measures on intuitionistic fuzzy sets. Pattern Recognit. Lett.
**2003**, 24, 2687–2693. [Google Scholar] [CrossRef] - Hung, W.L.; Yang, M.S. On similarity measures between intuitionistic fuzzy sets. Int. J. Intell. Syst.
**2008**, 23, 364–383. [Google Scholar] [CrossRef] - Xu, Z.S.; Chen, J. An overview of distance and similarity measures of intuitionistic fuzzy sets. Int. J. Uncertain. Fuzz. Knowl.-Based Syst.
**2008**, 16, 529–555. [Google Scholar] [CrossRef] - Szmidt, E.; Kacprzyk, J. A similarity measure for intuitionistic fuzzy sets and its application in supporting medical diagnostic reasoning. In Proceedings of the 7th International Conference on Artificial Intelligence and Soft Computing (ICAISC 2004), Zokopane, Poland, 7–11 June 2004; pp. 388–393. [Google Scholar]
- Xu, Z.S.; Yager, R.R. Intuitionistic and interval-valued intuitionistic fuzzy preference relations and their measures of similarity for the evaluation of agreement within a group. Fuzzy Optim. Decis. Mak.
**2009**, 8, 123–139. [Google Scholar] [CrossRef] - Xia, M.M.; Xu, Z.S. Some new similarity measures for intuitionistic fuzzy values and their application in group decision making. J. Syst. Sci. Syst. Eng.
**2010**, 19, 430–452. [Google Scholar] [CrossRef] - Wei, C.P.; Wang, P.; Zhang, Y.Z. Entropy, similarity measure of interval-valued intuitionistic fuzzy sets and their applications. Inf. Sci.
**2011**, 18, 4273–4286. [Google Scholar] [CrossRef] - Ye, J. Cosine similarity measures for intuitionistic fuzzy sets and their applications. Math. Comput. Model.
**2011**, 53, 91–97. [Google Scholar] [CrossRef] - Hwang, C.M.; Yang, M.S.; Hung, W.L.; Lee, M.G. A similarity measure of intuitionistic fuzzy sets based on the Sugeno integral with its application to pattern recognition. Inf. Sci.
**2012**, 189, 93–109. [Google Scholar] [CrossRef] - Li, J.Q.; Deng, G.N.; Li, H.X.; Zeng, W.Y. The relationship between similarity measure and entropy of intuitionistic fuzzy sets. Inf. Sci.
**2012**, 188, 314–321. [Google Scholar] [CrossRef] - Zhang, H.; Yu, L. New distance measures between intuitionistic fuzzy sets and interval-valued fuzzy sets. Inf. Sci.
**2013**, 245, 181–196. [Google Scholar] [CrossRef] - Boran, F.E.; Akay, D. A biparametric similarity measure on intuitionistic fuzzy sets with applications to pattern recognition. Inf. Sci.
**2014**, 255, 45–57. [Google Scholar] [CrossRef] - Chen, S.M.; Cheng, S.H.; Lan, T.C. A novel similarity measure between intuitionistic fuzzy sets based on the centroid points of transformed fuzzy numbers with applications to pattern recognition. Inf. Sci.
**2016**, 343, 15–40. [Google Scholar] [CrossRef] - Atanassov, K.; Gargov, G. Interval-valued intuitionistic fuzzy sets. Fuzzy Sets Syst.
**1989**, 31, 343–349. [Google Scholar] [CrossRef] - Chen, S.M.; Tan, J.M. Handling multicriteria fuzzy decision making problems based on vague set theory. Fuzzy Sets Syst.
**1994**, 67, 163–172. [Google Scholar] [CrossRef] - Hong, D.H.; Choi, C.H. Multicriteria fuzzy decision-making problems based on vague set theory. Fuzzy Sets Syst.
**2000**, 114, 103–113. [Google Scholar] [CrossRef] - Xu, Z.S.; Yager, R.R. Some geometric aggregation operators based on intuitionistic fuzzy sets. Int. J. Gen. Syst.
**2006**, 35, 417–433. [Google Scholar] [CrossRef] - Xu, Z.S. Some similarity measures of intuitionistic fuzzy sets and their applications to multiple attribute decision making. Fuzzy Optim. Decis. Mak.
**2007**, 6, 109–121. [Google Scholar] [CrossRef] - Hong, D.H.; Kim, C. A note on similarity measures between vague sets and between elements. Inf. Sci.
**1999**, 115, 83–96. [Google Scholar] [CrossRef] - Irpino, A.; Verde, R. Dynamic clustering of interval data using a Wasserstein-based distance. Pattern Recognit. Lett.
**2008**, 29, 1648–1658. [Google Scholar] [CrossRef] - Tran, L.; Duckstein, L. Comparison of fuzzy numbers using a fuzzy distance measure. Fuzzy Sets Syst.
**2002**, 130, 331–341. [Google Scholar] [CrossRef] [Green Version] - Lai, Y.J.; Liu, T.Y.; Hwang, C.L. TOPSIS for MODM. Eur. J. Oper. Res.
**1994**, 76, 486–500. [Google Scholar] [CrossRef] - Joshi, D.; Kumar, S. Intuitionistic fuzzy entropy and distance measure based on TOPSIS method for multi-criteria decision making. Egypt. Inf. J.
**2014**, 15, 97–104. [Google Scholar] [CrossRef] - Wu, M.C.; Chen, T.Y. The ELECTRE multicriteria analysis approach based on Atanassov’s intuitionistic fuzzy sets. Expert Syst. Appl.
**2011**, 38, 12318–12327. [Google Scholar] [CrossRef] - Chen, S.M.; Cheng, S.H.; Lan, T.C. Multicriteria decision making based on the TOPSIS method and similarity measures between intuitionistic fuzzy values. Inf. Sci.
**2016**, 367, 279–295. [Google Scholar] [CrossRef]

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ke, D.; Song, Y.; Quan, W.
New Distance Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application in Decision Making. *Symmetry* **2018**, *10*, 429.
https://doi.org/10.3390/sym10100429

**AMA Style**

Ke D, Song Y, Quan W.
New Distance Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application in Decision Making. *Symmetry*. 2018; 10(10):429.
https://doi.org/10.3390/sym10100429

**Chicago/Turabian Style**

Ke, Di, Yafei Song, and Wen Quan.
2018. "New Distance Measure for Atanassov’s Intuitionistic Fuzzy Sets and Its Application in Decision Making" *Symmetry* 10, no. 10: 429.
https://doi.org/10.3390/sym10100429