Next Article in Journal
A Simple Explicit Expression for the Flocculation Dynamics Modeling of Cohesive Sediment Based on Entropy Considerations
Previous Article in Journal
Improved Cryptanalysis and Enhancements of an Image Encryption Scheme Using Combined 1D Chaotic Maps
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Generalized Distance-Based Entropy and Dimension Root Entropy for Simplified Neutrosophic Sets

Department of Electrical engineering and Automation, Shaoxing University, 508 Huancheng West Road, Shaoxing 312000, China
*
Author to whom correspondence should be addressed.
Entropy 2018, 20(11), 844; https://doi.org/10.3390/e20110844
Submission received: 8 October 2018 / Revised: 1 November 2018 / Accepted: 2 November 2018 / Published: 4 November 2018

Abstract

:
In order to quantify the fuzziness in the simplified neutrosophic setting, this paper proposes a generalized distance-based entropy measure and a dimension root entropy measure of simplified neutrosophic sets (NSs) (containing interval-valued and single-valued NSs) and verifies their properties. Then, comparison with the existing relative interval-valued NS entropy measures through a numerical example is carried out to demonstrate the feasibility and rationality of the presented generalized distance-based entropy and dimension root entropy measures of simplified NSs. Lastly, a decision-making example is presented to illustrate their applicability, and then the decision results indicate that the presented entropy measures are effective and reasonable. Hence, this study enriches the simplified neutrosophic entropy theory and measure approaches.

1. Introduction

Since entropy is an effective measure approach in quantifying the uncertainty degree of the objects, with the development of fuzzy theory, a lot of research on fuzzy entropy has been done so far. Zadeh [1] first defined fuzzy entropy for fuzzy sets regarding the probability distribution of a fuzzy event. Then, De-Luca and Termini [2] formulated axioms of fuzzy entropy and proposed a non-probabilistic logarithm of fuzzy entropy. Exponential fuzzy entropy was presented by Pal and Pal [3]. Yager [4] put forward the metric distance-based entropy by measuring the lack of distinction between the fuzzy set and its complement. The weighted fuzzy entropy with trigonometric functions of membership degree was constructed by Parkash and Sharma [5]. Thereafter, the generalized parametric exponential fuzzy entropy of order-α was introduced by Verma and Sharma [6], which reduces to the Pal and Pal exponential entropy [3] when α = 1, and becomes the De-Luca and Termini logarithmic entropy [2] when α → 0. However, for an intuitionistic fuzzy set (IFS) extended by adding a non-membership degree to a fuzzy set (FS), Burillo and Bustince [7] first proposed IFS and interval-valued IFS entropy measures and their axiom requirements. Then, Szmidt and Kacprzyk [8] redefined De-Luca and Termini’s axioms [2] in IFS setting and presented an intuitionistic non-probabilistic fuzzy entropy measure by a geometric interpretation and a ratio of distance of IFSs. Valchos and Sergiadis [9] constructed a new entropy logarithm of IFS on the basis of the De-Luca and Termini fuzzy entropy logarithm [2]. As an extension of logarithmic entropy [2], Zhang and Jiang [10] proposed vague entropy by the intersection and union of the non-membership degree and membership degree for vague sets, and defined vague cross-entropy for IFSs. Further, the cosine and sine entropy of IFS was defined by Ye [11]. An exponential entropy measure of IFS was proposed by Verma and Sharma [12], and then intuitionistic fuzzy entropy was proposed corresponding to the order-α [13] and R-norm [14]. Additionally, for an interval-valued IFS (IvIFS), Ye [15] put forward the sine and cosine entropy of IvIFS. Wei et al. [16] also presented entropy and similarity measures of IvIFS and described their relationships. Then, Zhang et al. [17] defined the distance-based entropy of IvIFS and its relative axiom requirement. Tian et al. [18] proposed a pair of generalized entropy measure on IFSs and IvIFSs.
Recently, the neutrosophic set (NS) was introduced to describe the uncertainty and inconsistency information by an indeterminacy degree added to the IFS. After that, a single-valued NS (SvNS), an interval-valued NS (IvNS), and a simplified NS containing SvSN and IvNS were proposed as subsets of NS and successively used for practical applications. To measure the fuzziness of the NSs, Majumder and Samanta [19] developed the entropy of SvNSs. Aydoğdu [20] introduced the entropy and similarity measures of IvNSs. Then Ye and Du [21] put forward the distances, entropy, and similarity measures of IvNSs and depicted their relationship. Ye and Cui further proposed exponential entropy [22] and sine entropy [23] for simplified NSs. However, some distances-based entropy measures are not developed for simplified NSs in existing literature. Hence, it is necessary to add some distances-based entropy measures of simplified NSs as their complement.
Motivated by distance measures and dimension root similarity measure [24], we proposed the generalized distance-based entropy and dimension root entropy of simplified NSs in this paper. As for the framework of this paper, we introduce some concepts of simplified NSs in Section 2, and then Section 3 proposes the simplified neutrosophic generalized distance-based entropy and dimension root entropy. In Section 4, the comparative analysis of entropy measures for IvNSs is carried out to show the effectiveness and rationality of the presented entropy measures. In Section 5, a decision-making example is used to illustrate the applicability of the novel entropy measures. Lastly, the conclusions and future work of this study are given in Section 6.

2. Simplified Neutrosophic Sets

Simplified NS, which contains both SvNS and IvNS, was presented by Ye [25] as a subset of NS for convenient application. Assume there is a universal set A = {a1, a2, ..., an}, then a simplified NS B in A can be given by B = {<ai, TB(ai), UB(ai), FB(ai)> | aiA}, where B is a SvNS if TB(ai), UB(ai), FB(ai) ∊ [0, 1] and 0 ≤ TB(ai) + UB(ai) + FB(ai) ≤ 3, whereas B is an IvNS if TB(ai) = [ T B ( a i ) , T B + ( a i ) ] , UB(ai) = [ U B ( a i ) , U B + ( a i ) ] , FB(ai) = [ F B ( a i ) , F B + ( a i ) ] with the conditions of [ T B ( a i ) , T B + ( a i ) ] , [ U B ( a i ) , U B + ( a i ) ] , [ F B ( a i ) , F B + ( a i ) ] Í [0, 1] and 0 T B + ( a i ) + U B + ( a i ) + F B + ( a i ) 3 .
Provided that there are two simplified NSs B = {<ai, TB(ai), UB(ai), FB(ai)> | aiA} and C = {<ai, TC(ai), UC(ai), FC(ai)> | aiA}, then some operations between B and C can be given as follows [25,26]:
(1)
The sufficient and necessary condition of BC for SvNSs is TB(ai) ≤ TC(ai), UB(ai) ≥ UC(ai), and FB(ai) ≥ FC(ai), while that for IvNSs is T B ( a i ) T C ( a i ) , T B + ( a i ) T C + ( a i ) , U B ( a i ) U C ( a i ) , U B + ( a i ) U C + ( a i ) , F B ( a i ) F C ( a i ) , and F B + ( a i ) F C + ( a i ) ;
(2)
The sufficient and necessary condition of B = C is BC and CB;
(3)
The complement of a SvNS B is B c = { < a i , F B ( a i ) , 1 U B ( a i ) , T B ( a i ) > | a i A } , and then that of an IvNS B is B c = { < a i , [ F B ( a i ) , F B + ( a i ) ] , [ 1 U B + ( a i ) , 1 U B ( a i ) ] , [ T B ( a i ) , T B + ( a i ) ] > | a i A } ;
(4)
If B and C are SvNSs, then:
B C = { < a i , T B ( a i ) T C ( a i ) , U B ( a i ) U C ( a i ) , F B ( a i ) F C ( a i ) > | a i A } ,
B C = { < a i , T B ( a i ) T C ( a i ) , U B ( a i ) U C ( a i ) , F B ( a i ) F C ( a i ) > | a i A } ,
B C = { < a i , T B ( a i ) + T C ( a i ) T B ( a i ) T C ( a i ) , U B ( a i ) U C ( a i ) , F B ( a i ) F C ( a i ) > | a i A } ,
B C = { a i , T B ( a i ) T C ( a i ) , U B ( a i ) + U C ( a i ) U B ( a i ) U C ( a i ) , F B ( a i ) + F C ( a i ) F B ( a i ) F C ( a i ) | a i A } ,
γ B = { < a i , 1 ( 1 T B ( a i ) ) γ , U B γ ( a i ) , F B γ ( a i ) > | a i A }   for   γ > 0 ,
B γ = { a i , T B γ ( a i ) , 1 ( 1 U B ( a i ) ) γ , 1 ( 1 F B ( a i ) ) γ | a i A }   for   γ > 0 .
However, if B and C are IvNSs, then:
B C = { a i , [ T B ( a i ) T C ( a i ) , T B + ( a i ) T C + ( a i ) ] , [ U B ( a i ) U C ( a i ) , U B + ( a i ) U C + ( a i ) ] , [ F B ( a i ) F C ( a i ) , F B + ( a i ) F C + ( a i ) ] | a i A } ,
B C = { a i , [ T B ( a i ) T C ( a i ) , T B + ( a i ) T C + ( a i ) ] , [ U B ( a i ) U C ( a i ) , U B + ( a i ) U C + ( a i ) ] , [ F B ( a i ) F C ( a i ) , F B + ( a i ) F C + ( a i ) ] | a i A } ,
B C = { a i , [ T B ( a i ) + T C ( a i ) T B ( a i ) T C ( a i ) , T B + ( a i ) + T C + ( a i ) T B + ( a i ) T C + ( a i ) ] , [ U B ( a i ) U C ( a i ) , U B + ( a i ) U C + ( a i ) ] , [ F B ( a i ) F C ( a i ) , F B + ( a i ) F C + ( a i ) ] | a i A } ,
B C = { a i , [ T B ( a i ) T C ( a i ) , T B + ( a i ) T C + ( a i ) ] , [ U B ( a i ) + U C ( a i ) U B ( a i ) U C ( a i ) , U B + ( a i ) + U C + ( a i ) U B + ( a i ) U C + ( a i ) ] , [ F B ( a i ) + F C ( a i ) F B ( a i ) F C ( a i ) , F B + ( a i ) + F C + ( a i ) F B + ( a i ) F C + ( a i ) ] | a i A } ,
γ B = { a i , [ 1 ( 1 T B ( a i ) ) γ , 1 ( 1 T B + ( a i ) ) γ ] , [ ( U B ( a i ) ) γ , ( U B + ( a i ) ) γ ] , [ ( F B ( a i ) ) γ , ( F B + ( a i ) ) γ ] | a i A }   for   γ > 0 ,
B γ = { a i , [ ( T B ( a i ) ) γ , ( T B + ( a i ) ) γ ] , [ 1 ( 1 U B ( a i ) ) γ , 1 ( 1 U B + ( a i ) ) γ ] , [ 1 ( 1 F B ( a i ) ) γ , 1 ( 1 F B + ( a i ) ) γ ] | a i A }   for   γ > 0 .

3. Simplified Neutrosophic Generalized Distance-based Entropy and Dimension Root Entropy

In this section, two novel simplified neutrosophic entropy measures, containing a simplified neutrosophic generalized distance-based entropy measure and a simplified neutrosophic dimension root entropy measure, are defined below.

3.1. Simplified Neutrosophic Generalized Distance-Based Entropy

Definition 1.
Assume a simplified NS H in a universal set A = {a1, a2, ..., an} is H = {<ai, TH(ai), UH(ai), FH(ai)> | ai ∊ A}. Then, a new generalized distance-based entropy measure of H can be defined as:
E A 1 ρ ( H ) = 1 3 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ ]   for   the   SvNS   H   and   ρ > 0 ,
E A 2 ρ ( H ) = 1 6 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | T H + ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | U H + ( a i ) 0.5 | ρ + 1 2 ρ | F H + ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ ]   for   the   IvNS   H   and   ρ > 0 ,
where ρ is an integer value.
According to the axiomatic definition of the IvNS entropy measure [21], the proposed generalized distance-based entropy measure of a simplified NS has the theorem below.
Theorem 1.
Set A as a universal set A = {a1, a2, ..., an}. Assume there is a fuzziest simplified NS B = {b1, b2, ..., bn} = {<ai, TB(ai), UB(ai), FB(ai)> | ai ∊ A} in the universal set A along with each element bi = <ai, 0.5, 0.5, 0.5> (i = 1, 2, ..., n) for SvNS or bi = <ai, [0.5, 0.5], [0.5, 0.5], [0.5, 0.5]> (i = 1, 2, ..., n) for IvNS. Then the entropy measure E A k ρ ( H ) (k = 1, 2; ρ > 0) of the simplified NS H = {h1, h2, ..., hn} = {<ai, TH(ai), UH(ai), FH(ai)> | ai ∊ A} satisfies the following properties:
(EAP1) E A k ρ ( H ) = 0 (k = 1, 2; ρ > 0) if H is a crisp set whose element is <ai, 1, 0, 0> or <ai, 0, 0, 1> (i = 1, 2, ..., n) for SvNS and <ai, [1, 1], [0, 0], [0, 0]> or <ai, [0, 0], [0, 0], [1, 1]> for IvNS;
(EAP2) E A k ρ ( H ) = 1 (k = 1, 2; ρ > 0) if and only if hi = bi for i = 1, 2, ..., n;
(EAP3) If one simplified NS H is closer to the fuzziest simplified NS B than the other simplified NS L, then H is fuzzier than L with E A k ρ ( L ) < E A k ρ ( H ) (k = 1, 2; ρ > 0);
(EAP4) If the complement of H is HC, then E A k ρ ( H ) = E A k ρ ( H c ) (k = 1, 2; ρ > 0).
Proof. 
(EAP1) If a crisp set H = {h1, h2, ..., hn} is a SvNS with hi = <ai, 1, 0, 0> or hi = <ai, 0, 0, 1> (i = 1, 2, ..., n), by Equation (1) we can obtain:
E A 1 ρ ( H ) = 1 3 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ ] = n 3 n [ 1 2 ρ | 1 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ ] = 0
or   E A 1 ρ ( H ) = 1 3 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ ] = n 3 n [ 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 1 0.5 | ρ ] = 0 , for   ρ > 0
while if H = {h1, h2, ..., hn} is an IvNS with hi = <ai, [1, 1], [0, 0], [0, 0]> or hi = <ai, [0, 0], [0, 0], [1, 1]> (i = 1, 2, ..., n), by Equation (2) we have:
E A 2 ρ ( H ) = 1 6 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | T H + ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | U H + ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ + 1 2 ρ | F H + ( a i ) 0.5 | ρ ] = 1 6 n i = 1 n [ 1 2 ρ | 1 0.5 | ρ + 1 2 ρ | 1 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ ] = 0 ,
or:
E A 2 ρ ( H ) = 1 6 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | T H + ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | U H + ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ + 1 2 ρ | F H + ( a i ) 0.5 | ρ ] = 1 6 n i = 1 n [ 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 0 0.5 | ρ + 1 2 ρ | 1 0.5 | ρ + 1 2 ρ | 1 0.5 | ρ ] = 0   for   ρ > 0 .
(EAP2) Let f ( x i ) = 1 2 ρ | x i 0.5 | ρ (ρ > 0) be a function for xi ∊ [0, 1] (i = 1, 2, ..., n). Find the extreme values of f(xi) on the closed interval [0, 1] using calculus technique.
At first, by removing the absolute symbol, the function f(xi) can be expressed as:
f ( x i ) = { 1 2 ρ ( 0.5 x i ) ρ , f o r   0 x i < 0.5 1 ,   f o r   x i = 0.5 1 2 ρ ( x i 0.5 ) ρ ,   f o r   0.5 < x i 1   for   ρ > 0 .
For ρ = 1, the first derivative of f(xi) with respect to xi apart from xi = 0.5 is:
f ( x i ) = d f ( x i ) d x i = { 2 , f o r   0 x i < 0.5 2 ,   f o r   0.5 < x i 1   for   ρ = 1 .
It is clear that f(xi) is monotonically increasing when xi ∊ [0, 0.5) and decreasing when xi ∊ (0.5, 1]. Thus, for the interval [0, 1], f(xi) = 1 get the maximum value at the critical point of xi = 0.5 for ρ = 1.
Then, when ρ is not equal to 1, the first derivative of f(xi) with respect to xi can be calculated by:
f ( x i ) = d f ( x i ) d x i = { 2 ρ ρ ( 0.5 x i ) ρ 1 , f o r   0 x i < 0.5 0 ,   f o r   x i = 0.5 2 ρ ρ ( x i 0.5 ) ρ 1 ,   f o r   0.5 < x i 1 .
Obviously, the first derivative of f(xi) is equal to zero only at the point of xi = 0.5. Because f’(xi) is positive for 0 ≤ xi < 0.5 and negative for 0.5 < xi ≤ 1, the maximum of f(xi) = 1 on the closed interval [0, 1] can be obtained at the critical point xi = 0.5.
Regarding the definition of f(xi), the entropy measure of simplified NS H = {h1, h2, ..., hn} = {<ai, TH(ai), UH(ai), FH(ai)> | aiA} can be defined as:
E A 1 ρ ( H ) = 1 3 n i = 1 n [ f ( T H ( a i ) ) + f ( U H ( a i ) ) + f ( F H ( a i ) ) ]   for   the   SvNS   H ,
E A 2 ρ ( H ) = 1 6 n i = 1 n [ f ( T H ( a i ) ) + f ( T H + ( a i ) ) + f ( U H ( a i ) ) + f ( U H + ( a i ) ) + f ( F H ( a i ) ) + f ( F H + ( a i ) ) ]   for   the   IvNS   H .
It is clear that if and only if hi = <ai, 0.5, 0.5, 0.5>, the maximum value of the entropy measure is E A 1 ρ ( H ) = 1 and if and only if hi = <ai, [0.5, 0.5], [0.5, 0.5], [0.5, 0.5]>, the maximum value of the entropy measure is E A 2 ρ ( H ) = 1 .
(EAP3) According to Equations (3) and (4), f(xi) is monotonically increasing when xi ∊ [0, 0.5], and monotonically decreasing when xi ∊ [0.5, 1]. Therefore, the closer the simplified NS H is to the fuzziest set B than L, the fuzzier H is than L with E A k ρ ( L ) < E A k ρ ( H ) (k = 1, 2; ρ > 0).
(EAP4) When the complement of the SvNS H = {<ai, TH(ai), UH(ai), FH(ai)> | aiA} is HC = {<ai, FH(ai), 1 – UH(ai), TH(ai)> | aiA}, by Equation (1) we can obtain:
E A 1 ρ ( H C ) = 1 3 n i = 1 n [ 1 2 ρ | F H ( a i ) 0.5 | ρ + 1 2 ρ | 1 U H ( a i ) 0.5 | ρ + 1 2 ρ | T H ( a i ) 0.5 | ρ ] = 1 3 n i = 1 n [ 1 2 ρ | T H ( a i ) 0.5 | ρ + 1 2 ρ | U H ( a i ) 0.5 | ρ + 1 2 ρ | F H ( a i ) 0.5 | ρ ] = E A 1 ρ ( H ) , for   ρ > 0 .
When the complement of the IvNS H = { < a i , [ T H ( a i ) , T H + ( a i ) ] , [ U H ( a i ) , U H + ( a i ) ] , [ F H ( a i ) , F H + ( a i ) ] > | a i A } is H C = { < a i , [ F H ( a i ) , F H + ( a i ) ] , [ 1 U H ( a i ) , 1 U H + ( a i ) ] , [ T H ( a i ) , T H + ( a i ) ] > | a i A } , we can also have E A 2 ρ ( H C ) = E A 2 ρ ( H ) .
Thus, the proof of the Theorem 1 is completed.  □

3.2. Simplified Neutrosophic Dimension Root Entropy

For two SvNSs B = {<ai, TB(ai), UB(ai), FB(ai)> | aiA} and C = {<ai, TC(ai), UC(ai), FC(ai)> | aiA} in the universal set A, Ye [24] defined a dimension root distance of SvNSs as follows:
D ( B , C ) = 1 n i = 1 n [ ( T B ( a i ) T C ( a i ) ) 2 + ( U B ( a i ) U C ( a i ) ) 2 + ( F B ( a i ) F C ( a i ) ) 2 3 ] 1 3 .
Based on the dimension root distance, we can present simplified neutrosophic dimension root entropy for a simplified NS.
Definition 2.
Assume H = {<ai, TB(ai), UB(ai), FB(ai)> | ai ∊ A} is a simplified NS in a universal set A = {a1, a2, ..., an}. Then, we can define the following dimension root entropy measure for the simplified NS H:
E B 1 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 3 ] 1 3 for   the   SvNS   H ,
E B 2 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( T H + ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( U H + ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 + 4 ( F H + ( a i ) 0.5 ) 2 6 ] 1 3 for   the   IvNS   H .
Similar to the proposed simplified neutrosophic distance-based entropy, the dimension root entropy of simplified NSs also has the following theorem.
Theorem 2.
Assume there is a fuzziest simplified NS B = {b1, b2, ..., bn} = {<ai, TB(ai), UB(ai), FB(ai)> | ai ∊ A} in the universal set A = {a1, a2, ..., an} with each element bi = <ai, 0.5, 0.5, 0.5> (i = 1, 2, ..., n) for SvNS or bi = <ai, [0.5, 0.5], [0.5, 0.5], [0.5, 0.5]> (i = 1, 2, ..., n) for IvNS. Then the entropy measure EBk(H) (k = 1, 2) of the simplified NS H = {h1, h2, ..., hn} = {<ai, TH(ai), UH(ai), FH(ai)> | ai ∊ A} satisfies the following properties:
(EBP1) EBk(H) = 0 if H = {h1, h2, ..., hn} is a crisp set with each element hi = <ai, 1, 0, 0> or hi = <ai, 0, 0, 1> (i = 1, 2, ..., n) for SvNS, and hi = <ai, [1, 1], [0, 0], [0, 0]> or hi = <ai, [0, 0], [0, 0], [1, 1]> (i = 1, 2, ..., n) for IvNS;
(EBP2) EBk(H) = 1 if and only if hi = bi (i = 1, 2, ..., n);
(EBP3) If one simplified NS H is closer to the fuzziest simplified NS B than the other simplified NS L, then H is fuzzier than L with EBk(L) < EBk(H) for k = 1, 2;
(EBP4) EBk(H) = EBk(HC) if HC is the complement of H.
Proof. 
(EBP1) For a crisp SvNS H = {h1, h2, ..., hn} with hi = <ai, 1, 0, 0> or hi = <ai, 0, 0, 1> (i = 1, 2, ..., n), by Equation (5) we can obtain:
E B 1 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 3 ] 1 3 = 1 1 n i = 1 n [ 4 ( 1 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 3 ] 1 3 = 0
or   E B 1 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 3 ] 1 3 = 1 1 n i = 1 n [ 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 1 0.5 ) 2 3 ] 1 3 = 0
Similarly, for an IvNS H with hi = <ai, [1, 1], [0, 0], [0, 0]> or hi = <ai, [0, 0], [0, 0], [1, 1]> (i = 1, 2, ..., n), by Equation (6) we have:
E B 2 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( T H + ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( U H + ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 + 4 ( F H + ( a i ) 0.5 ) 2 6 ] 1 3 = 1 1 n i = 1 n [ 4 ( 1 0.5 ) 2 + 4 ( 1 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 6 ] 1 3 = 0
or   E B 2 ( H ) = 1 1 n i = 1 n [ 4 ( T H ( a i ) 0.5 ) 2 + 4 ( T H + ( a i ) 0.5 ) 2 + 4 ( U H ( a i ) 0.5 ) 2 + 4 ( U H + ( a i ) 0.5 ) 2 + 4 ( F H ( a i ) 0.5 ) 2 + 4 ( F H + ( a i ) 0.5 ) 2 6 ] 1 3 = 1 1 n i = 1 n [ 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 0 0.5 ) 2 + 4 ( 1 0.5 ) 2 + 4 ( 1 0.5 ) 2 6 ] 1 3 = 0
(EBP2) Let f ( x i ) = 4 ( x i 0.5 ) 2 be a function for xi ∊ [0, 1] (i = 1, 2, ..., n). It is clear that the minimum value of f(xi) = 0 can be gotten at the critical point xi = 0.5. Based on the function f(xi), by Equations (5) and (6) the dimension root entropy of H can be rewritten as the following form:
E B 1 ( H ) = 1 1 n i = 1 n [ f ( T H ( a i ) ) + f ( U H ( a i ) ) + f ( F H ( a i ) ) 3 ] 1 3   for   the   SvNS   H ,
E B 2 ( H ) = 1 1 n i = 1 n [ f ( T H ( a i ) ) + f ( T H + ( a i ) ) + f ( U H ( a i ) ) + f ( U H + ( a i ) ) + f ( F H ( a i ) ) + f ( F H + ( a i ) ) 6 ] 1 3   for   the   IvNS   H .
Obviously, if and only if TH(ai) = UH(ai) = FH(ai) = 0.5, the maximum value of the entropy measure is EB1(H) = 1; while if and only if T H ( a i ) = T H + ( a i ) = U H ( a i ) = U H + ( a i ) = F H ( a i ) = F H + ( a i ) = 0.5 , the maximum value of the entropy measure is EB2(H) = 1.
Thus, the property EBP2 can hold for the dimension root entropy.
(EBP3) It is obvious that f ( x i ) = 4 ( x i 0.5 ) 2 is monotonically decreasing when xi ∊ [0, 0.5], and monotonically increasing when xi ∊ [0.5, 1]. Therefore, the closer the simplified NS H is to the fuzziest simplified NS B than a simplified NS L, the fuzzier H is than L with EBk(L) < EBk(H) for k = 1, 2.
(EBP4) Since the complement of the SvNS H = {<ai, TH(ai), UH(ai), FH(ai)> | aiA} is HC= {<ai, FH(ai), 1 – UH(ai), TH(ai)> | aiA}, by Equation (5) we have:
E B 1 ( H C ) = 1 1 n i = 1 n [ 4 ( F i 0.5 ) 2 + 4 ( 1 U i 0.5 ) 2 + 4 ( T i 0.5 ) 2 3 ] 1 3 = 1 1 n i = 1 n [ 4 ( T i 0.5 ) 2 + 4 ( U i 0.5 ) 2 + 4 ( F i 0.5 ) 2 3 ] 1 3 = E B 1 ( H )
When the complement of the IvNS H = { < a i , [ T H ( a i ) , T H + ( a i ) ] , [ U H ( a i ) , U H + ( a i ) ] , [ F H ( a i ) , F H + ( a i ) ] > | a i A } is H C = { < a i , [ F H ( a i ) , F H + ( a i ) ] , [ 1 U H ( a i ) , 1 U H + ( a i ) ] , [ T H ( a i ) , T H + ( a i ) ] > | a i A } , we can also obtain E B 2 ( H C ) = E B 2 ( H ) .
Thus, the proof of the theorem is completed.  □

4. Comparative Analysis of Entropy Measures for IvNSs

The comparative analysis between the presented simplified neutrosophic entropy measures and the existing entropy measures of simplified NSs are shown in this section. Since SvNS is a special case of IvNS when the two bounded values of its each interval are the same, the example adopted from [21] was illustrated only in IvNS setting. Then the existing entropy measures [19,20,21,22,23] of the IvNS H used for the comparison are introduced as follows:
R 1 ( H ) = 1 1 3 n i = 1 n [ | T H ( a i ) 0.5 | + | T H + ( a i ) 0.5 | + | U H ( a i ) 0.5 | + | U H + ( a i ) 0.5 | + | F H ( a i ) 0.5 | + | F H + ( a i ) 0.5 | ] ,
R 2 ( H ) = 1 2 { 1 6 n i = 1 n [ ( T H ( a i ) 0.5 ) 2 + ( T H + ( a i ) 0.5 ) 2 + ( U H ( a i ) 0.5 ) 2 + ( U H + ( a i ) 0.5 ) 2 + ( F H ( a i ) 0.5 ) 2 + ( F H + ( a i ) 0.5 ) 2 ] } 1 / 2 ,
R 3 ( H ) = 1 2 3 n i = 1 n { max [ | T H ( a i ) 0.5 | , | T H + ( a i ) 0.5 | ] + max [ | U H ( a i ) 0.5 | , | U H + ( a i ) 0.5 | ] + max [ | F H ( a i ) 0.5 | , | F H + ( a i ) 0.5 | ] } ,
R 4 ( H ) = 1 2 n i = 1 n { max [ 1 2 ( | T H ( a i ) 0.5 | + | T H + ( a i ) 0.5 | ) , 1 2 ( | U H ( a i ) 0.5 | + | U H + ( a i ) 0.5 | ) , 1 2 ( | F H ( a i ) 0.5 | + | F H + ( a i ) 0.5 | ) ] } ,
R 5 ( H ) = 1 1 2 n i = 1 n { [ T H ( a i ) + F H ( a i ) ] | U H ( a i ) ( U H ( a i ) ) c | + [ T H + ( a i ) + F H + ( a i ) ] | U H + ( a i ) ( U H + ( a i ) ) c | } ,
R 6 ( H ) = 1 n i = 1 n 2 | T H ( a i ) F H ( a i ) | | T H + ( a i ) F H + ( a i ) | | U H ( a i ) U H + ( a i ) | 2 + | T H ( a i ) F H ( a i ) | + | T H + ( a i ) F H + ( a i ) | + | U H ( a i ) U H + ( a i ) | ,
R 7 ( H ) = 1 6 n ( e 1 ) i = 1 n [ T H ( a i ) e ( 1 T H ( a i ) ) + ( 1 T H ( a i ) ) e T H ( a i ) 1 + U H ( a i ) e ( 1 U H ( a i ) ) + ( 1 U H ( a i ) ) e U H ( a i ) 1 + F H ( a i ) e ( 1 F H ( a i ) ) + ( 1 F H ( a i ) ) e F H ( a i ) 1 + T H + ( a i ) e ( 1 T H + ( a i ) ) + ( 1 T H + ( a i ) ) e T H + ( a i ) 1 + U H + ( a i ) e ( 1 U H + ( a i ) ) + ( 1 U H + ( a i ) ) e U H + ( a i ) 1 + F H + ( a i ) e ( 1 F H + ( a i ) ) + ( 1 F H + ( a i ) ) e F H + ( a i ) 1 ]
R 8 ( H ) = 1 6 n i = 1 n [ sin ( T H ( a i ) π ) + sin ( T H + ( a i ) π ) + sin ( U H ( a i ) π ) + sin ( U H + ( a i ) π ) + sin ( F H ( a i ) π ) + sin ( F H ( a i ) π ) ] .
Assume an IvNS is H = { < a i , [ T H ( a i ) , T H + ( a i ) ] , [ U H ( a i ) , U H + ( a i ) ] , [ F H ( a i ) , F H + ( a i ) ] > | a i A } in the universal set A = {a1, a2, ..., an}. Then, Hn for n > 0 can be expressed as:
H n = { < a i , [ ( T H ( a i ) ) n , ( T H + ( a i ) ) n ] , [ 1 ( 1 U H ( a i ) ) n , 1 ( 1 U H + ( a i ) ) n ] , [ 1 ( 1 F H ( a i ) ) n , 1 ( 1 F H + ( a i ) ) n ] > | a i A } .
Provided that an IvNS H in A = {a1, a2, a3, a4, a5} = {1, 2, 3, 4, 5} is evaluated by H = {<1, [0.2, 0.3], [0.6, 0.6], [0.7, 0.8]>, <2, [0.3, 0.3], [0.5, 0.6], [0.5, 0.6]>, <3, [0.4, 0.5], [0.5, 0.5], [0, 0.1]>, <4, [1, 1], [0.4, 0.4], [0, 0.1]>, <5, [0.7, 0.8], [0.5, 0.5], [0, 0]>}, then regarding the characteristics of variables corresponding to these operations [21]: (1) H can be regarded as “large” in A; (2) H2 can be regarded as “very large”; (3) H3 can be regarded as “quite very large”; (4) H4 can be regarded as “very very large”. Then the operational results are shown in Table 1.
Then, calculated by Equations (2) and (6)–(14), the measure values of the relative entropy are shown in Table 2. Then, the entropy measure curves of E A 2 ρ ( H n ) for n = 1, 2, 3, 4 and ρ ∊ [1, 100] are also shown in Figure 1.
Then, the available entropy measures of IvNSs should satisfy the ranking order E A 2 ρ ( H ) > E A 2 ρ ( H 2 ) > E A 2 ρ ( H 3 ) > E A 2 ρ ( H 4 ) from the intuition. From Table 2, except for both E A 2 100 ( H n ) with the ranking order E A 2 100 ( H ) = E A 2 100 ( H 2 ) > E A 2 100 ( H 3 ) > E A 2 100 ( H 4 ) and R6(Hn) with the ranking order R6(H) > R6(H4) > R6(H3) > R6(H2), the entropy measure values of E A 2 ρ ( H n ) for ρ ∊ [1, 100) and EB2(Hn) and the existing R1(Hn)–R5(Hn), R7(Hn), R8(Hn) satisfy the above ranking demand of the available entropy measures. Furthermore, from Figure 1, the ranking order based on the entropy measure values of E A 2 ρ ( H n ) indicates some robustness regarding ρ from 1 to 100. However, with the parameter ρ increasing, especially when ρ > 30, the entropy measure values of E A 2 ρ ( H ) and E A 2 ρ ( H 2 ) tend to the same value 0.8.

5. Decision-Making Example Using Simplified Neutrosophic Entropy in IvNS Setting

In this section, the proposed entropy measures are applied in a decision-making problem, and then compared with the existing entropy measures. For convenience, an investment decision-making example adopted from the reference [21] was used for the application. In the decision-making problem, the decision makers are requested to assess four investment projects (alternatives), including a clothing company (g1), a food company (g2), a computer company (g3), and a house-building company (g4), over three attributes, like growth (a1), risk (a2), and environmental impact (a3) respectively, and then select the best alternative for the investment company. The evaluation information of the alternative set G = {g1, g2, g3, g4} over the attribute set A = {a1, a2, a3} is given by the form of IvNSs as the following matrix:
M = g 1 g 2 g 3 g 4 [ < a 1 , [ 0.4 , 0.6 ] , [ 0.1 , 0.3 ] , [ 0.2 , 0.3 ] > < a 1 , [ 0.3 , 0.6 ] , [ 0.3 , 0.5 ] , [ 0.8 , 0.9 ] > < a 1 , [ 0.5 , 0.6 ] , [ 0.2 , 0.3 ] , [ 0.3 , 0.4 ] > < a 1 , [ 0.5 , 0.6 ] , [ 0.3 , 0.4 ] , [ 0.8 , 0.9 ] > < a 2 , [ 0.7 , 0.9 ] , [ 0.2 , 0.3 ] , [ 0.4 , 0.5 ] > < a 2 , [ 0.6 , 0.7 ] , [ 0.1 , 0.2 ] , [ 0.2 , 0.3 ] > < a 2 , [ 0.3 , 0.6 ] , [ 0.2 , 0.3 ] , [ 0.3 , 0.4 ] > < a 2 , [ 0.7 , 0.8 ] , [ 0 , 0.1 ] , [ 0.1 , 0.2 ] > < a 3 , [ 0.4 , 0.5 ] , [ 0.2 , 0.3 ] , [ 0.3 , 0.4 ] > < a 3 , [ 0.6 , 0.7 ] , [ 0.1 , 0.2 ] , [ 0.2 , 0.3 ] > < a 3 , [ 0.4 , 0.5 ] , [ 0.2 , 0.5 ] , [ 0.7 , 0.9 ] > < a 3 , [ 0.6 , 0.7 ] , [ 0.1 , 0.2 ] , [ 0.1 , 0.3 ] > ]
By applying the proposed entropy measures of Equations (2) and (6) and the existing entropy measures of Equations (7)–(14) to the above decision-making problem, the relative entropy measure results and the ranking orders are listed in Table 3.
Obviously, the entropy measure values of E A 2 ρ ( g i ) (i =1, 2, 3, 4) for ρ ≤ 20 indicate the identical ranking result g3 > g1 > g2 > g4, and then it is changed into g3 = g1 > g2 > g4 for ρ = 30 and gets g3 = g1 = g2 > g4 along with the parameter values of ρ ranging from 40 to 100. Obviously, E A 2 ρ ( g i ) is a better selection with relatively small values of ρ such as ρ ≤ 20. On the other hand, the measure values of the proposed entropy EB2(gi) (i = 1, 2, 3, 4) and existing entropy including R1(gi)-R5(gi), R7(gi) have the same ranking order g3 > g1 > g2 > g4; while that of R6(gi) is g3 > g1 > g4 > g2. Anyway, the best alternative of all the ranking orders is g3.

6. Conclusions

This study originally presented the generalized distance-based entropy measure and the dimension root entropy measure of simplified NSs, containing both the SvSN and IvSN generalized distance-based entropy measures and the SvSN and IvSN neutrosophic dimension root entropy measures. Then, their properties were discussed based on the axioms of an entropy measure of IvNSs defined in [21]. After that, a comparison between the proposed entropy and existing relative entropy measures by a numerical example in IvNS setting showed that the proposed entropy measures are effective and rational. An application of the proposed two entropy measures in an actual decision-making problem illustrated the feasibility and rationality by comparison with the existing ones, especially with the relatively small values of the parameter ρ, such as ρ < 20. The proposed simplified NS entropy not only is a complement of the entropy theory of simplified NSs, but also presents a new effective way of the uncertain measure under the simplified NS setting. Our future work will focus on research to extend the proposed entropy measures to applications in diverse engineering fields.

Author Contributions

J.Y. proposed the entropy measures of simplified NSs and their proof; W.H.C. provided the actual decision-making example and comparative analysis; all authors wrote this paper together.

Funding

This research was funded by the National Natural Science Foundation of China (No. 61703280).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Probability measure of fuzzy events. J. Math. Anal. Appl. 1968, 23, 421–427. [Google Scholar] [CrossRef]
  2. De Luca, A.S.; Termini, S. A definition of nonprobabilistic entropy in the setting of fuzzy set theory. Inf. Control. 1972, 20, 301–312. [Google Scholar] [CrossRef]
  3. Pal, N.R.; Pal, S.K. Object background segmentation using new definitions of entropy. IEEE Proc. 1989, 366, 284–295. [Google Scholar] [CrossRef]
  4. Yager, R.R. On the measures of fuzziness and negation Part I: Membership in the unit interval. Int. J. Gen. Syst. 2008, 5, 221–229. [Google Scholar] [CrossRef]
  5. Parkash, O.; Sharma, P.K.; Mahajan, R. New measures of weighted fuzzy entropy and their applications for the study of maximum weighted fuzzy entropy principle. Inf. Sci. 2008, 178, 2389–2395. [Google Scholar] [CrossRef]
  6. Verma, R.; Sharma, B.D. On generalized exponential fuzzy entropy. Int. J. Math. Comput. Sci. 2011, 5, 1895–1898. [Google Scholar]
  7. Burillo, P.; Bustince, H. Entropy on intuitionistic fuzzy sets and on interval-valued fuzzy sets. Fuzzy Sets Syst. 1996, 78, 305–316. [Google Scholar] [CrossRef]
  8. Szmidt, E.; Kacprzyk, J. Entropy on intuitionistic fuzzy sets. Fuzzy Sets Syst. 2001, 118, 467–477. [Google Scholar] [CrossRef]
  9. Valchos, I.K.; Sergiadis, G.D. Intuitionistic fuzzy information—A pattern recognition. Pattern Recognit. Lett. 2007, 28, 197–206. [Google Scholar] [CrossRef]
  10. Zhang, Q.S.; Jiang, S.Y. A note on information entropy measure for vague sets. Inf. Sci. 2008, 178, 4184–4191. [Google Scholar] [CrossRef]
  11. Ye, J. Two effective measures of intuitionistic fuzzy entropy. Computing 2010, 87, 55–62. [Google Scholar] [CrossRef]
  12. Verma, R.; Sharma, B.D. Exponential entropy on intuitionistic fuzzy sets. Kybernetika 2013, 49, 114–127. [Google Scholar]
  13. Verma, R.; Sharma, B.D. On intuitionistic fuzzy entropy of order-alpha. Adv. Fuzzy Syst. 2014, 14, 1–8. [Google Scholar] [CrossRef]
  14. Verma, R.; Sharma, B.D. R-norm entropy on intuitionistic fuzzy sets. J. Intell. Fuzzy Syst. 2015, 28, 327–335. [Google Scholar]
  15. Ye, J. Multicriteria fuzzy decision-making method using entropy weights-based correlation coefficients of interval-valued intuitionistic fuzzy sets. Appl. Math. Model. 2010, 34, 3864–3870. [Google Scholar] [CrossRef]
  16. Wei, C.P.; Wang, P.; Zhang, Y.Z. Entropy, similarity measure of interval valued intuitionistic sets and their applications. Inf. Sci. 2011, 181, 4273–4286. [Google Scholar] [CrossRef]
  17. Zhang, Q.S.; Xing, H.Y.; Liu, F.C.; Ye, J.; Tang, P. Some new entropy measures for interval-valued intuitionistic fuzzy sets based on distances and their relationships with similarity and inclusion measures. Inf. Sci. 2014, 283, 55–69. [Google Scholar] [CrossRef]
  18. Tian, H.; Li, J.; Zhang, F.; Xu, Y.; Cui, C.; Deng, Y.; Xiao, S. Entropy analysis on intuitionistic fuzzy sets and interval-valued intuitionistic fuzzy sets and its applications in mode assessment on open communities. J. Adv. Comput. Intell. Intell. Inform. 2018, 22, 147–155. [Google Scholar] [CrossRef]
  19. Majumder, P.; Samanta, S.K. On similarity and entropy of neutrosophic sets. J. Intell. Fuzzy Syst. 2014, 26, 1245–1252. [Google Scholar] [CrossRef]
  20. Aydoğdu, A. On entropy and similarity measure of interval valued neutrosophic sets. Neutrosophic Sets Syst. 2015, 9, 47–49. [Google Scholar]
  21. Ye, J.; Du, S.G. Some distances, similarity and entropy measures for interval-valued neutrosophic sets and their relationship. Int. J. Mach. Learn. Cybern. 2017, 1–9. [Google Scholar] [CrossRef]
  22. Ye, J.; Cui, W.H. Exponential Entropy for Simplified Neutrosophic Sets and Its Application in Decision Making. Entropy 2018, 20, 357. [Google Scholar] [CrossRef]
  23. Cui, W.H.; Ye, J. Improved Symmetry Measures of Simplified Neutrosophic Sets and Their Decision-Making Method Based on a Sine Entropy Weight Model. Symmetry 2018, 10, 225. [Google Scholar] [CrossRef]
  24. Ye, J. Fault Diagnoses of Hydraulic Turbine Using the Dimension Root Similarity Measure of Single-valued Neutrosophic Sets. Intell. Autom. Soft Comput. 2017, 1–8. [Google Scholar] [CrossRef]
  25. Ye, J. A multicriteria decision-making method using aggregation operators for simplified neutrosophic sets. J. Intell. Fuzzy Syst. 2014, 26, 2459–2466. [Google Scholar] [CrossRef]
  26. Peng, J.-J.; Wang, J.-Q.; Wang, J.; Zhang, H.; Chen, X. Simplified neutrosophic sets and their applications in multi-criteria group decision-making problems. Int. J. Syst. Sci. 2016, 47, 2342–2358. [Google Scholar] [CrossRef]
Figure 1. The entropy measure curves of E A 2 ρ ( H n ) for H1, H2, H3, H4 and ρ ∊ [1, 100].
Figure 1. The entropy measure curves of E A 2 ρ ( H n ) for H1, H2, H3, H4 and ρ ∊ [1, 100].
Entropy 20 00844 g001
Table 1. Operation results of Hn for n = 1, 2, 3, 4.
Table 1. Operation results of Hn for n = 1, 2, 3, 4.
Hna1 = 1a2 = 2a3 = 3a4 = 4a5 = 5
H<1,[0.2, 0.3],
[0.6, 0.6], [0.7, 0.8]>
<2,[0.3, 0.3],
[0.5, 0.6], [0.5, 0.6]>
<3,[0.4, 0.5],
[0.5, 0.5], [0, 0.1]>
<4,[1, 1],
[0.4, 0.4], [0, 0.1]>
<5,[0.7, 0.8],
[0.5, 0.5], [0, 0]>
H2<1,[0.04, 0.09],
[0.84, 0.84],
[0.91, 0.96]>
<2,[0.09, 0.09],
[0.75, 0.84],
[0.75, 0.84]>
<3,[0.16, 0.25],
[0.75, 0.75],
[0, 0.19]>
<4,[1, 1],
[0.64, 0.64],
[0, 0.19]>
<5,[0.49, 0.64],
[0.75, 0.75],
[0, 0]>
H3<1,[0.008, 0.027],
[0.936, 0.936],
[0.973, 0.992]
<2,[0.027, 0.027],
[0.875, 0.936],
[0.875, 0.936]>
<3,[0.064, 0.125],
[0.875, 0.875],
[0, 0.271]>
<4,[1, 1],
[0.784, 0.784],
[0, 0.271]>
<5,[0.343, 0.512],
[0.875, 0.875],
[0, 0]>
H4<1,[0.0016, 0.0081],
[0.9744, 0.9744],
[0.9919, 0.9984]>
<2,[0.0081, 0.0081],
[0.9375, 0.9744],
[0.9375, 0.9744]>
<3,[0.0256, 0.0625],
[0.9375, 0.9375],
[0, 0.3439]>
<4,[1, 1],
[0.8704, 0.8704],
[0, 0.3439]>
<5,[0.2401, 0.4096],
[0.9375, 0.9375],
[0, 0]>
Table 2. All values of various entropy measures of IvNSs.
Table 2. All values of various entropy measures of IvNSs.
Entropy ValueHH2H3H4Ranking Order
E A 2 1 ( H n ) 0.57330.32930.20830.1465 E A 2 1 ( H ) > E A 2 1 ( H 2 ) > E A 2 1 ( H 3 ) > E A 2 1 ( H 4 )
E A 2 2 ( H n ) 0.68530.4850.31930.2228 E A 2 2 ( H ) > E A 2 2 ( H 2 ) > E A 2 2 ( H 3 ) > E A 2 2 ( H 4 )
E A 2 3 ( H n ) 0.73170.57490.3950.2743 E A 2 3 ( H ) > E A 2 3 ( H 2 ) > E A 2 3 ( H 3 ) > E A 2 3 ( H 4 )
E A 2 4 ( H n ) 0.75510.63130.45030.3143 E A 2 4 ( H ) > E A 2 4 ( H 2 ) > E A 2 4 ( H 3 ) > E A 2 4 ( H 4 )
E A 2 5 ( H n ) 0.76860.66880.49270.3473 E A 2 5 ( H ) > E A 2 5 ( H 2 ) > E A 2 5 ( H 3 ) > E A 2 5 ( H 4 )
E A 2 10 ( H n ) 0.79220.74840.6110.4589 E A 2 10 ( H ) > E A 2 10 ( H 2 ) > E A 2 10 ( H 3 ) > E A 2 10 ( H 4 )
E A 2 20 ( H n ) 0.79920.78480.69630.5667 E A 2 20 ( H ) > E A 2 20 ( H 2 ) > E A 2 20 ( H 3 ) > E A 2 20 ( H 4 )
E A 2 30 ( H n ) 0.79990.79420.73090.6191 E A 2 30 ( H ) > E A 2 30 ( H 2 ) > E A 2 30 ( H 3 ) > E A 2 30 ( H 4 )
E A 2 40 ( H n ) 0.80.79760.74990.6505 E A 2 40 ( H ) > E A 2 40 ( H 2 ) > E A 2 40 ( H 3 ) > E A 2 40 ( H 4 )
E A 2 50 ( H n ) 0.80.7990.76180.6719 E A 2 50 ( H ) > E A 2 50 ( H 2 ) > E A 2 50 ( H 3 ) > E A 2 50 ( H 4 )
E A 2 100 ( H n ) 0.80.80.78620.7247 E A 2 100 ( H ) = E A 2 100 ( H 2 ) > E A 2 100 ( H 3 ) > E A 2 100 ( H 4 )
EB2(Hn)0.35340.20130.12310.0829EB2(H) > EB2(H2) > EB2(H3) > EB2(H4)
R1(Hn) [21]0.57330.32930.20830.1465R1(H) > R1(H2) > R1(H3) > R1(H4)
R2(Hn) [21]0.4390.28240.1750.1184R2(H) > R2(H2) > R2(H3) > R2(H4)
R3(Hn) [21]0.520.27070.14770.0811R3(H) > R3(H2) > R3(H3) > R3(H4)
R4(Hn) [21]0.240.10.05560.0228R4(H) > R4(H2) > R4(H3) > R4(H4)
R5(Hn) [19,21]0.90.61090.44640.366R5(H) > R5(H2) > R5(H3) > R5(H4)
R6(Hn) [20]0.29380.26840.26980.2719R6(H) > R6(H4) > R6(H3) > R6(H2)
R7(Hn) [22]0.68860.49190.32550.2272R7(H) > R7(H2) > R7(H3) > R7(H4)
R8(Hn) [23]0.66950.45210.29020.2027R8(H) > R8(H2) > R8(H3) > R8(H4)
Table 3. All the results of the proposed entropy and existing entropy measures of IvNSs.
Table 3. All the results of the proposed entropy and existing entropy measures of IvNSs.
E A 2 ρ ( g i ) g1g2g3g4Ranking Order
E A 2 1 ( g i ) 0.63330.53330.65560.4444g3 > g1 > g2 > g4
E A 2 2 ( g i ) 0.81110.73330.83780.6356g3 > g1 > g2 > g4
E A 2 3 ( g i ) 0.88670.8320.91160.7351g3 > g1 > g2 > g4
E A 2 4 ( g i ) 0.92520.88690.94660.7945g3 > g1 > g2 > g4
E A 2 5 ( g i ) 0.94770.92030.96530.8332g3 > g1 > g2 > g4
E A 2 10 ( g i ) 0.9870.98040.9930.9132g3 > g1 > g2 > g4
E A 2 20 ( g i ) 0.99870.99810.99940.9412g3 > g1 > g2 > g4
E A 2 30 ( g i ) 0.99990.99980.99990.9441g3 = g1 > g2 > g4
E A 2 40 ( g i ) 1110.9444g3 = g1 = g2 > g4
E A 2 50 ( g i ) 1110.9444g3 = g1 = g2 > g4
E A 2 100 ( g i ) 1110.9444g3 = g1 = g2 > g4
EB2(gi)0.43020.35720.45710.2945g3 > g1 > g2 > g4
R1(gi) [21]0.63330.53330.65560.4444g3 > g1 > g2 > g4
R2(gi) [21]0.56540.48360.59720.3963g3 > g1 > g2 > g4
R3(gi) [21]0.51110.42220.53330.3333g3 > g1 > g2 > g4
R4(gi) [21]0.43330.30.46670.2333g3 > g1 > g2 > g4
R5(gi) [19,21]0.520.51330.570.3933g3 > g1 > g2 > g4
R6(gi) [20]0.56870.3640.57280.3818g3 > g1 > g4 > g2
R7(gi) [22]0.81650.74060.84290.6431g3 > g1 > g2 > g4
R8(gi) [23]0.78520.69850.81290.5997g3 > g1 > g2 > g4

Share and Cite

MDPI and ACS Style

Cui, W.-H.; Ye, J. Generalized Distance-Based Entropy and Dimension Root Entropy for Simplified Neutrosophic Sets. Entropy 2018, 20, 844. https://doi.org/10.3390/e20110844

AMA Style

Cui W-H, Ye J. Generalized Distance-Based Entropy and Dimension Root Entropy for Simplified Neutrosophic Sets. Entropy. 2018; 20(11):844. https://doi.org/10.3390/e20110844

Chicago/Turabian Style

Cui, Wen-Hua, and Jun Ye. 2018. "Generalized Distance-Based Entropy and Dimension Root Entropy for Simplified Neutrosophic Sets" Entropy 20, no. 11: 844. https://doi.org/10.3390/e20110844

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop