Next Article in Journal
Transfer Entropy between Communities in Complex Financial Networks
Previous Article in Journal
Distribution Structure Learning Loss (DSLL) Based on Deep Metric Learning for Image Retrieval
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Improved Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Deng Entropy and Belief Interval

1
Key lab of Structures Dynamic Behaviour and Control of the Ministry of Education, Harbin Institute of Technology, Harbin 150090, China
2
Key lab of Smart Prevention and Mitigation of Civil Engineering Disasters of the Ministry of Industry and Information Technology, Harbin Institute of Technology, Harbin 150090, China
3
School of Management, Harbin Institute of Technology, Harbin 150001, China
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(11), 1122; https://doi.org/10.3390/e21111122
Submission received: 23 October 2019 / Revised: 11 November 2019 / Accepted: 12 November 2019 / Published: 15 November 2019
(This article belongs to the Section Entropy and Biology)

Abstract

:
It is still an open issue to measure uncertainty of the basic probability assignment function under Dempster-Shafer theory framework, which is the foundation and preliminary work for conflict degree measurement and combination of evidences. This paper proposes an improved belief entropy to measure uncertainty of the basic probability assignment based on Deng entropy and the belief interval, which takes the belief function and the plausibility function as the lower bound and the upper bound, respectively. Specifically, the center and the span of the belief interval are employed to define the total uncertainty degree. It can be proved that the improved belief entropy will be degenerated to Shannon entropy when the the basic probability assignment is Bayesian. The results of numerical examples and a case study show that its efficiency and flexibility are better compared with previous uncertainty measures.

1. Introduction

Uncertainty results from both the objective world and human’s subjective cognition, that is aleatory uncertainty and epistemic uncertainty [1]. The aleatory uncertainty is mainly caused by variance and randomness, which is closely associated with probability. Hence, the aleatory is ineluctable. The epistemic stems from the lack of knowledge. With the improvement of people’s knowledge level, the epistemic uncertainty can be reduced to some extent. Usually, uncertainty will cause negative effects and consequences for decision makers who attempt to take some principles to avoid risk [2,3]. Handling and reducing uncertainty hidden in information have always been a difficult problem to be resolved in various fields. Nevertheless, the measurement of uncertainty is of vital importance because quantifying the uncertain degree of certain information is the foundation and prerequisite before further information processing and fusing [4,5].
A physical quantity, called entropy, was initially proposed by Clausius to measure uncertainty in statistical thermodynamics [6]. Then, Shannon entropy [7] developed by Shannon was extended to solve the problem of measuring uncertainty under the probability theory and was proved effective for handling the uncertainty in some application systems [8,9,10]. Nevertheless, it does not produce desired results when measuring uncertainty of basic probability assignment (BPA) in Dempster-Shafer (D-S) theory that was put forward by Dempster [11] and then developed by Shafer [12]. The theory has proved to have significant advantages in representing, processing and fusing uncertain information or data by assigning a probability to the subsets of a set comprising multiple solutions rather than to each of the individual solution [13,14] and has been accepted as de facto standard in many fields [15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34], such as risk assessment [16,17,18,19], fault diagnosis [20,21,22,23], pattern classification [24,25,26,27], knowledge reasoning [28,29], and sensors’ network analysis [30].
Measuring uncertainty of BPA in the framework of D-S theory is always an open issue. Many efforts have been made to extend Shannon Entropy to measure uncertainty of BPA. There are two main perspectives for measuring uncertainty, namely discord [35,36] and non-specificity [37]. For the former, “Confusion” [38], “Dissonance” [39] and “Strife” [40] were introduced to measure or quantify uncertainty. In terms of the latter, a generalized Hartley entropy originally proposed by Dubois and Prade was employed to represent it [37]. Yager [39] and Korner [41] gave their definitions and methods to measure non-specificity. These methods only consider either discord or non-specificity when measuring uncertainty. However, a mass function is a generalized probability assigned on the power set of the frame of discernment (FOD) and a focal element of the FOD contains one or more events [42]. Hence, discord and non-specificity should be incorporated together to measure the uncertainty of BPA. In this way, Deng entropy [43] is put forward by taking total non-specificity and discord into consideration simultaneously based on Shannon entropy and attracts plenty of attentions [21,44,45].
The belief interval also provides a new insight to measure the uncertainty of BPA [46], which takes the belief function and the plausibility function as the lower bound and the upper bound, respectively. Based on the belief interval, a distance-based total uncertainty measurement was proposed [47] by Yang and Han under D-S theory framework. The average distance between the belief interval of each singleton and the most uncertain case is used to represent the total uncertainty degree. Then, Deng et al. [48] gave an improved method by calculating the distance between the belief interval and the so-called most uncertain interval to define the uncertainty measurement. Although this method makes up some deficiencies of conventional methods, it does not degenerate into Shannon entropy when the BPA is Bayesian. This is counterintuitive because D-S theory is considered as a generation of probability theory. Pan and Deng [46] proposed a new belief entropy to measure uncertainty of BPA considering belief function and plausibility function while ignoring the span of belief interval, which contains more information and variance. Wang and Song [49] provided an uncertainty measure A U which considers the imprecision of the belief interval. A U is more sensitive to the change of belief structures and has no computational burden. However, the monotonicity which is a crucial property of an uncertainty measure of A U is violated.
In this paper, an improved belief entropy is proposed based on Deng entropy and the belief interval. The improved belief entropy takes advantage of the central value and the span of the belief interval. It replaces the BPA in Deng entropy with the central value of the belief interval and adds a correction factor associated with the span of the belief interval which represents the imprecision of the belief interval. Several numerical examples and a case study about fault diagnose are employed to verify the effectiveness and applicability of the improved belief entropy.
The rest of this paper is organized as follows. Preliminaries of D-S theory and uncertainty measure are briefly introduced in Section 2. In Section 3, an improved belief entropy is proposed based on the belief interval and Deng entropy. Section 4 gives some numerical examples to verify the efficiency and flexibility of the improved belief entropy. In Section 5, a case study of fault diagnose is presented to show the applicability. Finally, conclusions are summarized in Section 6.

2. Preliminaries

2.1. D-S Theory

Several preliminaries are briefly illustrated [4,50].
Let Θ be a nonempty finite set of events or propositions that are mutually exclusive and exhaustive. Θ called frame of discernment (FOD) is defined as follows:
Θ = { θ 1 , θ 2 , θ i , , θ N } .
2 Θ called the power set of Θ is represented as:
2 Θ = { , { θ 1 } , { θ 2 } , , { θ i } , , { θ N } , { θ 1 , θ 2 , , θ i } , , { Θ } } .
where denotes the empty set. Each element in the power set Θ is called a hypothesis or proposition.
In the FOD Θ , a mass function also called BPA or the belief structure is defined as follows:
m : 2 Θ [ 0 , 1 ] .
BPA should meet the following conditions:
m ( ) = 0 , A 2 Θ m ( A ) = 1 .
A is called a focal element when m ( A ) > 0 and the set of all focal elements and their corresponding BPAs compose a body of evidences (BOEs). m ( A ) represents how strongly the evidence supports the proposition A.
The belief function and plausibility function are defined as follows, respectively:
B e l ( A ) = B A m ( B ) , P l ( A ) = B A m ( B ) .
It is obvious that A Θ , B e l ( A ) < P l ( A ) . B e l ( A ) and P l ( A ) represent the lower and the upper boundary of the degree that the evidence supports A. [ B e l ( A ) , P l ( A ) ] is considered as the belief interval for A.
In D-S theory, two evidences, denoted as m 1 and m 2 , can be combined according to Dempster’s rule of combination [11,12] as follows:
m ( A ) = ( m 1 m 2 ) ( A ) = 1 1 - k B C = A m 1 ( B ) m 2 ( C ) .
where k is called the conflict coefficient which measures the degree of conflict of m 1 and m 2 . If k = 0, it means that there is no conflict between m 1 and m 2 . If k = 1, there is an absolute conflict. In other words, the greater k is, the higher the degree of conflict is. k is defined as follows:
k = B C = m 1 ( B ) m 2 ( C ) .

2.2. Shannon Entropy and Derivatives for D-S Framework

Shannon entropy, also known as information entropy and proposed by Shannon in 1948, is closely associated with uncertainty. Shannon referred to the concept of thermal entropy, which is a physical quantity indicating the degree of chaos of molecular states in thermodynamics. He thought that there was a close link between information volume and uncertainty, and then defined information entropy as follows:
H s = - i = 1 N p i log b p i .
where H s denotes Shannon entropy (information entropy), N is the number of basic states, and p i meets i = 1 N p i = 1 . b is assigned a value of 2 when the unit of information is bit. Then, Shannon entropy can be expressed as follows:
H s = - i = 1 N p i log 2 p i .
Great information entropy will contain more complexity and uncertainty in information so Shannon entropy succeeds in handling the problem of measuring uncertainty of information under the framework of probability theory to a large extent. Nevertheless, there are still some limitations [48]. The concept of entropy is also an open issue under the framework of D-S theory. Definitions of some typical uncertainty measures of D-S theory are briefly described as follows:
Dubois and Prade’s weighted Hartley entropy is shown as follows [37]:
H D P ( m ) = - A 2 Θ m ( A ) log ( | A | ) .
where A is a focal element of Θ and | A | is the cardinality of A.
Höhle’s confusion measure is shown as follows [38]:
H H ( A ) = - A 2 Θ m ( A ) log ( B e l ( A ) ) .
The dissonance measure of Yager is defined as follows [39]:
H Y ( A ) = - A 2 Θ m ( A ) log ( P l ( A ) ) .
The discord measure of Klir and Ramer is defined as follows [35]:
H K R ( A ) = - A 2 Θ m ( A ) log B 2 Θ m ( B ) | A B | | B | .
where | B | is the cardinality of B, which is also a focal element of Θ .
Klir and Parviz gave their strife measure of entropy. It is defined as follows [40]:
H K P ( A ) = - A 2 Θ m ( A ) log B 2 Θ m ( B ) | A B | | A | .
George and Pal proposed a method called conflict measure for entropy [51]:
H G P ( A ) = A 2 Θ m ( A ) log B 2 Θ m ( B ) ( 1 - | A B | | A B | ) .
where | A B | and | A B | represent the cardinality of A B and A B , respectively.

2.3. Deng Entropy

For mass functions, a new uncertainty measure called Deng entropy under D-S framework was defined as follows [43]:
E d ( m ) = A 2 Θ m ( A ) log 2 m ( A ) 2 | A | 1 .
As shown in the above definition, Deng entropy is similar to the classical Shannon entropy in form. However, the mass function of each focal element is divided by a term ( 2 | A | 1 ), which means the scale of the focal element A. If each focal element is assigned only one element, Deng entropy will be degenerated to Shannon entropy as follows:
E d ( m ) = A 2 Θ m ( A ) log 2 m ( A ) 2 | A | 1 = A 2 Θ m ( A ) log 2 m ( A ) .

3. The Improved Belief Entropy

By reviewing the relative literatures, we conclude that the mass function, the belief function and the plausibility function are applicable to the mainstream measurement of uncertainty. Here, the belief interval [ B e l ( A ) , P l ( A ) ] , which contains more information of D-S framework is often overlooked. There are only several articles about uncertainty measurement mentioning the belief interval [5,46,47,49,50,52]. In this article, an improved belief entropy involving the belief interval is proposed based on Deng entropy and the belief interval, defined as follows:
H i n t e r ( m ) = i = 1 n B e l ( θ i ) + P l ( θ i ) 2 log 2 ( B e l ( θ i ) + P l ( θ i ) 2 e ( P l ( A ) B e l ( A ) ) ) A θ i , A 2 Θ m ( A ) log 2 ( m ( A ) 2 | A | 1 e ( P l ( A ) B e l ( A ) ) ) .
The improved belief entropy proposed in this article has two improvements over Deng entropy. First, the mass function m ( A ) is used in Deng entropy while it is replaced by the central value of the belief interval [46]. P l ( A ) and B e l ( A ) are the lower limit function and upper limit function of the probability to which proposition A is supported [53]. The belief interval has higher accuracy than the mass function in illustrating how strongly the evidence supports A under a strict D-S framework. For simplicity, the mean of P l ( A ) and B e l ( A ) is employed to discretize the belief interval for computing. Additionally, a coefficient e ( P l ( A ) B e l ( A ) ) is added to the measurement of non-specificity of the belief structure. The non-specificity of the belief interval can be quantified by its imprecision degree, which is related to the span of the belief interval [49]. Additionally, the properties proposed by Klir and Wierman [54] for total uncertainty measurement of the improved belief entropy are explored as follows:
Probabilistic consistency: If all the focal elements of a BPA are singletons, then m ( x ) = B e l ( x ) = P l ( x ) for x 2 Θ . Obviously, the improved belief entropy will be degenerated to Shannon entropy. Hence, the probabilistic consistency property is verified.
Set consistency: Set consistency requires that H ( m ) = l o g ( | a | ) whenever m is categorical with focal element a, i.e., m ( a ) = 1 . For the improved belief entropy, when m ( a ) = 1 :
H i n t e r ( m ) = l o g 2 ( 2 | A | 1 ) l o g 2 ( | A | ) .
where | A | is the cardinality of a. Therefore, the belief entropy is not set consistent.
Range: The range property requires that for any BPA m X in X, 0 H ( m X ) l o g 2 ( | X | ) . A simple counter example is employed:
Let Θ = { θ 1 , θ 2 , θ 3 , θ 4 } be the FOD, for a mass function m ( θ 1 , θ 2 ) = 0.5 , m ( θ 3 ) = 0.1 , m ( θ 2 ) = 0.1 , and m ( Θ ) = 0.3 . Then,
B e l ( θ 1 ) = 0 , P l ( θ 1 ) = 0.8 , B e l ( θ 1 , θ 2 ) = 0.6 , P l ( θ 1 , θ 2 ) = 0.9 , B e l ( θ 3 ) = 0.1 , P l ( θ 3 ) = 0.6 , B e l ( θ 2 ) = 0.1 , P l ( θ 2 ) = 0.9 , H i n t e r = 0.4 l o g 2 ( 0.4 e 0.8 ) 0.5 l o g 2 ( 0.5 e 0.8 ) 0.25 l o g 2 ( 0.25 e 0.3 ) 0.5 l o g 2 ( 0.3 7 e 0.3 ) 0.3 l o g 2 0.3 7 = 5.5479 , l o g 2 3 = 1.5850 < 5.5479 .
Obviously, the improved belief entropy does not satisfy the property range.
Subadditivity: To investigate whether the improved belief entropy verifies the subadditivity, we take the following example:
Let X × Y be the product space of the sets X = { x 1 , x 2 , x 3 } and Y = { y 1 , y 2 } . The marginal BPAs on X × Y with masses is as follows:
m ( { z 11 , z 12 , z 21 } ) = 0.2 , m ( { z 31 , z 32 } ) = 0.3 , m ( { z 21 } ) = 0.1 , m ( X × Y ) = 0.4 .
where z i j = ( x i , y i ) , the marginal BPAs on X and Y are m 1 and m 2 :
m 1 ( x 1 , x 2 ) = 0.2 , m 1 ( x 3 ) = 0.3 , m 1 ( x 2 ) = 0.1 , m 1 ( X ) = 0.4 , m 2 ( y 1 ) = 0.1 , m 2 ( Y ) = 0.9 .
Therefore,
B e l ( x 1 ) = 0 , P l ( x 1 ) = 0.6 , B e l ( x 1 , x 2 ) = 0.3 , P l ( x 1 , x 2 ) = 0.9 , B e l ( x 3 ) = 0.3 , P l ( x 3 ) = 0.7 , B e l ( x 2 ) = 0.1 , P l ( x 2 ) = 0.7 , B e l ( y 1 ) = 0.1 , P l ( y 1 ) = 1 , B e l ( { z 11 , z 12 , z 21 } ) = 0.3 , P l ( { z 11 , z 12 , z 21 } ) = 0.7 , B e l ( { z 11 } ) = B e l ( { z 12 } ) = B e l ( { z 22 } ) = B e l ( { z 31 } ) = B e l ( { z 32 } ) = 0 , B e l ( { z 31 , z 32 } ) = 0.3 , P l ( { z 11 } ) = P l ( { z 12 } ) = 0.6 , P l ( { z 31 } ) = P l ( { z 32 } ) = 0.7 , P l ( { z 31 , z 32 } ) = 0.7 , B e l ( { z 21 } ) = 0.1 , P l ( { z 21 } ) = 0.7 , P l ( { z 22 } ) = 0.4 , H i n t e r ( m 1 ) + H i n t e r ( m 2 ) = 8.8473 , H i n t e r ( m ) = 8.8729 .
It is obvious that H i n t e r ( m 1 ) + H i n t e r ( m 2 ) < H i n t e r ( m ) . Hence, the property subadditivity is not satisfied.
Additivity: To verify the property additivity, the notation of the last example is employed. Let X × Y be the product space of the sets X = { x 1 , x 2 , x 3 } and Y = { y 1 , y 2 } . The marginal BPAs on X × Y with masses is as follows:
m 1 ( x 1 , x 2 ) = 0.2 , m 1 ( x 3 ) = 0.3 , m 1 ( x 2 ) = 0.1 , m 1 ( X ) = 0.4 , m 2 ( y 1 ) = 0.1 , m 2 ( Y ) = 0.9 .
The following BPA m = m 1 × m 2 on X × Y is built. The marginal BPAs of m are m 1 and m 2 , which are noninteractive. The masses of m are as follows:
m ( { z 11 , z 12 , z 21 } ) = 0.2 , m ( { z 31 , z 32 } ) = 0.3 , m ( { z 21 } ) = 0.1 , m ( X × Y ) = 0.4 .
where z i j = ( x i , y i ) . It can be calculated:
H i n t e r ( m 1 ) + H i n t e r ( m 2 ) = 8.8473 , H i n t e r ( m ) = 8.8729 .
The result that H i n t e r ( m 1 ) + H i n t e r ( m 2 ) < H i n t e r ( m ) shows that the property additivity is not satisfied. In summary, the improved entropy satisfies probabilistic consistency and set consistency but does not satisfy property range, additivity and subadditivity. It is true that these five requirements are helpful to identify whether a definition of the belief entropy makes sense. However, they are not the only criteria for judging rationality and effectiveness of a measurement of belief entropy. On the one hand, these requirements are motivated by the properties of Shannon entropy, which is not entirely applicable for D-S theory framework and there is no uniform definition of uncertainty measurement in D-S theory currently. It should be tolerable that there are very few measurements of uncertainty under D-S framework satisfying all the requirements. They deserve the opportunity to be tested in practice and it is true of the improved belief entropy in this paper. On the other hand, the properties of the belief entropy should keep pace with the times. It was twenty years ago that Klir and Wierman proposed these five requirements. Since then, there are significant increments in research on measurements of uncertainty in D-S theory. Some shortcomings in the properties of Klir and Wierman were stated by Radim and Prakash [55] and they proposed a list of six desired properties of entropy for D-S theory, which are different from those of Klir and Wierman. None of the existing definitions satisfy these six properties except itself. Hence, the properties of measurements of uncertainty in D-S theory need further research.

4. Numerical Examples

In this section, several examples are employed to show the efficiency of the improved belief entropy.

4.1. Example 1

Let Θ = { θ 1 , θ 2 , , θ n } be the FOD. There is a vacuous BPA m ( Θ ) = 1 under the FOD. It is obvious that:
B e l ( Θ ) = P l ( Θ ) = 1 .
The associated Shannon entropy and the improved belief entropy of the FOD are as follows:
H s ( m ) = 1 × l o g 2 1 = 0 , H i n t e r = 1 × l o g 2 ( 2 n 1 ) 1 = l o g 2 ( 2 n 1 ) .
Again, the above results verify that the improved belief entropy will deteriorate to Shannon entropy when there is only a single element in the vacuous BPA. H i n t e r > H s if n > 1 and H i n t e r n when n + . The result which is the same as Deng entropy is logical because uncertainty will increase as the number of elements in the vacuous BPA increases. It can also be seen that there are limitations of the application of Shannon entropy in D-S theory.

4.2. Example 2

Given a FOD Θ = { θ 1 , θ 2 , θ 3 , θ 4 , θ 5 } , m ( θ i ) = 0.2 for i = 1 , 2 , 3 , 4 , 5 . Then, B e l ( θ i ) = B l ( θ i ) = 0.2 for i = 1 , 2 , 3 , 4 , 5 . The associated Shannon entropy and the improved belief entropy of the FOD are as follows:
H s ( m ) = 0.2 × l o g 2 0.2 × 5 = 2.3219 , H i n t e r ( m ) = 0.2 + 0.2 2 l o g 2 ( 0.2 + 0.2 2 ( 2 1 1 ) e ( 0.2 0.2 ) 2 ) × 5 = 2.3219 .
It is obvious that Shannon entropy is equal to the improved belief entropy in this example, which verifies that the improved belief entropy is probability consistent if BPA is Bayesian.

4.3. Example 3

Let m be a belief structure in Θ = { a , b , c , d } ; there are two cases of BPAs in this FOD: m 1 ( { a , b } ) = 0.4 , m 1 ( { c , d } ) = 0.6 and m 2 ( { a , c } ) = 0.4 , m 2 ( { b , c } ) = 0.6 . The result with Deng Entropy is as follows [4]:
E d ( m 1 ) = 0.4 l o g 2 0.4 2 2 - 1 0.6 l o g 2 0.6 2 2 1 = 2.5559 , E d ( m 2 ) = 0.4 l o g 2 0.4 2 2 1 0.6 l o g 2 0.6 2 2 1 = 2.5559 .
The limitations of Deng entropy are transparent because the result calculated by Deng entropy is counterintuitive. The mass values of the two BPAs are the same, while the FOD of the first BPA m 1 consists of four events a , b , c , d and m 2 consists of three events a , b , c . Intuitively, the uncertainty of m 2 should be less than m 1 . However, the result that Deng entropy of both BPAs is equivalent illustrates that Deng entropy does not recognize the difference between m 1 and m 2 .
With the improved belief entropy, the result is calculated as follows:
H i n t e r ( m 1 ) = 0.4 l o g 2 ( 0.2 e 0.4 ) 0.6 l o g 2 ( 0.3 e 0.6 ) 0.4 l o g 2 0.4 3 0,6 l o g 2 0.2 = 5.9696 , H i n t e r ( m 2 ) = 0.2 l o g 2 ( 0.2 e 0.4 ) 0.3 l o g 2 ( 0.3 e 0.6 ) 0.5 l o g 2 ( 0.5 e 1 ) 0.4 l o g 2 ( 0.4 e 0.6 ) 0.6 l o g 2 ( 0.6 e 0.4 ) = 5.8303 .
The experimental results show that the improved belief entropy of m 2 is less than m 1 . It can be concluded that the improved belief entropy can effectively measure the difference between these two BPAs by taking more reliable information implied in different BPAs into consideration.

4.4. Example 4

Let FOD Θ be { θ 1 , θ 2 } and assume that there are two BPAs m 1 and m 2 over Θ :
m 1 ( { θ 2 } ) = 0.8 , m 1 ( { θ 1 , θ 2 } ) = 0.2 , m 2 ( { θ 1 } ) = 0.2 , m 2 ( { θ 2 } ) = 0.8 .
A comparative experiment is conducted. Another uncertainty measure S U also based on the belief interval is introduced [49]. The definition of S U is shown as follows:
Let m be a BPA defined on the FOD Θ = { θ 1 , θ 2 , , θ i , , θ N } . The total uncertainty degree of m can be expressed by
S U ( m ) = i = 1 n [ B e l ( { θ i } ) + P l ( { θ i } ) 2 l o g 2 B e l ( { θ i } ) + P l ( { θ i } ) 2 + B e l ( { θ i } ) P l ( { θ i } ) 2 ] .
For m 1 :
[ B e l m 1 ( { θ 1 } ) , P l m 1 ( { θ 1 } ) ] = [ 0 , 0.2 ] , [ B e l m 1 ( { θ 2 } ) , P l m 1 ( { θ 2 } ) ] = [ 0.8 , 1 ] , [ B e l m 1 ( { θ 1 , θ 2 } ) , P l m 1 ( { θ 1 , θ 2 } ) ] = [ 1 , 1 ] .
For m 2 :
[ B e l m 2 ( { θ 1 } ) , P l m 2 ( { θ 1 } ) ] = [ 0.2 , 0.2 ] , [ B e l m 2 ( { θ 2 } ) , P l m 2 ( { θ 2 } ) ] = [ 0.8 , 0.8 ] , [ B e l m 2 ( { θ 1 , θ 2 } ) , P l m 2 ( { θ 1 , θ 2 } ) ] = [ 1 , 1 ] .
Thus,
A Θ : [ B e l m 1 ( A ) , P l m 1 ( A ) ] [ B e l m 2 ( A ) , P l m 2 ( A ) ]
The results are as follows:
S U ( m 1 ) = 0 + 0.2 2 l o g 2 0 + 0.2 2 + 0.2 0 2 + ( 0.8 + 1 2 l o g 2 0.8 + 1 2 ) = 0.6690 , S U ( m 2 ) = 0.2 l o g 2 0.2 0.8 l o g 2 0.8 = 0.7219 .
It is obvious that S U ( m 1 ) < S U ( m 2 ) . This result shows that the monotonicity defined by Abellan [56] between m 1 and m 2 is violated by the uncertainty measure S U . The property of monotonicity is defined as follows:
There is an uncertainty measure U M and two arbitrary BPAs m 1 and m 2 over the FOD Θ . U M satisfies the property monotonicity if
A Θ : [ B e l m 1 ( A ) , P l m 1 ( A ) ] [ B e l m 2 ( A ) , P l m 2 ( A ) ] .
U M ( m 1 ) U M ( m 2 ) exists.
With the improved belief entropy, the result of the example above is calculated as follows:
H i n t e r ( m 1 ) = 0.1 l o g 2 ( 0.4 e 0.2 ) 0.9 l o g 2 ( 0.9 e 0.2 ) 0.2 l o g 2 0.2 3 = 1.5389 , H i n t e r ( m 2 ) = 0.2 l o g 2 0.2 0.8 l o g 2 0.8 = 0.7219 .
The result that H i n t e r ( m 1 ) > H i n t e r ( m 2 ) shows the improved belief entropy performs better than S U over this example in terms of the property monotonicity. The rigorous proof of the property monotonicity need further research.

4.5. Example 5

There is a FOD Θ = { 1 , 2 , , 14 , 15 } with 15 elements. The BPA of Θ is as follows:
m ( { 7 } ) = 0.05 , m ( { 3 , 4 , 5 } ) = 0.05 , m ( { A } ) = 0.8 , m ( { Θ } ) = 0.1 .
The number of elements in the proposition A changes from 1 to 14. To verify the advantages of the improved belief entropy, another eight uncertainty measures are introduced for comparison: Deng entropy [43], Höhle’s confusion measure [38], Yager’s dissonance measure [39], Dubois and Prade’s weighted Hartley entropy [37], Klir and Ramer’s discord measure [35], Klir and Parviz’s strife measure [40], George and Pal’s conflict measure [51], Pan and Zhou’s measure [50] and Zhou et al.’s measure [4].
The experimental results are shown in Table 1 and the results of these uncertainty measures are plotted in Figure 1. To make it more visible and understandable, the results of the methods in [35,37,38,39,40,51] are extracted and plotted in Figure 2. These curves can be divided into two categories. The first category includes Höhle’s confusion measure, Yager’s dissonance measure, Klir and Ramer’s discord measure, Klir and Parviz’s strife measure, and George and Pal’s conflict measure. They are on a downward trend with the rising of the element number of A or flat, which is counterintuitive because these measurements only measure the discord uncertainty while ignore the non-specificity uncertainty. The other category consists of Deng entropy, Pan and Zhou’s measure, Dubois and Prade’s weighted Hartley entropy and the improved belief entropy proposed in this paper. What they have in common is that they are on a rising trend with the increment of the element number in A. These methods also take non-specificity into consideration so their results are rational. However, Dubois and Prade’s weighted Hartley entropy only considers non-specificity while ignoring discord uncertainty. Deng entropy fails to detect the BPAs of which an element belongs to different focal elements. Pan and Zhou’s method applies the pignistic transformation and the plausibility transformation while they have deficiencies and limitations in D-S theory. Compared with Zhou et al.’s measure, which is a fairly comprehensive uncertainty measure of BPA, the improved belief entropy follows the same trend. Therefore, the improved belief entropy is relatively more effective and reasonable compared with other uncertainty measures under D-S theory framework, both considering the central value and the span of the belief interval.

5. Application

To verify the effectiveness and the applicability of the improved belief entropy in practice, the case study in [57] and the fault diagnosis method in [44] are employed in this part. The difference is that this paper represents Deng entropy applied in [44] with the improved belief entropy for comparison.
The problem is described as follows. There are three fault types denoted as F 1 , F 2 , and F 3 . The FOD is Θ = { F 1 , F 2 , F 3 } . Three sensor reports of the diagnostic result are listed in Table 2. With Dempster’s rule of combination in Equation (6), the combination result is shown in Table 3. It can be seen that it is difficult to judge which fault type has occurred because the BPA of F 1 and F 2 after combination is very close. Dempster’s rule of combination does not play a part in this case.
To solve this problem, a fault diagnosed method based on Deng entropy [44] is put forward. The uncertainty or reliability of sensor data will be modeled as a weight of each BPA, which is defined as follows:
w ( i ) = w s ( i ) × w d ( i ) .
where w s ( i ) means the static reliability and w d ( i ) represents the dynamic reliability. w s ( i ) of each BOE is listed in Table 4. w d ( i ) is defined as follows:
w d ( i ) = C r d ( i ) × E d ( m i ) m a x { E d ( m i ) } .
where C r d ( i ) is the credibility degree of E i . E d ( m i ) is the Deng entropy of E i . m a x { E d ( m i ) } represents the maximum of all the E d ( m i ) . C r d ( i ) and E d ( m i ) of three BOEs are shown in Table 4. Details can be found in the work of Yuan et al. [44]. The ultimate weight of each BOE based on the improved belief entropy H i n t e r ( m i ) proposed in this paper is defined as follows:
w ( i ) = w s ( i ) × C r d ( i ) × m a x { H i n t e r ( m i ) } H i n t e r ( m i ) .
It can be seen that Equation (23) is a little different from that in [44]. Intuitively, a piece of evidence with less uncertainty should be endowed with higher weight, which is consistent with the principle of the entropy weight method. Therefore, the method in [44] is modified and replaced with Equation (23). The improved belief entropy of each BOE is shown as follows:
H i n t e r ( m 1 ) = i = 1 n B e l 1 ( θ i ) + P l 1 ( θ i ) 2 log 2 ( B e l 1 ( θ i ) + P l 1 ( θ i ) 2 e ( P l 1 ( A ) B e l 1 ( A ) ) ) A θ i , A 2 Θ m ( A ) log 2 ( m ( A ) 2 | A | 1 e ( P l 1 ( A ) B e l 1 ( A ) ) ) = 0.7 log 2 ( 0.7 e 0.2 ) 0.25 log 2 ( 0.25 e 0.3 ) 0.15 log 2 ( 0.15 e 0.3 ) 0.1 log 2 ( 0.1 3 e 0.2 ) 0.2 log 2 0.2 7 = 3.1912 ,
H i n t e r ( m 2 ) = i = 1 n B e l 2 ( θ i ) + P l 2 ( θ i ) 2 log 2 ( B e l 2 ( θ i ) + P l 2 ( θ i ) 2 e ( P l 2 ( A ) B e l 2 ( A ) ) ) - A θ i , A 2 Θ m ( A ) log 2 ( m ( A ) 2 | A | 1 e ( P l 2 ( A ) B e l 2 ( A ) ) ) = 0.1 log 2 ( 0.1 e 0.1 ) 0.875 log 2 ( 0.875 e 0.15 ) 0.075 log 2 ( 0.075 e 0.15 ) 0.05 log 2 ( 0.05 3 e 0.1 ) 0.1 log 2 0.1 7 = 1.9165 , H i n t e r ( m 3 ) = - i = 1 n B e l 3 ( θ i ) + P l 3 ( θ i ) 2 log 2 ( B e l 3 ( θ i ) + P l 3 ( θ i ) 2 e ( P l 3 ( A ) B e l 3 ( A ) ) ) - A θ i , A 2 Θ m ( A ) log 2 ( m ( A ) 2 | A | - 1 e ( P l 3 ( A ) B e l 3 ( A ) ) ) = 0.75 log 2 ( 0.75 e 0.1 ) 0.2 log 2 ( 0.2 e 0.2 ) 0.1 log 2 0.1 e 0.2 0.1 log 2 ( 0.1 3 e 0.1 ) 0.1 log 2 0.1 7 = 2.4207 .
The weight of each BOE based on the improved belief entropy is calculated as follows:
w ( 1 ) = w s ( 1 ) × C r d ( 1 ) × H i n t e r ( m 1 ) H i n t e r ( m 1 ) = 1 × 1 × 3.1912 3.1912 = 1 , w ( 2 ) = w s ( 2 ) × C r d ( 2 ) × H i n t e r ( m 1 ) H i n t e r ( m 2 ) = 0.2040 × 0.5523 × 3.1912 1.9165 = 0.1876 , w ( 3 ) = w s ( 3 ) × C r d ( 3 ) × H i n t e r ( m 1 ) H i n t e r ( m 3 ) = 1 × 0.9660 × 3.1912 2.4207 = 1.2735 .
The final weight of each BOE after normalization is shown as follows:
w ( 1 ) = w ( 1 ) w ( 1 ) + w ( 2 ) + w ( 3 ) = 0.4063 , w ( 2 ) = w ( 2 ) w ( 1 ) + w ( 2 ) + w ( 3 ) = 0.0762 , w ( 3 ) = w ( 3 ) w ( 1 ) + w ( 2 ) + w ( 3 ) = 0.5175 .
The BPA of each proposition is modified by the final weight as follows:
m ( { F 1 } ) = 0.4063 × 0.6 + 0.0762 × 0.05 + 0.5175 × 0.7 = 0.6099 , m ( { F 2 } ) = 0.4063 × 0.1 + 0.0762 × 0.8 + 0.5175 × 0.1 = 0.1533 , m ( { F 2 , F 3 } ) = 0.4063 × 0.1 + 0.0762 × 0.05 + 0.5175 × 0.10 = 0.0962 , m ( Θ ) = 0.4063 × 0.2 + 0.0762 × 0.10 + 0.5175 × 0.10 = 0.1406 .
The fused result of the weighted BPA with the Dempster’s rule of combination is calculated as follows:
m ( { F 1 } ) = ( m m ) m ( ( { F 1 } ) = 0.8763 , m ( { F 2 } ) = ( m m ) m ( ( { F 2 } ) = 0.0961 , m ( { F 2 , F 3 } ) = ( m m ) m ( ( { F 2 , F 3 } ) = 0.0219 , m ( Θ ) = ( m m ) m ( Θ ) = 0.0057 .
The fused results with the improved belief entropy based on Dempster’s rule of combination are compared with several other methods, as shown in Table 5. Intuitively, F 1 should be the fault type that occurred because both E 1 and E 3 have relatively strong support to F 1 (0.60 and 0.70) while E 2 may come from an abnormal sensor compared with other two BOEs. The support of F 1 with Yuan et al.’s method based on the improved belief entropy is as high as Fan et al.’s method, Yuan et al.’s method and Zhou et al.’s method and the result is that F 1 is the fault type occurred. This case study verifies the applicability of the improved belief entropy.

6. Conclusions

The measurement of uncertainty under D-S theory framework is still an open issue. An improved belief entropy, which takes the central value and the span of the belief interval into consideration together when defining the uncertainty measure of BPA, is proposed based on Deng entropy and the belief interval in this paper. Importantly, as an uncertainty measure of BPA, it will degenerate to Shannon entropy when BPA is Bayesian, which is consistent with previous methods. Several numerical examples are conducted to verify the efficiency and flexibility of the improved belief entropy. The results of these examples show that the improved belief entropy performs better compared with other methods. To verify the applicability, a case study about fault diagnose is employed. The improved belief entropy will provide an insight to measure uncertainty in various fields (e.g., decision making, risk analysis, and pattern recognition) and further information processing. Although this study shows promising results, some limitations are worth consideration. First, the formula of the improved belief entropy is relatively complex, which leads to a high computational burden when faced with a large amount of evidences. Second, some critical properties (e.g., monotonicity) of the improved belief entropy are not proved by the study. Given the limitations of this study, future research is necessary to simplify the formula and investigate the critical properties of the improved belief entropy.

Author Contributions

Conceptualization, Y.Z. and L.F.; methodology, X.Y.; software, Y.Z.; validation, D.J.; resources, C.Z.; writing—original draft preparation, Y.Z.; and writing—review and editing, D.J.

Funding

This research was funded by the National Natural Science Foundation of China (No. 51825801).

Acknowledgments

The authors greatly appreciate the reviews’ valuable suggestions and the editor’s encouragement and the National Natural Science Foundation of China.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Haukaas, T.; Der Kiureghian, A. Methods and object-oriented software for FE reliability and sensitivity analysis with application to a bridge structure. J. Comput. Civil. Eng. 2007, 21, 151–163. [Google Scholar] [CrossRef]
  2. Hattis, D.; Anderson, E.L. What should be the implications of uncertainty, variability, and inherent “biases”/“conservatis” for risk management decision-making? Risk Anal. 1999, 19, 95–107. [Google Scholar] [CrossRef]
  3. Der Kiureghian, A.; Ditlevsen, O. Aleatory or epistemic? Does it matter? Struct. Saf. 2009, 31, 105–112. [Google Scholar] [CrossRef]
  4. Zhou, D.; Tang, Y.; Jiang, W. An improved belief entropy and its application in decision-making. Complexity 2017, 2017, 4359195. [Google Scholar] [CrossRef]
  5. Deng, X.; Xiao, F.; Deng, Y. An improved distance-based total uncertainty measure in belief function theory. Appl. Intell. 2017, 46, 898–915. [Google Scholar] [CrossRef]
  6. Rényi, A. On measures of entropy and information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability; Contributions to the Theory of Statistics; The Regents of the University of California: Oakland, CA, USA, 1961; Volume 1. [Google Scholar]
  7. Shannon, C.E. A mathematical theory of communication. ACMSIGMOBILE Mob. Comput. Commun. Rev. 2001, 5, 3–55. [Google Scholar] [CrossRef]
  8. Yin, L.; Deng, Y. Toward uncertainty of weighted networks: An entropy-based model. Physica A 2018, 508, 176–186. [Google Scholar] [CrossRef]
  9. Zavadskas, E.K.; Podvezko, V. Integrated determination of objective criteria weights in MCDM. Int. J. Inf. Technol. Decis. Mak. 2016, 15, 267–283. [Google Scholar] [CrossRef]
  10. Krylovas, A.; Kosareva, N.; Zavadskas, E.K. WEBIRA-comparative analysis of weight balancing method. Int. J. Comput. Commun. Control 2018, 12, 238–253. [Google Scholar] [CrossRef]
  11. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. In Classic Works of the Dempster-Shafer Theory of Belief Functions; Springer: Berlin/Heidelberg, Germany, 2008; pp. 57–72. [Google Scholar]
  12. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976; Volume 42. [Google Scholar]
  13. He, Z.; Jiang, W. A new belief Markov chain model and its application in inventory prediction. Int. J. Prod. Res. 2018, 56, 2800–2817. [Google Scholar] [CrossRef]
  14. Deng, X. Analyzing the monotonicity of belief interval based uncertainty measures in belief function theory. Int. J. Intell. Syst. 2018, 33, 1869–1879. [Google Scholar] [CrossRef]
  15. Dong, Y.; Zhang, J.; Li, Z.; Hu, Y.; Deng, Y. Combination of evidential sensor reports with distance function and belief entropy in fault diagnosis. Int. J. Comput. Commun. Control 2019, 14, 329–343. [Google Scholar] [CrossRef]
  16. Du, W.B.; Cao, X.B.; Hu, M.B.; Wang, W.X. Asymmetric cost in snowdrift game on scale-free networks. EPL 2009, 87, 60004. [Google Scholar] [CrossRef]
  17. Jiang, W.; Yang, Y.; Luo, Y.; Qin, X. Determining basic probability assignment based on the improved similarity measures of generalized fuzzy numbers. Int. J. Comput. Commun. Control 2015, 10, 333–347. [Google Scholar] [CrossRef]
  18. Neshat, A.; Pradhan, B. Risk assessment of groundwater pollution with a new methodological framework: Application of Dempster-Shafer theory and GIS. Nat. Hazards 2015, 78, 1565–1585. [Google Scholar] [CrossRef]
  19. Zhang, X.; Mahadevan, S.; Deng, X. Reliability analysis with linguistic data: An evidential network approach. Reliab. Eng. Syst. Saf. 2017, 162, 111–121. [Google Scholar] [CrossRef]
  20. Fu, C.; Wang, Y. An interval difference based evidential reasoning approach with unknown attribute weights and utilities of assessment grades. Comput. Ind. Eng. 2015, 81, 109–117. [Google Scholar] [CrossRef]
  21. Jiang, W.; Xie, C.; Zhuang, M.; Shou, Y.; Tang, Y. Sensor data fusion with z-numbers and its application in fault diagnosis. Sensors 2016, 16, 1509. [Google Scholar] [CrossRef]
  22. Jiang, W.; Xie, C.; Zhuang, M.; Tang, Y. Failure mode and effects analysis based on a novel fuzzy evidential method. Appl. Soft. Comput. 2017, 57, 672–683. [Google Scholar] [CrossRef]
  23. Vasu, J.Z.; Deb, A.K.; Mukhopadhyay, S. MVEM-based fault diagnosis of automotive engines using Dempster-Shafer theory and multiple hypotheses testing. IEEE Trans. Syst. Man Cybern.-Syst. 2015, 45, 977–989. [Google Scholar] [CrossRef]
  24. Xu, S.; Jiang, W.; Deng, X.; Shou, Y. A modified Physarum-inspired model for the user equilibrium traffic assignment problem. Appl. Math. Model. 2018, 55, 340–353. [Google Scholar] [CrossRef]
  25. Roychowdhury, S.; Koozekanani, D.D.; Parhi, K.K. DREAM: Diabetic retinopathy analysis using machine learning. IEEE J. Biomed. Health Inform. 2013, 18, 1717–1728. [Google Scholar] [CrossRef] [PubMed]
  26. Perez, A.; Tabia, H.; Declercq, D.; Zanotti, A. Using the conflict in Dempster-Shafer evidence theory as a rejection criterion in classifier output combination for 3D human action recognition. Image Vis. Comput. 2016, 55, 149–157. [Google Scholar] [CrossRef]
  27. Bhattacharyya, A.; Saraswat, V.; Manimaran, P.; Rao, S. Evidence theoretic classification of ballistic missiles. Appl. Soft. Comput. 2015, 37, 479–489. [Google Scholar] [CrossRef]
  28. Fei, L.; Deng, Y. A new divergence measure for basic probability assignment and its applications in extremely uncertain environments. Int. J. Intell. Syst. 2019, 34, 584–600. [Google Scholar] [CrossRef]
  29. Denoeux, T. Maximum likelihood estimation from uncertain data in the belief function framework. IEEE Trans. Knowl. Data Eng. 2011, 25, 119–130. [Google Scholar] [CrossRef]
  30. Deng, X.; Jiang, W.; Wang, Z. Zero-sum polymatrix games with link uncertainty: A Dempster-Shafer theory solution. Appl. Math. Comput. 2019, 340, 101–112. [Google Scholar] [CrossRef]
  31. Dzitac, I. The fuzzification of classical structures: A general view. Int. J. Comput. Commun. Control 2015, 10, 12–28. [Google Scholar] [CrossRef]
  32. Jiang, W.; Wei, B.; Zhan, J.; Xie, C.; Zhou, D. A visibility graph power averaging aggregation operator: A methodology based on network analysis. Comput. Ind. Eng. 2016, 101, 260–268. [Google Scholar] [CrossRef]
  33. Moosavian, A.; Khazaee, M.; Najafi, G.; Kettner, M.; Mamat, R. Spark plug fault recognition based on sensor fusion and classifier combination using Dempster-Shafer evidence theory. Appl. Acoust. 2015, 93, 120–129. [Google Scholar] [CrossRef]
  34. Yang, J.B.; Xu, D.L. Evidential reasoning rule for evidence combination. Artif. Intell. 2013, 205, 1–29. [Google Scholar] [CrossRef]
  35. Klir, G.J.; Ramer, A. Uncertainty in the Dempster-Shafer theory: A critical re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
  36. Dubois, D.J.; Wellman, M.P.; D’Ambrosio, B. Uncertainty in Artificial Intelligence: Proceedings of the Eighth Conference (1992); Morgan Kaufmann: Burlington, MA, USA, 2014. [Google Scholar]
  37. Dubois, D.; Prade, H. A note on measures of specificity for fuzzy sets. Int. J. Gen. Syst. 1985, 10, 279–283. [Google Scholar] [CrossRef]
  38. Hohle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th IEEE International Symposium on Multiple-Valued Logic, Paris, France, 25–27 May 1982. [Google Scholar]
  39. Yager, R.R. Entropy and specificity in a mathematical theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  40. Klir, G.J.; Parviz, B. A note on the measure of discord. In Uncertainty in Artificial Intelligence; Elsevier: Amsterdam, The Netherlands, 1992; pp. 138–141. [Google Scholar]
  41. Körner, R.; Näther, W. On the specificity of evidences. Fuzzy Sets Syst. 1995, 71, 183–196. [Google Scholar] [CrossRef]
  42. Tang, Y.; Zhou, D.; Xu, S.; He, Z. A weighted belief entropy-based uncertainty measure for multi-sensor data fusion. Sensors 2017, 17, 928. [Google Scholar] [CrossRef] [Green Version]
  43. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  44. Yuan, K.; Xiao, F.; Fei, L.; Kang, B.; Deng, Y. Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory. Sensors 2016, 16, 113. [Google Scholar] [CrossRef] [Green Version]
  45. Yuan, K.; Xiao, F.; Fei, L.; Kang, B.; Deng, Y. Conflict management based on belief function entropy in sensor fusion. Springerplus 2016, 5, 638. [Google Scholar] [CrossRef] [Green Version]
  46. Pan, L.; Deng, Y. A new belief entropy to measure uncertainty of basic probability assignments based on belief function and plausibility function. Entropy 2018, 20, 842. [Google Scholar] [CrossRef] [Green Version]
  47. Yang, Y.; Han, D. A new distance-based total uncertainty measure in the theory of belief functions. Knowl.-Based Syst. 2016, 94, 114–123. [Google Scholar] [CrossRef]
  48. Li, Y.; Deng, Y. Generalized Ordered Propositions Fusion Based on Belief Entropy. Int. J. Comput. Commun. Control 2018, 13, 792–807. [Google Scholar] [CrossRef]
  49. Wang, X.; Song, Y. Uncertainty measure in evidence theory with its applications. Appl. Intell. 2018, 48, 1672–1688. [Google Scholar] [CrossRef]
  50. Pan, Q.; Zhou, D.; Tang, Y.; Li, X.; Huang, J. A Novel Belief Entropy for Measuring Uncertainty in Dempster-Shafer Evidence Theory Framework Based on Plausibility Transformation and Weighted Hartley Entropy. Entropy 2019, 21, 163. [Google Scholar] [CrossRef] [Green Version]
  51. George, T.; Pal, N.R. Quantification of conflict in Dempster-Shafer framework: A new approach. Int. J. Gen. Syst. 1996, 24, 407–423. [Google Scholar] [CrossRef]
  52. Jiang, W.; Wang, S. An Uncertainty Measure for Interval-valued Evidences. Int. J. Comput. Commun. Control 2017, 12, 631–644. [Google Scholar] [CrossRef] [Green Version]
  53. Deng, X.; Yong, H.; Yong, D.; Mahadevan, S. Environmental impact assessment based on D numbers. Expert Syst. Appl. 2014, 41, 635–643. [Google Scholar] [CrossRef]
  54. Klir, G.J.; Wierman, M.J. Uncertainty-Based Information: Elements of Generalized Information Theory; Physica: Berlin, Germany, 2013; Volume 15. [Google Scholar]
  55. Jiroušek, R.; Shenoy, P.P. A new definition of entropy of belief functions in the Dempster-Shafer theory. Int. J. Approx. Reason. 2018, 92, 49–65. [Google Scholar] [CrossRef] [Green Version]
  56. Abellan, J.; Masegosa, A. Requirements for total uncertainty measures in Dempster-Shafer theory of evidence. Int. J. Gen. Syst. 2008, 37, 733–747. [Google Scholar] [CrossRef]
  57. Fan, X.; Mingj, Z. Fault diagnosis of machines based on D-S evidence theory. Part 1: D-S evidence theory and its improvement. Pattern Recognit. Lett. 2006, 27, 366–376. [Google Scholar] [CrossRef]
  58. Zhou, D.; Tang, Y.; Jiang, W. A modified belief entropy in Dempster-Shafer framework. PLoS ONE 2017, 12, e0176832. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Comparison among different uncertainty measures (Pan and Zhou [50], Yager [39], Deng Entropy [43], Dubois and Prade [37], Höhle [38], Klir and Ramer [35], Klir and Parviz [40], George and Pal [51], Zhou et al. [4], and Improved entropy).
Figure 1. Comparison among different uncertainty measures (Pan and Zhou [50], Yager [39], Deng Entropy [43], Dubois and Prade [37], Höhle [38], Klir and Ramer [35], Klir and Parviz [40], George and Pal [51], Zhou et al. [4], and Improved entropy).
Entropy 21 01122 g001
Figure 2. Comparison among different uncertainty measures (Yager [39], Dubois and Prade [37], Höhle [38], Klir and Ramer [35], Klir and Parviz [40], and George and Pal [51]).
Figure 2. Comparison among different uncertainty measures (Yager [39], Dubois and Prade [37], Höhle [38], Klir and Ramer [35], Klir and Parviz [40], and George and Pal [51]).
Entropy 21 01122 g002
Table 1. The results of different uncertainty measures.
Table 1. The results of different uncertainty measures.
CasesPan and ZhouYagerDeng EntropyDubois and PradeHöhleKlir and RamerKlir and ParvizGeorge and PalZhou et al.The Improved Belief Entropy
A = {1}1.97570.39522.66230.46991.02196.44193.38040.33172.51805.9870
A = {1, 2}2.33620.39523.93031.26991.02195.64193.29560.32103.70909.2881
A = {1, 2, 3}2.52320.19974.90821.73791.02194.28232.97090.29434.610011.2461
A = {1, 2, 3, 4}2.70850.19975.78782.06991.02193.68632.81320.26775.412712.9904
A = {1, 2, 3, 4, 5}2.87490.19976.62562.32741.02193.29462.71210.24106.173614.6352
A = {1, 2,..., 6}3.05160.00747.44412.53791.02192.48882.49920.22506.915116.3330
A = {1, 2,..., 7}3.06470.00748.25322.71581.02192.45622.51980.22197.647317.9447
A = {1, 2,..., 8}3.20420.00749.05782.86991.02192.42302.53360.21708.374919.6287
A = {1, 2,..., 9}3.33000.00749.86003.00591.02192.38982.54310.21089.100221.3103
A = {1, 2,..., 10}3.44450.007410.66123.12751.02192.35682.54940.20379.824422.9908
A = {1, 2,..., 11}3.54970.007411.46173.23751.02192.32412.55360.195910.548024.6708
A = {1, 2,..., 12}3.64690.007412.26203.33791.02192.29202.55620.187711.271426.3504
A = {1, 2,..., 13}3.73740.007413.06223.43031.02192.26052.55770.179111.994628.0300
A = {1, 2,..., 14}3.82190.007413.86223.51581.02192.22962.55820.170112.717729.7094
Table 2. BPAs of the sensors report.
Table 2. BPAs of the sensors report.
Sensors Report{F1}{F2}{F2,F3}Θ
E 1 : m 1 ( · ) 0.600.100.100.20
E 2 : m 2 ( · ) 0.050.800.050.10
E 3 : m 3 ( · ) 0.700.100.100.10
Table 3. Fused results of the sensors report (Dempster’s rule of combination).
Table 3. Fused results of the sensors report (Dempster’s rule of combination).
F1F2F2,F3Θ
Fused Results0.45190.50480.03360.0096
Table 4. w s ( i ) , C r d ( i ) , E d ( i ) of three BOEs.
Table 4. w s ( i ) , C r d ( i ) , E d ( i ) of three BOEs.
E1E2E3
w s ( i ) 1.00000.20401.0000
C r d ( i ) 1.00000.55230.9660
E d ( i ) 2.29091.38191.7960
Table 5. The comparison of the fused results among different methods.
Table 5. The comparison of the fused results among different methods.
Methods{F1}{F2}{F2,F3}{Θ}
Dempster’s rule of combination [11,12]0.45190.50480.03360.0096
Fan et al.’s method [57]0.81190.10960.05260.0259
Yuan et al.’s method [44]0.89480.07390.02410.0072
Zhou et al.’s method [58]0.89510.07380.02400.0071
The improved belief entropy0.87630.09610.02190.0057

Share and Cite

MDPI and ACS Style

Zhao, Y.; Ji, D.; Yang, X.; Fei, L.; Zhai, C. An Improved Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Deng Entropy and Belief Interval. Entropy 2019, 21, 1122. https://doi.org/10.3390/e21111122

AMA Style

Zhao Y, Ji D, Yang X, Fei L, Zhai C. An Improved Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Deng Entropy and Belief Interval. Entropy. 2019; 21(11):1122. https://doi.org/10.3390/e21111122

Chicago/Turabian Style

Zhao, Yonggang, Duofa Ji, Xiaodong Yang, Liguo Fei, and Changhai Zhai. 2019. "An Improved Belief Entropy to Measure Uncertainty of Basic Probability Assignments Based on Deng Entropy and Belief Interval" Entropy 21, no. 11: 1122. https://doi.org/10.3390/e21111122

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop