Next Article in Journal
Electro-Osmotic Behavior of Polymeric Cation-Exchange Membranes in Ethanol-Water Solutions
Previous Article in Journal
An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Belief Entropy in Dempster–Shafer Theory Based on Basic Probability Assignment and the Frame of Discernment

1
School of Automation, Northwestern Polytechnical University, Xi’an 710072, China
2
School of Electronics and Information, Northwestern Polytechnical University, Xi’an 710072, China
*
Authors to whom correspondence should be addressed.
Entropy 2020, 22(6), 691; https://doi.org/10.3390/e22060691
Submission received: 30 April 2020 / Revised: 8 June 2020 / Accepted: 10 June 2020 / Published: 20 June 2020
(This article belongs to the Section Information Theory, Probability and Statistics)

Abstract

:
Dempster–Shafer theory has been widely used in many applications, especially in the measurement of information uncertainty. However, under the D-S theory, how to use the belief entropy to measure the uncertainty is still an open issue. In this paper, we list some significant properties. The main contribution of this paper is to propose a new entropy, for which some properties are discussed. Our new model has two components. The first is Nguyen entropy. The second component is the product of the cardinality of the frame of discernment (FOD) and Dubois entropy. In addition, under certain conditions, the new belief entropy can be transformed into Shannon entropy. Compared with the others, the new entropy considers the impact of FOD. Through some numerical examples and simulation, the proposed belief entropy is proven to be able to measure uncertainty accurately.

1. Introduction

How to measure uncertainty is a meaningful question to be solved. Our work will also discuss this issue. First of all, we need to know what uncertainty is.
The uncertainty problem is still very extensive. It means not certainly known, questionable, and problematic. Uncertainty can mainly divided into three types: vagueness, which is boundary uncertainty; nonspecificity, which is size (cardinality) uncertainty; and discord, which expresses conflict. Correspondingly, there are some theories to solve these problems: fuzzy set theory [1], probability theory [2], evidence theory [3,4], and rough sets [5]. Besides, some extended theories are also presented for the uncertainty measure, e.g., generalized evidence theory [6], complex numbers [7], fuzzy numbers [8,9,10], Z numbers [11,12], D numbers theory [13,14,15,16], and so on [17,18,19,20,21,22]. In this paper, we use evidence theory to study these open issues.
In 1960, Dempster [3] proposed upper and lower probabilities to solve the multivalued mapping problem. In 1971, Shafer [4] completed the theory proposed by Dempster and formed evidence theory, also called D-S theory. After years of exploration, D-S theory is a very effective tool for modeling and processing information uncertainty. In 1948, Shannon [23] used the concepts in thermodynamics to define information entropy. Under probability theory, Shannon entropy was very good at measuring the degree of information uncertainty. However, D-S theory is easier than probability theory for getting prior data, and the former has the advantage of fused data. Thus, we introduce D-S theory to replace probability theory in uncertainty.
D-S theory uses basic probability assignment (BPA), which is under the frame of discernment (FOD), to represent the degree of support for a focal element. Different FODs may have different BPAs. Besides, the core of D-S theory is Dempster’s combination rule. Dempster’s combination rule provides a way to fuse different BPAs. The proposition of evidence theory provides mathematical support for the establishment of uncertain models.
On the other hand, it is well known in information theory that Hartley [24] and Shannon’s measures are both effective ways to deal with information uncertainty. Meanwhile, D-S theory as an extension of probability theory contains much ignorance information. Thus, Höhle [25] was the earliest to combine D-S theory and Shannon entropy, namely Höhle entropy. Subsequently, Nguyen [26], Dubois and Prade [27], Klir [28], Jiroušek and Shenoy [29], Nikhil R. Pal [30,31], Deng [32], Pan and Deng [33], and Wang [34] defined their uncertainty models. Some of them have been successfully applied in real situations [35,36]. However, these models are not effective in some places [37,38].
According to their entropy, most of them are only focused on the BPA of every focal element and the cardinality of an element or use the belief function and plausibility function to measure uncertainty. Therefore, no one focused on FOD. Obviously, the scale of FOD can impact the degree of uncertainty. We combined different models and proposed a new uncertainty measurement, namely B& F entropy, because the uncertainty is determined by both BPA and FOD. In this method, it can well reflect the impact of FOD on uncertainty. In the end, we will give a few examples to compare the new model and others. Besides, we design a simulation to illustrate the feasibility and effectiveness of the proposed model.
The outline of the remainder of the paper is as follows. In Section 2, we briefly review the Hartley and Shannon measures and D-S theory. Some essential properties are briefly introduced in Section 3. Section 4 presents and existing uncertainty measures. In Section 5, we discuss some properties and define a new entropy. In Section 6, some significant numerical examples and simulations are carried out to illustrate the feasibility and effectiveness of the proposed belief entropy. In Section 7, we summarize our findings and conclude with some open questions.

2. Preliminaries

The focus of this paper is based on D-S theory and information entropy. We divide this section into two parts. In the D-S theory section, some basic concepts will briefly be introduced. In the information entropy section, we will introduce two typical representatives, the Hartley measure and Shannon entropy.

2.1. D-S Theory

Dempster–Shafer theory, also called evidence reasoning or evidence theory, originated from Dempster [3] and was developed by his student Shafer [4]. Through a series of improvements and reinforcements, a method of uncertainty reasoning using “evidence” and “combination” was formed. In a way, D-S theory is a promotion of Bayesian reasoning. Dempster–Shafer theory is often applied to pattern recognition [39,40,41,42,43,44], fault diagnosis [45,46,47,48], uncertainty modeling [20,49], clustering [50], decision making [51,52], risk analysis [53,54,55,56], and other hot fields [57,58].
The idea of D-S theory is based on the frame of discernment X, which X = { x 1 , x 2 x n } , and the set of all subsets in X is called the power set 2 X . In the power set 2 X , it contains 2 X elements. X means the cardinality of X, which is the number of elements in X. Under this frame, Dempster and Shafer defined some basic concepts as follows.

2.1.1. Basic Belief Assignment

Based on the above power set 2 X , function m : 2 X 0 , 1 satisfies:
a 2 X m ( a ) = 1 m ( ) = 0
The function m ( a ) is also called a basic probability assignment or mass function. If m ( a ) > 0 , then a is a focal element. m ( a ) means the value of trust that the object belongs to a. The larger m ( a ) is, the higher the trust value is.
Some definitions of BPA are as follows. The vacuous BPA means entirely unknown for the true result. In contrast, from the Bayesian BPA, we can know to which category the target should belong.

2.1.2. Belief Function

The belief function is the sum of the basic probability assignments for all subsets of a and is given by:
B e l ( a ) = b a m ( b ) , a 2 X
It is the lower limit of support for a.

2.1.3. Plausibility Function

The plausibility function is the sum of the basic probability assignments for all subsets that intersect with a and is given by:
P l ( a ) = b a = m ( b ) , a 2 X
It is the upper limit of support for a.
The value, between the belief function and plausibility function, is the degree of uncertainty for evidence.

2.1.4. Dempster’s Combination Rule

Dempster’s combination rule is the most commonly used method in evidence fusion. This rule takes into account the degree of conflict between the evidence and defined conflict coefficient k to measure the degree of conflict among different evidence.
Suppose m 1 and m 2 are independent BPAs from the different evidence resources, respectively. The fusion result of m 1 and m 2 under Dempster’s combination is as follows:
m ( a ) = 0 , a = m ( a ) = b c = a m 1 ( b ) m 2 ( c ) 1 k , a
where k is a conflict coefficient, defined by:
k = b c = m 1 ( b ) m 2 ( c )
Notice that Dempster’s combination rule is invalid, if two bodies of evidence completely conflict ( k = 1 ) . Furthermore, if k > 1 , Dempster’s combination rule cannot be applied to the two BPAs’ fusion.

2.2. Origin of Information Entropy

Different authors have measured information uncertainty in a variety of ways, and Hartley and Shannon laid the foundation for it. The information entropy and their extended models have been applied to many fields [59]. Next, we will briefly introduce the Hartley measure and Shannon entropy.

2.2.1. Hartley Measure

Suppose X is an FOD and a is a subset of X. Then, the Hartley measure [24] is defined as:
H ( a ) = l o g 2 a
where a means the cardinality of a.
Obviously, the measurement is proportional to the cardinality of a. When a is a singleton of X, H ( a ) = 0 , this means there is no conflict. Unfortunately, the measurement method of Hartley does not show the effect of the probability distribution on the degree of uncertainty.

2.2.2. Shannon Entropy

In 1948, Shannon [23] proposed information entropy, namely Shannon entropy. His model uses the concept of entropy from thermodynamics.
H ( x ) = x X P ( x ) l o g 2 1 P ( x )
where P ( x ) is the probability of x and P ( x ) satisfies x X P ( x ) = 1 .
As he said in his thesis, the role of information is used to eliminate the uncertainty. Shannon entropy is an excellent way to measure and eliminate uncertainty. It played a crucial role in solving the probability problem. We can conclude from his definition that it is based on the probability distribution. With the emergence of D-S theory, the information entropy was given a new meaning. The format of our new model is also derived from the Shannon entropy.

3. Properties of the Uncertainty Measure in D-S Theory

According to Klir and Wierman [60] and Klir and Folger [61], we introduce some important properties of entropy for D-S theory, including non-negativity, maximum, monotonicity, probability consistency, additivity, sub-additivity, and range. These properties for a measure that captures both discord and non-specificity are defined as follows.

3.1. Non-Negativity

Suppose m is a BPA on FOD X; the entropy H ( m ) must be:
H ( m ) > 0
where this is equality if and only if m ( x ) = 1 and x X .
Only when entropy satisfies the non-negativity property, it provides a standard for measurement uncertainty.

3.2. Maximum Entropy

It makes sense that the vacuous BPA m v for uncertainty is lager than other normal BPAs m n . Thus, the maximum entropy property is defined as:
H ( m v ) > H ( m n )

3.3. Monotonicity

As the number of focal elements in FOD increases, so should the degree of uncertainty. The monotonicity property is defined as:
H ( m X ) > H ( m Y )
where m X and m Y are the vacuous BPAs for FOD X and FOD Y. Meanwhile, X > Y .

3.4. Probability Consistency

Let m B be a Bayesian BPA, and then, the entropy should be the same as Shannon entropy. Therefore, the probability consistency property follows as:
H ( m B ) = H S ( P X ) = x X P X ( x ) l o g 2 1 P X ( x )
where H S is the Shannon entropy and P X is the BPA of X corresponding to m b .

3.5. Additivity

Let m X and m Y be independent BPAs on FOD X and FOD Y, respectively. ⊕ means Dempster’s combination rule. Thus, the additivity property is defined as:
H ( m X m Y ) = H ( m X ) + H ( m Y )
where m X m Y is a BPA for FOD { X , Y } . Note that m ( a × b ) = m X ( a ) m Y ( b ) , where m is the m X and m Y combined by Dempster’s combination rule.

3.6. Sub-Additivity

Let m be a BPA on FOD { X × Y } . Let m X and m Y be the marginal BPAs of FOD X and FOD Y. Then, define:
H ( m ) H ( m X ) + H ( m Y )

3.7. Range

As Klir and Wierman defined, the range of H ( m ) is [ 0 , l o g 2 X ] .

4. The Development of Entropy Based on D-S Theory

In this section, some belief entropies of BPAs in D-S theory proposed by others are reviewed. We also discuss whether or not these models satisfy the properties we list.
Yager [62] defined the belief entropy using the conflict coefficient between two focal elements, simplified as follows:
H Y ( m ) = x 2 X m ( a ) l o g 2 P l ( a )
where P l ( a ) is the plausibility function associated with a under m. The entropy of Yager only measures the degree of conflict between evidence. H Y ( m ) only satisfies the additivity property.
Dubois [27] used a new information measurement method to get the new formula of entropy.
H D ( m ) = a 2 X m ( a ) l o g 2 a
From the definition of Dubois, this entropy only answers the question of the non-specific part of the uncertainty. If m is a Bayesian BPA, then H D ( m ) = 0 . It is noticeable that H D ( m ) is clearly a weighted Hartley [24] measure. H D ( m ) satisfies the maximum entropy and monotonicity properties.
Nguyen [26] defined a new entropy according to Shannon entropy.
H N ( m ) = a 2 X m ( a ) l o g 2 1 m ( a )
From the definition format, it only uses the BPA to capture the part of the conflict. This is inaccurate for uncertain measurements. It only satisfies the probabilistic consistency property and the additivity property.
Lamata and Moral [63] used the entropy theory proposed by Yager and Dubois.
H L & M ( m ) = H Y ( m ) + H D ( m ) = a 2 X m ( a ) l o g 2 a P l ( a )
They both have two components: one measures the innate contradiction, while the other measures the imprecision of the information. This definition does not satisfy the maximum entropy and sub-additivity properties.
Jiroušek and Shenoy [29] entropy is a combination of the Shannon and Dubois definitions.
H J & S ( m ) = H S ( P l _ P m ) + H D ( m ) = x X P l _ P m l o g 2 1 P l _ P m + a 2 X m ( a ) l o g 2 a
where P l _ P m is the normalized result of plausibility function P l m . The first part is the measurement of conflict based on Shannon entropy, and the second part is to measure the non-specificity portion of uncertainty. The entropy of H J & S ( m ) satisfies non-negativity, maximum entropy, monotonicity, probability consistency, and additivity.
Klir and Ramer [28] defined:
H K & R ( m ) = a 2 X m ( a ) l o g 2 b 2 X m ( b ) b a b
Due to the Yager entropy not concluded the broader view of conflict (it only considered the conflict situation of B A = ), Klir and Ramer proposed a new method to solve this problem. It is easy to see that this entropy can measure the conflict of evidential claims within each body of evidence in bits. However, under certain conditions, it is difficult for H K & R ( m ) to express the aspect of uncertainty. It just does not satisfy the maximum entropy property.
Nikhil R. Pal [30,31] focused on nonspecificity and randomness under a total uncertainty environment.
H P ( m ) = a 2 X m ( a ) l o g 2 a m ( a ) = a 2 X m ( a ) l o g 2 1 m ( a ) + a 2 X m ( a ) l o g 2 a
They summed up the methods proposed by Lamata and Moral and Klir and Ramer. It was pointed out that there would be mistakes against common sense in certain situations. The first part is, in some sense, analogous to Yager’s entropy, and the second part measures the conflict of the body of evidence. It does not satisfy the maximum entropy property.
Jousselme [64] entropy is based on pignistic transformation B e t P m ( a ) [65].
H J ( m ) = a 2 X B e t P m ( a ) l o g 2 1 B e t P m ( a )
He finally proved that as the evidence changes, the entropy becomes more sensitive.
Deng [32] defined an entropy:
H D e n g ( m ) = a 2 X m ( a ) l o g 2 2 a 1 m ( a )
As proven by Joaquín Abellán [66], the Deng entropy does not satisfy the monotonicity, additivity, and subadditivity properties.
Pan and Deng [33] developed Deng entropy and defined it as follows:
H B e l ( m ) = a 2 X B e l ( a ) + P l ( a ) 2 l o g 2 B e l ( a ) + P l ( a ) 2 ( 2 a 1 )
where B e l ( a ) and P l ( a ) are the belief function and plausibility function, respectively. H B e l ( m ) uses the interval probability to measure the discord and non-specificity uncertainty of BPA. It does not satisfy the maximum entropy, additivity, sub-additivity, and range properties.
W [34] is another modified model based on Deng entropy:
H W ( m ) = a X m ( a ) l o g 2 ( m ( a ) 2 a 1 ( 1 + ϵ ) f ( X ) )
where ϵ is a constant and ϵ 0 , f ( X ) is the function about the cardinality of X. ϵ is a change number; it can take different values to represent different entropies. However, as the parameter ϵ changes, it has little effect on the value of W entropy [34].

5. A New Belief Entropy Based on Evidence Theory

As introduced at the start of the first chapter of Shafer’s book [4], D-S theory is a theory of evidence. That means using the mathematical form to express the degree of support for evidence.
Based on the entropy proposed by previous scholars, for the measurement method of information uncertainty, there remain several aspects of the frame of discernment about which relatively little is known. In D-S theory, if we have the same cardinality of BPA, but different FODs, the results of uncertainty should be changed. However, most of them we listed above only focused on the value of BPA or the cardinality of every BPA, and the effect of FOD was totally ignored. Thus, these definitions cannot measure the degree of uncertainty under different FOD. To improve these deficiencies, we suggest that the FOD is also important for the measurement of uncertainty. Therefore, we introduce the scale of FOD to our new entropy. The new belief entropy based on D-S theory, namely B& F entropy, is defined as follows:
H B & F ( m ) = a 2 X m ( a ) l o g 2 a X m ( a )
where a denotes the cardinality of the focal element a and X equals the number of elements in FOD.
Like some of the definitions we mentioned, the new definition can be represented by a combination of other entropies. Thus, the new entropy also can be expressed as:
H B & F ( m ) = a 2 X m ( a ) l o g 2 1 m ( a ) + a 2 X m ( a ) l o g 2 a X = H N ( m ) + X H D ( m )
where H N ( m ) is Nguyen’s entropy and H D ( m ) is Dubois’ entropy. Obviously, the new entropy is a combination of H N ( m ) and X times H D ( m ) . Similar to most of the belief entropies, the first component a 2 X m ( a ) l o g 2 m 1 ( a ) in the new belief entropy is designed to measure the discord uncertainty of BPA. At length, the second component a 2 X m ( a ) l o g 2 a X is the measure of non-specificity of the mass function among various focal elements [27,32,61]. In addition, it can capture the information about the size of cardinality. When m is a Bayesian BPA or the cardinality of FOD equals one, the new entropy degenerates to Pal’s definition.
The most important information about FOD is the quantity of the focal element, namely X . If X is modified, the accuracy of uncertain measurement will be affected. Here, we use an example to show that X is the best way to represent the information of FOD.
As shown in Figure 1, it is obvious that l o g 2 X and 2 X cannot reflect the effect of FOD on entropy very well. When the cardinality of FOD is greater than 10, l o g 2 X is almost constant, but 2 X is very large. Thus, X can well contain the information of the FOD size.
The new entropy connects the degree of information uncertainty and the FOD, meanwhile improving the information uncertainty measurement method.
According to Section 3, the basic properties of the new belief entropy are proven as follows:
(P1) Non-negativity:
Let a be a cardinality of the focal element and X be a cardinality of FOD. It is obvious that a X > 1 ; thus, H B & F ( m ) 0 , if and only if m is the Bayesian BPA and m = 1 . Therefore, the new definition satisfies the non-negativity property.
(P2) Maximum entropy:
Let m b be a Bayesian BPA and m v be a vacuous BPA, then H B & F ( m b ) = l o g 2 X , H B & F ( m v ) = X l o g 2 X . Although, according to our calculations, H B & F ( m v ) > H B & F ( m b ) , it does not mean H B & F ( m v ) is the maximum value. Later, we will further explain the max value through simulation. In this part, we just give some simple explanations.
From the definition of Nguyen we introduced, this entropy does not satisfy the maximum entropy, as it consists of Nguyen’s entropy and Dubois entropy. Thus, the maximum entropy is not satisfied with the new belief entropy.
(P3) Monotonicity:
We suppose that m v denotes the vacuous BPA, then H B & F ( m v ) = X l o g 2 X . Obviously, H B & F ( m v ) increases with X . Therefore, H B & F ( m ) satisfies the monotonicity property.
(P4) Probability consistency:
When m b is a Bayesian BPA, then a = 1 , H B & F ( m ) = a 2 X m ( a ) l o g 2 1 m ( a ) . From this result, we conclude that the new belief entropy satisfies the probability consistency property.
(P5) Additivity and sub-additivity:
Let c = a × b 2 X × Y , where a, b, c is a focal element and X, Y means the FOD. Meanwhile, a 2 X and b 2 Y . According to the definition of the above properties, m ( c ) = m X ( a ) × m Y ( b ) , where m X is the marginal BPA for X and m Y is the marginal BPA for Y.
H B & F ( c ) = c 2 X × Y m ( c ) l o g 2 c X Y m ( c ) = a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 a b X Y a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 m X ( a ) m Y ( b ) = a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 a X Y + a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 b X Y a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 m X ( a ) a 2 X b 2 Y m X ( a ) m Y ( b ) l o g 2 m X ( b ) = b 2 Y m Y ( b ) a 2 X m X ( a ) l o g 2 a X Y + a 2 X m X ( a ) b 2 Y m Y ( b ) l o g 2 b X Y b 2 Y m Y ( b ) a 2 X m X ( a ) l o g 2 m X ( a ) a 2 X m X ( a ) b 2 Y m Y ( b ) l o g 2 m Y ( b ) = a 2 X m X ( a ) l o g 2 a X Y + b 2 Y m Y ( b ) l o g 2 b X Y a 2 X m X ( a ) l o g 2 m X ( a ) b 2 Y m Y ( b ) l o g 2 m Y ( b ) = a 2 X m X ( a ) l o g 2 a X Y m X ( a ) + b 2 Y m Y ( b ) l o g 2 b X Y m Y ( b ) a 2 X m X ( a ) l o g 2 a X m X ( a ) + b 2 Y m Y ( b ) l o g 2 b Y m Y ( b ) = H B & F ( m X ( a ) ) + H B & F ( m Y ( b ) )
We can see from the above proof that the new entropy satisfies the additivity property, if and only if X = Y = 1 . Otherwise, the new belief entropy neither satisfies the additivity property nor sub-additivity.
To be more intuitive, we consider the following example:
Let Z be the product of FOD X = { x 1 , x 2 } and FOD Y = { y 1 , y 2 , y 3 } . We have that BPA on Z is m, and the marginal BPAs on X and Y are m X and m Y . We suppose the case on Z is shown as follows:
m ( { z 11 } ) = 0 . 1 , m ( { z 13 } ) = 0 . 1
m ( { z 12 } ) = 0 . 1 , m ( { ( z 21 } ) = 0 . 1
m { ( z 22 } ) = 0 . 2 , m ( { z 23 } ) = 0 . 2
m ( Z ) = 1 m ( { z 11 } ) m ( { z 13 } ) m ( { z 12 } ) m ( { z 21 } ) m ( { z 22 } ) m ( { z 23 } ) = 0 . 2
where Z i , j = { x i , y j } . Thus, the BPAs on X and Y are:
m X ( { x 1 } ) = 0 . 3 , m X ( { x 2 } ) = 0 . 5
m X ( X ) = 1 m X ( { x 1 } ) m X ( { x 2 } ) = 0 . 2
m Y ( { y 1 } ) = 0 . 2 , m Y ( { y 2 } ) = 0 . 3 , m Y ( { y 3 } ) = 0 . 3
m Y ( Y ) = 1 m Y ( { y 1 } ) m Y ( { y 2 } ) m Y ( { y 3 } ) = 0 . 2
The calculation results are as follows:
Z = 6 , X = 2 , Y = 3 H B & F ( m ) = z 2 X × Y m ( z ) l o g 2 z X Y m ( z ) = 2 × 0 . 1 × l o g 2 1 6 0 . 1 + 2 × 0 . 1 × l o g 2 1 6 0 . 1 + 2 × 0 . 2 × l o g 2 1 6 0 . 2 + 0 . 2 × l o g 2 6 6 0 . 2 = 0 . 6644 + 0 . 6644 + 0 . 9288 + 3 . 5663 = 5 . 8239 H B & F ( m X ) = a 2 X m X ( a ) l o g 2 a X m X ( a ) = 0 . 3 × l o g 2 1 2 0 . 3 + 0 . 5 × l o g 2 1 2 0 . 5 + 0 . 2 × l o g 2 2 2 0 . 2 = 0 . 5211 + 0 . 5 + 0 . 8644 = 1 . 8855 H B & F ( m Y ) = b 2 Y m Y ( b ) l o g 2 b Y m Y ( b ) = 0 . 2 × l o g 2 1 3 0 . 2 + 0 . 3 × l o g 2 1 3 0 . 3 + 0 . 3 × l o g 2 1 3 0 . 3 + 0 . 2 × l o g 2 3 3 0 . 2 = 0 . 4644 + 0 . 5211 + 0 . 5211 + 1 . 4154 = 2 . 9220 H B & F ( m X ) + H B & F ( m Y ) = 1 . 8855 + 2 . 9220 = 4 . 8075
Obviously, H B & F ( m ) > H B & F ( m X ) + H B & F ( m Y ) . Therefore, the additivity and sub-additivity properties are not satisfied with the new entropy.
(P6) Range:
As demonstrated by the maximum entropy property, the value of the new entropy H B & F ( m v ) = X l o g 2 X , and H B & F ( m v ) > l o g 2 X . Thus, it does not satisfy the range property.
From the above results we proved, the new belief entropy satisfies the non-negativity, monotonicity, and probability consistency properties, and does not satisfy the maximum entropy, additivity, subadditivity, and range properties.

6. Numerical Example and Simulation

In the first part of this section, some examples are given to illustrate the effectiveness of the new belief entropy. The influence of different BPAs on B & F entropy is shown in the second section.

6.1. Numerical Example

6.1.1. Example 1

Let FOD X = { a } , and we get a BPA from the sensor as m ( { a } ) = 1 . Shannon entropy and the new definition proposed by the authors’ calculation results are as follows:
H B & F ( m ) = H S ( m ) = 1 × l o g 2 1 = 0

6.1.2. Example 2

Suppose there are three FODs X 1 = { x 1 , x 2 } , X 2 = { x 1 , x 2 , x 3 , x 4 } , X 3 = { x 1 , x 2 , x 3 , x 4 , x 5 } . Every Bayesian BPA m 1 , m 2 , m 3 of these FODs is equal. Their BPAs are as follows:
m 1 ( { x 1 } ) = m 1 ( { x 2 } ) = 0 . 5
m 2 ( { x 1 } ) = m 2 ( { x 2 } ) = m 2 ( { x 3 } ) = m 2 ( { x 4 } ) = 0 . 25
m 3 ( { x 1 } ) = m 3 ( { x 2 } ) = m 3 ( { x 3 } ) = m 3 ( { x 4 } ) = m 3 ( { x 5 } ) = 0 . 2
The new belief entropy is calculated as follows:
H B & F ( m 1 ) = 2 × 0 . 5 × l o g 2 1 2 0 . 5 = 1
H B & F ( m 2 ) = 4 × 0 . 25 × l o g 2 1 4 0 . 25 = 2
H B & F ( m 3 ) = 5 × 0 . 2 × l o g 2 1 5 0 . 2 = 2 . 3219
It is obvious that uncertainty increases as the number of focal elements increases. This is reasonable.

6.1.3. Example 3

Using the FOD raised by Example 2 and the vacuous BPAs m 1 ( { x 1 , x 2 } ) = 1 , m 2 ( { x 1 , x 2 , x 3 , x 4 } ) = 1 , m 3 ( { x 1 , x 2 , x 3 , x 4 , x 5 } ) = 1 , the new entropy results are calculated as follows:
H B & F ( m 1 ) = 1 × l o g 2 2 2 1 = 2
H B & F ( m 2 ) = 1 × l o g 2 4 4 1 = 8
H B & F ( m 3 ) = 1 × l o g 2 5 5 1 = 11 . 6096
Comparing Example 2 and Example 3, it is easy to get that the results of the vacuous BPA are bigger than the results of the Bayesian BPA.

6.1.4. Example 4

In this example, we compare the difference between Pal entropy and B & F entropy. Let FOD X 1 = { x 1 , x 2 , x 3 , x 4 , x 5 } and X 2 = { x 1 , x 2 , x 3 , x 4 } . Meanwhile, suppose the following two situations exist:
C 1 : m 1 ( { x 1 , x 2 , x 3 } ) = 0 . 3 , m 1 ( { x 4 , x 5 } ) = 0 . 7
C 2 : m 2 ( { x 1 , x 2 , x 3 } ) = 0 . 3 , m 2 ( { x 3 , x 4 } ) = 0 . 7
Thus, the Pal entropy and B & F entropy results calculated and compared are the following:
C 1 : H B & F ( m 1 ) = 0 . 3 × l o g 2 3 5 0 . 3 + 0 . 7 × l o g 2 2 5 0 . 7 = 2 . 8985 + 3 . 8602 = 6 . 7587 C 1 : H P ( m 1 ) = 0 . 3 × l o g 2 3 0 . 3 + 0 . 7 × l o g 2 2 0 . 7 = 0 . 9966 + 1 . 0602 = 2 . 0568 C 2 : H B & F ( m 2 ) = 0 . 3 × l o g 2 3 4 0 . 3 + 0 . 7 × l o g 2 2 4 0 . 7 = 2 . 4230 + 3 . 1602 = 5 . 5832 C 2 : H P ( m 2 ) = 0 . 3 × l o g 2 3 0 . 3 + 0 . 7 × l o g 2 2 0 . 7 = 0 . 9966 + 1 . 0602 = 2 . 0568
We can draw the following conclusions:
H B & F ( m 1 ) > H B & F ( m 2 ) , H P ( m 1 ) = H P ( m 2 )
By comparison, we can conclude that the result of B & F is more reasonable. Because of C 2 has fewer focal elements and they have the same element x 3 in two BPAs, therefore, the uncertainty of C 1 should be bigger than the uncertainty of C 2 .
From an overall view, as long as the focal elements for every BPA are equal, the results of Pal entropy keep constant, even if the number of focal elements on FOD is different. This is unreasonable. However, for the new belief entropy, it reflects the impact of the number of FODs on information uncertainty. Obviously, the degree of information uncertainty is proportional to FOD. Thus, the new definition proposed in this paper is more reasonable for the above Section 6.1.4.

6.1.5. Example 5

In this example, we suppose a FOD that has ten focal elements, X = { x 1 , x 2 x 10 } and four mass functions, m ( { 3 , 4 , 5 } ) = 0 . 05 , m ( { 7 } ) = 0 . 05 , m ( { B i } ) = 0 . 8 , m ( X ) = 0 . 1 , where B i is the subset of 2 X and i is equal to the cardinality of B. We chose ten subsets of 2 X to assignment B and used Dubois entropy, Deng entropy, Pan–Deng entropy, and the new belief entropy for comparison. In Section 4, we already listed these definitions of entropy. When B i changes, their values can be calculated by MATLAB. The calculation results of these definitions are shown in the following Table 1.
Table 1 and Figure 2 show that the new belief entropy is larger than Deng entropy and Dubois entropy. On the other hand, the growth trend of the new belief entropy is slower than Deng entropy and Pan–Deng entropy and the same as Dubois entropy. For example, we chose B 1 , B 2 and B 9 , B 10 to illustrate the impact of each additional element in B i on uncertainty, under different cardinality of B i . From the Table 1, we can get:
H D u b o i s ( m 2 ) H D u b o i s ( m 1 ) = 0 . 8 > H D u b o i s ( m 10 ) H D u b o i s ( m 9 ) = 0 . 1216
H D e n g ( m 2 ) H D e n g ( m 1 ) = 1 . 2680 H D e n g ( m 10 ) H D e n g ( m 9 ) = 0 . 8012
H P & D ( m 2 ) H P & D ( m 1 ) = 1 . 3473 H P & D ( m 10 ) H P & D ( m 9 ) = 1 . 1735
H B & F ( m 2 ) H B & F ( m 1 ) = 8 > H B & F ( m 10 ) H B & F ( m 9 ) = 1 . 2161
Where the P& D entropy in Figure 2 is the H B e l ( m ) we listed in Section 4.
Although the four entropy values in Figure 2 increased, their slopes were different. Deng entropy and Pan–Deng entropy increased linearly, while the slopes of Dubois entropy and the new entropy decreased with the increase of the cardinality of B. We believe that the growth trend of the latter was more reasonable. This was because the scale of B was an important indicator to measure the change of information uncertainty, which should change with the size of cardinality. With the same cardinality of B i , our new belief entropy was larger than the Dubois entropy. It could well reflect the degree of uncertainty. Therefore, through comprehensive analysis, we considered that the new belief entropy was more accurate.
Yager entropy, Pal entropy, Klir and Rammer entropy, and Jiroušek and Shenoy entropy are plotted in Figure 3.
From Figure 3, it can be seen that these definitions kept a small value. The degrees of uncertainty measured by Klir and Rammer and Yager decreased visibly with the increasing of the elements in B. This was understandable. The uncertainty measures proposed by Pal and Jiroušek and Shenoy were nearly linear with the cardinality of B. They had the same growth trend as Deng entropy.
Where the J& S entropy in Figure 3 is the H J & S ( m ) we listed in Section 4.

6.1.6. Example 6

In recent years, much research has been modified based on Deng entropy theory [33,34,37]. In this example, we chose to compare W entropy and our new model.
Although W entropy takes into account the scale of FOD, the effect of the scale of FOD on W entropy is very limited [34]. As Equation (26) shows, the value of our new model would change exponentially with the scale of FOD. As they showed in their examples, when ϵ increased from zero to 10, the change trend of W entropy was almost the same as Deng entropy. However, as we demonstrated in Section 6.1.5, the growth trend of B & F entropy was different from Deng entropy. Therefore, we could see the effectiveness and superiority of the proposed entropy.

6.1.7. Example Summary

Based on the examples proposed above, we list some typical cases that may affect the new belief entropy and compare it with other entropies. From Section 6.1.2 and Section 6.1.3, we could see that the new entropy was more sensitive to the vacuous BPA. Section 6.1.4 shows the limitations of the general entropy, and the new entropy could solve the problem caused by the different number of FODs. Section 6.1.5 reflected the change of the new entropy and other entropies as the number of elements increased. In Section 6.1.6, we made a simple comparison between W entropy and B & F entropy.

6.2. Simulation

Here, we use MATLAB to complete the test. This test could more intuitively feel how the new belief entropy changed with the different BPAs.
We supposed an FOD X = { x 1 , x 2 } . This FOD had three BPAs, m ( { x 1 } ) = p 1 , m ( { x 2 } ) = p 2 , and m ( { x 1 , x 2 } ) = 1 p 1 p 2 . p 1 and p 2 can take any value from zero to one. However, according to D-S theory in Section 2, we limited the value of these BPAs, where m ( { x 1 } ) + m ( { x 2 } ) = p 1 + p 2 1 . Obviously, m ( { x 1 , x 2 } ) exists only when m ( { x 1 } ) + m ( { x 2 } ) < 1 . The simulation results are as Figure 4 and Figure 5 show, where the x-axis is m ( { x 1 } ) , the y-axis is m ( { x 2 } ) , and the z-axis means the value of the new entropy.
When p 1 + p 2 = 1 , the max value of the new entropy H B & F ( m ) = 1 , where p 1 = p 2 = 0 . 5 . When considering the BPA m ( { x 1 , x 2 } ) , m ( x 1 ) = 0 . 17 , m ( x 2 ) = 0 . 16 and m ( { x 1 , x 2 } ) = 0 . 67 could get the max value of the new entropy, H B & F ( m a x ) ( m ) = 2 . 585 . From this max value result, we obtained that the new definition did not satisfy the maximum entropy property.
Analysis: These simulation results suggested that the main trend of the new entropy was changing with different BPAs. It also indicated that the new entropy increased as the vacuous BPA increased, when p 1 + p 2 < 1 , which was reasonable. Therefore, the new entropy could reflect well the degree of measurement of information uncertainty.

7. Conclusions and Discussion

First of all, we reviewed some earlier definitions proposed by Hartley, Shannon, Yager, Nguyen, Lamata and Moral, Jiroušek and Shenoy, Klir and Ramer, Dubois, Nikhil R. Pal, Joussemle, Deng, and Pan–Deng. However, none of them reflected the number of FODs’ effect on uncertainty.
We discussed an open issue, which was how to measure information uncertainty. Our principle was to include as much known information as possible under D-S theory. Thus, in this paper, we considered the cardinality of FOD and defined a new model to measure uncertainty. Meanwhile, some properties of the new entropy were discussed. The result of the examples and simulation proved that the new entropy could be more effective and accurate when compared with other entropies.
When the target belonged to the set of clusters and the total number of targets could not be determined, our method could get the information uncertainty from the target accurately. Compared with traditional methods, the new entropy was easy to calculate. This meant that in the same time, it could process more data. In future work, we will apply it to solve practical problems and improve it in real applications.

Author Contributions

Conceptualization, Q.P.; Data curation, J.L.; Formal analysis, J.L.; Funding acquisition, J.L.; Investigation, J.L.; Methodology, J.L.; Project administration, J.L.; Resources, Q.P.; Software, J.L.; Supervision, Q.P.; Validation, J.L.; Visualization, J.L.; Writing–original draft, J.L.; Writing–review & editing, Q.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Acknowledgments

The authors greatly appreciate the reviews’ suggestions and the editor’s encouragement.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zadeh, L.A. Fuzzy sets. Inf. Control. 1965, 8, 338–353. [Google Scholar] [CrossRef] [Green Version]
  2. Feller, W. An Introduction to Probability Theory and Its Applications; John Wiley & Sons: Hoboken, NJ, USA, 2008; Volume 2. [Google Scholar]
  3. Dempster, A.P. Upper and lower probabilities induced by a multivalued mapping. In Classic Works of the Dempster–Shafer Theory of Belief Functions; Springer: Berlin, Germany, 2008; pp. 57–72. [Google Scholar]
  4. Shafer, G. A Mathematical Theory of Evidence; Princeton University Press: Princeton, NJ, USA, 1976; Volume 42. [Google Scholar]
  5. Pawlak, Z. Rough sets. Int. J. Comput. Inf. Sci. 1982, 11, 341–356. [Google Scholar] [CrossRef]
  6. Deng, Y. Generalized evidence theory. Appl. Intell. 2015, 43, 530–543. [Google Scholar] [CrossRef] [Green Version]
  7. Yager, R.R.; Abbasov, A.M. Pythagorean membership grades, complex numbers, and decision making. Int. J. Intell. Syst. 2013, 28, 436–452. [Google Scholar] [CrossRef]
  8. Song, Y.; Wang, X.; Quan, W.; Huang, W. A new approach to construct similarity measure for intuitionistic fuzzy sets. Soft Comput. 2019, 23, 1985–1998. [Google Scholar] [CrossRef]
  9. Pan, Y.; Zhang, L.; Li, Z.; Ding, L. Improved fuzzy Bayesian network-based risk analysis with interval-valued fuzzy sets and DS evidence theory. In IEEE Transactions on Fuzzy Systems; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
  10. Düğenci, M. A new distance measure for interval valued intuitionistic fuzzy sets and its application to group decision making problems with incomplete weights information. Appl. Soft Comput. 2016, 41, 120–134. [Google Scholar] [CrossRef]
  11. Liu, Q.; Tian, Y.; Kang, B. Derive knowledge of Z-number from the perspective of Dempster–Shafer evidence theory. Eng. Appl. Artif. Intell. 2019, 85, 754–764. [Google Scholar] [CrossRef]
  12. Jiang, W.; Cao, Y.; Deng, X. A novel Z-network model based on Bayesian network and Z-number. In IEEE Transactions on Fuzzy Systems; IEEE: Piscataway, NJ, USA, 2019. [Google Scholar]
  13. Deng, Y. D numbers: theory and applications. J. Inf. Comput. Sci. 2012, 9, 2421–2428. [Google Scholar]
  14. Liu, B.; Deng, Y. Risk Evaluation in Failure Mode and Effects Analysis Based on D Numbers Theory. Int. J. Comput. Commun. Control 2019, 14, 672–691. [Google Scholar]
  15. Deng, X.; Jiang, W. Evaluating green supply chain management practices under fuzzy environment: a novel method based on D number theory. Int. J. Fuzzy Syst. 2019, 21, 1389–1402. [Google Scholar] [CrossRef]
  16. Zhao, J.; Deng, Y. Performer Selection in Human Reliability Analysis: D numbers Approach. Int. J. Comput. Commun. Control 2019, 14, 437–452. [Google Scholar] [CrossRef] [Green Version]
  17. George, T.; Pal, N.R. Quantification of conflict in Dempster–Shafer framework: a new approach. Int. J. Gen. Syst. 1996, 24, 407–423. [Google Scholar] [CrossRef]
  18. Sabahi, F.; Akbarzadeh-T, M.R. A qualified description of extended fuzzy logic. Inf. Sci. 2013, 244, 60–74. [Google Scholar] [CrossRef]
  19. Deng, Y.; Liu, Y.; Zhou, D. An improved genetic algorithm with initial population strategy for symmetric TSP. Math. Probl. Eng. 2015, 2015, 212794. [Google Scholar] [CrossRef] [Green Version]
  20. Yang, Y.; Han, D. A new distance-based total uncertainty measure in the theory of belief functions. Knowl.-Based Syst. 2016, 94, 114–123. [Google Scholar] [CrossRef]
  21. Sabahi, F.; Akbarzadeh-T, M.R. Introducing validity in fuzzy probability for judicial decision-making. Int. J. Approx. Reason. 2014, 55, 1383–1403. [Google Scholar] [CrossRef]
  22. Deng, Y. Fuzzy analytical hierarchy process based on canonical representation on fuzzy numbers. J. Comput. Anal. Appl. 2017, 22, 201–228. [Google Scholar]
  23. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  24. Hartley, R.V. Transmission of information 1. Bell Syst. Tech. J. 1928, 7, 535–563. [Google Scholar] [CrossRef]
  25. Hohle, U. Entropy with respect to plausibility measures. In Proceedings of the 12th IEEE International Symposium on Multiple-Valued Logic, Paris, France, 25–26 May 1982. [Google Scholar]
  26. Nguyen, H.T. On entropy of random sets and possibility distributions. Anal. Fuzzy Inf. 1987, 1, 145–156. [Google Scholar]
  27. Dubois, D.; Prade, H. Properties of measures of information in evidence and possibility theories. Fuzzy Sets Syst. 1987, 24, 161–182. [Google Scholar] [CrossRef]
  28. Klir, G.J.; Ramer, A. Uncertainty in the Dempster–Shafer theory: a critical re-examination. Int. J. Gen. Syst. 1990, 18, 155–166. [Google Scholar] [CrossRef]
  29. Jiroušek, R.; Shenoy, P.P. A new definition of entropy of belief functions in the Dempster–Shafer theory. Int. J. Approx. Reason. 2018, 92, 49–65. [Google Scholar] [CrossRef] [Green Version]
  30. Pal, N.R.; Bezdek, J.C.; Hemasinha, R. Uncertainty measures for evidential reasoning I: A review. Int. J. Approx. Reason. 1992, 7, 165–183. [Google Scholar] [CrossRef] [Green Version]
  31. Pal, N.R.; Bezdek, J.C.; Hemasinha, R. Uncertainty measures for evidential reasoning II: A new measure of total uncertainty. Int. J. Approx. Reason. 1993, 8, 1–16. [Google Scholar] [CrossRef] [Green Version]
  32. Deng, Y. Deng entropy. Chaos Solitons Fractals 2016, 91, 549–553. [Google Scholar] [CrossRef]
  33. Pan, L.; Deng, Y. A new belief entropy to measure uncertainty of basic probability assignments based on belief function and plausibility function. Entropy 2018, 20, 842. [Google Scholar] [CrossRef] [Green Version]
  34. Wang, D.; Gao, J.; Wei, D. A New Belief Entropy Based on Deng Entropy. Entropy 2019, 21, 987. [Google Scholar] [CrossRef] [Green Version]
  35. Frikha, A.; Moalla, H. Analytic hierarchy process for multi-sensor data fusion based on belief function theory. Eur. J. Oper. Res. 2015, 241, 133–147. [Google Scholar] [CrossRef]
  36. Khodabandeh, M.; Shahri, A.M. Uncertainty evaluation for a Dezert–Smarandache theory-based localization problem. Int. J. Gen. Syst. 2014, 43, 610–632. [Google Scholar] [CrossRef]
  37. Zhou, D.; Tang, Y.; Jiang, W. A modified belief entropy in Dempster–Shafer framework. PLoS ONE 2017, 12, e0176832. [Google Scholar] [CrossRef] [PubMed]
  38. Tang, Y.; Zhou, D.; He, Z.; Xu, S. An improved belief entropy–based uncertainty management approach for sensor data fusion. Int. J. Distrib. Sens. Networks 2017, 13, 1550147717718497. [Google Scholar] [CrossRef]
  39. Denoeux, T. A k-nearest neighbor classification rule based on Dempster–Shafer theory. In Classic Works of the Dempster–Shafer Theory of Belief Functions; Springer: Berlin, Germany, 2008; pp. 737–760. [Google Scholar]
  40. Liu, Z.G.; Pan, Q.; Dezert, J. A new belief-based K-nearest neighbor classification method. Pattern Recognit. 2013, 46, 834–844. [Google Scholar] [CrossRef]
  41. Ma, J.; Liu, W.; Miller, P.; Zhou, H. An evidential fusion approach for gender profiling. Inf. Sci. 2016, 333, 10–20. [Google Scholar] [CrossRef] [Green Version]
  42. Liu, Z.G.; Pan, Q.; Dezert, J.; Mercier, G. Credal classification rule for uncertain data based on belief functions. Pattern Recognit. 2014, 47, 2532–2541. [Google Scholar] [CrossRef]
  43. Han, D.; Liu, W.; Dezert, J.; Yang, Y. A novel approach to pre-extracting support vectors based on the theory of belief functions. Knowl.-Based Syst. 2016, 110, 210–223. [Google Scholar] [CrossRef]
  44. Liu, Z.G.; Pan, Q.; Dezert, J.; Martin, A. Adaptive imputation of missing values for incomplete pattern classification. Pattern Recognit. 2016, 52, 85–95. [Google Scholar] [CrossRef] [Green Version]
  45. Jiang, W.; Wei, B.; Xie, C.; Zhou, D. An evidential sensor fusion method in fault diagnosis. Adv. Mech. Eng. 2016, 8, 1687814016641820. [Google Scholar] [CrossRef] [Green Version]
  46. Yuan, K.; Xiao, F.; Fei, L.; Kang, B.; Deng, Y. Modeling sensor reliability in fault diagnosis based on evidence theory. Sensors 2016, 16, 113. [Google Scholar] [CrossRef] [Green Version]
  47. Yuan, K.; Xiao, F.; Fei, L.; Kang, B.; Deng, Y. Conflict management based on belief function entropy in sensor fusion. SpringerPlus 2016, 5, 638. [Google Scholar] [CrossRef] [Green Version]
  48. Jiang, W.; Xie, C.; Zhuang, M.; Shou, Y.; Tang, Y. Sensor data fusion with z-numbers and its application in fault diagnosis. Sensors 2016, 16, 1509. [Google Scholar] [CrossRef] [PubMed]
  49. Yager, R.R.; Liu, L. Classic Works of the Dempster–Shafer Theory of Belief Functions; Springer: Berlin, Germany, 2008; Volume 219. [Google Scholar]
  50. Liu, Z.G.; Pan, Q.; Dezert, J.; Mercier, G. Credal c-means clustering method based on belief functions. Knowl.-Based Syst. 2015, 74, 119–132. [Google Scholar] [CrossRef]
  51. Yager, R.R.; Alajlan, N. Decision making with ordinal payoffs under Dempster–Shafer type uncertainty. Int. J. Intell. Syst. 2013, 28, 1039–1053. [Google Scholar] [CrossRef]
  52. Merigó, J.M.; Casanovas, M. Induced aggregation operators in decision making with the Dempster–Shafer belief structure. Int. J. Intell. Syst. 2009, 24, 934–954. [Google Scholar] [CrossRef] [Green Version]
  53. Wang, Y.M.; Elhag, T.M. A comparison of neural network, evidential reasoning and multiple regression analysis in modelling bridge risks. Expert Syst. Appl. 2007, 32, 336–348. [Google Scholar] [CrossRef]
  54. Su, X.; Deng, Y.; Mahadevan, S.; Bao, Q. An improved method for risk evaluation in failure modes and effects analysis of aircraft engine rotor blades. Eng. Fail. Anal. 2012, 26, 164–174. [Google Scholar] [CrossRef]
  55. Fu, C.; Yang, J.B.; Yang, S.L. A group evidential reasoning approach based on expert reliability. Eur. J. Oper. Res. 2015, 246, 886–893. [Google Scholar] [CrossRef]
  56. Zhang, X.; Mahadevan, S.; Deng, X. Reliability analysis with linguistic data: An evidential network approach. Reliab. Eng. Syst. Saf. 2017, 162, 111–121. [Google Scholar] [CrossRef]
  57. Yager, R.R. Arithmetic and other operations on Dempster–Shafer structures. Int. J. Man-Mach. Stud. 1986, 25, 357–366. [Google Scholar] [CrossRef]
  58. Li, Y.; Deng, Y. Intuitionistic evidence sets. IEEE Access 2019, 7, 106417–106426. [Google Scholar] [CrossRef]
  59. Song, Y.; Deng, Y. Divergence measure of belief function and its application in data fusion. IEEE Access 2019, 7, 107465–107472. [Google Scholar] [CrossRef]
  60. Klir, G.J.; Wierman, M.J. Uncertainty-Based Information: Elements of Generalized Information Theory; Springer: Berlin, Germany, 2013; Volume 15. [Google Scholar]
  61. Klir, G.; Folger, T. Fuzzy Sets, Uncertainty, and Information; Prentice Hall: Englewood Cliffs, NJ, USA, 1988. [Google Scholar]
  62. Yager, R.R. Entropy and specificity in a mathematical theory of evidence. Int. J. Gen. Syst. 1983, 9, 249–260. [Google Scholar] [CrossRef]
  63. Lamata, M.T.; Moral, S. Measures of entropy in the theory of evidence. Int. J. Gen. Syst. 1988, 14, 297–305. [Google Scholar] [CrossRef]
  64. Jousselme, A.L.; Liu, C.; Grenier, D.; Bossé, É. Measuring ambiguity in the evidence theory. In IEEE Transactions on Systems, Man, and Cybernetics-Part A: Systems and Humans; IEEE: Piscataway, NJ, USA, 2006; Volume 36, pp. 890–903. [Google Scholar]
  65. Smets, P. Constructing the Pignistic Probability Function in a Context of Uncertainty. UAI 1989, 89, 29–40. [Google Scholar]
  66. Abellán, J. Analyzing properties of Deng entropy in the theory of evidence. Chaos Solitons Fractals 2017, 95, 195–199. [Google Scholar] [CrossRef]
Figure 1. Comparison of different frame of discernment (FOD) information.
Figure 1. Comparison of different frame of discernment (FOD) information.
Entropy 22 00691 g001
Figure 2. Comparison between the new belief entropy and other entropies.
Figure 2. Comparison between the new belief entropy and other entropies.
Entropy 22 00691 g002
Figure 3. Results’ comparison of other entropies.
Figure 3. Results’ comparison of other entropies.
Entropy 22 00691 g003
Figure 4. The value of the new belief entropy with changes of BPA.
Figure 4. The value of the new belief entropy with changes of BPA.
Entropy 22 00691 g004
Figure 5. The value of the new belief entropy with changes of BPA.
Figure 5. The value of the new belief entropy with changes of BPA.
Entropy 22 00691 g005
Table 1. The value of different definitions when B i changes.
Table 1. The value of different definitions when B i changes.
CasesDubois EntropyDeng EntropyPan–Deng EntropyNew Entropy
B 1 = { 1 } 0.41142.662316.14435.1363
B 2 = { 1 , 2 } 1.21143.930317.491613.1363
B 3 = { 1 , 2 , 3 } 1.67944.908219.860817.8160
B 4 = { 1 , 2 4 } 2.01145.787820.822921.1363
B 5 = { 1 , 2 5 } 2.26906.625621.831423.7118
B 6 = { 1 , 2 6 } 2.47947.444122.752125.8160
B 7 = { 1 , 2 7 } 2.65738.253224.133127.5952
B 8 = { 1 , 2 8 } 2.81149.057825.068529.1363
B 9 = { 1 , 2 9 } 2.94749.860026.021230.4957
B 10 = { 1 , 2 10 } 3.069010.661227.194731.7118

Share and Cite

MDPI and ACS Style

Li, J.; Pan, Q. A New Belief Entropy in Dempster–Shafer Theory Based on Basic Probability Assignment and the Frame of Discernment. Entropy 2020, 22, 691. https://doi.org/10.3390/e22060691

AMA Style

Li J, Pan Q. A New Belief Entropy in Dempster–Shafer Theory Based on Basic Probability Assignment and the Frame of Discernment. Entropy. 2020; 22(6):691. https://doi.org/10.3390/e22060691

Chicago/Turabian Style

Li, Jiapeng, and Qian Pan. 2020. "A New Belief Entropy in Dempster–Shafer Theory Based on Basic Probability Assignment and the Frame of Discernment" Entropy 22, no. 6: 691. https://doi.org/10.3390/e22060691

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop