A New Total Uncertainty Measure from A Perspective of Maximum Entropy Requirement

The Dempster-Shafer theory (DST) is an information fusion framework and widely used in many fields. However, the uncertainty measure of a basic probability assignment (BPA) is still an open issue in DST. There are many methods to quantify the uncertainty of BPAs. However, the existing methods have some limitations. In this paper, a new total uncertainty measure from a perspective of maximum entropy requirement is proposed. The proposed method can measure both dissonance and non-specificity in BPA, which includes two components. The first component is consistent with Yager’s dissonance measure. The second component is the non-specificity measurement with different functions. We also prove the desirable properties of the proposed method. Besides, numerical examples and applications are provided to illustrate the effectiveness of the proposed total uncertainty measure.


Introduction
With the development of sensor technology, it has become a trend for complex systems to be equipped with multiple sensors. Compared with single-sensor monitoring, multisensor monitoring could have higher reliability. Obviously, it is a key issue to effectively fuse the multi-sensor information. Many techniques are proposed to solve the issue, such as the Dempster-Shafer theory (DST) [1,2], Kalman filtering (KF) [3,4], fuzzy theory [5], Bayesian reasoning method [6,7], neural network [8], and so on. However, there are many uncertainties, for example, randomness and imprecision, in the real world. The treatment of uncertainty is an important aspect in information fusion theories. Among them, DST is an effective framework to deal with uncertain information. This theory was first proposed by Dempster [9] and further developed by Shafer [10]. It is widely used in fault diagnosis [11][12][13][14], decision-making [15,16], risk assessment [17], and so on. Many studies in recent years have focused on conflict resolution [18][19][20], evidence revision [21], combination rules [22][23][24][25], and information volume [26,27]. Many methods about uncertainty quantification have also been proposed [28]. However, the existing methods have some limitations, and the uncertainty measure of BPAs is still an open issue in DST.
The concept of entropy was first proposed by the German physicist Clausius in 1865. In thermodynamics, entropy is a measure of the "chaos" of a system [29]. In information theory, entropy, also known as Shannon entropy, represents a measure of the uncertainty of a random variable [30]. Besides, Ilya Prigogine proposed a famous statement: "The entropy . . . leads to an ordering process" [31]. Parker and Jeynes also showed from a MaxEnt standpoint that the (stupendously gigantic) entropy of the supermassive black hole at the centre of the Milky Way can account for the geometrical stability of the galaxy [32]. Among them, the Shannon entropy, verified in the past few decades, is an effective way to measure uncertainty in probability theory (PT), but its direct application in DST is inappropriate. That is because that PT describes the probability of the occurrence of singletons, while the evidence theory is based on the theory of non-additive probabilities, which can represent the possibility of the occurrence of propositions with multiple elements [33].
Based on the above analysis, many scholars have proposed different entropy-like measures to quantify the uncertainty of a body of evidences (BOEs) in DST. For instance, Nguyen proposed a belief entropy based on the original basic probability assignment (BPA) [34]. Dubois and Prade proposed a weighted Hartley entropy for measuring the non-specificity of BPAs [35]. In addition, many other belief entropies have been proposed, including Höhle's entropy [36], Yager's dissonance measure [37], Klir and Ramer's discord measure [38], Klir and Parviz's strife measure [39], Jousselme's ambiguity measure (AM) [40], Deng entropy [41], Yang and Han's measure [42], the aggregated uncertainty measure (AU) [43], Wang and Song's measure (SU) [44], Jirousek and Shenoy's entropy (JS) [45], Deng's measure [46], and so on [47][48][49]. These methods can effectively measure the uncertainty of BOEs in some cases, and satisfy some desirable properties of uncertainty quantification in DST [50]. Intuitively, when the system is completely unknown, that is, m(Ω) = 1, Ω represents a frame of discernment (FOD), the uncertainty of evidence is the greatest, which is called the maximum entropy property. Some of the existing methods do not support this property. However, we think that maximum entropy should be a property that must be satisfied.
Motivated by the above discussions, a new total uncertainty measure from a perspective of maximum entropy requirement is proposed. The proposed method can measure both dissonance conflict and non-specificity in BPA, which includes two components. The first component is consistent with Yager's dissonance measure. The second component is the non-specificity with different functions. We also prove the majority of the desired properties of the proposed method, such as non-negativity, monotonicity, probability consistency, and so on. The main contributions are summarized as follows.

•
We propose a new total uncertainty measure from the perspective of the maximum entropy requirement to quantify the uncertainty of BPAs in DST. Besides, properties of the proposed method have also been proved, such as non-negativity, monotonicity, maximum entropy, and so on.

•
We conduct some numerical examples to evaluate the effectiveness of our proposed method. The simulation results indicated that our proposed total uncertainty measure could be degraded to Shannon entropy when BPA is a Bayesian mass function. Furthermore, the proposed entropy could effectively deal with the redundant information of the focal element.
The remainder of this paper is organized as follows. Section 1 reviews the related concepts and works. Some preliminaries are introduced in Section 2. In Section 3, we illustrate the proposed method in detail. Simulation results and discussion compared with other methods are presented in Section 4. In Section 5, we give an application in feature evaluation for pattern classification based on the Iris dataset. Section 6 is a conclusion of the paper.

Preliminaries
Some basic concepts and existing methods are briefly introduced in this section.

Dempster-Shafer Theory
The Dempster-Shafer theory, proposed by Dempster [9] and expanded by Shafer [10], is a mathematical theory of multi-source information. This method can effectively cope with uncertainty. It is widely used in target identification, fusion decision, and so on. Some definitions about this theory are as follows. • Frame o f Discernment (FOD). If Θ = {θ 1 , θ 2 , · · · , θ r } is a finite complete set of r mutually exclusive elements, it is called a frame of discernment [9,10,51].
• Basic Probability Assignment (BPA). Set Θ is a FOD, and its power set constitutes a set of propositions. If a function m : 2 Θ → [0, 1] satisfies the following formula [9,10,52]: the mass function m is a BPA. In this definition, m(S) is the basic probability number of proposition S, and indicates the belief assigned to S.

•
Dempster Combination Rule (DCR). Let m 1 and m 2 be two BPAs, and then the Dempster combination rule is as follows [9]: where k = ∑ S∩B=∅ m 1 (S)m 2 (B).

Belief and Plausibility Function
Let m be a BPA on FOD Θ, if Bel : 2 Θ → [0, 1] statisfies [9]: then, Bel(S) is called the belief measure of proposition S. Pl(S) is the plausibility function that is defined as follows: Pl(S) is the degree to which you do not disagree with proposition S.

Shannon Entropy
Let X be a sample space with possible values {x 1 , x 2 , · · · , x n }, then, the Shannon entropy is defined as [30]: where p(x i ) is the probability of x i . It also satisfies ∑

Some Existing Entropies in DST
In the information theory, Shannon entropy has been widely used, but it has an inherent limitation of handling the uncertainty in DST. Nonetheless, the idea of Shannon entropy still plays a crucial role in guiding the uncertainty measurement of BPA. Many scholars proposed methods of uncertainty measurements. Let X be a FOD, and some existing uncertainty measures in DST are listed as follows.
Nguyen's entropy. Nguyen [34] proposed a belief entropy based on the original BPA: Weighted Hartley entropy. Dubois and Prade [35] proposed a entropy for the non-specificity measure: where |S| is the cardinality of S. [43] proposed a total uncertainty measure of non-specificity and inconsistency:

Aggregated uncertainty measure (AU). Harmanec and Klir
where p x is defined as: Yager's entropy. Yager [37] proposed a dissonance measure of BPAs based on the plausibility function: where Pl(S) is the plausibility measurement of S in m.
Deng entropy. Deng [41] proposed a new uncertainty measurement method, namely, "Deng entropy". It is defined as: Höhle entropy. Höhle [36] proposed a belief entropy based on a belief function, which is defined as: where Bel(S) is the belief measurement of S.
Yang and Han's measure (TU I ). Yang and Han [42] defined a total uncertainty measurement. The formula is defined as: where n is the elements number of FOD X. The interval distance is defined as: Deng's measure (TU I E ). In addition, Deng et al. [46] proposed an improved total uncertainty measurement method based on belief intervals: where d I E (·) indicates the Euclidean distance between two interval numbers [Bel(x), Pl(x)] and [0, 1].

The Proposed Method
In this section, a new total uncertainty measure of BPAs in DST is proposed, which is from a perspective of maximum entropy requirement. It can quantify the total uncertainty of BPAs, including conflict and non-specificity. As for the conflict measure of BPAs, we utilize the method of Yager's dissonance entropy [37]. As for the non-specificity measure of BPAs, H n−s (m), we think that H n−s (m) should be consistent with the maximum entropy requirement. For example, the uncertainty of BPA m(S) = 1 defined on FOD X, S ⊂ X, should equal to the uncertainty of vacuous BPA m (Ω) = 1, where Ω is the FOD and Ω = S. Furthermore, the uncertainty of m(S) = a should hence be a function of a and the uncertainty degree of BPA m(S) = 1, but not measured by the weighted Hartley entropy.
Based on the above idea, the proposed new total uncertainty measure is defined as follows: where X is the FOD, and Q S represents the maximum entropy in S, that is, the uncertainty of m(S) = 1. Logically, for S ⊆ X, Q S is a monotonically increasing function of the cardinality of S, that is, Q X ≥ Q S . In addition, when the BPA is a Bayesian mass function, we expect the new entropy to be degraded to Shannon's entropy. Therefore, Q S = 0 when |S| = 1. In summary, Q S is a function Q S : |S| → R, satisfying (i) Q S = 0 i f |S| = 1; and (ii) dQ S d|S| ≥ 0.

Properties of the Proposed Method
Similar to the probability theory (PT), there are some properties which need to be satisfied for the uncertainty measurement in DST, such as probability consistency, additivity, non-negativity, and so forth. The properties analysis of the proposed entropy is explored as follows. Proof. Given X = {x 1 , x 2 , · · · , x n }, for any S ∈ 2 X , Property 2 (Set Monotonicity). Let m(X 1 ) = 1 be a vacuous BPA on FOD X 1 , and m(X 2 ) = 1 be also a vacuous BPA on FOD X 2 , if |X 1 | < |X 2 |, then H(m 1 ) < H(m 2 ).
Proof. For any vacuous BPA m(X) = 1, there is Pl(X) = 1, then From the analysis in Section 3.1, it can be obtained that Q S is a monotonically increasing function of |X|, and we have Q X 1 < Q X 2 if |X 1 | < |X 2 |. Hence, the proposed method satisfies the set monotonicity property.

Property 3 (Maximum entropy)
. For all BPAs defined on a FOD X, the vacuous BPA m(X) = 1 has the most uncertainty.
Proof. Let m be a BPA on FOD X. According to the analysis of Section 3.1, Q S is a monotonically increasing function of the cardinality of S. Therefore, the proposed method obviously satisfies the maximum entropy property.
Therewith, the proposed method satisfies the property of probability consistency.

Property 5 (Range).
The range of the proposed entropy is [0, Q X ], where Q X is a function of |X|.
Property 6 (Non-Additivity). Let m X and m Y be two BPAs that are defined on FOD X and Y, respectively. Then, Therefore, the proposed method does not satisfy the additivity property. Proof: Assuming m is a BPA defined on FOD X, for S ∈ 2 X and m(S) = 1, the uncertainty measurement based on the proposed entropy is: Therefore, the proposed entropy satisfies the property of generalized set consistency if, and only if Q S = log 2 |S|.

Numerical Examples
In this section, we give three different forms of Q S . Some numerical examples are given to verify the rationality and effectiveness of the proposed method.
According to [45], the maximum entropy is 2log 2 |X|, where X is a FOD. In this paper, Q S : S → R represents the maximum entropy in S. Hence, based on the above analysis, one function form of Q S can be defined as: According to [26], the maximum Deng entropy is , which is inconsistent with our idea.
In this paper, we think that the uncertainty of m(S) = a should be a function of a and uncertainty degree of BPA m(S) = 1. Hence, for any B ∈ 2 S , there is only one situation, B = S. Therefore, another function form of Q S can be defined as:

Case 3 (Q 3 S ).
According to [44,46], the maximum entropy is |X|, where X is a FOD. Similarly, the third function form of Q S can be defined as: Then, the proposed entropy could be written as follows:

Example 1
This example is adapted from [53]. Let the FOD be X = {x 1 , x 2 , · · · , x n }. We give a BPA as m({x i }) = 1/n. Then, we calculate the uncertainty of this BPA when n changes.
According to the definition and the desired properties of the entropy, it can be inferred that as n increases, the uncertainty of this BPA increases. In addition, when the BPA is a Bayesian mass function, the uncertainty of the BPA should be consistent with Shannon entropy.
We calculate the uncertainty of this BPA in this example based on the proposed method and some existing methods, as shown in Figure 1.  For this example, all the methods gave exactly the same result as the Deng Entropy (except the weighted Hartley entropy, and the two different "measures" of Deng, as well as Yang and Han). In this paper, the "Deng entropy" was proposed by Deng to measure the uncertainty of BPAs in 2016, while "Deng's measure (TU I E )", proposed by Deng et al. in 2017, is an improved total uncertainty measure method based on the belief interval.
In Figure 1, the uncertainty calculated by Yang and Han's measure, TU I E and the weighted Hartley entropy show a downward trend with the increase of n, which was inconsistent with our intuition. However, the uncertainty calculated by the proposed methods in this paper and the remaining existing methods gradually increases with the increase of n, which is consistent with the results calculated by Shannon entropy. Therefore, the proposed methods in this paper are effective when BPA is a Bayesian mass function.

Example 2
Let the FOD be X = {x 1 , x 2 , · · · , x n }. We also give a BPA as m({x 1 , · · · , x n }) = 1. When n increases from 1 to 14, the uncertainty measure of BPAs based on the proposed method and other existing methods are shown in Table 1. In addition, in order to visualize the change of uncertain measurement results with n, the uncertainty measurement results of different methods are given in Figure 2.  As shown in Figure 2, it is obvious that the uncertainty degree measured by the Yager's dissonance entropy is always 0. Intuitively, however, the uncertainty of this BPA should increase as n increases. Therefore, Yager's dissonance entropy will obtain wrong results when m is a vacuous BPA in this example. The uncertainty measure results obtained by AU, weighted Hartley entropy, and AM are the same. This is because when BPA is a vacuous BPA, the three methods give us the same result log 2 |n|. The degree of uncertainty obtained by these three methods increases with the increase of n, which is consistent with expectations. Similarly, the degree of uncertainty calculated based on the methods proposed in this paper (H 1 , H 2 , H 3 ), Deng's entropy, SU, JS, Yang and Han's measure, and TU I E also increases with the increase of n. Additionally, the growth trend of the proposed method H 2 in this paper is basically the same as that of the Deng entropy. This is because for a vacuous BPA, m(A) = Pl(A). Hence, the two methods have the same function form. The proposed method H 3 and SU, Yang and Han's measure, TU I E also have the same growth trend. However, we think that when the change trend is consistent with the theoretical connotation of uncertainty, it can be considered as a reasonable and effective measure method. Hence, the proposed methods are all effective when BPA is a vacuous BPA.
For this example, the proposed method H 2 gave the same results as the Deng entropy.

Example 3
Let X = {a, b, c, d} be the FOD. We give two BPAs as follows. For m 1 , then the uncertainty based on the proposed method are: H 2 (m 1 ) = 1 5 · log 2 2 log 2 (2 1 −1) 4/5 + 1 5 · log 2 2 log 2 (2 1 −1) 4/5 + 3 5 · log 2 2 log 2 (2 2 −1) 1 In addition, the uncertainty measured by other methods, as shown in Table 2. Obviously, owing to differences in the focal elements in these BPAs, the uncertainty degree of m 1 and m 2 are different, though the values of the two BPAs are the same, and H(m 1 ) should be less than H(m 2 ). However, the results obtained by the Deng entropy and weighted Hartley entropy are the same for m 1 and m 2 . The results obtained by other methods are as expected. However, Yager's dissonance entropy method did not consider the non-specificity measure. The methods proposed in this paper obtain reasonable results and consider the total uncertainty. Therefore, when the focal elements are different but the BPA values are the same, the proposed method in this paper can effectively measure the degree of uncertainty.

Example 4
Let X = {a, b, c, d} be a FOD. Two BPAs defined on the FOD are given as follows.
There is an intersection relationship between propositions. The uncertainties are measured by different methods, as shown in Table 3. For the body of evidence (BOE) m 1 , the intersection between the two propositions a, b and c, d is empty. For BOE m 2 , the intersection between the two propositions a, b and c, d is a single element c. However, the values of BPAs are the same. Based on the above analysis, the uncertainties of two BOEs are obviously different. However, according to Table 3, the results are the same for m 1 and m 2 , measured by Deng entropy, AU, weighted Hartley entropy, Yang and Han's measure, and Deng's measure. Besides, the uncertainty of BOE m 2 calculated by Yager's dissonance entropy is 0. This result is clearly wrong. This is because there is an intersection between two propositions of m 2 , and Yager's dissonance entropy eliminates the distinction between the two propositions when calculating a plausibility function. On the contrary, the three methods we propose can all distinguish the uncertainty difference between the two BOEs. Therefore, the proposed method can effectively distinguish the uncertainty when there is an intersection relationship between propositions.

Example 5
Let X = {1, 2, · · · , 15} be a FOD with 15 elements. A BPA defined on the FOD is: where A is a variable subset of X, with the number of single elements changing from 1 to 14. This example was adopted from [53].
The results are shown in Tables 4 and 5, and Figure 3. Table 4 shows changes in the number of elements in S from 1 to 7, and Table 5 shows changes in the number of elements in A from 8 to 14. All the results are plotted in Figure 3. Table 4. The comparison between the proposed method and some existing methods in Example 5.  Table 5. The comparison between the proposed method and some existing methods in Example 5. As shown in Figure 3, with the increase of the number of elements in A, the uncertainty calculated by Yager's dissonance entropy shows a downward trend, which is inconsistent with the connotation of uncertainty. The reason is that when the number of elements in A gradually increases, it gradually intersects with other propositions, and Yager's entropy does not show this difference. This suggests that Yager's entropy does not correctly measure the uncertainty of the evidence in this example.  From a "common sense" point of view, the uncertainty of BPA increases as the number of elements increases in A. Additionally, other methods except Yager's dissonance entropy show an increasing trend on the whole. The corresponding values can be found in Tables 4 and 5. However, it should be noted that when A changes from {1, 2} to {1, 2, 3}, A begins to intersect with the proposition {3, 4, 5}. Therefore, the change of the uncertainty should be slightly less than the change of A from {1} to {1, 2}. From Figure 4, it can be obtained that the proposed methods in this paper present this change. As for the uncertainty measure of BOEs, as far as we know, there is no reasonable evaluation index at present, and it is not certain that the greater the uncertainty, the better. Nevertheless, when its change trend is consistent with the theoretical connotation of uncertainty, it can be considered as a reasonable and effective measurement method.

Example 6
Let X = {θ 1 , θ 2 } be a FOD with two elements. Additionally, we give a BPA as: where a, b ∈ [0, 0.5]. This example is adopted from [53]. Here, we calculate the uncertainty values based on the proposed methods and some existing methods with the changes of a and b. The results are shown in Figure 4.  , it can be found that the maximum uncertainty is obtained when m(X) = 1, which is consistent with the property of maximum entropy. In addition, as the value of m(X) decreases, the uncertainty of BOE decreases gradually. As for Deng entropy, according to [26], its maximum uncertainty is obtained at m(θ 1 ) = 2 |θ 1 | −1 2 |θ 1 | −1 + 2 |θ 2 | −1 + 2 |{θ 1 ,θ 2 }| −1 = 0.2, m(θ 2 ) = 0.2, and m(X) = 0.6, which is consistent with Figure 4(d) and does not satisfy the maximum entropy property. For Yager's disso-nance entropy, the maximum uncertainty is obtained at m(θ 1 ) = 0.5, m(θ 2 ) = 0.5, m(X) = 0. As the value of m(X) decreases, the uncertainty also increases, which is counter-intuitive. Hence, in this example, Yager's dissonance entropy fails to measure the uncertainty of BOEs. Besides, for the method of AU, it obviously fails to measure the uncertainty of BOEs in this example.
The above numerical examples are analyzed and summarized. For Example 1, since we give a Bayesian mass function, the uncertainty of the Bayesian BPA is proportional to the number of elements n in FOD based on the proposed method. It can also be understood from the concept of entropy that with the increase of the elements number of FOD, the "chaos" degree of information in this example also increases. The results obtained by all methods are consistent with this understanding, except weighted Hartley entropy, and two different "measures" of Yang and Han, as well as Deng. For Example 2, we give a vacuous BPA. Obviously, as the number of elements in the FOD increases, the disorder of the system increases. This is true for all methods except Yager's dissonance entropy. It is because Yager's dissonance entropy only measures dissonance, not nonspecificity. For Examples 3 and 4, the two BPAs given in each example are assigned the same belief value, but with different propositions. Because the uncertainty of a BPA is related to its belief value and proposition, the uncertainty is obviously different when the propositions are completely disjointed and partially intersected. Example 5 further illustrates the problem. In this example, the degree of intersection between the different propositions of the BPA changes gradually, that is, the belief values for elements in the FOD change gradually. Therefore, the degree of confusion in the system changes gradually, and the results of the uncertainty measure change accordingly. However, overall, as the number of elements in A increases, the confusion of BPA to the system should also increase. For Example 6, different belief values are assigned to BPA, but the propositions are the same. Obviously, when m({θ 1 , θ 2 }) = 1, the system is in a completely unknown state, so the uncertainty should be at the maximum, which is the maximum entropy.
The proposed method in this paper can effectively measure the uncertainty of BOEs in the above examples. However, for the non-specificity measure of BPA, Q S function is determined based on the maximum entropy of three existing uncertainty measures. Actually, there are many other entropies which can be considered, such as info-entropy [32]. This is a good guide for our future research direction.

Application
In this section, feature evaluation is performed with the Iris dataset to further verify the rationality of the proposed uncertainty measure. In the Iris dataset, three types of iris plants are surveyed, including "Setosa", "Versicolour", and "Virginica". Besides, sepal length (SL), sepal width (SW), petal length (PL), and petal width (PW) are taken as four features. With respect to each iris class, each feature of instances is a Gaussian distribution with different standard deviations and means, as shown in Table 6 and Figure 5.  As shown in Figure 5, intuitively, PL has the best class discriminability, which is attributed to the best separation of Gaussian probability density functions (PDFs) of the three iris types, while the PDFs of the three iris types in SW almost overlap. Thus, the class discriminability of SW is the worst.
In addition, the method proposed in [54] is utilized to quantify the discriminability of different fault features, which is as shown below.
where tr is the trace of a matrix, and S w and S b are the within-types scatter matrix and between-types scatter matrix, respectively. with where X is a feature vector of a sample and M is the mean of all fault types' centroids. Step1 (BPA generation). For different features in {SL, SW, PL, PW}, we generate the BPA corresponding to each sample in the fault dataset according to [55].
Step2 (Uncertainty measure of BPAs). For each feature, calculate the uncertainty of each BPA on it by using all the above uncertainty measures.
Step3 (Average uncertainty measure). Calculate the average uncertainty value on different features corresponding to different methods.
The results are shown in Table 7, and visually represented in the histogram, which are shown in Figure 6.  Features with smaller average uncertainty have better discriminability. It can be found in Table 7 and Figure 6 that for the proposed methods H 1 , H 2 and H 3 , the average uncertainty on feature PL is the smallest. The result indicates the feature PL has the best ability to distinguish the iris types. The ranking of the discrimination of the four features is PL PW SL SW, which is consistent with intuition obtained by Figure 5. The same result can be obtained by Deng entropy, AM, SU, JS, Yang and Han's measure, and Deng's measure except weighted Hartley entropy and AU. Therefore, the application has demonstrated the effectiveness of the proposed method.

Conclusions
In this work, we proposed a new total uncertainty measure from the perspective of maximum entropy requirement. The properties of the proposed method are analyzed, such as non-negativity, monotonicity, maximum entropy, and so on. Besides, we give three uncertainty measure functions for the body of evidences, and analyze the effectiveness and reasonableness of the proposed methods through several different numerical examples and an application. It can be seen from these examples that our methods are in general agreement with the connotation of uncertainty. Compared with Deng entropy, the proposed method can effectively measure the uncertainty of BPA when the propositions intersect with the same reliability value, and satisfy the maximum entropy property. In addition, it is not that the greater the uncertainty for a BPA, the better the measure. How do we evaluate uncertainty measure methods more rationally in DST? Is there a reasonable indicator system for the evaluation? Actually, we think that when we adapt the proposed method, and the uncertainty trend of the measured evidence is consistent with the theory, it can be considered as a reasonable and effective method. Our study provides the framework for further studies to assess the performance characteristics of uncertainty function. Our results are encouraging and should be validated in application areas, such as decisionmaking, fault diagnosis, target recognition, and many other areas. We will be devoted to the applications in depth in further work. Beyond that, a maximum entropy from the Parker and Jeynes argument showed that the entropy of the supermassive black hole at the centre of the Milky Way can account for the geometrical stability of the galaxy. We believe that this is a good guide for our future work about uncertainty measures.

Conflicts of Interest:
The authors declare no conflict of interest for publishing in this journal.

Abbreviations
The following abbreviations are used in this manuscript: PT probability theory DST Dempster-Shafer evidence theory FOD Frame of Discernment BPA basic probability assignment BOE body of evidence