Logarithmic Negation of Basic Probability Assignment and Its Application in Target Recognition

: The negation of probability distribution is a new perspective from which to obtain information. Dempster–Shafer (D–S) evidence theory, as an extension of possibility theory, is widely used in decision-making-level fusion. However, how to reasonably construct the negation of basic probability assignment (BPA) in D–S evidence theory is an open issue. This paper proposes a new negation of BPA, logarithmic negation. It solves the shortcoming of Yin’s negation that maximal entropy cannot be obtained when there are only two focal elements in the BPA. At the same time, the logarithmic negation of BPA inherits the good properties of the negation of probability, such as order reversal, involution, convergence, degeneration, and maximal entropy. Logarithmic negation degenerates into Gao’s negation when the values of the elements all approach 0. In addition, the data fusion method based on logarithmic negation has a higher belief value of the correct target in target recognition application. Y.H.

Events have two sides, so we often describe and express information from the front in D-S evidence theory. In fact, the opposite of an event can provide us with a new perspective and help us in obtaining more information in order to solve problems better. For example, if the highest probability of rain the next day is predicted to be 90%, it is still hard to make decisions, as the uncertainty cannot be determined. Let us see the negation of this situation. If the highest probability of not raining the next day is predicted to be 10%, decision making is relatively easy, and uncertainty is small [41]. On the basis of the above discussion, studying the negation of BPA [42][43][44][45] is of great significance to deal with uncertainty. How to reasonably construct the negation of BPA, on the other hand, is an open issue. Luo [41] proposed a matrix method of BPA negation where BPAs were represented as vectors, and negation was realized with matrix operators. This method could interpret the matrix operators well, but the calculation process was complicated. Gao [42] proposed a new negation that could be seen as arithmetic negation. It better presented the connection between changes in the uncertainty and entropy of random sets. Yin's [44] negation of BPA could measure the uncertainty of the BPA well. However, the maximal entropy cannot be obtained when there are only two focal elements in the BPA. In this paper, we propose a novel negation of BPA based on the logarithmic function named logarithmic negation that can be seen as geometric negation and it solves the shortcoming of Yin's negation. At the same time, the logarithmic negation of BPA inherits the good properties of negation of probability, such as order reversal, involution, convergence, degeneration, and maximal entropy. Logarithmic negation degenerates into Gao's negation when the values of the elements all approach 0. In addition, the data fusion method based on logarithmic negation had a higher belief value of the correct target in the application of target recognition.
The remainder of this paper is organized as follows. The preliminaries are introduced in Section 2. In Section 3, the logarithmic negation is proposed, and its properties are analyzed and proved. In Section 4, some numerical examples are used to test the feasibility of logarithmic negation. In Section 5, two target recognition applications are used to demonstrate the effectiveness of the data fusion approach based on logarithmic negation. Conclusions are given in Section 6.

Frame of Discernment
Θ was assumed to be a set of mutually exclusive and exhaustive elements F i (i = 1, 2, 3, 4, . . . , N), and it could be defined as [46]: where Θ is the frame of discernment (FOD), and F i is the single subset proposition. We defined 2 Θ as a power set that contains 2 N elements and can be described as follows [47]: where ∅ is an empty set in Equation (2).

Basic Probability Assignment
Basic probability assignment (BPA) function m is also called mass function and is defined as a mapping of power set 2 Θ to [0,1] [48].
If m(A) > 0, A is called a focal element or subset. Mass function m(∅) is equal to 0 in classical D-S evidence theory.

Dempster's Combination Rule
In D-S evidence theory, two BPAs can be combined with Dempster's combination rule, defined as follows [49]: in which where ⊕ represents Dempster's combination rule, and k is the conflict coefficient.

Shannon Entropy
Shannon entropy is an uncertain measure of information volume and is denoted by [50]: where p i is the probability of state i.

Deng Entropy
Deng entropy is defined as follows [51]: where |A| is the cardinality of Proposition A.

Yin's Negation of BPA
Yin's negation of BPA is defined as follows [44]: where n is the number of focal elements, and m(e i ) is the focal element.

Gao's Negation of BPA
Gao's negation of BPA is defined as follows [42]: where n is the number of elements in the FOD, and m A i is the mass function.

Proposed Negation
D-S evidence theory, with more accurate expression of information and better processing ability, is an extension of possibility theory. The negation of BPA shows the other side of the information and offers a new perspective on processing information. In this section, a new negation of BPA is proposed named logarithmic negation.
Assuming that the FOD has N elements, a power set 2 Θ is described as: Since the negation of BPA is under classical D-S evidence theory, mass function m(∅) was not considered. The power set can be expressed as: which satisfies The definition of logarithmic negation is as follows.
where N is the number of element in the FOD, and K is a normal number for every certain BPA. As we know, then, Then, we simplify and establish Therefore, The logarithmic negation satisfies the involution.
represents the next moment after t. Since logarithmic negation is noninvolutionary, the value of negation constantly changes. if and only if [m( we can calculate the result of the next iteration as: So, the negative value is fixed from the tth iteration, namely, converges into 1/(2 N − 1). Then, the belief distribution is uniform.

Theorem 4. Logarithmic negation degenerates into Gao's negation [42] when m(
Proof. According to the commonly used equivalent infinitesimal relations, we can obtain When m(A 1 ), m(A 2 ), · · · , m(A i ) → 0, we simplify and establish the following.
Theorem 5. After each logarithmic negation operation, the entropy of information keeps increasing to its maximum. Proof.
To further simplify the calculation, we could obtain the following approximation according to Theorem 4. Then, Through the Lagrangean multiplier method, we assume that We then take the partial derivatives.

∂F ∂m(
According to the derivation in [42], it can become When the belief distribution of logarithmic negation follows uniform distribution, there is H(m) = H(m), and entropy reaches its maximum and cannot increase anymore.
Note: The more elements there are in the FOD, the faster the convergence speed is. The more elements there are in FOD, the more uncertain information the BPA contains. Obviously, this has more elements involved in logarithmic negation. From the perspective of increasing entropy, the larger scale of FOD indicates that more elements consume information under the same conditions. Then, the overall rate of the consumption of information is faster, entropy reaches its maximum more quickly, and the belief distribution quickly converges. Furthermore, the larger the number of elements is, the smaller the logarithmic negation that converges into the value of 1/(2 N − 1). Thus, belief redistribution by logarithmic negation facilitates the belief distribution to converge into uniform distribution. This is illustrated by numerical Examples 2-4.
The phenomenon of logarithmic negation measures the uncertainty of evidence by reassigning belief. From the point of view of information entropy, the essence of logarithmic negation is the process of consuming information, and the entropy converges into maximal entropy. Detailed numerical examples in the next section are given to help in understanding the concept.

Numerical Examples
In the next section, several numerical examples are used to illustrate the theorems.
Yin's negation [44] method cannot handle this special case where there is one focal element in the close world. Since Yin's negation can only construct negations from elements with belief values greater than 0, this inevitably causes a loss of information in the FOD and produces unreasonable results.
The logarithmic negation of BPA is Obviously, the proposed negation could obtain the intuitive results. Although BPAs m(b), m(a, b) were equal to 0, they contained important information, that is, events b and a, b were unlikely to happen. The proposed method could capture this hidden information and process it to render the expression of information more complete and effective.  Table 1 shows the BPA and entropy after Yin's negation iterations, and Figure 1 visualizes them. When the BPA only contained two focal elements, the negation was reversible:m Y (a) = m(a). No matter the number of negation iterations, the belief values of m(a) and m(b) always cyclically changed between 0.7 and 0.3. This did not converge into a certain value. Correspondingly, entropy no longer increased and could not reach its maximum.  The above shows thatm(a) <m(b) when m(a) > m(b), which is consistent with the order reversal. In addition, there ism(a) = m(a). The irreversibility was verified. Table 2 shows the BPA and entropy after logarithmic negation iteration, and Figure 2 visualizes them. The belief distribution lastly converged into uniform distribution, that is, the negative values were all equal to 1/(2 N − 1). Entropy kept increasing to its maximum. Theorems 3 and 5 were fully verified. The proposed negation overcomes the shortcomings of Yin's negation well.    Table 3 shows the BPA and entropy after logarithmic negation iterations, and Figure 3 visualizes them. The belief distribution gradually converged as the number of iterations increased. Lastly, the BPA converged into uniform distribution. At the same time, maximal entropy was obtained. We can understand this phenomenon in terms of information consumption. In fact, each iteration of negation is a process of consuming information. The consumption of information means an increase in entropy. When maximal entropy was reached, there was no more information to consume for the BPA, and entropy stopped increasing. Theorems 3 and 5 of logarithmic negation were verified again.  Table 3. BPA and entropy after logarithmic negation iterations in Example 3.   Table 4 shows the BPA and entropy after logarithmic negation iterations. The FOD contained two elements in Example 2, three elements in Example 3, and four elements in Example 4. Tables 2-4 show that the numbers of iterations required for the convergence of a logarithmic negative were 8, 5, and 4. In other words, the convergence speed of logarithmic negation accelerated significantly as the scale of the FOD grew. This means that the BPA needed fewer negative iterations to obtain maximal entropy. This interesting point is understood from the perspective of increasing entropy. The larger scale of FOD indicates that more elements consumed information under the same conditions. Then, the overall rate of consumption of information was faster. Thus, maximal entropy was achieved faster.

Application
In this section, two target recognition applications are used to verify the effectiveness of the data fusion approach based on the logarithmic negation. On the basis of the proposed negation, we adopted Li's data fusion method [45] to run the target recognition application. The detailed steps of Li's method are described as follows.
Step 1: construct the logarithmic negationm n of each piece of evidence by using Equation (12), where n is the number of pieces of evidence.
Step 2: calculate the Deng entropy of the initial evidence and its negation by using Equation (6).
Step 3: calculate the credibility of each piece of evidence by using Equation (16).
where |E d (m n ) − E d (m n )| represents the absolute value of the entropy difference between the initial evidence and its negation. If the belief distribution is uniform, we just need to Step 4: calculate the weight of each piece of evidence by using Equation (17).
Step 5: weighted average evidencem is calculated as follows: Step 6: Dempster's combination rule is used k-1 times to combine the weighted average evidence according to Equation (3). Then, the final combination result is calculated as follows:

Application 1
In a multisensor automatic target recognition system, it is assumed that the actual target is A. The collected sensor reports from the system, modelled as BPAs, are shown in Table 5. The application is cited from [45]. Step 1: construct the logarithmic negationm n of each piece of evidence by using Equation (12). Step 2: calculate the Deng entropy of the initial evidence and its negation by using Equation (6).
Step 4: calculate the weight of each piece of evidence by using Equation (17). Step 6: Dempster's combination rule is used 3 times to combine the weighted average evidence according to Equation (3). Then, the final combination result is calculated as follows: Evidence m 2 is highly conflicted with other pieces of evidence. As shown in Table 6 and Figure 4, Dempster's method did not identify the correct target A. The operation of a one-vote vet produced a counterintuitive result that the belief value of target A is always 0. However, other methods all overcame the impact of conflicting evidence and obtained reasonable decision-making results. The experimental results show that the belief value of the correct target A for the three other methods was 0.6027, 0.7773 and 0.8491. The degree of belief was relatively low. The proposed data fusion method could achieve better performance in combining pieces of conflicting evidence, as it had the highest belief value (0.9653) for the correct target A. Compared with the three other methods, the belief value of the correct target was improved by 36.26%, 17.8% and 11.62%. The main reason is that the logarithmic negation could extract more useful information in the process of conflict processing from the perspective of geometric negation. The accuracy of the fusion results was improved, and the target type could be identified more accurately. Furthermore, since the entropy of conflicting evidence is usually low, it can distinguish between conflicting and normal evidence with Deng entropy. The entropy difference between the conflicting evidence and its negation was larger than that of normal evidence. Therefore, the proposed method assigned a lower credibility value to the conflicting evidence, and more reasonable results were obtained. This application proves the validity of the proposed method.

Application 2
Consider a multisensor target recognition problem associated with sensor reports that are collected from five different types of sensors {S 1 , S 2 , S 3 , S 4 , S 5 }, and each of which is allocated to a different position in order to monitor the objectives. The FOD that consists of three types of target is given by Θ = {F 1 , F 2 , F 3 }. These sensors collect the target information and generate reports, which were modeled as BPAs denoted by m 1 (·), m 2 (·), m 3 (·), m 4 (·) and m 5 (·) in Table 7. The application is cited from [54]. The combined results are shown in Figure 5 and Table 8. Even if the second piece of evidence was in major conflict with other evidence, all the methods could successfully identify the target type as A, which was in line with our intuition. The belief value of the target type A for the four other methods is 0.8657, 0.9885, 0.9888 and 0.9892. Although the overall belief value was relatively high, the accuracy of the fusion results still has some room for improvement. The proposed method had the highest belief value (0.9897) for the correct target. The belief value of the correct target with the proposed method was only improved by 0.05% compared with Wang's method [55], but it is of great significance to the target recognition system based on such a high belief value. The main reason is that the proposed method took more uncertainty information into account in the process of logarithmic negation of BPA, that is, it used well the information of elements whose belief value was equal to 0. The operation of logarithmic negation consumed a lot of information in the conflicting evidence, which led to a rapid increase in the entropy of the conflicting evidence's negation. Then, the entropy difference between the conflicting evidence and its negation grew, and the conflicting evidence was assigned a smaller weight value. Lastly, conflicting evidence was handled effectively. In addition, the combination of Deng entropy and logarithmic negation improved the accuracy of the fusion results and reduced the loss of information, which also ensured the ability to deal with conflicting evidence. The validity of the proposed method was verified again.

Conclusions
How to reasonably construct the negation of BPA is an open issue. To address this problem, a novel negation of BPA was proposed named logarithmic negation that solves the shortcoming of Yin's negation that the maximal entropy cannot be obtained when there are only two focal elements in the BPA. At the same time, the logarithmic negation of BPA inherits the good properties of the negation of probability, such as order reversal, involution, convergence, degeneration, and maximal entropy. Logarithmic negation degenerates into Gao's negation when the values of the elements all approach 0. The operation of logarithmic negation can cause an increase in entropy, and convergence speed is proportional to the number of elements in the FOD. Some numerical examples were presented for analysis and proof. Lastly, the data fusion method based on logarithmic negation had the highest belief value of the correct target in two target recognition applications. Institutional Review Board Statement: not applicable.