Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (18)

Search Parameters:
Keywords = Dempster–Shafer evidence theory (DST)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
33 pages, 2537 KB  
Article
Efficient Deep Wavelet Gaussian Markov Dempster–Shafer Network-Based Spectrum Sensing at Very Low SNR in Cognitive Radio Networks
by Sunil Jatti and Anshul Tyagi
Sensors 2025, 25(23), 7361; https://doi.org/10.3390/s25237361 - 3 Dec 2025
Viewed by 516
Abstract
Cognitive radio networks (CRNs) rely heavily on spectral sensing to detect primary user (PU) activity, yet detection at low signal-to-noise ratios (SNRs) remains a major challenge. Hence, a novel “Deep Wavelet Cyclostationary Independent Gaussian Markov Fourier Transform Dempster–Shafer Network” is proposed. When the [...] Read more.
Cognitive radio networks (CRNs) rely heavily on spectral sensing to detect primary user (PU) activity, yet detection at low signal-to-noise ratios (SNRs) remains a major challenge. Hence, a novel “Deep Wavelet Cyclostationary Independent Gaussian Markov Fourier Transform Dempster–Shafer Network” is proposed. When the signal waveform is submerged within the noise envelope and residual correlation emerges in the noise, it violates white Gaussian assumptions, leading to misidentification of signal presence. To resolve this, the Adaptive Continuous Wavelet Cyclostationary Denoising Autoencoder (ACWC-DAE) is introduced, in which the Adaptive Continuous Wavelet Transform (ACWT), Cyclostationary Independent Component Analysis Detection (CICAD), and Denoising Autoencoder (DAE) are introduced into the first hidden layer of a Deep Q-Network (DQN). It restores the bursty signal structure, separates the structured noise, and reconstructs clean signals, leading to accurate signal detection. Additionally, bursty and fading-affected primary user signals become fragmented and dip below the noise floor, causing conventional fixed-window sensing to fail in accumulating reliable evidence for detection under intermittent and low-duty-cycle conditions. Therefore, the Adaptive Gaussian Short-Time Fourier Transform Dempster–Shafer Model (AGSTFT-DSM) is incorporated into the second DQN layer, Adaptive Gaussian Mixture Hidden Markov Modeling (AGMHMM) tracks the hidden activity states, Adaptive Short-Time Fourier Transform (ASFT) resolves brief signal bursts, and Dempster–Shafer Theory (DST) fuses uncertain evidence to infer occupancy, thereby detecting an accurate user signal. The results obtained by the proposed model have a low error and detection time of 0.12 and 30.10 ms and a high accuracy of 97.8%, revealing the novel insight that adaptive wavelet denoising, along with uncertainty-aware evidence fusion, supports reliable spectrum detection under low-SNR conditions where existing models fail. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Figure 1

22 pages, 311 KB  
Article
A Dempster–Shafer, Fusion-Based Approach for Malware Detection
by Patricio Galdames, Simon Yusuf Enoch, Claudio Gutiérrez-Soto and Marco A. Palomino
Mathematics 2025, 13(16), 2677; https://doi.org/10.3390/math13162677 - 20 Aug 2025
Cited by 1 | Viewed by 1607
Abstract
Dempster–Shafer theory (DST), a generalization of probability theory, is well suited for managing uncertainty and integrating information from diverse sources. In recent years, DST has gained attention in cybersecurity research. However, despite the growing interest, there is still a lack of systematic comparisons [...] Read more.
Dempster–Shafer theory (DST), a generalization of probability theory, is well suited for managing uncertainty and integrating information from diverse sources. In recent years, DST has gained attention in cybersecurity research. However, despite the growing interest, there is still a lack of systematic comparisons of DST implementation strategies for malware detection. In this paper, we present a comprehensive evaluation of DST-based ensemble mechanisms for malware detection, addressing critical methodological questions regarding optimal mass function construction and combination rules. Our systematic analysis was tested with 630,504 benign and malicious samples collected from four public datasets (BODMAS, DREBIN, AndroZoo, and BMPD) to train malware detection models. We explored three approaches for converting classifier outputs into probability mass functions: global confidence using fixed values derived from performance metrics, class-specific confidence with separate values for each class, and computationally optimized confidence values. The results establish that all approaches yield comparable performance, although fixed values offer significant computational and interpretability advantages. Additionally, we introduced a novel linear combination rule for evidence fusion, which delivers results on par with conventional DST rules while enhancing interpretability. Our experiments show consistently low false positive rates—ranging from 0.16% to 3.19%. This comprehensive study provides the first systematic methodology comparison for DST-based malware detection, establishing evidence-based guidelines for practitioners on optimal implementation strategies. Full article
(This article belongs to the Special Issue Analytical Frameworks and Methods for Cybersecurity, 2nd Edition)
Show Figures

Figure 1

19 pages, 3930 KB  
Article
Enhancing Probabilistic Solar PV Forecasting: Integrating the NB-DST Method with Deterministic Models
by Tawsif Ahmad, Ning Zhou, Ziang Zhang and Wenyuan Tang
Energies 2024, 17(10), 2392; https://doi.org/10.3390/en17102392 - 16 May 2024
Cited by 7 | Viewed by 2442
Abstract
Accurate quantification of uncertainty in solar photovoltaic (PV) generation forecasts is imperative for the efficient and reliable operation of the power grid. In this paper, a data-driven non-parametric probabilistic method based on the Naïve Bayes (NB) classification algorithm and Dempster–Shafer theory (DST) of [...] Read more.
Accurate quantification of uncertainty in solar photovoltaic (PV) generation forecasts is imperative for the efficient and reliable operation of the power grid. In this paper, a data-driven non-parametric probabilistic method based on the Naïve Bayes (NB) classification algorithm and Dempster–Shafer theory (DST) of evidence is proposed for day-ahead probabilistic PV power forecasting. This NB-DST method extends traditional deterministic solar PV forecasting methods by quantifying the uncertainty of their forecasts by estimating the cumulative distribution functions (CDFs) of their forecast errors and forecast variables. The statistical performance of this method is compared with the analog ensemble method and the persistence ensemble method under three different weather conditions using real-world data. The study results reveal that the proposed NB-DST method coupled with an artificial neural network model outperforms the other methods in that its estimated CDFs have lower spread, higher reliability, and sharper probabilistic forecasts with better accuracy. Full article
(This article belongs to the Section A2: Solar Energy and Photovoltaic Systems)
Show Figures

Figure 1

21 pages, 800 KB  
Article
Probabilistic Hesitant Fuzzy Evidence Theory and Its Application in Capability Evaluation of a Satellite Communication System
by Jiahuan Liu, Ping Jian, Desheng Liu and Wei Xiong
Entropy 2024, 26(1), 94; https://doi.org/10.3390/e26010094 - 22 Jan 2024
Cited by 6 | Viewed by 2145
Abstract
Evaluating the capabilities of a satellite communication system (SCS) is challenging due to its complexity and ambiguity. It is difficult to accurately analyze uncertain situations, making it difficult for experts to determine appropriate evaluation values. To address this problem, this paper proposes an [...] Read more.
Evaluating the capabilities of a satellite communication system (SCS) is challenging due to its complexity and ambiguity. It is difficult to accurately analyze uncertain situations, making it difficult for experts to determine appropriate evaluation values. To address this problem, this paper proposes an innovative approach by extending the Dempster-Shafer evidence theory (DST) to the probabilistic hesitant fuzzy evidence theory (PHFET). The proposed approach introduces the concept of probabilistic hesitant fuzzy basic probability assignment (PHFBPA) to measure the degree of support for propositions, along with a combination rule and decision approach. Two methods are developed to generate PHFBPA based on multi-classifier and distance techniques, respectively. In order to improve the consistency of evidence, discounting factors are proposed using an entropy measure and the Jousselme distance of PHFBPA. In addition, a model for evaluating the degree of satisfaction of SCS capability requirements based on PHFET is presented. Experimental classification and evaluation of SCS capability requirements are performed to demonstrate the effectiveness and stability of the PHFET method. By employing the DST framework and probabilistic hesitant fuzzy sets, PHFET provides a compelling solution for handling ambiguous data in multi-source information fusion, thereby improving the evaluation of SCS capabilities. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

18 pages, 1300 KB  
Article
A Novel Multi-Source Domain Adaptation Method with Dempster–Shafer Evidence Theory for Cross-Domain Classification
by Min Huang and Chang Zhang
Mathematics 2022, 10(15), 2797; https://doi.org/10.3390/math10152797 - 6 Aug 2022
Cited by 4 | Viewed by 3265
Abstract
In this era of big data, Multi-source Domain Adaptation (MDA) becomes more and more popular and is employed to make full use of available source data collected from several different, but related domains. Although multiple source domains provide much information, the processing of [...] Read more.
In this era of big data, Multi-source Domain Adaptation (MDA) becomes more and more popular and is employed to make full use of available source data collected from several different, but related domains. Although multiple source domains provide much information, the processing of domain shifts becomes more challenging, especially in learning a common domain-invariant representation for all domains. Moreover, it is counter-intuitive to treat multiple source domains equally as most existing MDA algorithms do. Therefore, the domain-specific distribution for each source–target domain pair is aligned, respectively. Nevertheless, it is hard to combine adaptation outputs from different domain-specific classifiers effectively, because of ambiguity on the category boundary. Subjective Logic (SL) is introduced to measure the uncertainty (credibility) of each domain-specific classifier, so that MDA could be bridged with DST. Due to the advantage of information fusion, Dempster–Shafer evidence Theory (DST) is utilized to reduce the category boundary ambiguity and output reasonable decisions by combining adaptation outputs based on uncertainty. Finally, extensive comparative experiments on three popular benchmark datasets for cross-domain image classification are conducted to evaluate the performance of the proposed method via various aspects. Full article
Show Figures

Figure 1

19 pages, 873 KB  
Article
An Evidential Software Risk Evaluation Model
by Xingyuan Chen and Yong Deng
Mathematics 2022, 10(13), 2325; https://doi.org/10.3390/math10132325 - 2 Jul 2022
Cited by 48 | Viewed by 3901
Abstract
Software risk management is an important factor in ensuring software quality. Therefore, software risk assessment has become a significant and challenging research area. The aim of this study is to establish a data-driven software risk assessment model named DDERM. In the proposed model, [...] Read more.
Software risk management is an important factor in ensuring software quality. Therefore, software risk assessment has become a significant and challenging research area. The aim of this study is to establish a data-driven software risk assessment model named DDERM. In the proposed model, experts’ risk assessments of probability and severity can be transformed into basic probability assignments (BPAs). Deng entropy was used to measure the uncertainty of the evaluation and to calculate the criteria weights given by experts. In addition, the adjusted BPAs were fused using the rules of Dempster–Shafer evidence theory (DST). Finally, a risk matrix was used to get the risk priority. A case application demonstrates the effectiveness of the proposed method. The proposed risk modeling framework is a novel approach that provides a rational assessment structure for imprecision in software risk and is applicable to solving similar risk management problems in other domains. Full article
(This article belongs to the Special Issue Probability and Statistics in Quality and Reliability Engineering)
Show Figures

Figure 1

37 pages, 6640 KB  
Article
Decision Theory versus Conventional Statistics for Personalized Therapy of Breast Cancer
by Michael Kenn, Rudolf Karch, Dan Cacsire Castillo-Tong, Christian F. Singer, Heinz Koelbl and Wolfgang Schreiner
J. Pers. Med. 2022, 12(4), 570; https://doi.org/10.3390/jpm12040570 - 2 Apr 2022
Cited by 2 | Viewed by 3439
Abstract
Estrogen and progesterone receptors being present or not represents one of the most important biomarkers for therapy selection in breast cancer patients. Conventional measurement by immunohistochemistry (IHC) involves errors, and numerous attempts have been made to increase precision by additional information from gene [...] Read more.
Estrogen and progesterone receptors being present or not represents one of the most important biomarkers for therapy selection in breast cancer patients. Conventional measurement by immunohistochemistry (IHC) involves errors, and numerous attempts have been made to increase precision by additional information from gene expression. This raises the question of how to fuse information, in particular, if there is disagreement. It is the primary domain of Dempster–Shafer decision theory (DST) to deal with contradicting evidence on the same item (here: receptor status), obtained through different techniques. DST is widely used in technical settings, such as self-driving cars and aviation, and is also promising to deliver significant advantages in medicine. Using data from breast cancer patients already presented in previous work, we focus on comparing DST with classical statistics in this work, to pave the way for its application in medicine. First, we explain how DST not only considers probabilities (a single number per sample), but also incorporates uncertainty in a concept of ‘evidence’ (two numbers per sample). This allows for very powerful displays of patient data in so-called ternary plots, a novel and crucial advantage for medical interpretation. Results are obtained according to conventional statistics (ODDS) and, in parallel, according to DST. Agreement and differences are evaluated, and the particular merits of DST discussed. The presented application demonstrates how decision theory introduces new levels of confidence in diagnoses derived from medical data. Full article
Show Figures

Graphical abstract

23 pages, 1728 KB  
Article
A Novel Conflict Management Method Based on Uncertainty of Evidence and Reinforcement Learning for Multi-Sensor Information Fusion
by Fanghui Huang, Yu Zhang, Ziqing Wang and Xinyang Deng
Entropy 2021, 23(9), 1222; https://doi.org/10.3390/e23091222 - 17 Sep 2021
Cited by 7 | Viewed by 3682
Abstract
Dempster–Shafer theory (DST), which is widely used in information fusion, can process uncertain information without prior information; however, when the evidence to combine is highly conflicting, it may lead to counter-intuitive results. Moreover, the existing methods are not strong enough to process real-time [...] Read more.
Dempster–Shafer theory (DST), which is widely used in information fusion, can process uncertain information without prior information; however, when the evidence to combine is highly conflicting, it may lead to counter-intuitive results. Moreover, the existing methods are not strong enough to process real-time and online conflicting evidence. In order to solve the above problems, a novel information fusion method is proposed in this paper. The proposed method combines the uncertainty of evidence and reinforcement learning (RL). Specifically, we consider two uncertainty degrees: the uncertainty of the original basic probability assignment (BPA) and the uncertainty of its negation. Then, Deng entropy is used to measure the uncertainty of BPAs. Two uncertainty degrees are considered as the condition of measuring information quality. Then, the adaptive conflict processing is performed by RL and the combination two uncertainty degrees. The next step is to compute Dempster’s combination rule (DCR) to achieve multi-sensor information fusion. Finally, a decision scheme based on correlation coefficient is used to make the decision. The proposed method not only realizes adaptive conflict evidence management, but also improves the accuracy of multi-sensor information fusion and reduces information loss. Numerical examples verify the effectiveness of the proposed method. Full article
(This article belongs to the Special Issue Recent Progress of Deng Entropy)
Show Figures

Figure 1

13 pages, 274 KB  
Article
Fractional Deng Entropy and Extropy and Some Applications
by Mohammad Reza Kazemi, Saeid Tahmasebi, Francesco Buono and Maria Longobardi
Entropy 2021, 23(5), 623; https://doi.org/10.3390/e23050623 - 17 May 2021
Cited by 47 | Viewed by 4160
Abstract
Deng entropy and extropy are two measures useful in the Dempster–Shafer evidence theory (DST) to study uncertainty, following the idea that extropy is the dual concept of entropy. In this paper, we present their fractional versions named fractional Deng entropy and extropy and [...] Read more.
Deng entropy and extropy are two measures useful in the Dempster–Shafer evidence theory (DST) to study uncertainty, following the idea that extropy is the dual concept of entropy. In this paper, we present their fractional versions named fractional Deng entropy and extropy and compare them to other measures in the framework of DST. Here, we study the maximum for both of them and give several examples. Finally, we analyze a problem of classification in pattern recognition in order to highlight the importance of these new measures. Full article
(This article belongs to the Special Issue Measures of Information)
Show Figures

Figure 1

17 pages, 1302 KB  
Article
Measuring the Uncertainty in the Original and Negation of Evidence Using Belief Entropy for Conflict Data Fusion
by Yutong Chen and Yongchuan Tang
Entropy 2021, 23(4), 402; https://doi.org/10.3390/e23040402 - 28 Mar 2021
Cited by 14 | Viewed by 3257
Abstract
Dempster-Shafer (DS) evidence theory is widely used in various fields of uncertain information processing, but it may produce counterintuitive results when dealing with conflicting data. Therefore, this paper proposes a new data fusion method which combines the Deng entropy and the negation of [...] Read more.
Dempster-Shafer (DS) evidence theory is widely used in various fields of uncertain information processing, but it may produce counterintuitive results when dealing with conflicting data. Therefore, this paper proposes a new data fusion method which combines the Deng entropy and the negation of basic probability assignment (BPA). In this method, the uncertain degree in the original BPA and the negation of BPA are considered simultaneously. The degree of uncertainty of BPA and negation of BPA is measured by the Deng entropy, and the two uncertain measurement results are integrated as the final uncertainty degree of the evidence. This new method can not only deal with the data fusion of conflicting evidence, but it can also obtain more uncertain information through the negation of BPA, which is of great help to improve the accuracy of information processing and to reduce the loss of information. We apply it to numerical examples and fault diagnosis experiments to verify the effectiveness and superiority of the method. In addition, some open issues existing in current work, such as the limitations of the Dempster-Shafer theory (DST) under the open world assumption and the necessary properties of uncertainty measurement methods, are also discussed in this paper. Full article
(This article belongs to the Section Information Theory, Probability and Statistics)
Show Figures

Figure 1

15 pages, 456 KB  
Article
Deng Entropy Weighted Risk Priority Number Model for Failure Mode and Effects Analysis
by Haixia Zheng and Yongchuan Tang
Entropy 2020, 22(3), 280; https://doi.org/10.3390/e22030280 - 28 Feb 2020
Cited by 32 | Viewed by 5787
Abstract
Failure mode and effects analysis (FMEA), as a commonly used risk management method, has been extensively applied to the engineering domain. A vital parameter in FMEA is the risk priority number (RPN), which is the product of occurrence (O), severity (S), and detection [...] Read more.
Failure mode and effects analysis (FMEA), as a commonly used risk management method, has been extensively applied to the engineering domain. A vital parameter in FMEA is the risk priority number (RPN), which is the product of occurrence (O), severity (S), and detection (D) of a failure mode. To deal with the uncertainty in the assessments given by domain experts, a novel Deng entropy weighted risk priority number (DEWRPN) for FMEA is proposed in the framework of Dempster–Shafer evidence theory (DST). DEWRPN takes into consideration the relative importance in both risk factors and FMEA experts. The uncertain degree of objective assessments coming from experts are measured by the Deng entropy. An expert’s weight is comprised of the three risk factors’ weights obtained independently from expert’s assessments. In DEWRPN, the strategy of assigning weight for each expert is flexible and compatible to the real decision-making situation. The entropy-based relative weight symbolizes the relative importance. In detail, the higher the uncertain degree of a risk factor from an expert is, the lower the weight of the corresponding risk factor will be and vice versa. We utilize Deng entropy to construct the exponential weight of each risk factor as well as an expert’s relative importance on an FMEA item in a state-of-the-art way. A case study is adopted to verify the practicability and effectiveness of the proposed model. Full article
(This article belongs to the Special Issue Entropy: The Scientific Tool of the 21st Century)
Show Figures

Figure 1

14 pages, 359 KB  
Article
Paradox Elimination in Dempster–Shafer Combination Rule with Novel Entropy Function: Application in Decision-Level Multi-Sensor Fusion
by Md Nazmuzzaman Khan and Sohel Anwar
Sensors 2019, 19(21), 4810; https://doi.org/10.3390/s19214810 - 5 Nov 2019
Cited by 39 | Viewed by 5073
Abstract
Multi-sensor data fusion technology in an important tool in building decision-making applications. Modified Dempster–Shafer (DS) evidence theory can handle conflicting sensor inputs and can be applied without any prior information. As a result, DS-based information fusion is very popular in decision-making applications, but [...] Read more.
Multi-sensor data fusion technology in an important tool in building decision-making applications. Modified Dempster–Shafer (DS) evidence theory can handle conflicting sensor inputs and can be applied without any prior information. As a result, DS-based information fusion is very popular in decision-making applications, but original DS theory produces counterintuitive results when combining highly conflicting evidences from multiple sensors. An effective algorithm offering fusion of highly conflicting information in spatial domain is not widely reported in the literature. In this paper, a successful fusion algorithm is proposed which addresses these limitations of the original Dempster–Shafer (DS) framework. A novel entropy function is proposed based on Shannon entropy, which is better at capturing uncertainties compared to Shannon and Deng entropy. An 8-step algorithm has been developed which can eliminate the inherent paradoxes of classical DS theory. Multiple examples are presented to show that the proposed method is effective in handling conflicting information in spatial domain. Simulation results showed that the proposed algorithm has competitive convergence rate and accuracy compared to other methods presented in the literature. Full article
(This article belongs to the Collection Multi-Sensor Information Fusion)
Show Figures

Figure 1

28 pages, 1632 KB  
Article
Risk Assessment in Urban Large-Scale Public Spaces Using Dempster-Shafer Theory: An Empirical Study in Ningbo, China
by Jibiao Zhou, Xinhua Mao, Yiting Wang, Minjie Zhang and Sheng Dong
Int. J. Environ. Res. Public Health 2019, 16(16), 2942; https://doi.org/10.3390/ijerph16162942 - 16 Aug 2019
Cited by 11 | Viewed by 4017
Abstract
Urban Large-scale Public Spaces (ULPS) are important areas of urban culture and economic development, which are also places of the potential safety hazard. ULPS safety assessment has played a crucial role in the theory and practice of urban sustainable development. The primary objective [...] Read more.
Urban Large-scale Public Spaces (ULPS) are important areas of urban culture and economic development, which are also places of the potential safety hazard. ULPS safety assessment has played a crucial role in the theory and practice of urban sustainable development. The primary objective of this study is to explore the interaction between ULPS safety risk and its influencing factors. In the first stage, an index sensitivity analysis method was applied to calculate and identify the safety risk assessment index system. Next, a Delphi method and information entropy method were also applied to collect and calculate the weight of risk assessment indicators. In the second stage, a Dempster-Shafer Theory (DST) method with evidence fusion technique was utilized to analyze the interaction between the ULPS safety risk level and the multiple-index variables, measured by four observed performance indicators, i.e., environmental factor, human factor, equipment factor, and management factor. Finally, an empirical study of DST approach for ULPS safety performance analysis was presented. Full article
Show Figures

Figure 1

11 pages, 362 KB  
Article
A Novel Uncertainty Management Approach for Air Combat Situation Assessment Based on Improved Belief Entropy
by Ying Zhou, Yongchuan Tang and Xiaozhe Zhao
Entropy 2019, 21(5), 495; https://doi.org/10.3390/e21050495 - 14 May 2019
Cited by 15 | Viewed by 4901
Abstract
Uncertain information exists in each procedure of an air combat situation assessment. To address this issue, this paper proposes an improved method to address the uncertain information fusion of air combat situation assessment in the Dempster–Shafer evidence theory (DST) framework. A better fusion [...] Read more.
Uncertain information exists in each procedure of an air combat situation assessment. To address this issue, this paper proposes an improved method to address the uncertain information fusion of air combat situation assessment in the Dempster–Shafer evidence theory (DST) framework. A better fusion result regarding the prediction of military intention can be helpful for decision-making in an air combat situation. To obtain a more accurate fusion result of situation assessment, an improved belief entropy (IBE) is applied to preprocess the uncertainty of situation assessment information. Data fusion of assessment information after preprocessing will be based on the classical Dempster’s rule of combination. The illustrative example result validates the rationality and the effectiveness of the proposed method. Full article
Show Figures

Figure 1

19 pages, 903 KB  
Article
A Novel Belief Entropy for Measuring Uncertainty in Dempster-Shafer Evidence Theory Framework Based on Plausibility Transformation and Weighted Hartley Entropy
by Qian Pan, Deyun Zhou, Yongchuan Tang, Xiaoyang Li and Jichuan Huang
Entropy 2019, 21(2), 163; https://doi.org/10.3390/e21020163 - 10 Feb 2019
Cited by 32 | Viewed by 4700
Abstract
Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main [...] Read more.
Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments. Full article
Show Figures

Figure 1

Back to TopTop