Next Article in Journal
Identification of Denatured Biological Tissues Based on Time-Frequency Entropy and Refined Composite Multi-Scale Weighted Permutation Entropy during HIFU Treatment
Previous Article in Journal
Akaike’s Bayesian Information Criterion for the Joint Inversion of Terrestrial Water Storage Using GPS Vertical Displacements, GRACE and GLDAS in Southwest China
Article Menu

Export Article

Open AccessArticle

Structure Learning of Bayesian Network Based on Adaptive Thresholding

1,2, 1,2, 1,2 and 1,2,*
1
College of Computer Science and Technology, Jilin University, Changchun 130012, China
2
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(7), 665; https://doi.org/10.3390/e21070665
Received: 1 June 2019 / Revised: 2 July 2019 / Accepted: 5 July 2019 / Published: 8 July 2019
  |  
PDF [495 KB, uploaded 9 July 2019]
  |  

Abstract

Direct dependencies and conditional dependencies in restricted Bayesian network classifiers (BNCs) are two basic kinds of dependencies. Traditional approaches, such as filter and wrapper, have proved to be beneficial to identify non-significant dependencies one by one, whereas the high computational overheads make them inefficient especially for those BNCs with high structural complexity. Study of the distributions of information-theoretic measures provides a feasible approach to identifying non-significant dependencies in batch that may help increase the structure reliability and avoid overfitting. In this paper, we investigate two extensions to the k-dependence Bayesian classifier, MI-based feature selection, and CMI-based dependence selection. These two techniques apply a novel adaptive thresholding method to filter out redundancy and can work jointly. Experimental results on 30 datasets from the UCI machine learning repository demonstrate that adaptive thresholds can help distinguish between dependencies and independencies and the proposed algorithm achieves competitive classification performance compared to several state-of-the-art BNCs in terms of 0–1 loss, root mean squared error, bias, and variance. View Full-Text
Keywords: Bayesian network classifiers; mutual information; conditional mutual information; thresholding Bayesian network classifiers; mutual information; conditional mutual information; thresholding
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Zhang, Y.; Wang, L.; Duan, Z.; Sun, M. Structure Learning of Bayesian Network Based on Adaptive Thresholding. Entropy 2019, 21, 665.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top