Structure Extension of Tree-Augmented Naive Bayes
AbstractDue to the simplicity and competitive classification performance of the naive Bayes (NB), researchers have proposed many approaches to improve NB by weakening its attribute independence assumption. Through the theoretical analysis of Kullback–Leibler divergence, the difference between NB and its variations lies in different orders of conditional mutual information represented by these augmenting edges in the tree-shaped network structure. In this paper, we propose to relax the independence assumption by further generalizing tree-augmented naive Bayes (TAN) from 1-dependence Bayesian network classifiers (BNC) to arbitrary k-dependence. Sub-models of TAN that are built to respectively represent specific conditional dependence relationships may “best match” the conditional probability distribution over the training data. Extensive experimental results reveal that the proposed algorithm achieves bias-variance trade-off and substantially better generalization performance than state-of-the-art classifiers such as logistic regression. View Full-Text
Share & Cite This Article
Long, Y.; Wang, L.; Sun, M. Structure Extension of Tree-Augmented Naive Bayes. Entropy 2019, 21, 721.
Long Y, Wang L, Sun M. Structure Extension of Tree-Augmented Naive Bayes. Entropy. 2019; 21(8):721.Chicago/Turabian Style
Long, Yuguang; Wang, Limin; Sun, Minghui. 2019. "Structure Extension of Tree-Augmented Naive Bayes." Entropy 21, no. 8: 721.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.