Next Article in Journal
Application of the Entropy Spectral Method for Streamflow and Flood-Affected Area Forecasting in the Brahmaputra River Basin
Previous Article in Journal
Empirical Estimation of Information Measures: A Literature Guide
Previous Article in Special Issue
Discriminatory Target Learning: Mining Significant Dependence Relationships from Labeled and Unlabeled Data
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle

Structure Extension of Tree-Augmented Naive Bayes

College of Software, Jilin University, Changchun 130012, China
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
College of Computer Science and Technology, Jilin University, Changchun 130012, China
Author to whom correspondence should be addressed.
Entropy 2019, 21(8), 721;
Received: 20 May 2019 / Revised: 16 July 2019 / Accepted: 23 July 2019 / Published: 25 July 2019
(This article belongs to the Special Issue Information Theoretic Measures and Their Applications)
PDF [1465 KB, uploaded 29 July 2019]


Due to the simplicity and competitive classification performance of the naive Bayes (NB), researchers have proposed many approaches to improve NB by weakening its attribute independence assumption. Through the theoretical analysis of Kullback–Leibler divergence, the difference between NB and its variations lies in different orders of conditional mutual information represented by these augmenting edges in the tree-shaped network structure. In this paper, we propose to relax the independence assumption by further generalizing tree-augmented naive Bayes (TAN) from 1-dependence Bayesian network classifiers (BNC) to arbitrary k-dependence. Sub-models of TAN that are built to respectively represent specific conditional dependence relationships may “best match” the conditional probability distribution over the training data. Extensive experimental results reveal that the proposed algorithm achieves bias-variance trade-off and substantially better generalization performance than state-of-the-art classifiers such as logistic regression. View Full-Text
Keywords: tree-augmented naive Bayes; Kullback–Leibler divergence; attribute independence assumption; probability distribution tree-augmented naive Bayes; Kullback–Leibler divergence; attribute independence assumption; probability distribution

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).

Share & Cite This Article

MDPI and ACS Style

Long, Y.; Wang, L.; Sun, M. Structure Extension of Tree-Augmented Naive Bayes. Entropy 2019, 21, 721.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics



[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top