Next Article in Journal
On the Deposition Equilibrium of Carbon Nanotubes or Graphite in the Reforming Processes of Lower Hydrocarbon Fuels
Next Article in Special Issue
Label-Driven Learning Framework: Towards More Accurate Bayesian Network Classifiers through Discrimination of High-Confidence Labels
Previous Article in Journal
Coherent Processing of a Qubit Using One Squeezed State
Previous Article in Special Issue
Multivariate Multiscale Symbolic Entropy Analysis of Human Gait Signals
Article Menu
Issue 12 (December) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(12), 651; https://doi.org/10.3390/e19120651

K-Dependence Bayesian Classifier Ensemble

Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
*
Author to whom correspondence should be addressed.
Received: 6 September 2017 / Revised: 23 November 2017 / Accepted: 27 November 2017 / Published: 30 November 2017
(This article belongs to the Special Issue Symbolic Entropy Analysis and Its Applications)
View Full-Text   |   Download PDF [4203 KB, uploaded 30 November 2017]   |  

Abstract

To maximize the benefit that can be derived from the information implicit in big data, ensemble methods generate multiple models with sufficient diversity through randomization or perturbation. A k-dependence Bayesian classifier (KDB) is a highly scalable learning algorithm with excellent time and space complexity, along with high expressivity. This paper introduces a new ensemble approach of KDBs, a k-dependence forest (KDF), which induces a specific attribute order and conditional dependencies between attributes for each subclassifier. We demonstrate that these subclassifiers are diverse and complementary. Our extensive experimental evaluation on 40 datasets reveals that this ensemble method achieves better classification performance than state-of-the-art out-of-core ensemble learners such as the AODE (averaged one-dependence estimator) and averaged tree-augmented naive Bayes (ATAN). View Full-Text
Keywords: k-dependence forest; diversity; conditional dependencies k-dependence forest; diversity; conditional dependencies
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Duan, Z.; Wang, L. K-Dependence Bayesian Classifier Ensemble. Entropy 2017, 19, 651.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top