Next Article in Journal
General Approach for Composite Thermoelectric Systems with Thermal Coupling: The Case of a Dual Thermoelectric Cooler
Previous Article in Journal
Delayed-Compensation Algorithm for Second-Order Leader-Following Consensus Seeking under Communication Delay
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(6), 3766-3786; doi:10.3390/e17063766

Learning a Flexible K-Dependence Bayesian Classifier from the Chain Rule of Joint Probability Distribution

1
Key Laboratory of Symbolic Computation and Knowledge Engineering of Ministry of Education, Jilin University, Changchun 130012, China
2
School of Software, Jilin University, Changchun 130012, China
*
Author to whom correspondence should be addressed.
Academic Editor: Antonio M. Scarfone
Received: 30 November 2014 / Accepted: 3 June 2015 / Published: 8 June 2015
(This article belongs to the Section Statistical Mechanics)
View Full-Text   |   Download PDF [2540 KB, uploaded 8 June 2015]   |  

Abstract

As one of the most common types of graphical models, the Bayesian classifier has become an extremely popular approach to dealing with uncertainty and complexity. The scoring functions once proposed and widely used for a Bayesian network are not appropriate for a Bayesian classifier, in which class variable C is considered as a distinguished one. In this paper, we aim to clarify the working mechanism of Bayesian classifiers from the perspective of the chain rule of joint probability distribution. By establishing the mapping relationship between conditional probability distribution and mutual information, a new scoring function, Sum_MI, is derived and applied to evaluate the rationality of the Bayesian classifiers. To achieve global optimization and high dependence representation, the proposed learning algorithm, the flexible K-dependence Bayesian (FKDB) classifier, applies greedy search to extract more information from the K-dependence network structure. Meanwhile, during the learning procedure, the optimal attribute order is determined dynamically, rather than rigidly. In the experimental study, functional dependency analysis is used to improve model interpretability when the structure complexity is restricted. View Full-Text
Keywords: Bayesian classifier; chain rule; optimal attribute order; information quantity Bayesian classifier; chain rule; optimal attribute order; information quantity
Figures

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Wang, L.; Zhao, H. Learning a Flexible K-Dependence Bayesian Classifier from the Chain Rule of Joint Probability Distribution. Entropy 2015, 17, 3766-3786.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top