Next Article in Journal
An Economics-Based Second Law Efficiency
Next Article in Special Issue
Estimation Bias in Maximum Entropy Models
Previous Article in Journal
Non-Linear Fusion of Observations Provided by Two Sensors
Previous Article in Special Issue
Bootstrap Methods for the Empirical Study of Decision-Making and Information Flows in Social Systems
Article Menu

Export Article

Open AccessArticle
Entropy 2013, 15(7), 2716-2735; https://doi.org/10.3390/e15072716

Efficient Approximation of the Conditional Relative Entropy with Applications to Discriminative Learning of Bayesian Network Classifiers

1
Department of Electrical Engineering, IST, University of Lisbon, Lisbon 1049-001, Portugal
2
PIA, Instituto de Telecomunicações, Lisbon 1049-001, Portugal
3
Department of Computer Science, IST, University of Lisbon, Lisbon 1049-001, Portugal
4
SQIG, Instituto de Telecomunicações, Lisbon 1049-001, Portugal
5
Department of Mathematics, IST, University of Lisbon, Lisbon 1049-001, Portugal
*
Author to whom correspondence should be addressed.
Received: 8 June 2013 / Revised: 3 July 2013 / Accepted: 3 July 2013 / Published: 12 July 2013
(This article belongs to the Special Issue Estimating Information-Theoretic Quantities from Data)
View Full-Text   |   Download PDF [2244 KB, uploaded 24 February 2015]   |  

Abstract

We propose a minimum variance unbiased approximation to the conditional relative entropy of the distribution induced by the observed frequency estimates, for multi-classification tasks. Such approximation is an extension of a decomposable scoring criterion, named approximate conditional log-likelihood (aCLL), primarily used for discriminative learning of augmented Bayesian network classifiers. Our contribution is twofold: (i) it addresses multi-classification tasks and not only binary-classification ones; and (ii) it covers broader stochastic assumptions than uniform distribution over the parameters. Specifically, we considered a Dirichlet distribution over the parameters, which was experimentally shown to be a very good approximation to CLL. In addition, for Bayesian network classifiers, a closed-form equation is found for the parameters that maximize the scoring criterion. View Full-Text
Keywords: conditional relative entropy; approximation; discriminative learning; Bayesian network classifiers conditional relative entropy; approximation; discriminative learning; Bayesian network classifiers
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License (CC BY 3.0).

Share & Cite This Article

MDPI and ACS Style

Carvalho, A.M.; Adão, P.; Mateus, P. Efficient Approximation of the Conditional Relative Entropy with Applications to Discriminative Learning of Bayesian Network Classifiers. Entropy 2013, 15, 2716-2735.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top