Next Article in Journal
Nonparametric Problem-Space Clustering: Learning Efficient Codes for Cognitive Control Tasks
Next Article in Special Issue
A Hybrid EEMD-Based SampEn and SVD for Acoustic Signal Processing and Fault Diagnosis
Previous Article in Journal
Fractal Representation of Exergy
Previous Article in Special Issue
Classification Active Learning Based on Mutual Information
Open AccessArticle

An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications

1
NLPR/LIAMA, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
2
College of Mathematics and Information Science, Hebei University, Baoding 071002, China
*
Author to whom correspondence should be addressed.
Academic Editors: Badong Chen and Jose C. Principe
Entropy 2016, 18(2), 59; https://doi.org/10.3390/e18020059
Received: 3 December 2015 / Revised: 3 February 2016 / Accepted: 4 February 2016 / Published: 19 February 2016
(This article belongs to the Special Issue Information Theoretic Learning)
In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers result in non-Bayesian solutions. For both types of errors, we derive the closed-form relations between each bound and error components. When Fano’s lower bound in a diagram of “Error Probability vs. Conditional Entropy” is realized based on the approach, its interpretations are enlarged by including non-Bayesian errors and the two situations along with independent properties of the variables. A new upper bound for the Bayesian error is derived with respect to the minimum prior probability, which is generally tighter than Kovalevskij’s upper bound. View Full-Text
Keywords: entropy; error probability; Bayesian errors; error types; upper bound; lower bound entropy; error probability; Bayesian errors; error types; upper bound; lower bound
Show Figures

Figure 1

MDPI and ACS Style

Hu, B.-G.; Xing, H.-J. An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications. Entropy 2016, 18, 59.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop