Next Article in Journal
Nonparametric Problem-Space Clustering: Learning Efficient Codes for Cognitive Control Tasks
Next Article in Special Issue
A Hybrid EEMD-Based SampEn and SVD for Acoustic Signal Processing and Fault Diagnosis
Previous Article in Journal
Fractal Representation of Exergy
Previous Article in Special Issue
Classification Active Learning Based on Mutual Information
Article Menu

Export Article

Open AccessArticle
Entropy 2016, 18(2), 59; doi:10.3390/e18020059

An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications

1
NLPR/LIAMA, Institute of Automation, Chinese Academy of Science, Beijing 100190, China
2
College of Mathematics and Information Science, Hebei University, Baoding 071002, China
*
Author to whom correspondence should be addressed.
Academic Editors: Badong Chen and Jose C. Principe
Received: 3 December 2015 / Revised: 3 February 2016 / Accepted: 4 February 2016 / Published: 19 February 2016
(This article belongs to the Special Issue Information Theoretic Learning)
View Full-Text   |   Download PDF [1541 KB, uploaded 19 February 2016]   |  

Abstract

In this work, we propose a new approach of deriving the bounds between entropy and error from a joint distribution through an optimization means. The specific case study is given on binary classifications. Two basic types of classification errors are investigated, namely, the Bayesian and non-Bayesian errors. The consideration of non-Bayesian errors is due to the facts that most classifiers result in non-Bayesian solutions. For both types of errors, we derive the closed-form relations between each bound and error components. When Fano’s lower bound in a diagram of “Error Probability vs. Conditional Entropy” is realized based on the approach, its interpretations are enlarged by including non-Bayesian errors and the two situations along with independent properties of the variables. A new upper bound for the Bayesian error is derived with respect to the minimum prior probability, which is generally tighter than Kovalevskij’s upper bound. View Full-Text
Keywords: entropy; error probability; Bayesian errors; error types; upper bound; lower bound entropy; error probability; Bayesian errors; error types; upper bound; lower bound
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Hu, B.-G.; Xing, H.-J. An Optimization Approach of Deriving Bounds between Entropy and Error from Joint Distribution: Case Study for Binary Classifications. Entropy 2016, 18, 59.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top