Next Article in Journal
Objective Bayesian Entropy Inference for Two-Parameter Logistic Distribution Using Upper Record Values
Previous Article in Journal
About the Concept of Quantum Chaos
Article Menu
Issue 5 (May) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(5), 206; doi:10.3390/e19050206

Divergence and Sufficiency for Convex Optimization

GSK Department, Copenhagen Business College, Nørre Voldgade 34, 1358 Copenhagen K, Denmark
Academic Editor: Renaldas Urniezius
Received: 30 December 2016 / Revised: 11 April 2017 / Accepted: 2 May 2017 / Published: 3 May 2017
(This article belongs to the Special Issue Convex Optimization and Entropy)
View Full-Text   |   Download PDF [631 KB, uploaded 3 May 2017]   |  

Abstract

Logarithmic score and information divergence appear in information theory, statistics, statistical mechanics, and portfolio theory. We demonstrate that all these topics involve some kind of optimization that leads directly to regret functions and such regret functions are often given by Bregman divergences. If a regret function also fulfills a sufficiency condition it must be proportional to information divergence. We will demonstrate that sufficiency is equivalent to the apparently weaker notion of locality and it is also equivalent to the apparently stronger notion of monotonicity. These sufficiency conditions have quite different relevance in the different areas of application, and often they are not fulfilled. Therefore sufficiency conditions can be used to explain when results from one area can be transferred directly to another and when one will experience differences. View Full-Text
Keywords: Bregman divergence; entropy; exergy; Kraft’s inequality; locallity; monotonicity; portfolio; regret; scoring rule; sufficiency Bregman divergence; entropy; exergy; Kraft’s inequality; locallity; monotonicity; portfolio; regret; scoring rule; sufficiency
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Harremoës, P. Divergence and Sufficiency for Convex Optimization. Entropy 2017, 19, 206.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top