Next Article in Journal
Bayesian Optimization Based on K-Optimality
Next Article in Special Issue
Thermodynamic Analysis of Irreversible Desiccant Systems
Previous Article in Journal
Using the Data Agreement Criterion to Rank Experts’ Beliefs
Previous Article in Special Issue
Entropic Stabilization of Cas4 Protein SSO0001 Predicted with Popcoen
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle
Entropy 2018, 20(8), 593; https://doi.org/10.3390/e20080593

The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex

1
Department of Mathematics and Statistics, University of Canterbury, 8140 Christchurch, New Zealand
2
Department of Mathematics and Computer Science, University of Palermo, 90123 Palermo, Italy
3
Department of Economics, Business, and Statistics, University of Palermo, 90128 Palermo, Italy
*
Author to whom correspondence should be addressed.
Received: 3 July 2018 / Revised: 26 July 2018 / Accepted: 6 August 2018 / Published: 9 August 2018
(This article belongs to the Special Issue Entropy: From Physics to Information Sciences and Geometry)
Full-Text   |   PDF [1095 KB, uploaded 9 August 2018]   |  

Abstract

The refinement axiom for entropy has been provocative in providing foundations of information theory, recognised as thoughtworthy in the writings of both Shannon and Jaynes. A resolution to their concerns has been provided recently by the discovery that the entropy measure of a probability distribution has a dual measure, a complementary companion designated as “extropy”. We report here the main results that identify this fact, specifying the dual equations and exhibiting some of their structure. The duality extends beyond a simple assessment of entropy, to the formulation of relative entropy and the Kullback symmetric distance between two forecasting distributions. This is defined by the sum of a pair of directed divergences. Examining the defining equation, we notice that this symmetric measure can be generated by two other explicable pairs of functions as well, neither of which is a Bregman divergence. The Kullback information complex is constituted by the symmetric measure of entropy/extropy along with one of each of these three function pairs. It is intimately related to the total logarithmic score of two distinct forecasting distributions for a quantity under consideration, this being a complete proper score. The information complex is isomorphic to the expectations that the two forecasting distributions assess for their achieved scores, each for its own score and for the score achieved by the other. Analysis of the scoring problem exposes a Pareto optimal exchange of the forecasters’ scores that both are willing to engage. Both would support its evaluation for assessing the relative quality of the information they provide regarding the observation of an unknown quantity of interest. We present our results without proofs, as these appear in source articles that are referenced. The focus here is on their content, unhindered. The mathematical syntax of probability we employ relies upon the operational subjective constructions of Bruno de Finetti. View Full-Text
Keywords: entropy; extropy; relative entropy/extropy; prevision; duality; Fermi–Dirac entropy; Kullback symmetric divergence; total logarithmic scoring rule; Pareto optimal exchange entropy; extropy; relative entropy/extropy; prevision; duality; Fermi–Dirac entropy; Kullback symmetric divergence; total logarithmic scoring rule; Pareto optimal exchange
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Lad, F.; Sanfilippo, G.; Agrò, G. The Duality of Entropy/Extropy, and Completion of the Kullback Information Complex. Entropy 2018, 20, 593.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top