Next Article in Journal
A Mean-Variance Hybrid-Entropy Model for Portfolio Selection with Fuzzy Returns
Next Article in Special Issue
General Hyperplane Prior Distributions Based on Geometric Invariances for Bayesian Multivariate Linear Regression
Previous Article in Journal
Generalized Stochastic Fokker-Planck Equations
Previous Article in Special Issue
Computing Bi-Invariant Pseudo-Metrics on Lie Groups for Consistent Statistics
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(5), 3253-3318; doi:10.3390/e17053253

The Homological Nature of Entropy

1
Max Planck Institute for Mathematics in the Sciences, Inselstrasse 22, 04103 Leipzig, Germany
2
Universite Paris Diderot-Paris 7, UFR de Mathematiques, Equipe Geometrie et Dynamique, Batiment Sophie Germain, 5 rue Thomas Mann, 75205 Paris Cedex 13, France
This paper is an extended version of our paper published in Proceedings of the MaxEnt 2014 Conference on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Amboise, France, 21–26 September 2014.
*
Author to whom correspondence should be addressed.
Received: 31 January 2015 / Revised: 3 May 2015 / Accepted: 5 May 2015 / Published: 13 May 2015
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)
View Full-Text   |   Download PDF [510 KB, uploaded 13 May 2015]

Abstract

We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1) classical probabilities and random variables; (2) quantum probabilities and observable operators; (3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback–Leibler divergence and generalizes them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems A, C and E in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system. View Full-Text
Keywords: Shannon information; homology theory; entropy; quantum information; homotopy of links; mutual informations; Kullback–Leiber divergence; trees; monads; partitions Shannon information; homology theory; entropy; quantum information; homotopy of links; mutual informations; Kullback–Leiber divergence; trees; monads; partitions
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Baudot, P.; Bennequin, D. The Homological Nature of Entropy. Entropy 2015, 17, 3253-3318.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top