Next Article in Journal
Quantifying the Variability in Resting-State Networks
Previous Article in Journal
Fabrication of Nanocrystalline AlCoCrFeNi High Entropy Alloy through Shock Consolidation and Mechanical Alloying
Open AccessArticle

The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology

by 1,2
1
Inserm UNIS UMR1072—Université Aix-Marseille, 13015 Marseille, France
2
Median Technologies, 06560 Valbonne, France
Entropy 2019, 21(9), 881; https://doi.org/10.3390/e21090881
Received: 5 July 2019 / Revised: 26 August 2019 / Accepted: 3 September 2019 / Published: 10 September 2019
Previous works established that entropy is characterized uniquely as the first cohomology class in a topos and described some of its applications to the unsupervised classification of gene expression modules or cell types. These studies raised important questions regarding the statistical meaning of the resulting cohomology of information and its interpretation or consequences with respect to usual data analysis and statistical physics. This paper aims to present the computational methods of information cohomology and to propose its interpretations in terms of statistical physics and machine learning. In order to further underline the cohomological nature of information functions and chain rules, the computation of the cohomology in low degrees is detailed to show more directly that the k multivariate mutual information ( I k ) are ( k - 1 ) -coboundaries. The ( k - 1 ) -cocycles condition corresponds to I k = 0 , which generalizes statistical independence to arbitrary degree k. Hence, the cohomology can be interpreted as quantifying the statistical dependences and the obstruction to factorization. I develop the computationally tractable subcase of simplicial information cohomology represented by entropy H k and information I k landscapes and their respective paths, allowing investigation of Shannon’s information in the multivariate case without the assumptions of independence or of identically distributed variables. I give an interpretation of this cohomology in terms of phase transitions in a model of k-body interactions, holding both for statistical physics without mean field approximations and for data points. The I 1 components define a self-internal energy functional U k and ( - 1 ) k I k , k 2 components define the contribution to a free energy functional G k (the total correlation) of the k-body interactions. A basic mean field model is developed and computed on genetic data reproducing usual free energy landscapes with phase transition, sustaining the analogy of clustering with condensation. The set of information paths in simplicial structures is in bijection with the symmetric group and random processes, providing a trivial topological expression of the second law of thermodynamics. The local minima of free energy, related to conditional information negativity and conditional independence, characterize a minimum free energy complex. This complex formalizes the minimum free-energy principle in topology, provides a definition of a complex system and characterizes a multiplicity of local minima that quantifies the diversity observed in biology. I give an interpretation of this complex in terms of unsupervised deep learning where the neural network architecture is given by the chain complex and conclude by discussing future supervised applications.
Keywords: algebraic topology; machine learning; information theory; statistical physics; deep neural networks; unsupervised learning; multivariate mutual information; statistical dependences; k-body interactions; synergy; clustering algebraic topology; machine learning; information theory; statistical physics; deep neural networks; unsupervised learning; multivariate mutual information; statistical dependences; k-body interactions; synergy; clustering
MDPI and ACS Style

Baudot, P. The Poincaré-Shannon Machine: Statistical Physics and Machine Learning Aspects of Information Cohomology. Entropy 2019, 21, 881.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map

1
Back to TopTop