Special Issue "Information Geometry"
A special issue of Entropy (ISSN 1099-4300).
Deadline for manuscript submissions: closed (31 March 2014)
Dr. Geert Verdoolaege
Research Unit Nuclear Fusion, Department of Applied Physics, Ghent University, Sint-Pietersnieuwstraat 41, 9000 Gent, Belgium
Interests: probability theory; bayesian inference; machine learning; information geometry; similarity measurement for probability distributions; nuclear fusion; plasma physics; continuum mechanics; spectroscopy
The mathematical field of Information Geometry originated from the observation by C.R. Rao in 1945 that the Fisher information can be used to define a Riemannian metric in spaces of probability distributions. This led to a geometrical description of probability theory and statistics, allowing the study of the invariant properties of statistical manifolds. It was through the works of S.-I. Amari and others, that it was later realized that the differential-geometric structure of a statistical manifold can be derived from divergence functions, yielding a Riemannian metric and a pair of dually coupled affine connections.
Since then, Information Geometry has become a truly interdisciplinary field with applications in various domains. It enables a deeper understanding of the methods of statistical inference and machine learning, while providing a powerful framework for deriving new algorithms as well. As such, Information Geometry has many applications in optimization (e.g. on matrix manifolds), signal and image processing, computer vision, neural networks and other subfields of the information sciences. Furthermore, the methods of Information Geometry have been applied to a wide variety of topics in physics, mathematical finance, biology and the neurosciences. In physics, there are many links through the fields of (nonextensive) statistical mechanics and quantum mechanics, since these are also based on probability theory. In addition, there are recent indications that the formalism of Information Geometry may be suitable for explaining or deriving other physical laws as well.
For this special issue we welcome submissions related to the foundations and applications of Information Geometry. We envisage contributions that aim at clarifying the connection of Information Geometry with both the information sciences and the physical sciences, so as to demonstrate the profound impact of the field in these disciplines. In addition, we hope to receive original papers illustrating the wide variety of applications of the methods of Information Geometry.
Dr. Geert Verdoolaege
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.
- information geometry
- probability theory
- machine learning
- signal processing
- image processing
- statistical mechanics
- quantum mechanics
Article: Information Geometry of Positive Measures and Positive-Definite Matrices: Decomposable Dually Flat Structure
Entropy 2014, 16(4), 2131-2145; doi:10.3390/e16042131
Received: 14 February 2014; in revised form: 9 April 2014 / Accepted: 10 April 2014 / Published: 14 April 2014| Download PDF Full-text (246 KB)
Entropy 2014, 16(4), 2023-2055; doi:10.3390/e16042023
Received: 12 February 2014; in revised form: 11 March 2014 / Accepted: 24 March 2014 / Published: 8 April 2014| Download PDF Full-text (365 KB)
Article: Learning from Complex Systems: On the Roles of Entropy and Fisher Information in Pairwise Isotropic Gaussian Markov Random Fields
Entropy 2014, 16(2), 1002-1036; doi:10.3390/e16021002
Received: 4 December 2013; Accepted: 30 January 2014 / Published: 18 February 2014| Download PDF Full-text (6192 KB)
Last update: 20 August 2013