entropy-logo

Journal Browser

Journal Browser

Differential Geometrical Theory of Statistics

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (15 September 2016) | Viewed by 94339

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors


E-Mail Website
Guest Editor

Special Issue Information

Dear Colleagues,

This Special Issue will collect a limited number of selected invited and contributed talks presented during the conference GSI'15 on "Geometric Science of Information" which will be held at Ecole Polytechnique, Paris-Saclay Campus, France, in October 2015. Conference web site: http://www.see.asso.fr/gsi2015 .

Dr. Frédéric Barbaresco
Prof. Dr. Frank Nielsen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.


Published Papers (16 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

257 KiB  
Article
A Sequence of Escort Distributions and Generalizations of Expectations on q-Exponential Family
by Hiroshi Matsuzoe
Entropy 2017, 19(1), 7; https://doi.org/10.3390/e19010007 - 25 Dec 2016
Cited by 11 | Viewed by 4540
Abstract
In the theory of complex systems, long tailed probability distributions are often discussed. For such a probability distribution, a deformed expectation with respect to an escort distribution is more useful than the standard expectation. In this paper, by generalizing such escort distributions, a [...] Read more.
In the theory of complex systems, long tailed probability distributions are often discussed. For such a probability distribution, a deformed expectation with respect to an escort distribution is more useful than the standard expectation. In this paper, by generalizing such escort distributions, a sequence of escort distributions is introduced. As a consequence, it is shown that deformed expectations with respect to sequential escort distributions effectively work for anomalous statistics. In particular, it is shown that a Fisher metric on a q-exponential family can be obtained from the escort expectation with respect to the second escort distribution, and a cubic form (or an Amari–Chentsov tensor field, equivalently) is obtained from the escort expectation with respect to the third escort distribution. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
766 KiB  
Article
Guaranteed Bounds on Information-Theoretic Measures of Univariate Mixtures Using Piecewise Log-Sum-Exp Inequalities
by Frank Nielsen and Ke Sun
Entropy 2016, 18(12), 442; https://doi.org/10.3390/e18120442 - 09 Dec 2016
Cited by 50 | Viewed by 9639
Abstract
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback–Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated [...] Read more.
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler divergence between two mixture models, are core primitives in many signal processing tasks. Since the Kullback–Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte Carlo stochastic integration, approximated or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy, the Kullback–Leibler and the α-divergences of mixtures. We illustrate the versatile method by reporting our experiments for approximating the Kullback–Leibler and the α-divergences between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures and Gamma mixtures. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

562 KiB  
Article
Foliations-Webs-Hessian Geometry-Information Geometry-Entropy and Cohomology
by Michel Nguiffo Boyom
Entropy 2016, 18(12), 433; https://doi.org/10.3390/e18120433 - 02 Dec 2016
Cited by 13 | Viewed by 5591
Abstract
IN MEMORIAM OF ALEXANDER GROTHENDIECK. THE MAN. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

2983 KiB  
Article
Anisotropically Weighted and Nonholonomically Constrained Evolutions on Manifolds
by Stefan Sommer
Entropy 2016, 18(12), 425; https://doi.org/10.3390/e18120425 - 26 Nov 2016
Cited by 10 | Viewed by 5785
Abstract
We present evolution equations for a family of paths that results from anisotropically weighting curve energies in non-linear statistics of manifold valued data. This situation arises when performing inference on data that have non-trivial covariance and are anisotropic distributed. The family can be [...] Read more.
We present evolution equations for a family of paths that results from anisotropically weighting curve energies in non-linear statistics of manifold valued data. This situation arises when performing inference on data that have non-trivial covariance and are anisotropic distributed. The family can be interpreted as most probable paths for a driving semi-martingale that through stochastic development is mapped to the manifold. We discuss how the paths are projections of geodesics for a sub-Riemannian metric on the frame bundle of the manifold, and how the curvature of the underlying connection appears in the sub-Riemannian Hamilton–Jacobi equations. Evolution equations for both metric and cometric formulations of the sub-Riemannian metric are derived. We furthermore show how rank-deficient metrics can be mixed with an underlying Riemannian metric, and we relate the paths to geodesics and polynomials in Riemannian geometry. Examples from the family of paths are visualized on embedded surfaces, and we explore computational representations on finite dimensional landmark manifolds with geometry induced from right-invariant metrics on diffeomorphism groups. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

1321 KiB  
Article
The Information Geometry of Sparse Goodness-of-Fit Testing
by Paul Marriott, Radka Sabolová, Germain Van Bever and Frank Critchley
Entropy 2016, 18(12), 421; https://doi.org/10.3390/e18120421 - 24 Nov 2016
Cited by 1 | Viewed by 4629
Abstract
This paper takes an information-geometric approach to the challenging issue of goodness-of-fit testing in the high dimensional, low sample size context where—potentially—boundary effects dominate. The main contributions of this paper are threefold: first, we present and prove two new theorems on the behaviour [...] Read more.
This paper takes an information-geometric approach to the challenging issue of goodness-of-fit testing in the high dimensional, low sample size context where—potentially—boundary effects dominate. The main contributions of this paper are threefold: first, we present and prove two new theorems on the behaviour of commonly used test statistics in this context; second, we investigate—in the novel environment of the extended multinomial model—the links between information geometry-based divergences and standard goodness-of-fit statistics, allowing us to formalise relationships which have been missing in the literature; finally, we use simulation studies to validate and illustrate our theoretical results and to explore currently open research questions about the way that discretisation effects can dominate sampling distributions near the boundary. Novelly accommodating these discretisation effects contrasts sharply with the essentially continuous approach of skewness and other corrections flowing from standard higher-order asymptotic analysis. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

284 KiB  
Article
Geometry Induced by a Generalization of Rényi Divergence
by David C. De Souza, Rui F. Vigelis and Charles C. Cavalcante
Entropy 2016, 18(11), 407; https://doi.org/10.3390/e18110407 - 17 Nov 2016
Cited by 15 | Viewed by 4158
Abstract
In this paper, we propose a generalization of Rényi divergence, and then we investigate its induced geometry. This generalization is given in terms of a φ-function, the same function that is used in the definition of non-parametric φ-families. The properties of [...] Read more.
In this paper, we propose a generalization of Rényi divergence, and then we investigate its induced geometry. This generalization is given in terms of a φ-function, the same function that is used in the definition of non-parametric φ-families. The properties of φ-functions proved to be crucial in the generalization of Rényi divergence. Assuming appropriate conditions, we verify that the generalized Rényi divergence reduces, in a limiting case, to the φ-divergence. In generalized statistical manifold, the φ-divergence induces a pair of dual connections D ( 1 ) and D ( 1 ) . We show that the family of connections D ( α ) induced by the generalization of Rényi divergence satisfies the relation D ( α ) = 1 α 2 D ( 1 ) + 1 + α 2 D ( 1 ) , with α [ 1 , 1 ] . Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
1374 KiB  
Article
Kernel Density Estimation on the Siegel Space with an Application to Radar Processing
by Emmanuel Chevallier, Thibault Forget, Frédéric Barbaresco and Jesus Angulo
Entropy 2016, 18(11), 396; https://doi.org/10.3390/e18110396 - 11 Nov 2016
Cited by 19 | Viewed by 10512
Abstract
This paper studies probability density estimation on the Siegel space. The Siegel space is a generalization of the hyperbolic space. Its Riemannian metric provides an interesting structure to the Toeplitz block Toeplitz matrices that appear in the covariance estimation of radar signals. The [...] Read more.
This paper studies probability density estimation on the Siegel space. The Siegel space is a generalization of the hyperbolic space. Its Riemannian metric provides an interesting structure to the Toeplitz block Toeplitz matrices that appear in the covariance estimation of radar signals. The main techniques of probability density estimation on Riemannian manifolds are reviewed. For computational reasons, we chose to focus on the kernel density estimation. The main result of the paper is the expression of Pelletier’s kernel density estimator. The computation of the kernels is made possible by the symmetric structure of the Siegel space. The method is applied to density estimation of reflection coefficients from radar observations. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

7800 KiB  
Article
Geometric Theory of Heat from Souriau Lie Groups Thermodynamics and Koszul Hessian Geometry: Applications in Information Geometry for Exponential Families
by Frédéric Barbaresco
Entropy 2016, 18(11), 386; https://doi.org/10.3390/e18110386 - 04 Nov 2016
Cited by 26 | Viewed by 9984
Abstract
We introduce the symplectic structure of information geometry based on Souriau’s Lie group thermodynamics model, with a covariant definition of Gibbs equilibrium via invariances through co-adjoint action of a group on its moment space, defining physical observables like energy, heat, and moment as [...] Read more.
We introduce the symplectic structure of information geometry based on Souriau’s Lie group thermodynamics model, with a covariant definition of Gibbs equilibrium via invariances through co-adjoint action of a group on its moment space, defining physical observables like energy, heat, and moment as pure geometrical objects. Using geometric Planck temperature of Souriau model and symplectic cocycle notion, the Fisher metric is identified as a Souriau geometric heat capacity. The Souriau model is based on affine representation of Lie group and Lie algebra that we compare with Koszul works on G/K homogeneous space and bijective correspondence between the set of G-invariant flat connections on G/K and the set of affine representations of the Lie algebra of G. In the framework of Lie group thermodynamics, an Euler-Poincaré equation is elaborated with respect to thermodynamic variables, and a new variational principal for thermodynamics is built through an invariant Poincaré-Cartan-Souriau integral. The Souriau-Fisher metric is linked to KKS (Kostant–Kirillov–Souriau) 2-form that associates a canonical homogeneous symplectic manifold to the co-adjoint orbits. We apply this model in the framework of information geometry for the action of an affine group for exponential families, and provide some illustrations of use cases for multivariate gaussian densities. Information geometry is presented in the context of the seminal work of Fréchet and his Clairaut-Legendre equation. The Souriau model of statistical physics is validated as compatible with the Balian gauge model of thermodynamics. We recall the precursor work of Casalis on affine group invariance for natural exponential families. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

774 KiB  
Article
Explicit Formula of Koszul–Vinberg Characteristic Functions for a Wide Class of Regular Convex Cones
by Hideyuki Ishi
Entropy 2016, 18(11), 383; https://doi.org/10.3390/e18110383 - 26 Oct 2016
Cited by 4 | Viewed by 4578
Abstract
The Koszul–Vinberg characteristic function plays a fundamental role in the theory of convex cones. We give an explicit description of the function and related integral formulas for a new class of convex cones, including homogeneous cones and cones associated with chordal (decomposable) graphs [...] Read more.
The Koszul–Vinberg characteristic function plays a fundamental role in the theory of convex cones. We give an explicit description of the function and related integral formulas for a new class of convex cones, including homogeneous cones and cones associated with chordal (decomposable) graphs appearing in statistics. Furthermore, we discuss an application to maximum likelihood estimation for a certain exponential family over a cone of this class. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
345 KiB  
Article
Non-Asymptotic Confidence Sets for Circular Means
by Thomas Hotz, Florian Kelma and Johannes Wieditz
Entropy 2016, 18(10), 375; https://doi.org/10.3390/e18100375 - 20 Oct 2016
Cited by 3 | Viewed by 4460
Abstract
The mean of data on the unit circle is defined as the minimizer of the average squared Euclidean distance to the data. Based on Hoeffding’s mass concentration inequalities, non-asymptotic confidence sets for circular means are constructed which are universal in the sense that [...] Read more.
The mean of data on the unit circle is defined as the minimizer of the average squared Euclidean distance to the data. Based on Hoeffding’s mass concentration inequalities, non-asymptotic confidence sets for circular means are constructed which are universal in the sense that they require no distributional assumptions. These are then compared with asymptotic confidence sets in simulations and for a real data set. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

450 KiB  
Article
From Tools in Symplectic and Poisson Geometry to J.-M. Souriau’s Theories of Statistical Mechanics and Thermodynamics
by Charles-Michel Marle
Entropy 2016, 18(10), 370; https://doi.org/10.3390/e18100370 - 19 Oct 2016
Cited by 34 | Viewed by 5415
Abstract
I present in this paper some tools in symplectic and Poisson geometry in view of their applications in geometric mechanics and mathematical physics. After a short discussion of the Lagrangian an Hamiltonian formalisms, including the use of symmetry groups, and a presentation of [...] Read more.
I present in this paper some tools in symplectic and Poisson geometry in view of their applications in geometric mechanics and mathematical physics. After a short discussion of the Lagrangian an Hamiltonian formalisms, including the use of symmetry groups, and a presentation of the Tulczyjew’s isomorphisms (which explain some aspects of the relations between these formalisms), I explain the concept of manifold of motions of a mechanical system and its use, due to J.-M. Souriau, in statistical mechanics and thermodynamics. The generalization of the notion of thermodynamic equilibrium in which the one-dimensional group of time translations is replaced by a multi-dimensional, maybe non-commutative Lie group, is fully discussed and examples of applications in physics are given. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
1018 KiB  
Article
Entropy Minimizing Curves with Application to Flight Path Design and Clustering
by Stéphane Puechmorel and Florence Nicol
Entropy 2016, 18(9), 337; https://doi.org/10.3390/e18090337 - 15 Sep 2016
Cited by 5 | Viewed by 4627
Abstract
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase [...] Read more.
Air traffic management (ATM) aims at providing companies with a safe and ideally optimal aircraft trajectory planning. Air traffic controllers act on flight paths in such a way that no pair of aircraft come closer than the regulatory separation norms. With the increase of traffic, it is expected that the system will reach its limits in the near future: a paradigm change in ATM is planned with the introduction of trajectory-based operations. In this context, sets of well-separated flight paths are computed in advance, tremendously reducing the number of unsafe situations that must be dealt with by controllers. Unfortunately, automated tools used to generate such planning generally issue trajectories not complying with operational practices or even flight dynamics. In this paper, a means of producing realistic air routes from the output of an automated trajectory design tool is investigated. For that purpose, the entropy of a system of curves is first defined, and a mean of iteratively minimizing it is presented. The resulting curves form a route network that is suitable for use in a semi-automated ATM system with human in the loop. The tool introduced in this work is quite versatile and may be applied also to unsupervised classification of curves: an example is given for French traffic. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

347 KiB  
Article
A Proximal Point Algorithm for Minimum Divergence Estimators with Application to Mixture Models
by Diaa Al Mohamad and Michel Broniatowski
Entropy 2016, 18(8), 277; https://doi.org/10.3390/e18080277 - 27 Jul 2016
Cited by 1 | Viewed by 4084
Abstract
Estimators derived from a divergence criterion such as φ - divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation [...] Read more.
Estimators derived from a divergence criterion such as φ - divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual φ–divergence estimator (MDφDE), an estimator built using a dual representation of φ–divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estimator. The algorithm contains by construction the well-known Expectation Maximization (EM) algorithm. Our work is based on the paper of Tseng on the likelihood function. We provide some convergence properties by adapting the ideas of Tseng. We improve Tseng’s results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and is hard to be verified for “non mixture” ones. Convergence of the EM algorithm in a two-component Gaussian mixture is discussed in the spirit of our approach. Several experimental results on mixture models are provided to confirm the validity of the approach. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Graphical abstract

293 KiB  
Article
Link between Lie Group Statistical Mechanics and Thermodynamics of Continua
by Géry De Saxcé
Entropy 2016, 18(7), 254; https://doi.org/10.3390/e18070254 - 12 Jul 2016
Cited by 12 | Viewed by 4558
Abstract
In this work, we consider the value of the momentum map of the symplectic mechanics as an affine tensor called momentum tensor. From this point of view, we analyze the underlying geometric structure of the theories of Lie group statistical mechanics and relativistic [...] Read more.
In this work, we consider the value of the momentum map of the symplectic mechanics as an affine tensor called momentum tensor. From this point of view, we analyze the underlying geometric structure of the theories of Lie group statistical mechanics and relativistic thermodynamics of continua, formulated by Souriau independently of each other. We bridge the gap between them in the classical Galilean context. These geometric structures of the thermodynamics are rich and we think they might be a source of inspiration for the geometric theory of information based on the concept of entropy. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
291 KiB  
Article
Syntactic Parameters and a Coding Theory Perspective on Entropy and Complexity of Language Families
by Matilde Marcolli
Entropy 2016, 18(4), 110; https://doi.org/10.3390/e18040110 - 07 Apr 2016
Cited by 8 | Viewed by 4593
Abstract
We present a simple computational approach to assigning a measure of complexity and information/entropy to families of natural languages, based on syntactic parameters and the theory of error correcting codes. We associate to each language a binary string of syntactic parameters and to [...] Read more.
We present a simple computational approach to assigning a measure of complexity and information/entropy to families of natural languages, based on syntactic parameters and the theory of error correcting codes. We associate to each language a binary string of syntactic parameters and to a language family a binary code, with code words the binary string associated to each language. We then evaluate the code parameters (rate and relative minimum distance) and the position of the parameters with respect to the asymptotic bound of error correcting codes and the Gilbert–Varshamov bound. These bounds are, respectively, related to the Kolmogorov complexity and the Shannon entropy of the code and this gives us a computationally simple way to obtain estimates on the complexity and information, not of individual languages but of language families. This notion of complexity is related, from the linguistic point of view to the degree of variability of syntactic parameter across languages belonging to the same (historical) family. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
952 KiB  
Article
Riemannian Laplace Distribution on the Space of Symmetric Positive Definite Matrices
by Hatem Hajri, Ioana Ilea, Salem Said, Lionel Bombrun and Yannick Berthoumieu
Entropy 2016, 18(3), 98; https://doi.org/10.3390/e18030098 - 16 Mar 2016
Cited by 11 | Viewed by 5562
Abstract
The Riemannian geometry of the space Pm, of m × m symmetric positive definite matrices, has provided effective tools to the fields of medical imaging, computer vision and radar signal processing. Still, an open challenge remains, which consists of extending these [...] Read more.
The Riemannian geometry of the space Pm, of m × m symmetric positive definite matrices, has provided effective tools to the fields of medical imaging, computer vision and radar signal processing. Still, an open challenge remains, which consists of extending these tools to correctly handle the presence of outliers (or abnormal data), arising from excessive noise or faulty measurements. The present paper tackles this challenge by introducing new probability distributions, called Riemannian Laplace distributions on the space Pm. First, it shows that these distributions provide a statistical foundation for the concept of the Riemannian median, which offers improved robustness in dealing with outliers (in comparison to the more popular concept of the Riemannian center of mass). Second, it describes an original expectation-maximization algorithm, for estimating mixtures of Riemannian Laplace distributions. This algorithm is applied to the problem of texture classification, in computer vision, which is considered in the presence of outliers. It is shown to give significantly better performance with respect to other recently-proposed approaches. Full article
(This article belongs to the Special Issue Differential Geometrical Theory of Statistics)
Show Figures

Figure 1

Back to TopTop