Entropy 2014, 16(7), 4015-4031; doi:10.3390/e16074015
Article

New Riemannian Priors on the Univariate Normal Model

* email, email and email
Received: 17 April 2014; in revised form: 23 June 2014 / Accepted: 9 July 2014 / Published: 17 July 2014
(This article belongs to the Special Issue Information Geometry)
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Abstract: The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ) with hyperparameters θ - Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(d2(θ, θ - )/2γ2), where d2(θ, θ - ) is the square of Rao’s Riemannian distance. The distributions G( θ - , γ) are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ) is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ) has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ), as shown in the paper), and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ) as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ) and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ) can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.
Keywords: Fisher information; Riemannian metric; prior distribution; univariate normal distribution; image classification
PDF Full-text Download PDF Full-Text [2418 KB, uploaded 17 July 2014 11:13 CEST]

Export to BibTeX |
EndNote


MDPI and ACS Style

Said, S.; Bombrun, L.; Berthoumieu, Y. New Riemannian Priors on the Univariate Normal Model. Entropy 2014, 16, 4015-4031.

AMA Style

Said S, Bombrun L, Berthoumieu Y. New Riemannian Priors on the Univariate Normal Model. Entropy. 2014; 16(7):4015-4031.

Chicago/Turabian Style

Said, Salem; Bombrun, Lionel; Berthoumieu, Yannick. 2014. "New Riemannian Priors on the Univariate Normal Model." Entropy 16, no. 7: 4015-4031.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert