Next Article in Journal
Generalized Boundary Conditions for the Time-Fractional Advection Diffusion Equation
Next Article in Special Issue
Natural Gradient Flow in the Mixture Geometry of a Discrete Exponential Family
Previous Article in Journal / Special Issue
Most Likely Maximum Entropy for Population Analysis with Region-Censored Data
Article Menu

Export Article

Open AccessArticle
Entropy 2015, 17(6), 3989-4027; doi:10.3390/e17063989

Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

Laboratoire des Signaux et Systèmes, UMR 8506 CNRS-SUPELEC-UNIV PARIS SUD, SUPELEC, Plateau de Moulon, 3 rue Juliot-Curie, 91192 Gif-sur-Yvette, France
This paper is an extended version of the paper published in Proceedings of the 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering (MaxEnt 2014), Amboise, France, 21–26 September 2014.
Received: 20 November 2014 / Revised: 4 May 2015 / Accepted: 5 May 2015 / Published: 12 June 2015
(This article belongs to the Special Issue Information, Entropy and Their Geometric Structures)
View Full-Text   |   Download PDF [347 KB, uploaded 12 June 2015]

Abstract

The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP), information theory, relative entropy and the Kullback–Leibler (KL) divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA) and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC) and, in particular, the variational Bayesian approximation (VBA) methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC) methods. We will also see that VBA englobes joint maximum a posteriori (MAP), as well as the different expectation-maximization (EM) algorithms as particular cases. View Full-Text
Keywords: Bayes; Laplace; entropy; Bayesian inference; maximum entropy principle; information theory; Kullback–Leibler divergence; Fisher information; geometrical science of information; inverse problems Bayes; Laplace; entropy; Bayesian inference; maximum entropy principle; information theory; Kullback–Leibler divergence; Fisher information; geometrical science of information; inverse problems
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Mohammad-Djafari, A. Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems. Entropy 2015, 17, 3989-4027.

Show more citation formats Show less citations formats

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top