Reprint

Information, Entropy and Their Geometric Structures

Edited by
September 2015
552 pages
  • ISBN978-3-03842-103-0 (Hardback)
  • ISBN978-3-03842-104-7 (PDF)

This book is a reprint of the Special Issue Information, Entropy and Their Geometric Structures that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Format
  • Hardback
License
© 2015 MDPI; under CC BY-NC-ND license
Keywords
Koszul-Vinberg characteristic function; Koszul forms; Koszul entropy; temperature vector; covariant thermodynamics; Souriau-Gibbs equilibrium state; Shannon’s formula; Hartley’s rule; additive noise channel; differential entropy; channel capacity; signal-to-noise ratio; pulse-amplitude modulation (PAM); additive white Gaussian noise (AWGN) channel; uniform noise channel; characteristic function; uniform B-spline function; uniform sum distribution; central limit theorem; mechanics; thermodynamics; potentials; Lagrange’s equations; Joseph John Thomson; Pierre Duhem; black-box; optimization; geodesics; Gaussian; information geometry; naturalgradient; Noether; learning rate; IGO; xNES; metamorphic systems; entropy; Bernoulli approximation; homology measures; Kähler manifold; information geometry; Bayesian prediction; superharmonic prior; Kähler manifold; information geometry; cepstrum; time series model; Bayesian prediction; superharmonic prior; Fisher metric; probability measure; geodesic; Busemann function; barycenter; Lie group; Lie algebra; statistics; pseudo-Riemannian; Shannon information; homology theory; entropy; quantum information; homotopy of links; mutual informations; Kullback–Leiber divergence; trees; monads; partitions; prior probabilities; hyperplanes; geometrical probability; neural networks; censored observations; non-parametric maximum likelihood; constrained MaxEnt; regularization; Bayes; Laplace; entropy; Bayesian inference; maximum entropy principle; information theory; Kullback–Leibler divergence; Fisher information; geometrical science of information; inverse problems; information geometry; stochastic relaxation; natural gradient flow; expectation parameters; toric models; α-embedding; monotone embedding; conjugate embedding; generalized Fisher–Rao metric; Amari–Chentsov tensor; deformed logarithm; representation duality; (ρ,τ)-geometry; regression analysis; information geometry; geodesic distance; scaling laws; nuclear fusion; multi-task learning; Itakura–Saito distance; pseudo model; un-normalized model