Reprint

The Statistical Foundations of Entropy

Edited by
April 2022
182 pages
  • ISBN978-3-0365-3557-9 (Hardback)
  • ISBN978-3-0365-3558-6 (PDF)

This book is a reprint of the Special Issue The Statistical Foundations of Entropy that was published in

Chemistry & Materials Science
Computer Science & Mathematics
Physical Sciences
Summary

In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems.

Format
  • Hardback
License
© 2022 by the authors; CC BY-NC-ND license
Keywords
ecological inference; generalized cross entropy; distributional weighted regression; matrix adjustment; entropy; critical phenomena; renormalization; multiscale thermodynamics; GENERIC; non-Newtonian calculus; non-Diophantine arithmetic; Kolmogorov–Nagumo averages; escort probabilities; generalized entropies; maximum entropy principle; MaxEnt distribution; calibration invariance; Lagrange multipliers; entropy; generalized Bilal distribution; adaptive Type-II progressive hybrid censoring scheme; maximum likelihood estimation; Bayesian estimation; Lindley’s approximation; confidence interval; Markov chain Monte Carlo method; Rényi entropy; Tsallis entropy; entropic uncertainty relations; quantum metrology; non-equilibrium thermodynamics; entropy; variational entropy; rényi entropy; tsallis entropy; landsberg—vedral entropy; gaussian entropy; sharma—mittal entropy; α-mutual information; α-channel capacity; maximum entropy; Bayesian inference; updating probabilities; n/a