entropy-logo

Journal Browser

Journal Browser

The Statistical Foundations of Entropy

A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Statistical Physics".

Deadline for manuscript submissions: closed (19 May 2021) | Viewed by 25199

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editors

Faculty of Nuclear Sciences and Physical Engineering, Czech Technical University in Prague, Prague, Czech Republic
Interests: non-equilibrium thermodynamics; stochastic processes; superstatistics; (multi)fractal data analysis; transfer entropies; black-swan processes; Feynman's path integral; coherent states; critical phenomena
Special Issues, Collections and Topics in MDPI journals
Department of Science for Complex Systems, Medical University of Vienna & CSH Associate Faculty, 1080 Vienna, Austria
Interests: complex systems; econophysics; stochastic thermodynamics; theory of information; networks; generalized entropies
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

During the last two decades, the understanding of complex dynamical systems undergone important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. On the other hand, the usual Boltzmann–Gibbs statistics have proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.

The aim of this Special Issue is to encourage researchers to present an original piece of work that could contribute to an ongoing discussion on statistical foundations of entropy with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. Expected contributions should address, on the one-hand, purely conceptual issues ranging from non-equilibrium statistical physics, (quantum) thermodynamics to information and estimation theory, and on the other hand, they should be related to applications, e.g., in complex dynamical systems, micromechanics, networks structures, or stochastic thermodynamics.

Dr. Petr Jizba
Dr. Jan Korbel
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Generalized entropies
  • Non-equilibrium processes
  • Generalizations of statistical mechanics
  • Quantum systems
  • Information-theoretic entropies
  • Axiomatic approaches

Published Papers (10 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

3 pages, 175 KiB  
Editorial
The Statistical Foundations of Entropy
by Petr Jizba and Jan Korbel
Entropy 2021, 23(10), 1367; https://doi.org/10.3390/e23101367 - 19 Oct 2021
Viewed by 2234
Abstract
During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information theory, and statistical estimation to more recent applications in biology, astrophysics, geology, financial markets, or social networks [...] Read more.
During the last few decades, the notion of entropy has become omnipresent in many scientific disciplines, ranging from traditional applications in statistical physics and chemistry, information theory, and statistical estimation to more recent applications in biology, astrophysics, geology, financial markets, or social networks [...] Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)

Research

Jump to: Editorial

20 pages, 321 KiB  
Article
Entropy, Information, and the Updating of Probabilities
by Ariel Caticha
Entropy 2021, 23(7), 895; https://doi.org/10.3390/e23070895 - 14 Jul 2021
Cited by 7 | Viewed by 2540
Abstract
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the [...] Read more.
This paper is a review of a particular approach to the method of maximum entropy as a general framework for inference. The discussion emphasizes pragmatic elements in the derivation. An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. The method of updating from a prior to posterior probability distribution is designed through an eliminative induction process. The logarithmic relative entropy is singled out as a unique tool for updating (a) that is of universal applicability, (b) that recognizes the value of prior information, and (c) that recognizes the privileged role played by the notion of independence in science. The resulting framework—the ME method—can handle arbitrary priors and arbitrary constraints. It includes the MaxEnt and Bayes’ rules as special cases and, therefore, unifies entropic and Bayesian methods into a single general inference scheme. The ME method goes beyond the mere selection of a single posterior, and also addresses the question of how much less probable other distributions might be, which provides a direct bridge to the theories of fluctuations and large deviations. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
23 pages, 730 KiB  
Article
On the α-q-Mutual Information and the α-q-Capacities
by Velimir M. Ilić and Ivan B. Djordjević
Entropy 2021, 23(6), 702; https://doi.org/10.3390/e23060702 - 01 Jun 2021
Cited by 2 | Viewed by 2537
Abstract
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. [...] Read more.
The measures of information transfer which correspond to non-additive entropies have intensively been studied in previous decades. The majority of the work includes the ones belonging to the Sharma–Mittal entropy class, such as the Rényi, the Tsallis, the Landsberg–Vedral and the Gaussian entropies. All of the considerations follow the same approach, mimicking some of the various and mutually equivalent definitions of Shannon information measures, and the information transfer is quantified by an appropriately defined measure of mutual information, while the maximal information transfer is considered as a generalized channel capacity. However, all of the previous approaches fail to satisfy at least one of the ineluctable properties which a measure of (maximal) information transfer should satisfy, leading to counterintuitive conclusions and predicting nonphysical behavior even in the case of very simple communication channels. This paper fills the gap by proposing two parameter measures named the α-q-mutual information and the α-q-capacity. In addition to standard Shannon approaches, special cases of these measures include the α-mutual information and the α-capacity, which are well established in the information theory literature as measures of additive Rényi information transfer, while the cases of the Tsallis, the Landsberg–Vedral and the Gaussian entropies can also be accessed by special choices of the parameters α and q. It is shown that, unlike the previous definition, the α-q-mutual information and the α-q-capacity satisfy the set of properties, which are stated as axioms, by which they reduce to zero in the case of totally destructive channels and to the (maximal) input Sharma–Mittal entropy in the case of perfect transmission, which is consistent with the maximum likelihood detection error. In addition, they are non-negative and less than or equal to the input and the output Sharma–Mittal entropies, in general. Thus, unlike the previous approaches, the proposed (maximal) information transfer measures do not manifest nonphysical behaviors such as sub-capacitance or super-capacitance, which could qualify them as appropriate measures of the Sharma–Mittal information transfer. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

16 pages, 318 KiB  
Article
Classical and Quantum H-Theorem Revisited: Variational Entropy and Relaxation Processes
by Carlos Medel-Portugal, Juan Manuel Solano-Altamirano and José Luis E. Carrillo-Estrada
Entropy 2021, 23(3), 366; https://doi.org/10.3390/e23030366 - 19 Mar 2021
Cited by 2 | Viewed by 2350
Abstract
We propose a novel framework to describe the time-evolution of dilute classical and quantum gases, initially out of equilibrium and with spatial inhomogeneities, towards equilibrium. Briefly, we divide the system into small cells and consider the local equilibrium hypothesis. We subsequently define a [...] Read more.
We propose a novel framework to describe the time-evolution of dilute classical and quantum gases, initially out of equilibrium and with spatial inhomogeneities, towards equilibrium. Briefly, we divide the system into small cells and consider the local equilibrium hypothesis. We subsequently define a global functional that is the sum of cell H-functionals. Each cell functional recovers the corresponding Maxwell–Boltzmann, Fermi–Dirac, or Bose–Einstein distribution function, depending on the classical or quantum nature of the gas. The time-evolution of the system is described by the relationship dH/dt0, and the equality condition occurs if the system is in the equilibrium state. Via the variational method, proof of the previous relationship, which might be an extension of the H-theorem for inhomogeneous systems, is presented for both classical and quantum gases. Furthermore, the H-functionals are in agreement with the correspondence principle. We discuss how the H-functionals can be identified with the system’s entropy and analyze the relaxation processes of out-of-equilibrium systems. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
24 pages, 625 KiB  
Article
From Rényi Entropy Power to Information Scan of Quantum States
by Petr Jizba, Jacob Dunningham and Martin Prokš
Entropy 2021, 23(3), 334; https://doi.org/10.3390/e23030334 - 12 Mar 2021
Cited by 3 | Viewed by 2870
Abstract
In this paper, we generalize the notion of Shannon’s entropy power to the Rényi-entropy setting. With this, we propose generalizations of the de Bruijn identity, isoperimetric inequality, or Stam inequality. This framework not only allows for finding new estimation inequalities, but it also [...] Read more.
In this paper, we generalize the notion of Shannon’s entropy power to the Rényi-entropy setting. With this, we propose generalizations of the de Bruijn identity, isoperimetric inequality, or Stam inequality. This framework not only allows for finding new estimation inequalities, but it also provides a convenient technical framework for the derivation of a one-parameter family of Rényi-entropy-power-based quantum-mechanical uncertainty relations. To illustrate the usefulness of the Rényi entropy power obtained, we show how the information probability distribution associated with a quantum state can be reconstructed in a process that is akin to quantum-state tomography. We illustrate the inner workings of this with the so-called “cat states”, which are of fundamental interest and practical use in schemes such as quantum metrology. Salient issues, including the extension of the notion of entropy power to Tsallis entropy and ensuing implications in estimation theory, are also briefly discussed. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

24 pages, 1057 KiB  
Article
Estimation for Entropy and Parameters of Generalized Bilal Distribution under Adaptive Type II Progressive Hybrid Censoring Scheme
by Xiaolin Shi, Yimin Shi and Kuang Zhou
Entropy 2021, 23(2), 206; https://doi.org/10.3390/e23020206 - 08 Feb 2021
Cited by 6 | Viewed by 1454
Abstract
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the [...] Read more.
Entropy measures the uncertainty associated with a random variable. It has important applications in cybernetics, probability theory, astrophysics, life sciences and other fields. Recently, many authors focused on the estimation of entropy with different life distributions. However, the estimation of entropy for the generalized Bilal (GB) distribution has not yet been involved. In this paper, we consider the estimation of the entropy and the parameters with GB distribution based on adaptive Type-II progressive hybrid censored data. Maximum likelihood estimation of the entropy and the parameters are obtained using the Newton–Raphson iteration method. Bayesian estimations under different loss functions are provided with the help of Lindley’s approximation. The approximate confidence interval and the Bayesian credible interval of the parameters and entropy are obtained by using the delta and Markov chain Monte Carlo (MCMC) methods, respectively. Monte Carlo simulation studies are carried out to observe the performances of the different point and interval estimations. Finally, a real data set has been analyzed for illustrative purposes. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

10 pages, 262 KiB  
Article
Calibration Invariance of the MaxEnt Distribution in the Maximum Entropy Principle
by Jan Korbel
Entropy 2021, 23(1), 96; https://doi.org/10.3390/e23010096 - 11 Jan 2021
Cited by 9 | Viewed by 2459
Abstract
The maximum entropy principle consists of two steps: The first step is to find the distribution which maximizes entropy under given constraints. The second step is to calculate the corresponding thermodynamic quantities. The second part is determined by Lagrange multipliers’ relation to the [...] Read more.
The maximum entropy principle consists of two steps: The first step is to find the distribution which maximizes entropy under given constraints. The second step is to calculate the corresponding thermodynamic quantities. The second part is determined by Lagrange multipliers’ relation to the measurable physical quantities as temperature or Helmholtz free energy/free entropy. We show that for a given MaxEnt distribution, the whole class of entropies and constraints leads to the same distribution but generally different thermodynamics. Two simple classes of transformations that preserve the MaxEnt distributions are studied: The first case is a transform of the entropy to an arbitrary increasing function of that entropy. The second case is the transform of the energetic constraint to a combination of the normalization and energetic constraints. We derive group transformations of the Lagrange multipliers corresponding to these transformations and determine their connections to thermodynamic quantities. For each case, we provide a simple example of this transformation. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
20 pages, 334 KiB  
Article
Unifying Aspects of Generalized Calculus
by Marek Czachor
Entropy 2020, 22(10), 1180; https://doi.org/10.3390/e22101180 - 19 Oct 2020
Cited by 13 | Viewed by 3058
Abstract
Non-Newtonian calculus naturally unifies various ideas that have occurred over the years in the field of generalized thermostatistics, or in the borderland between classical and quantum information theory. The formalism, being very general, is as simple as the calculus we know from undergraduate [...] Read more.
Non-Newtonian calculus naturally unifies various ideas that have occurred over the years in the field of generalized thermostatistics, or in the borderland between classical and quantum information theory. The formalism, being very general, is as simple as the calculus we know from undergraduate courses of mathematics. Its theoretical potential is huge, and yet it remains unknown or unappreciated. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

19 pages, 360 KiB  
Article
Dynamic and Renormalization-Group Extensions of the Landau Theory of Critical Phenomena
by Miroslav Grmela, Václav Klika and Michal Pavelka
Entropy 2020, 22(9), 978; https://doi.org/10.3390/e22090978 - 02 Sep 2020
Cited by 4 | Viewed by 2041
Abstract
We place the Landau theory of critical phenomena into the larger context of multiscale thermodynamics. The thermodynamic potentials, with which the Landau theory begins, arise as Lyapunov like functions in the investigation of the relations among different levels of description. By seeing the [...] Read more.
We place the Landau theory of critical phenomena into the larger context of multiscale thermodynamics. The thermodynamic potentials, with which the Landau theory begins, arise as Lyapunov like functions in the investigation of the relations among different levels of description. By seeing the renormalization-group approach to critical phenomena as inseparability of levels in the critical point, we can adopt the renormalization-group viewpoint into the Landau theory and by doing it bring its predictions closer to results of experimental observations. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Show Figures

Figure 1

12 pages, 260 KiB  
Article
Entropy-Based Solutions for Ecological Inference Problems: A Composite Estimator
by Rosa Bernardini Papalia and Esteban Fernandez Vazquez
Entropy 2020, 22(7), 781; https://doi.org/10.3390/e22070781 - 17 Jul 2020
Cited by 4 | Viewed by 1758
Abstract
Information-based estimation techniques are becoming more popular in the field of Ecological Inference. Within this branch of estimation techniques, two alternative approaches can be pointed out. The first one is the Generalized Maximum Entropy (GME) approach based on a matrix adjustment problem where [...] Read more.
Information-based estimation techniques are becoming more popular in the field of Ecological Inference. Within this branch of estimation techniques, two alternative approaches can be pointed out. The first one is the Generalized Maximum Entropy (GME) approach based on a matrix adjustment problem where the only observable information is given by the margins of the target matrix. An alternative approach is based on a distributionally weighted regression (DWR) equation. These two approaches have been studied so far as completely different streams, even when there are clear connections between them. In this paper we present these connections explicitly. More specifically, we show that under certain conditions the generalized cross-entropy (GCE) solution for a matrix adjustment problem and the GME estimator of a DWR equation differ only in terms of the a priori information considered. Then, we move a step forward and propose a composite estimator that combines the two priors considered in both approaches. Finally, we present a numerical experiment and an empirical application based on Spanish data for the 2010 year. Full article
(This article belongs to the Special Issue The Statistical Foundations of Entropy)
Back to TopTop