entropy-logo

Journal Browser

Journal Browser

Facets of Entropy - Papers presented at the workshop in Copenhagen (24-26 October 2007)

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (28 February 2008) | Viewed by 64509

Special Issue Editor


E-Mail Website
Guest Editor
Copenhagen Business College, Rønne Alle 1, st., 2860 Søborg, Denmark
Interests: cause and effect; entropy; exponential families; graphical models; information divergence; minimum description length; quantum information; statistical mechanics
Special Issues, Collections and Topics in MDPI journals

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

186 KiB  
Article
Axiomatic Characterizations of Information Measures
by Imre Csiszár
Entropy 2008, 10(3), 261-273; https://doi.org/10.3390/e10030261 - 19 Sep 2008
Cited by 202 | Viewed by 13979
Abstract
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} [...] Read more.
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory. Full article
166 KiB  
Article
Deformed Generalization of the Semiclassical Entropy
by Gustavo Ferri, Fernando Olivares, Flavia Pennini, Angel Plastino, Anel R. Plastino and Montserrat Casas
Entropy 2008, 10(3), 240-247; https://doi.org/10.3390/e10030240 - 19 Sep 2008
Cited by 1 | Viewed by 8829 | Correction
Abstract
We explicitly obtain here a novel expression for the semiclassical Wehrl’s entropy using deformed algebras built up with the q¡coherent states (see Arik and Coon [J.Math.Phys. 17, 524 (1976) and Quesne [J. Phys. A 35, 9213 (2002)]). The generalization is investigated with emphasis [...] Read more.
We explicitly obtain here a novel expression for the semiclassical Wehrl’s entropy using deformed algebras built up with the q¡coherent states (see Arik and Coon [J.Math.Phys. 17, 524 (1976) and Quesne [J. Phys. A 35, 9213 (2002)]). The generalization is investigated with emphasis on i) its behavior as a function of temperature and ii) the results obtained when the deformation-parameter tends to unity. Full article
Show Figures

Figure 1

235 KiB  
Article
Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints
by Gian Paolo Beretta
Entropy 2008, 10(3), 160-182; https://doi.org/10.3390/entropy-e10030010 - 14 Aug 2008
Cited by 27 | Viewed by 9015
Abstract
A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible [...] Read more.
A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges. Full article
213 KiB  
Article
Generalised Exponential Families and Associated Entropy Functions
by Jan Naudts
Entropy 2008, 10(3), 131-149; https://doi.org/10.3390/entropy-e10030131 - 16 Jul 2008
Cited by 65 | Viewed by 9797
Abstract
A generalised notion of exponential families is introduced. It is based on the variational principle, borrowed from statistical physics. It is shown that inequivalent generalised entropy functions lead to distinct generalised exponential families. The well-known result that the inequality of Cram´er and Rao [...] Read more.
A generalised notion of exponential families is introduced. It is based on the variational principle, borrowed from statistical physics. It is shown that inequivalent generalised entropy functions lead to distinct generalised exponential families. The well-known result that the inequality of Cram´er and Rao becomes an equality in the case of an exponential family can be generalised. However, this requires the introduction of escort probabilities. Full article
158 KiB  
Article
Incremental Entropy Relation as an Alternative to MaxEnt
by Angelo Plastino, Angel R. Plastino, Evaldo M. F. Curado and Montse Casas
Entropy 2008, 10(2), 124-130; https://doi.org/10.3390/entropy-e10020124 - 24 Jun 2008
Cited by 3 | Viewed by 10443
Abstract
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses the [...] Read more.
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses the macroscopic thermodynamic relation that links ±S to changes in i) the internal energy E and ii) the remaining M relevant extensive quantities Ai, i = 1; : : : ;M; that characterize the context one is working with. Full article
354 KiB  
Article
Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
by Yun Gao, Ioannis Kontoyiannis and Elie Bienenstock
Entropy 2008, 10(2), 71-99; https://doi.org/10.3390/entropy-e10020071 - 17 Jun 2008
Cited by 78 | Viewed by 11806
Abstract
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression [...] Read more.
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases. Full article
Show Figures

Figure 1

Back to TopTop