E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Facets of Entropy - Papers presented at the workshop in Copenhagen (24-26 October 2007)"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (28 February 2008)

Special Issue Editor

Guest Editor
Dr. Peter Harremoës (Website)

Copenhagen Business College, Rønne Alle 1, st., DK-2860 Søborg, Denmark
Interests: symmetry; information divergence; cause and effect; Maxwell\'s demon; probability and statistics

Published Papers (6 papers)

View options order results:
result details:
Displaying articles 1-6
Export citation of selected articles as:

Research

Open AccessArticle Deformed Generalization of the Semiclassical Entropy
Entropy 2008, 10(3), 240-247; doi:10.3390/e10030240
Received: 5 March 2008 / Revised: 25 August 2008 / Accepted: 25 August 2008 / Published: 19 September 2008
Cited by 1 | PDF Full-text (166 KB) | Correction | Supplementary Files
Abstract
We explicitly obtain here a novel expression for the semiclassical Wehrl’s entropy using deformed algebras built up with the q¡coherent states (see Arik and Coon [J.Math.Phys. 17, 524 (1976) and Quesne [J. Phys. A 35, 9213 (2002)]). The generalization is investigated with [...] Read more.
We explicitly obtain here a novel expression for the semiclassical Wehrl’s entropy using deformed algebras built up with the q¡coherent states (see Arik and Coon [J.Math.Phys. 17, 524 (1976) and Quesne [J. Phys. A 35, 9213 (2002)]). The generalization is investigated with emphasis on i) its behavior as a function of temperature and ii) the results obtained when the deformation-parameter tends to unity. Full article
Open AccessArticle Axiomatic Characterizations of Information Measures
Entropy 2008, 10(3), 261-273; doi:10.3390/e10030261
Received: 1 September 2008 / Accepted: 12 September 2008 / Published: 19 September 2008
Cited by 58 | PDF Full-text (186 KB)
Abstract
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : [...] Read more.
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory. Full article
Open AccessArticle Modeling Non-Equilibrium Dynamics of a Discrete Probability Distribution: General Rate Equation for Maximal Entropy Generation in a Maximum-Entropy Landscape with Time-Dependent Constraints
Entropy 2008, 10(3), 160-182; doi:10.3390/entropy-e10030010
Received: 17 December 2007 / Revised: 30 July 2008 / Accepted: 30 July 2008 / Published: 14 August 2008
Cited by 20 | PDF Full-text (235 KB)
Abstract
A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent [...] Read more.
A rate equation for a discrete probability distribution is discussed as a route to describe smooth relaxation towards the maximum entropy distribution compatible at all times with one or more linear constraints. The resulting dynamics follows the path of steepest entropy ascent compatible with the constraints. The rate equation is consistent with the Onsager theorem of reciprocity and the fluctuation-dissipation theorem. The mathematical formalism was originally developed to obtain a quantum theoretical unification of mechanics and thermodinamics. It is presented here in a general, non-quantal formulation as a part of an effort to develop tools for the phenomenological treatment of non-equilibrium problems with applications in engineering, biology, sociology, and economics. The rate equation is also extended to include the case of assigned time-dependences of the constraints and the entropy, such as for modeling non-equilibrium energy and entropy exchanges. Full article
Open AccessArticle Generalised Exponential Families and Associated Entropy Functions
Entropy 2008, 10(3), 131-149; doi:10.3390/entropy-e10030131
Received: 26 February 2008 / Revised: 1 July 2008 / Accepted: 14 July 2008 / Published: 16 July 2008
Cited by 39 | PDF Full-text (213 KB)
Abstract
A generalised notion of exponential families is introduced. It is based on the variational principle, borrowed from statistical physics. It is shown that inequivalent generalised entropy functions lead to distinct generalised exponential families. The well-known result that the inequality of Cram´er and [...] Read more.
A generalised notion of exponential families is introduced. It is based on the variational principle, borrowed from statistical physics. It is shown that inequivalent generalised entropy functions lead to distinct generalised exponential families. The well-known result that the inequality of Cram´er and Rao becomes an equality in the case of an exponential family can be generalised. However, this requires the introduction of escort probabilities. Full article
Open AccessArticle Incremental Entropy Relation as an Alternative to MaxEnt
Entropy 2008, 10(2), 124-130; doi:10.3390/entropy-e10020124
Received: 5 March 2008 / Revised: 14 June 2008 / Accepted: 22 June 2008 / Published: 24 June 2008
Cited by 2 | PDF Full-text (158 KB)
Abstract
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses [...] Read more.
We show that, to generate the statistical operator appropriate for a given system, and as an alternative to Jaynes’ MaxEnt approach, that refers to the entropy S, one can use instead the increments ±S in S. To such an effect, one uses the macroscopic thermodynamic relation that links ±S to changes in i) the internal energy E and ii) the remaining M relevant extensive quantities Ai, i = 1; : : : ;M; that characterize the context one is working with. Full article
Open AccessArticle Estimating the Entropy of Binary Time Series: Methodology, Some Theory and a Simulation Study
Entropy 2008, 10(2), 71-99; doi:10.3390/entropy-e10020071
Received: 6 March 2008 / Revised: 9 June 2008 / Accepted: 17 June 2008 / Published: 17 June 2008
Cited by 25 | PDF Full-text (354 KB)
Abstract
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data [...] Read more.
Partly motivated by entropy-estimation problems in neuroscience, we present a detailed and extensive comparison between some of the most popular and effective entropy estimation methods used in practice: The plug-in method, four different estimators based on the Lempel-Ziv (LZ) family of data compression algorithms, an estimator based on the Context-Tree Weighting (CTW) method, and the renewal entropy estimator. METHODOLOGY: Three new entropy estimators are introduced; two new LZ-based estimators, and the “renewal entropy estimator,” which is tailored to data generated by a binary renewal process. For two of the four LZ-based estimators, a bootstrap procedure is described for evaluating their standard error, and a practical rule of thumb is heuristically derived for selecting the values of their parameters in practice. THEORY: We prove that, unlike their earlier versions, the two new LZ-based estimators are universally consistent, that is, they converge to the entropy rate for every finite-valued, stationary and ergodic process. An effective method is derived for the accurate approximation of the entropy rate of a finite-state hidden Markov model (HMM) with known distribution. Heuristic calculations are presented and approximate formulas are derived for evaluating the bias and the standard error of each estimator. SIMULATION: All estimators are applied to a wide range of data generated by numerous different processes with varying degrees of dependence and memory. The main conclusions drawn from these experiments include: (i) For all estimators considered, the main source of error is the bias. (ii) The CTW method is repeatedly and consistently seen to provide the most accurate results. (iii) The performance of the LZ-based estimators is often comparable to that of the plug-in method. (iv) The main drawback of the plug-in method is its computational inefficiency; with small word-lengths it fails to detect longer-range structure in the data, and with longer word-lengths the empirical distribution is severely undersampled, leading to large biases. Full article

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top