entropy-logo

Journal Browser

Journal Browser

Information and Entropy

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (31 October 2009) | Viewed by 133047

Special Issue Editor

Copenhagen Business College, Rønne Alle 1, st., 2860 Søborg, Denmark
Interests: cause and effect; entropy; exponential families; graphical models; information divergence; minimum description length; quantum information; statistical mechanics
* Dr. Harremoës also serves as the Editor-in-Chief of Entropy
Special Issues, Collections and Topics in MDPI journals

Keywords

  • entropy
  • information
  • information theory

Published Papers (15 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review, Other

194 KiB  
Article
Recovering Matrices of Economic Flows from Incomplete Data and a Composite Prior
by Esteban Fernández-Vázquez
Entropy 2010, 12(3), 516-527; https://doi.org/10.3390/e12030516 - 12 Mar 2010
Cited by 3 | Viewed by 8137
Abstract
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes as [...] Read more.
In several socioeconomic applications, matrices containing information on flows-trade, income or migration flows, for example–are usually not constructed from direct observation but are rather estimated, since the compilation of the information required is often extremely expensive and time-consuming. The estimation process takes as point of departure another matrix which is adjusted until it optimizes some divergence criterion and simultaneously is consistent with some partial information-row and column margins–of the target matrix. Among all the possible criteria to be considered, one of the most popular is the Kullback-Leibler divergence [1], leading to the well-known Cross-Entropy technique. This paper proposes the use of a composite Cross-Entropy approach that allows for introducing a mixture of two types of a priori information–two possible matrices to be included as point of departure in the estimation process. By means of a Monte Carlo simulation experiment, we will show that under some circumstances this approach outperforms other competing estimators. Besides, a real-world case with a matrix of interregional trade is included to show the applicability of the suggested technique. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

620 KiB  
Article
The Quantum-Classical Transition as an Information Flow
by Andres M. Kowalski, Maria T. Martin, Luciano Zunino, Angelo Plastino and Montserrat Casas
Entropy 2010, 12(1), 148-160; https://doi.org/10.3390/e12010148 - 26 Jan 2010
Cited by 5 | Viewed by 9253
Abstract
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that [...] Read more.
We investigate the classical limit of the semiclassical evolution with reference to a well-known model that represents the interaction between matter and a given field. This is done by recourse to a special statistical quantifier called the “symbolic transfer entropy”. We encounter that the quantum-classical transition gets thereby described as the sign-reversal of the dominating direction of the information flow between classical and quantal variables. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

158 KiB  
Article
A Dynamic Model of Information and Entropy
by Michael C. Parker and Stuart D. Walker
Entropy 2010, 12(1), 80-88; https://doi.org/10.3390/e12010080 - 07 Jan 2010
Cited by 9 | Viewed by 7534
Abstract
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: [...] Read more.
We discuss the possibility of a relativistic relationship between information and entropy, closely analogous to the classical Maxwell electro-magnetic wave equations. Inherent to the analysis is the description of information as residing in points of non-analyticity; yet ultimately also exhibiting a distributed characteristic: additionally analogous, therefore, to the wave-particle duality of light. At cosmological scales our vector differential equations predict conservation of information in black holes, whereas regular- and Z-DNA molecules correspond to helical solutions at microscopic levels. We further propose that regular- and Z-DNA are equivalent to the alternative words chosen from an alphabet to maintain the equilibrium of an information transmission system. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

637 KiB  
Article
Redundancy in Systems Which Entertain a Model of Themselves: Interaction Information and the Self-Organization of Anticipation
by Loet Leydesdorff
Entropy 2010, 12(1), 63-79; https://doi.org/10.3390/e12010063 - 06 Jan 2010
Cited by 25 | Viewed by 11942
Abstract
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information [...] Read more.
Mutual information among three or more dimensions (μ* = –Q) has been considered as interaction information. However, Krippendorff [1,2] has shown that this measure cannot be interpreted as a unique property of the interactions and has proposed an alternative measure of interaction information based on iterative approximation of maximum entropies. Q can then be considered as a measure of the difference between interaction information and redundancy generated in a model entertained by an observer. I argue that this provides us with a measure of the imprint of a second-order observing system—a model entertained by the system itself—on the underlying information processing. The second-order system communicates meaning hyper-incursively; an observation instantiates this meaning-processing within the information processing. The net results may add to or reduce the prevailing uncertainty. The model is tested empirically for the case where textual organization can be expected to contain intellectual organization in terms of distributions of title words, author names, and cited references. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

425 KiB  
Article
On the Spectral Entropy of Thermodynamic Paths for Elementary Systems
by Daniel J. Graham
Entropy 2009, 11(4), 1025-1041; https://doi.org/10.3390/e11041025 - 07 Dec 2009
Cited by 1 | Viewed by 9417
Abstract
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on [...] Read more.
Systems do not elect thermodynamic pathways on their own. They operate in tandem with their surroundings. Pathway selection and traversal require coordinated work and heat exchanges along with parallel tuning of the system variables. Previous research by the author (Reference [1]) focused on the information expressed in thermodynamic pathways. Examined here is how spectral entropy is a by-product of information that depends intricately on the pathway structure. The spectral entropy has proven to be a valuable tool in diverse fields. This paper illustrates the contact between spectral entropy and the properties which distinguish ideal from non-ideal gases. The role of spectral entropy in the first and second laws of thermodynamics and heat → work conversions is also discussed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

178 KiB  
Communication
Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
by Bernard Testa
Entropy 2009, 11(4), 993-1000; https://doi.org/10.3390/e11040993 - 03 Dec 2009
Cited by 9 | Viewed by 6306
Abstract
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from [...] Read more.
The objective of this essay is to reflect on a possible relation between entropy and emergence. A qualitative, relational approach is followed. We begin by highlighting that entropy includes the concept of dispersal, relevant to our enquiry. Emergence in complex systems arises from the coordinated behavior of their parts. Coordination in turn necessitates recognition between parts, i.e., information exchange. What will be argued here is that the scope of recognition processes between parts is increased when preceded by their dispersal, which multiplies the number of encounters and creates a richer potential for recognition. A process intrinsic to emergence is dissolvence (aka submergence or top-down constraints), which participates in the information-entropy interplay underlying the creation, evolution and breakdown of higher-level entities. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

187 KiB  
Article
Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems
by Ricardo López-Ruiz, Jaime Sañudo and Xavier Calbet
Entropy 2009, 11(4), 959-971; https://doi.org/10.3390/e11040959 - 02 Dec 2009
Cited by 6 | Viewed by 9162
Abstract
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. [...] Read more.
A set of many identical interacting agents obeying a global additive constraint is considered. Under the hypothesis of equiprobability in the high-dimensional volume delimited in phase space by the constraint, the statistical behavior of a generic agent over the ensemble is worked out. The asymptotic distribution of that statistical behavior is derived from geometrical arguments. This distribution is related with the Gamma distributions found in several multi-agent economy models. The parallelism with all these systems is established. Also, as a collateral result, a formula for the volume of high-dimensional symmetrical bodies is proposed. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

143 KiB  
Article
A Lower-Bound for the Maximin Redundancy in Pattern Coding
by Aurélien Garivier
Entropy 2009, 11(4), 634-642; https://doi.org/10.3390/e11040634 - 22 Oct 2009
Cited by 7 | Viewed by 7992
Abstract
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known [...] Read more.
We show that the maximin average redundancy in pattern coding is eventually larger than 1.84 (n/log n)1/3 for messages of length n. This improves recent results on pattern redundancy, although it does not fill the gap between known lower- and upper-bounds. The pattern of a string is obtained by replacing each symbol by the index of its first occurrence. The problem of pattern coding is of interest because strongly universal codes have been proved to exist for patterns while universal message coding is impossible for memoryless sources on an infinite alphabet. The proof uses fine combinatorial results on partitions with small summands. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

175 KiB  
Article
Landauer’s Principle and Divergenceless Dynamical Systems
by Claudia Zander, Angel Ricardo Plastino, Angelo Plastino, Montserrat Casas and Sergio Curilef
Entropy 2009, 11(4), 586-597; https://doi.org/10.3390/e11040586 - 13 Oct 2009
Cited by 5 | Viewed by 10215
Abstract
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information [...] Read more.
Landauer’s principle is one of the pillars of the physics of information. It constitutes one of the foundations behind the idea that “information is physical”. Landauer’s principle establishes the smallest amount of energy that has to be dissipated when one bit of information is erased from a computing device. Here we explore an extended Landauerlike principle valid for general dynamical systems (not necessarily Hamiltonian) governed by divergenceless phase space flows. Full article
(This article belongs to the Special Issue Information and Entropy)
387 KiB  
Article
Scale-Based Gaussian Coverings: Combining Intra and Inter Mixture Models in Image Segmentation
by Fionn Murtagh, Pedro Contreras and Jean-Luc Starck
Entropy 2009, 11(3), 513-528; https://doi.org/10.3390/e11030513 - 24 Sep 2009
Cited by 4 | Viewed by 8966
Abstract
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose [...] Read more.
By a “covering” we mean a Gaussian mixture model fit to observed data. Approximations of the Bayes factor can be availed of to judge model fit to the data within a given Gaussian mixture model. Between families of Gaussian mixture models, we propose the Rényi quadratic entropy as an excellent and tractable model comparison framework. We exemplify this using the segmentation of an MRI image volume, based (1) on a direct Gaussian mixture model applied to the marginal distribution function, and (2) Gaussian model fit through k-means applied to the 4D multivalued image volume furnished by the wavelet transform. Visual preference for one model over another is not immediate. The Rényi quadratic entropy allows us to show clearly that one of these modelings is superior to the other. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

221 KiB  
Article
Information, Deformed қ-Wehrl Entropies and Semiclassical Delocalization
by Flavia Pennini, Angelo Plastino, Gustavo L. Ferri, Felipe Olivares and Montse Casas
Entropy 2009, 11(1), 32-41; https://doi.org/10.3390/e11010032 - 27 Jan 2009
Cited by 4 | Viewed by 6949
Abstract
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the celebrated [...] Read more.
Semiclassical delocalization in phase space constitutes a manifestation of the Uncertainty Principle, one indispensable part of the present understanding of Nature and the Wehrl entropy is widely regarded as the foremost localization-indicator. We readdress the matter here within the framework of the celebrated semiclassical Husimi distributions and their associatedWehrl entropies, suitably қ-deformed. We are able to show that it is possible to significantly improve on the extant phase-space classical-localization power. Full article
(This article belongs to the Special Issue Information and Entropy)
Show Figures

Figure 1

181 KiB  
Article
Generalized Measure of Departure from No Three-Factor Interaction Model for 2 x 2 x K Contingency Tables
by Kouji Yamamoto, Yohei Ban and Sadao Tomizawa
Entropy 2008, 10(4), 776-785; https://doi.org/10.3390/e10040776 - 22 Dec 2008
Cited by 47 | Viewed by 7405
Abstract
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x [...] Read more.
For 2 x 2 x K contingency tables, Tomizawa considered a Shannon entropy type measure to represent the degree of departure from a log-linear model of no three-factor interaction (the NOTFI model). This paper proposes a generalization of Tomizawa's measure for 2 x 2 x K tables. The measure proposed is expressed by using Patil-Taillie diversity index or Cressie-Read power-divergence. A special case of the proposed measure includes Tomizawa's measure. The proposed measure would be useful for comparing the degrees of departure from the NOTFI model in several tables. Full article
(This article belongs to the Special Issue Information and Entropy)

Review

Jump to: Research, Other

525 KiB  
Review
Quantum Entropy and Its Applications to Quantum Communication and Statistical Physics
by Masanori Ohya and Noboru Watanabe
Entropy 2010, 12(5), 1194-1245; https://doi.org/10.3390/e12051194 - 07 May 2010
Cited by 29 | Viewed by 8484
Abstract
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum [...] Read more.
Quantum entropy is a fundamental concept for quantum information recently developed in various directions. We will review the mathematical aspects of quantum entropy (entropies) and discuss some applications to quantum communication, statistical physics. All topics taken here are somehow related to the quantum entropy that the present authors have been studied. Many other fields recently developed in quantum information theory, such as quantum algorithm, quantum teleportation, quantum cryptography, etc., are totally discussed in the book (reference number 60). Full article
(This article belongs to the Special Issue Information and Entropy)
328 KiB  
Review
Processing Information in Quantum Decision Theory
by Vyacheslav I. Yukalov and Didier Sornette
Entropy 2009, 11(4), 1073-1120; https://doi.org/10.3390/e11041073 - 14 Dec 2009
Cited by 63 | Viewed by 12721
Abstract
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures [...] Read more.
A survey is given summarizing the state of the art of describing information processing in Quantum Decision Theory, which has been recently advanced as a novel variant of decision making, based on the mathematical theory of separable Hilbert spaces. This mathematical structure captures the effect of superposition of composite prospects, including many incorporated intended actions. The theory characterizes entangled decision making, non-commutativity of subsequent decisions, and intention interference. The self-consistent procedure of decision making, in the frame of the quantum decision theory, takes into account both the available objective information as well as subjective contextual effects. This quantum approach avoids any paradox typical of classical decision theory. Conditional maximization of entropy, equivalent to the minimization of an information functional, makes it possible to connect the quantum and classical decision theories, showing that the latter is the limit of the former under vanishing interference terms. Full article
(This article belongs to the Special Issue Information and Entropy)

Other

Jump to: Research, Review

100 KiB  
Comment
Comment on “Equiprobability, Entropy, Gamma Distributions and Other Geometrical Questions in Multi-Agent Systems”, Entropy 2009, 11, 959-971
by Raúl Toral
Entropy 2009, 11(4), 1121-1122; https://doi.org/10.3390/e11041121 - 22 Dec 2009
Cited by 8 | Viewed by 7780
Abstract
The volume of the body enclosed by the n-dimensional Lamé curve defined by Ʃni=1 xbi = E is computed. Full article
(This article belongs to the Special Issue Information and Entropy)
Back to TopTop