E-Mail Alert

Add your e-mail address to receive forthcoming issues of this journal:

Journal Browser

Journal Browser

Special Issue "Distance in Information and Statistical Physics Volume 2"

Quicklinks

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: closed (30 June 2013)

Special Issue Editor

Guest Editor
Dr. Takuya Yamano

Department of Mathematics and Physics, Faculty of Science, Kanagawa University, 2946,6-233 Tsuchiya, Hiratsuka, Kanagawa 259-1293, Japan
Phone: +81 45 472 8796
Fax: +81 45 473 1280
Interests: Fisher information; nonextensivity; information theory; nonlinear Fokker-Planck equations; nonlinear Schrödinger equations; complexity measure; irreversibility; tumor growth; etc.

Special Issue Information

Dear Colleagues,

The notion of distance plays a pivotal role in information sciences and statistical physics. For example, relative entropy helps our understanding of the asymptotic process of systems and serves to identify how distinguishable two distributions are. It is not exaggerated to say that much effort revolves around clarification of information structure pertain to distance measures (entropies). This special issue should provide a forum to present and discuss recent progress on the topics listed in the keywords below.

Takuya Yamano
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs).

Keywords

  • relative entropy
  • Kullback-Leibler divergence
  • typicality
  • quantum thermodynamics
  • nonequilibrium entropy
  • fluctuation
  • 2nd law of thermodynamics
  • information geometry
  • Fisher information

Published Papers (11 papers)

View options order results:
result details:
Displaying articles 1-11
Export citation of selected articles as:

Research

Open AccessArticle Local Softening of Information Geometric Indicators of Chaos in Statistical Modeling in the Presence of Quantum-Like Considerations
Entropy 2013, 15(11), 4622-4633; doi:10.3390/e15114622
Received: 8 September 2013 / Revised: 21 October 2013 / Accepted: 22 October 2013 / Published: 28 October 2013
PDF Full-text (205 KB) | HTML Full-text | XML Full-text
Abstract
In a previous paper (C. Cafaro et al., 2012), we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical [...] Read more.
In a previous paper (C. Cafaro et al., 2012), we compared an uncorrelated 3D Gaussian statistical model to an uncorrelated 2D Gaussian statistical model obtained from the former model by introducing a constraint that resembles the quantum mechanical canonical minimum uncertainty relation. Analysis was completed by way of the information geometry and the entropic dynamics of each system. This analysis revealed that the chaoticity of the 2D Gaussian statistical model, quantified by means of the Information Geometric Entropy (IGE), is softened or weakened with respect to the chaoticity of the 3D Gaussian statistical model, due to the accessibility of more information. In this companion work, we further constrain the system in the context of a correlation constraint among the system’s micro-variables and show that the chaoticity is further weakened, but only locally. Finally, the physicality of the constraints is briefly discussed, particularly in the context of quantum entanglement. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Correlation Distance and Bounds for Mutual Information
Entropy 2013, 15(9), 3698-3713; doi:10.3390/e15093698
Received: 21 June 2013 / Revised: 21 August 2013 / Accepted: 3 September 2013 / Published: 6 September 2013
Cited by 3 | PDF Full-text (321 KB) | HTML Full-text | XML Full-text
Abstract
The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs of two-valued classical variables [...] Read more.
The correlation distance quantifies the statistical independence of two classical or quantum systems, via the distance from their joint state to the product of the marginal states. Tight lower bounds are given for the mutual information between pairs of two-valued classical variables and quantum qubits, in terms of the corresponding classical and quantum correlation distances. These bounds are stronger than the Pinsker inequality (and refinements thereof) for relative entropy. The classical lower bound may be used to quantify properties of statistical models that violate Bell inequalities. Partially entangled qubits can have lower mutual information than can any two-valued classical variables having the same correlation distance. The qubit correlation distance also provides a direct entanglement criterion, related to the spin covariance matrix. Connections of results with classically-correlated quantum states are briefly discussed. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Information Geometry of Complex Hamiltonians and Exceptional Points
Entropy 2013, 15(9), 3361-3378; doi:10.3390/e15093361
Received: 16 July 2013 / Revised: 12 August 2013 / Accepted: 16 August 2013 / Published: 23 August 2013
Cited by 5 | PDF Full-text (293 KB) | HTML Full-text | XML Full-text
Abstract
Information geometry provides a tool to systematically investigate the parameter sensitivity of the state of a system. If a physical system is described by a linear combination of eigenstates of a complex (that is, non-Hermitian) Hamiltonian, then there can be phase transitions [...] Read more.
Information geometry provides a tool to systematically investigate the parameter sensitivity of the state of a system. If a physical system is described by a linear combination of eigenstates of a complex (that is, non-Hermitian) Hamiltonian, then there can be phase transitions where dynamical properties of the system change abruptly. In the vicinities of the transition points, the state of the system becomes highly sensitive to the changes of the parameters in the Hamiltonian. The parameter sensitivity can then be measured in terms of the Fisher-Rao metric and the associated curvature of the parameter-space manifold. A general scheme for the geometric study of parameter-space manifolds of eigenstates of complex Hamiltonians is outlined here, leading to generic expressions for the metric. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Time Evolution of Relative Entropies for Anomalous Diffusion
Entropy 2013, 15(8), 2989-3006; doi:10.3390/e15082989
Received: 26 June 2013 / Revised: 17 July 2013 / Accepted: 18 July 2013 / Published: 26 July 2013
Cited by 8 | PDF Full-text (298 KB) | HTML Full-text | XML Full-text
Abstract
The entropy production paradox for anomalous diffusion processes describes a phenomenon where one-parameter families of dynamical equations, falling between the diffusion and wave equations, have entropy production rates (Shannon, Tsallis or Renyi) that increase toward the wave equation limit unexpectedly. Moreover, also [...] Read more.
The entropy production paradox for anomalous diffusion processes describes a phenomenon where one-parameter families of dynamical equations, falling between the diffusion and wave equations, have entropy production rates (Shannon, Tsallis or Renyi) that increase toward the wave equation limit unexpectedly. Moreover, also surprisingly, the entropy does not order the bridging regime between diffusion and waves at all. However, it has been found that relative entropies, with an appropriately chosen reference distribution, do. Relative entropies, thus, provide a physically sensible way of setting which process is “nearer” to pure diffusion than another, placing pure wave propagation, desirably, “furthest” from pure diffusion. We examine here the time behavior of the relative entropies under the evolution dynamics of the underlying one-parameter family of dynamical equations based on space-fractional derivatives. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Pushing for the Extreme: Estimation of Poisson Distribution from Low Count Unreplicated Data—How Close Can We Get?
Entropy 2013, 15(4), 1202-1220; doi:10.3390/e15041202
Received: 15 January 2013 / Revised: 21 March 2013 / Accepted: 25 March 2013 / Published: 8 April 2013
PDF Full-text (188 KB) | HTML Full-text | XML Full-text
Abstract
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed photon streams) where [...] Read more.
Studies of learning algorithms typically concentrate on situations where potentially ever growing training sample is available. Yet, there can be situations (e.g., detection of differentially expressed genes on unreplicated data or estimation of time delay in non-stationary gravitationally lensed photon streams) where only extremely small samples can be used in order to perform an inference. On unreplicated data, the inference has to be performed on the smallest sample possible—sample of size 1. We study whether anything useful can be learnt in such extreme situations by concentrating on a Bayesian approach that can account for possible prior information on expected counts. We perform a detailed information theoretic study of such Bayesian estimation and quantify the effect of Bayesian averaging on its first two moments. Finally, to analyze potential benefits of the Bayesian approach, we also consider Maximum Likelihood (ML) estimation as a baseline approach. We show both theoretically and empirically that the Bayesian model averaging can be potentially beneficial. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Kullback–Leibler Divergence Measure for Multivariate Skew-Normal Distributions
Entropy 2012, 14(9), 1606-1626; doi:10.3390/e14091606
Received: 16 July 2012 / Revised: 25 August 2012 / Accepted: 27 August 2012 / Published: 4 September 2012
Cited by 10 | PDF Full-text (2213 KB) | HTML Full-text | XML Full-text
Abstract
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate [...] Read more.
The aim of this work is to provide the tools to compute the well-known Kullback–Leibler divergence measure for the flexible family of multivariate skew-normal distributions. In particular, we use the Jeffreys divergence measure to compare the multivariate normal distribution with the skew-multivariate normal distribution, showing that this is equivalent to comparing univariate versions of these distributions. Finally, we applied our results on a seismological catalogue data set related to the 2010 Maule earthquake. Specifically, we compare the distributions of the local magnitudes of the regions formed by the aftershocks. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Geometry of q-Exponential Family of Probability Distributions
Entropy 2011, 13(6), 1170-1185; doi:10.3390/e13061170
Received: 11 February 2011 / Revised: 1 June 2011 / Accepted: 2 June 2011 / Published: 14 June 2011
Cited by 21 | PDF Full-text (157 KB) | HTML Full-text | XML Full-text
Abstract
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather [...] Read more.
The Gibbs distribution of statistical physics is an exponential family of probability distributions, which has a mathematical basis of duality in the form of the Legendre transformation. Recent studies of complex systems have found lots of distributions obeying the power law rather than the standard Gibbs type distributions. The Tsallis q-entropy is a typical example capturing such phenomena. We treat the q-Gibbs distribution or the q-exponential family by generalizing the exponential function to the q-family of power functions, which is useful for studying various complex or non-standard physical phenomena. We give a new mathematical structure to the q-exponential family different from those previously given. It has a dually flat geometrical structure derived from the Legendre transformation and the conformal geometry is useful for understanding it. The q-version of the maximum entropy theorem is naturally induced from the q-Pythagorean theorem. We also show that the maximizer of the q-escort distribution is a Bayesian MAP (Maximum A posteriori Probability) estimator. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Figures

Open AccessArticle Distances in Probability Space and the Statistical Complexity Setup
Entropy 2011, 13(6), 1055-1075; doi:10.3390/e13061055
Received: 11 April 2011 / Accepted: 27 May 2011 / Published: 3 June 2011
Cited by 13 | PDF Full-text (532 KB) | HTML Full-text | XML Full-text
Abstract
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review [...] Read more.
Statistical complexity measures (SCM) are the composition of two ingredients: (i) entropies and (ii) distances in probability-space. In consequence, SCMs provide a simultaneous quantification of the randomness and the correlational structures present in the system under study. We address in this review important topics underlying the SCM structure, viz., (a) a good choice of probability metric space and (b) how to assess the best distance-choice, which in this context is called a “disequilibrium” and is denoted with the letter Q. Q, indeed the crucial SCM ingredient, is cast in terms of an associated distance D. Since out input data consists of time-series, we also discuss the best way of extracting from the time series a probability distribution P. As an illustration, we show just how these issues affect the description of the classical limit of quantum mechanics. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Parametric Bayesian Estimation of Differential Entropy and Relative Entropy
Entropy 2010, 12(4), 818-843; doi:10.3390/e12040818
Received: 16 November 2009 / Revised: 28 March 2010 / Accepted: 2 April 2010 / Published: 9 April 2010
Cited by 14 | PDF Full-text (356 KB) | HTML Full-text | XML Full-text
Abstract
Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. [...] Read more.
Given iid samples drawn from a distribution with known parametric form, we propose the minimization of expected Bregman divergence to form Bayesian estimates of differential entropy and relative entropy, and derive such estimators for the uniform, Gaussian, Wishart, and inverse Wishart distributions. Additionally, formulas are given for a log gamma Bregman divergence and the differential entropy and relative entropy for the Wishart and inverse Wishart. The results, as always with Bayesian estimates, depend on the accuracy of the prior parameters, but example simulations show that the performance can be substantially improved compared to maximum likelihood or state-of-the-art nonparametric estimators. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Entropy and Divergence Associated with Power Function and the Statistical Application
Entropy 2010, 12(2), 262-274; doi:10.3390/e12020262
Received: 29 December 2009 / Revised: 20 February 2010 / Accepted: 23 February 2010 / Published: 25 February 2010
Cited by 11 | PDF Full-text (133 KB) | HTML Full-text | XML Full-text
Abstract
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood [...] Read more.
In statistical physics, Boltzmann-Shannon entropy provides good understanding for the equilibrium states of a number of phenomena. In statistics, the entropy corresponds to the maximum likelihood method, in which Kullback-Leibler divergence connects Boltzmann-Shannon entropy and the expected log-likelihood function. The maximum likelihood estimation has been supported for the optimal performance, which is known to be easily broken down in the presence of a small degree of model uncertainty. To deal with this problem, a new statistical method, closely related to Tsallis entropy, is proposed and shown to be robust for outliers, and we discuss a local learning property associated with the method. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)
Open AccessArticle Transport of Heat and Charge in Electromagnetic Metrology Based on Nonequilibrium Statistical Mechanics
Entropy 2009, 11(4), 748-765; doi:10.3390/e11040748
Received: 11 September 2009 / Accepted: 26 October 2009 / Published: 3 November 2009
PDF Full-text (194 KB) | HTML Full-text | XML Full-text
Abstract
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is [...] Read more.
Current research is probing transport on ever smaller scales. Modeling of the electromagnetic interaction with nanoparticles or small collections of dipoles and its associated energy transport and nonequilibrium characteristics requires a detailed understanding of transport properties. The goal of this paper is to use a nonequilibrium statistical-mechanical method to obtain exact time-correlation functions, fluctuation-dissipation theorems (FD), heat and charge transport, and associated transport expressions under electromagnetic driving. We extend the time-symmetric Robertson statistical-mechanical theory to study the exact time evolution of relevant variables and entropy rate in the electromagnetic interaction with materials. In this exact statistical-mechanical theory, a generalized canonical density is used to define an entropy in terms of a set of relevant variables and associated Lagrange multipliers. Then the entropy production rate are defined through the relevant variables. The influence of the nonrelevant variables enter the equations through the projection-like operator and thereby influences the entropy. We present applications to the response functions for the electrical and thermal conductivity, specific heat, generalized temperature, Boltzmann’s constant, and noise. The analysis can be performed either classically or quantum-mechanically, and there are only a few modifications in transferring between the approaches. As an application we study the energy, generalized temperature, and charge transport equations that are valid in nonequilibrium and relate it to heat flow and temperature relations in equilibrium states. Full article
(This article belongs to the Special Issue Distance in Information and Statistical Physics Volume 2)

Journal Contact

MDPI AG
Entropy Editorial Office
St. Alban-Anlage 66, 4052 Basel, Switzerland
entropy@mdpi.com
Tel. +41 61 683 77 34
Fax: +41 61 302 89 18
Editorial Board
Contact Details Submit to Entropy
Back to Top