Next Article in Journal
Dispersal (Entropy) and Recognition (Information) as Foundations of Emergence and Dissolvence
Next Article in Special Issue
Best Probability Density Function for Random Sampled Data
Previous Article in Journal
Calculation of Entropy for a Sinusoid with Beta-Distributed Phase
Previous Article in Special Issue
A Weighted Generalized Maximum Entropy Estimator with a Data-driven Weight
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Fisher Information and Semiclassical Treatments

Departamento de Física, Universidad Católica del Norte, Av. Angamos 0610, Antofagasta, Chile
Instituto de Física La Plata (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata (UNLP), Consejo Nacional de Investigaciones (CCT-CONICET), C.C. 727, (1900) La Plata, Argentina
Facultad de Ciencias Exactas y Naturales, Universidad Nacional de La Pampa, Uruguay 151, Santa Rosa, La Pampa, Argentina
Author to whom correspondence should be addressed.
Entropy 2009, 11(4), 972-992;
Received: 30 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
(This article belongs to the Special Issue Maximum Entropy)


We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well.

1. Introduction

A quarter of century before Shannon, R.A. Fisher advanced a method to measure the information content of continuous (rather than digital) inputs using not the binary computer codes but the statistical distribution of classical probability theory [1]. Already in 1980 Wootters pointed out that Fisher’s information measure (FIM) and quantum mechanics share a common formalism and both relate probabilities to the squares of continuous functions [2].
The present review draws materials from much interesting work that is reported recently and devoted to the physical applications of Fisher’s information measure (see, for instance, [1,3,4,5,6]). Frieden and Soffer [3] have shown that Fisher’s information measure provides one with a powerful variational principle—the extreme physical information—that yields most of the canonical Lagrangians of theoretical physics [1,3]. Additionally, FIM has been shown to provide an interesting characterization of the “arrow of time”, alternative to the one associated with Boltzmann’s entropy [7,8]. Thus, unravelling the multiple FIM facets and their links to physics should be of general interest. The Legendre transform structure of thermodynamics can be replicated as well, without any change, if one replaces the Boltzmann–Gibbs–Shannon entropy S by Fisher’s information measure I. In particular, I possesses the all important concavity property [5], and use of the Fisher’s measure allows for the development of a thermodynamics that seems to be able to treat equilibrium and non-equilibrium situations in a manner entirely similar to the conventional one [5]. Here, the focus of our attention will be, following [9], the thermal description of harmonic oscillator (HO).
The semiclassical approximation (SC) has had a long and distinguished history and remains today a very important weapon in the physics armory. It is indeed indispensable in many areas of scientific endeavor. Also, it facilitates, in many circumstances, an intuitive understanding of the underlying physics that may remain hidden in extensive numerical solutions of Schrödinger’s equation. Although the SC-approach is as old as quantum mechanics itself, it remains active, as reported, for example, in [10] and [11].
Our emphasis in this review will be placed on the study of the differences between (i) statistical treatments of a purely quantal nature, on the one hand, and (ii) semiclassical ones, on the other. We will show that these differences can be neatly expressed entirely in terms of a special version, to be called I τ , of Fisher’s information measure: the so-called shift-invariant Fisher one [1], associated to phase space. Additionally I τ is a functional of a semiclassical distribution function, namely, the Husimi function μ ( x , p ) . The phase space measure I τ will be shown to help to (1) refine the so-called Lieb-bound [12] and (2) connect this refinement with the delocalization in phase space. The latter can, of course, be visualized as information loss. I τ will also be related to an interesting semiclassical measure that was early introduced to characterize the same phenomenon: the Wehrl entropy W [12],
W = k B ln μ μ
for which Lieb established the above cited lower bound W 1 , which is a manifestation of the uncertainty principle [13]. k B is the Boltzmann’s constant. Henceforth we will set k B = 1 , for the sake of simplicity.
For the convenience of the reader, in the following section we describe some fundamental aspects of the HO canonical-ensemble description from a coherent states’ viewpoint [9], the Husimi probability distribution function, and the Wehrl information measure.

2. Background Notions

2.1. HO’s coherent states

In [9] the authors discuss quantum-mechanical phase space distributions expressed in terms of the celebrated coherent states | z of the harmonic oscillator, eigenstates of the annihilation operator a ^ [14,15], i.e.,
a ^ | z = z | z
with z a complex combination of the phase space coordinates x, p
z = x 2 σ x + i p 2 σ p
where σ x = ( / 2 m ω ) 1 / 2 , σ p = ( m ω / 2 ) 1 / 2 , and σ x σ p = / 2 .
Coherent states span Hilbert’s space, constitute an over-complete basis and obey resolution of unity [15]
d 2 z π | z z | = d x d p 2 π | x , p x , p | = 1
where the differential element of area in the z plane is d 2 z = d x d p / 2 π and the integration is carried out over the whole complex plane.
The coherent state | z can be expanded in terms of the states of the HO as follows
| z = n = 0 | z | n | 2 | n
where | n are eigenstates of the HO Hamiltonian whose form is H ^ = ω [ a ^ a ^ + 1 / 2 ] and we have
| z | n | 2 = | z | 2 n n ! e | z | 2

2.2. HO-expressions

We write down now, for future reference, well-known quantal HO-expressions for, respectively, the partition function Z, the entropy S, the mean energy U, the mean excitation energy E, the free energy F = U T S , and the specific heat C [16]
Z = e β ω / 2 1 e β ω
S = β ω e β ω 1 ln ( 1 e β ω )
U = ω 2 + E
E = ω e β ω 1
F = ω 2 + T ln ( 1 e β ω )
C = β ω e β ω 1 2 e β ω

2.3. Husimi probability distribution

In the wake of a discussion advanced in [17], we will be mainly concerned with building “Husimi–Fisher” bridges. It is well-known that the oldest and most elaborate phase space (PS) formulation of quantum mechanics is that of Wigner [18,19]. To every quantum state a PS function (the Wigner one) can be assigned. This PS function can, regrettably enough, assume negative values so that a probabilistic interpretation becomes questionable. Such limitation was overcome, among others, by Husimi [20]. In terms of the concomitant Husimi probability distributions, quantum mechanics can be completely reformulated [21,22,23,24]. This phase space distribution has the form of
μ ( x , p ) μ ( z ) = z | ρ ^ | z
where ρ ^ is the density operator of the system and | z are the coherent states (see, for instance, [25] and references therein). The function μ ( x , p ) is normalized in the fashion
d x d p 2 π μ ( x , p ) = 1
For a thermal equilibrium case ρ ^ = Z 1 e β H ^ , Z = Tr ( e β H ^ ) is the partition function, β = 1 / T , with T being the temperature. Specializing things for the HO of frequency ω, with eigenstates | n associated to the eigenenergies E n = ω n + 1 / 2 , one has
z | ρ ^ | z = 1 Z n e β H ^ | z | n | 2
with | z | n | 2 given by Equation (6), and the normalized Husimi probability distribution is
μ ( z ) = ( 1 e β ω ) e ( 1 e β ω ) | z | 2

2.4. Wehrl entropy

The Wehrl entropy is defined as [12]
W = d x d p 2 π μ ( x , p ) ln μ ( x , p )
where μ ( x , p ) is the “classical” distribution function (13) associated to the density operator ρ ^ of the system. The uncertainty principle manifests itself through the inequality W 1 , which was first conjectured by Wehrl [12] and later proved by Lieb [13]. Equality holds if ρ ^ is a coherent state. After integration over all phase space, turns out to be [9]
W = 1 ln ( 1 e β ω )

3. Fisher’s Information Measure

Let us consider a system that is specified by a physical parameter θ, while x is a real stochastic variable and f θ ( x ) , which in turn depends on the parameter θ, is the probability density for x . An observer makes a measurement of x and estimates θ from this measurement, represented by θ ˜ = θ ˜ ( x ) . One wonders how well θ can be determined. Estimation theory [26] asserts that the best possible estimator θ ˜ ( x ) , after a very large number of x -samples is examined, suffers a mean-square error e 2 from θ that obeys a relationship involving Fisher’s I, namely, I e 2 = 1 , where the Fisher information measure I is of the form
I ( θ ) = d x f θ ( x ) ln f θ ( x ) θ 2
This “best” estimator is called the efficient estimator. Any other estimator must have a larger mean-square error. The only proviso to the above result is that all estimators be unbiased, i.e., satisfy θ ˜ ( x ) = θ . Thus, Fisher’s information measure has a lower bound, in the sense that, no matter what parameter of the system we choose to measure, I has to be larger or equal than the inverse of the mean-square error associated with the concomitant experiment. This result, I e 2 1 , is referred to as the Cramer–Rao bound [1,27]. A particular I-case is of great importance: that of translation families [1,4], i.e., distribution functions (DF) whose form does not change under θ-displacements. These DF are shift-invariant (à la Mach, no absolute origin for θ), and for them Fisher’s information measure adopts the somewhat simpler appearance [1]
I = d x f ( x ) ln f ( x ) x 2
Fisher’s measure is additive [1]: If x and p are independent, variables, I ( x + p ) = I ( x ) + I ( p ) . Notice that, for τ ( x , p ) (a point in phase-space), we face a shift-invariance situation. Since in defining z in terms of the variables x and p, these are scaled by their respective variances, the Fisher measure associated to the probability distribution μ ( x , p ) will be of the form [4]
I τ = d x d p 2 π μ ( x , p ) A
A = σ x 2 ln μ ( x , p ) x 2 + σ p 2 ln μ ( x , p ) p 2
Given the μ-expression (16), I τ becomes
I τ = 1 e β ω
which, immediately yields
I τ e | z | 2 ( β , ω ) = 1 ; ( CR bound reached )
We realize at this point that the Fisher measure built up with Husimi distributions is to be best employed to estimate “phase space position” | z | . Further, efficient estimation is possible for all temperatures, a rather significant result. Comparison with Equation (18) allows one now to write
W = 1 ln ( I τ ) W + ln ( I τ ) = 1
Since both W and I τ are positive-definite quantities, (25) tells us that they are complementary informational quantities, thus if one of them gains, the other loses. Following Anderson et al. [9] let us now analyze the high and low temperatures limits. Given the form (23), when the temperature goes to zero ( β ) , I τ 1 , its maximum possible value, since we know that the ground state will be the only one to be populated. If, on the other hand, the temperature tends to infinity ( β 0 ) , then I τ β ω and tends to zero, because we know beforehand that, in the limit, all energy levels will be populated in uniform. The uniform distribution is that of maximum ignorance [28,29,30,31]. The range of I τ is 0 I τ 1 , that of W is 1 W , as we can see in Figure 1. Using I τ together with Equation (7) we notice that
I τ = e β ω / 2 Z
so that it coincides with the canonical ensemble probability for finding the system in its ground state.
Figure 1. Fisher ( I τ ) and Wehrl (W) information measures vs. T (in ω units) for HO-Husimi distribution.
Figure 1. Fisher ( I τ ) and Wehrl (W) information measures vs. T (in ω units) for HO-Husimi distribution.
Entropy 11 00972 g001

4. Fisher, Thermodynamics’ Third Law, and Thermodynamic Quantities

Consider now the general definition (19) of Fisher’s information measure in terms of the DF μ ( x , p )
I β = d x d p 2 π μ ( x , p ) ln μ ( x , p ) β 2
with θ β is the parameter to be estimated. Since
ln μ ( x , p ) β = ω e β ω 1 [ 1 ( 1 e β ω ) | z | 2 ]
one readily ascertains that (i) the μ-mean value of (28) vanishes and (ii)
I β = ω e β ω 1 2 T = [ 0 , ] I β = [ 0 , ]
which, in view of (12), entails
I β = e β ω β 2 C
Reflection upon the I β -range (29) might led one to conclude that it constitutes a Fisher manifestation of thermodynamics’ third law. Not only Shannon’s measure but also Fisher’s (for the HO, at least) vanishes at zero temperature. Replacing now (23) and (29) into the entropy expression (8) we immediately arrive at the relation
S = β I β ln I τ
The HO entropy can be expressed as the sum of two terms: one associated with the Fisher information I β and the other with the Fisher information for translation families I τ corresponding to the phase space variables ( x , p ) . Using Equation (7) we also have
ln I τ = β ω 2 ln Z = ( β E g s + ln Z )
with E g s denoting the ground state energy. Thus,
S = β ω 2 + I β + ln Z
which is to be compared to the well known canonical ensemble general expression connecting S and the mean energy U [16]
S = ln Z + β U
we see that I β is related to the excited spectrum contribution to U while I τ is to be linked to the partition function. We will look now for a new connection between Fisher’s measures I τ and I β . From (29) it is possible to rewrite I β in the form
I β ω e β ω 1 e β ω 2
and therefore
I τ I β = ω e β ω = β ( e β ω )
i.e., the product on the left hand side is the β-derivative of the Boltzmann factor (constant energy-wise) at the inverse temperature β. In other words, I τ I β measures the β-gradient of the Boltzmann factor.
Equation (23) implies, via Equations (7) to (12), that the quantal HO expressions for the most important thermodynamic quantities can be expressed in terms of the semiclassical measure I τ . For this end we define the semiclassical free energy
F s c = T ln I τ
which is the semiclassical contribution to the HO free-energy F = ω / 2 + F s c . Therefore, the thermodynamic quantities can be expressed as follows
Z = e β ω / 2 I τ
E = ω 1 I τ I τ
S = β ω 1 I τ I τ F s c T
C = ( β ω ) 2 1 I τ I τ 2
which shows that the semiclassical, Husimi-based I τ information measure does contain purely quantum statistical information. Furthermore, since from Helmholtz’ free energy F, we can derive all of the HO quantal thermodynamics [16], we see that the the HO-quantum thermostatistics is, as far as information is considered, entirely of semiclassical nature, as it is completely expressed in terms of a semiclassical measure. We emphasize thus the fact that the semiclassical quantity I τ contains all the relevant HO-statistical quantum information.

5. HO-Semiclassical Fisher’s Measure

5.1. MaxEnt approach

All the previous results are exact. No reference whatsoever needs to be made to Jaynes’ Maximum Entropy Principle (MaxEnt) [31] up to this point. We wish now to consider a MaxEnt viewpoint. It is shown in [14] that the HO-energy can be cast as the sum of the ground-state energy ω / 2 plus the expectation value of the HO-Hamiltonian with respect to the coherent state | z , which is the sum of the ground-state energy plus a semiclassical excitation energy E . One has for the semiclassical excitation HO-energy e ( z ) at z [14]
e ( z ) = z | H | z ω / 2 = ω | z | 2 , i . e . , E ν = ω | z | 2 ν
where the last expectation value is computed using the distribution ν ( z ) . This semiclassical excitation energy E μ is given, for a Husimi distribution μ, by [25]
E μ = e ( z ) μ = ω I τ
Note now that, from Equation (16), we can conveniently recast the HO-expression for μ into the Gaussian fashion
μ ( z ) = I τ e I τ | z | 2
peaked at the origin. The probability density μ of Equation (44) is clearly of the maximum entropy [31].
As a consequence, it proves convenient, at this stage, to view I τ in the following light. The semiclassical form of the entropy S has exhaustively been studied by Wehrl. It is the (cf. 1) Shannon’s information measure evaluated with Husimi distributions [12]. Assume we know a priori the value E ν = ω | z | 2 ν . We wish to determine the distribution ν ( z ) that maximizes the Wehrl entropy under this E ν value constraint. Accordingly, the MaxEnt distribution will be [31]
ν ( z ) = e λ o e η E ( z )
with λ o the normalization Lagrange multiplier and η the one associated to E ν . According to MaxEnt tenets we have [31]
λ o = λ o ( η ) = ln d 2 z π e η ω | z | 2 = ln ( η ω )
Now, the η multiplier is determined by the relation [31]
E ν = λ o η = 1 η
If we choose the Fisher-Husimi constraint given by Equation (43), this results in η = I τ / ω and from Equation (46) we get λ o = ln I τ , i.e., e λ o = I τ , and we consequently arrive to the desired outcome
ν ( z ) = I τ e I τ | z | 2 μ ( z )
We have thus shown that the HO-Husimi distributions are MaxEnt-ones with the semiclassical excitation energy (43) as a constraint. It is clear from Equation (48) that I τ plays there the role of an “inverse temperature”.
The preceding argument suggests that we are tacitly envisioning the existence of a quantity T W (the inverse of η) associated to the Wehrl measure that we here extend to extreme. This Wehrl temperature T W governs the width of our Gaussian Husimi distribution. On account of
μ ( z ) = I τ e ( I τ / ω ) ω | z | 2 = I τ e E μ / T W
which entails
T W = ω I τ
and it is easy to see from the range of I τ that the range of values of T W is then ω T W . Due to the semiclassical nature of both W and μ, T W has a lower bound greater than zero.

5.2. Delocalization

The two quantities W and I τ have been shown to be related, for the HO, according to Equation (25). Since the Wehrl temperature T W yields the width of our Gaussian Husimi distribution, we can conceive of introducing a “delocalization factor” D
D = T W ω
The above definition leads to the relation
W = 1 + ln T W ln ω = 1 + ln D
As stressed above, W has been constructed as a delocalization measure [12]. The preceding considerations clearly motivate one to regard the Fisher measure built up with Husimi distributions as a “localization estimator" in phase space. The HO-Gaussian expression for μ (44) illuminates the fact that the Fisher measure controls both the height and the spread (which is [ 2 I τ ] 1 ). Obviously, spread is here a “phase-space delocalization indicator”. This fact is reflected by the quantity D introduced above.
Thus, an original physical interpretation of Fisher’s measure emerges: localization control. The inverse of the Fisher measure, D, turns out to be a delocalization-indicator. Differentiating Fisher’s measure (23) with respect T, notice also that
d I τ d T = ω T 2 e β ω
so that Fisher’s information decreases exponentially as the temperature grows. Our Gaussian distribution loses phase-space “localization” as energy and/or temperature are injected into our system, as reflected via T W or D. Notice that (52) complements the Lieb bound W 1 . It tells us by just how much W exceeds unity. We see that it does it by virtue of delocalization effects. Moreover, this fact can be expressed using the shift-invariant Fisher measure. We will now show that D is proportional to the system’s energy fluctuations.

5.3. Second moment of the Husimi distribution

The second moment of the Husimi distribution μ ( z ) is an important measure to ascertain the “degree of complexity” of a quantum state (see below). It is a measure of the delocalization-degree of the Husimi distribution in phase space (see Reference [32] for details and discussions). It is defined as
M 2 = d 2 z π μ 2 ( z )
that, after explicit evaluation of M 2 from Equation (44) reads
M 2 = I τ 2
Using now (52) we conclude that
M 2 ( D ) = 1 2 D
Thus, our energy-fluctuations turn out to be
Δ μ e = ω I τ = ω D
with ( Δ μ e ) 2 = ( E 2 ) μ E μ 2 . As a consequence, we get
D = Δ μ e ω
An important result is thus obtained: the delocalization factor D represents energy-fluctuations expressed in ω terms. Delocalization is clearly seen to be the counterpart of energy fluctuations.

6. Thermodynamics-Like Relations

Let us now go back to Equation (37) and revisit the entropic expression. It is immediately realized that we can recast the entropy S in terms of the quantal mean excitation energy E and the delocalization factor D as
E T = S ln D
i.e., if one injects into the system some excitation energy E, expressed in “natural” T units, it is apportioned partly as heat dissipation via S and partly via delocalization. More precisely, the part of this energy not dissipated is that employed to delocalize the system in phase space. Now, since W = 1 ln I τ = 1 + ln D , the above equation can be recast in alternative forms, as
S = E T + ln D = E T ln I τ ; or
W = 1 + S E T
W S 0 for T
which is a physically sensible result and
W S 1 for T 0
as it should, since S = 0 at T = 0 (third law of thermodynamics), while W attains there its Lieb’s lower bound of unity.
One finds in Equation (60) some degree of resemblance to thermodynamics’s first law. To reassure ourselves on this respect, we slightly changed our underlying canonical probabilities μ, multiplying it by a factor δ F = random number / 100 . Specifically, we generated random numbers according to the normal distribution and divided them by 100 to obtain the above factors δ F . This process leads to new “perturbed” probabilities P = μ + δ μ , conveniently normalized. With them we evaluate the concomitant changes d S , d E (we do this 50 times, with different random numbers in each instance). We were then able to numerically verify that the difference d S β d E 0 . The concomitant results are plotted in Figure 2) Since, as stated, numerically d S = ( 1 / T ) d E , this entails, from Equation (60), d I τ / I τ 0 . The physical connotations are as follows: if the only modification effected is that of a change δ μ [16] in the canonical distribution μ, this implies that the system undergoes a heat transfer process [16] for which thermodynamics’ first law implies d U = T d S . This is numerically confirmed in the plots of Figure 2. The null contribution of ln I τ to this process suggests that delocalization (not a thermodynamic effect, but a dynamic one) can be regarded as behaving (thermodynamically) like a kind of “work”.
Figure 2. Numerical computation results for the HO: changes d U and d I τ vs. d S that ensue after randomly generating variations δ p i in the underlying microscopic canonical probabilities p i .
Figure 2. Numerical computation results for the HO: changes d U and d I τ vs. d S that ensue after randomly generating variations δ p i in the underlying microscopic canonical probabilities p i .
Entropy 11 00972 g002
Now, since (a) I τ = 1 e β ω , and (b) the mean energy of excitation is E = ω / ( exp ( β ω ) 1 ) , one also finds, for the quantum-semiclassical difference (QsCD) S W the result
W S = 1 I τ 1 I τ ln ( 1 I τ ) = F 1 ( I τ )
Moreover, since 0 F 1 ( I τ ) 1 , we see that, always, W S , as expected, since the semiclassical treatment contains less information than the quantal one. Note that the QsCD can be expressed exclusively in Fisher’s information terms. This is, the quantum-semiclassical entropic difference S W may be given in I τ terms only. Figure (3) depicts S, β E , and ln D vs. the dimensionless quantity t = T / ω . Accordingly, entropy is apportioned in such a way that
  • part of it originates from excitation energy and
  • the remaining is accounted for by phase space delocalization.
A bit of algebra allows one now to express the rate of entropic change per unit temperature increase as
d S d T = β d E d T = β C = ω 1 T d D d T ,
C = ω d D d T
Figure 3. S, ln D , and β E as a function of t = T / ω .
Figure 3. S, ln D , and β E as a function of t = T / ω .
Entropy 11 00972 g003
In the case of the one dimensional HO we see that the specific heat measures delocalization change per unit temperature increase. Also, d E / d T d D / d T , providing us with a very simple relationship between mean excitation energy changes and delocalization ones.
d E d D = ω

7. On Thermal Uncertainties

Additional considerations are in order with regards to thermal uncertainties, that express the effect of temperature on Heisenberg’s celebrated relations (see, for instance [6,33,34,35]). We use now a result obtained in [9] (Equation (3.12)), where the authors cast Wehrl’s information measure in terms of the coordinates’ variances Δ μ x and Δ μ p , obtaining
W = ln e Δ μ x Δ μ p
In the present context, the relation W = 1 ln I τ allows us to conclude that [17]
I τ Δ μ x Δ μ p =
which can be regarded as a “Fisher uncertainty principle” and adds still another meaning to I τ : since, necessarily, Δ μ x Δ μ p / 2 , it is clear that I τ / 2 is the “correcting factor” that permits one to reach the uncertainty’s lower bound / 2 , a rather interesting result.
Phase space “localization” is possible, with Husimi distributions, only up to ℏ [14]. This is to be compared to the uncertainties evaluated in a purely quantal fashion, without using Husimi distributions, and in particular with a previous result in [17]. With the virial theorem [16] one can easily ascertain in [17] that
Δ x Δ p = 2 e β ω + 1 e β ω 1
together with (69) yields
Δ μ x Δ μ p = 2 Δ x Δ p 1 + e β ω
Thus We see that, as β , Δ μ Δ μ x Δ μ p is twice the minimum quantum value for Δ x Δ p , and Δ μ , the “minimal” phase-space cell. The quantum and semiclassical results do coincide at very high temperature though. Indeed, one readily verifies [17] that Heisenberg’s uncertainty relation, as a function of both frequency and temperature, is governed by a thermal “uncertainty function” F that acquires the aspect
F ( β , ω ) = Δ x Δ p = 1 2 Δ μ + E ω
Coming back to results derivable within the present context, we realize here that F can be recast as
F ( β , ω ) = 1 2 D + E ω
so that, for T varying in [ 0 , ] , the range of possible Δ x Δ p -values is [ / 2 , ] . Equation (73) is a “Heisenberg–Fisher” thermal uncertainty relation (for a discussion of this concept see, for instance, [6,33,34]).
F ( β , ω ) grows with both E and D. The usual result / 2 is attained for minimum D and zero excitation energy. As for d F / d T , one is able to set F F ( E , D ) , since 2 d F = d D + ω 1 d E . Remarkably enough, the two contributions to d F / d T are easily seen to be equal and d F / d T ( 1 / ω ) for T . One can also write
F D E = 2 ; F E D = 1 2 ω
providing us with a thermodynamic “costume” for the uncertainty function F that sheds some new light onto the meaning of both ℏ and ω. In particular, we see that / 2 is the derivative of the uncertainty function F with respect to the delocalization factor D. Increases d F of the thermal uncertainty function F are of two types
  • from the excitation energy, that supplies a C / ω contribution and
  • from the delocalization factor D.

8. Degrees of Purity Relations

8.1. Semiclassical purity

The quantal concept of degree of purity of a general density operator ρ ^ is expressed via Tr ρ ^ 2 [36,37]. Its inverse, the so-called participation ratio
R = 1 Tr ρ ^ 2
is particularly convenient for calculations [38]. It varies from unity for pure states to N for totally mixed states [38]. It may be interpreted as the effective number of pure states that enter a quantum mixture. Here we will consider the “degree of purity” d μ of a semiclassical distribution, given by
d μ = d 2 z π μ 2 ( z ) 1
Clearly, d μ coincides with the second moment of the Husimi distribution (44) given by Equation (54), i.e.,
d μ = M 2 = I τ 2
Using now (52) we relate the semiclassical degree of purity to the delocalization factor and to the Wehrl temperature T W
d μ = 1 2 D = T W 2 ω
and also to our semiclassical energy-fluctuations (57)
d μ = ω 2 Δ μ e
Since ω T W , the “best” purity attainable at the semiclassical level equals one-half.

8.2. Quantal purity

For the quantum mixed HO-state ρ ^ = e β H / Z , where H is the Hamiltonian of the harmonic oscillator and the partition function Z is given by Equation (7) [16], we have a degree of purity d ρ ^ given by (see the detailed study by Dodonov [35])
d ρ ^ = e β ω Z 2 n = 0 e 2 β ω n
leading to
d ρ ^ = tanh ( β ω / 2 )
where 0 d ρ ^ 1 . Thus, Heisenberg’ uncertainty relation can be cast in the fashion
Δ x Δ p = 2 coth ( β ω / 2 )
where Δ x and Δ p are the quantum variances for the canonically conjugated observables x and p [35]
Δ x Δ p = 2 1 d ρ ^
which is to be compared to the semiclassical result that was derived above (cf. 71).
We relate now the degree of purity of our thermal state with various physical quantities both in its quantal and semiclassical versions. Using Equations (71) and (77) we get
d μ = I τ 2 = ( 1 d μ ) d ρ ^
which leads to
d ρ ^ = d μ 1 d μ = I τ 2 I τ
d μ = d ρ ^ 1 + d ρ ^
such as clearly shows that (i) d μ d ρ ^ , and (ii) for a pure state, again, its semiclassical counterpart has a degree of purity equal 1 / 2 .
Additionally, on account of Equation (69), on the one hand, and since the semiclassical degree of purity reads d μ = I τ / 2 , on the other one, we are led to an uncertainty relation for mixed states in terms of d μ , namely,
Δ μ x Δ μ p = 2 1 d μ
that tells us just how uncertainty grows as participation ratio R = 1 / d μ augments. Equation (86) is of semiclassical origin, which makes it a bit different from the one that results form a purely quantal treatment (see [35], Equation (4)). Moreover, notice how information concerning the purely quantal notion of purity d ρ ^ is already contained in the semiclassical measure I τ .
Figure 4. Semiclassical purity d μ vs. T / h ν .
Figure 4. Semiclassical purity d μ vs. T / h ν .
Entropy 11 00972 g004
We appreciate the fact that R increases as delocalization grows, a quite sensible result. Figure (4) depicts d μ ( T ) , a monotonously decreasing function, which tells us that degree of purity of a mixed state acts here as a thermometer, and allows then to assign a T value to any of our mixed states.
Also, using once again the result d μ = I τ / 2 together with the uncertainties just derived we see that β ω = ln ( 1 I τ ) . Thus, we can rewrite Equation (64) in the following form
W S = 1 + β ω 2 d μ 1 2 d μ = 1 + β ω d ρ ^ 1 2 d ρ ^
which casts the difference between the quantal and semiclassical entropies in terms of the degrees of purity. From Equation (87) we can also give the quantal mean excitation energy E in terms of d μ using (38)
E = ω 2 1 2 d μ d μ = ω 2 1 d ρ ^ d ρ ^

9. Conclusions

We have explored in this review connections between canonical ensemble quantities and two Fisher information measures, associated to the estimation of phase-space location ( I τ ) and temperature ( I β ). Our most important result is, perhaps, to have shown that there exists a “Fisher-associated” third law of thermodynamics (at least for the HO). From a pure information-theoretic viewpoint, we have, obtained significant results, namely,
  • a connection between Wehrl’s entropy and I τ (cf. Equation (25)),
  • an interpretation of I τ as the HO’s ground state occupation probability (cf. Equation (26)),
  • an interpretation of I β proportional to the HO’s specific heat (cf. Equation (30)),
  • the possibility of expressing the HO’s entropy as a sum of two terms, one for each of the above FIM realizations (cf. Equation (31)),
  • a new form of Heisenberg’s uncertainty relations in Fisher terms (cf. Equation (73)),
  • that efficient | z | -estimation can be achieved with I τ at all temperatures, as the minimum Cramer–Rao value is always reached (cf. Equation (24)).
Our statistical semiclassical treatment yielded, we believe, some new interesting physics that we proceed to summarize. We have, for the HO,
  • established that the semiclassical Fisher measure I τ contains all relevant statistical quantum information,
  • shown that the Husimi distributions are MaxEnt ones, with the semiclassical excitation energy E as the only constraint,
  • complemented the Lieb bound on the Wehrl entropy using I τ ,
  • observed in detailed fashion how delocalization becomes the counterpart of energy fluctuations,
  • written down the difference W S between the semiclassical and quantal entropy also in I τ terms,
  • provided a relation between energy excitation and degree of delocalization,
  • shown that the derivative of twice the uncertainty function F ( β ω ) = Δ x Δ p with respect to I τ 1 is the Planck constant ℏ,
  • established a semiclassical uncertainty relation in terms of the semiclassical purity d μ , and
  • expressed both d μ and the quantal degree of purity in terms of I τ .
These results are, of course, restricted to the harmonic oscillator. However, this is such an important system that HO insights usually have a wide impact, as the HO constitutes much more than a mere example. Nowadays it is of particular interest for the dynamics of bosonic or fermionic atoms contained in magnetic traps [39,40,41] as well as for any system that exhibits an equidistant level spacing in the vicinity of the ground state, like nuclei or Luttinger liquids. The treatment of Hamiltonians including anharmonic terms is the next logical step. We are currently undertaking such a task. To this end analytical considerations do not suffice, and numerical methods are required. The ensuing results will be published elsewhere.


F. Pennini is grateful for the financial support by FONDECYT, grant 1080487.

References and Notes

  1. Frieden, B.R. Science from Fisher Information, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
  2. Wootters, W.K. The acquisition of information from quantum measurements. PhD Dissertation, University of Texas, Austin, TX, USA, 1980. [Google Scholar]
  3. Frieden, B.R.; Soffer, B.H. Lagrangians of physics and the game of Fisher-information transfer. Phys. Rev. E 1995, 52, 2274–2286. [Google Scholar] [CrossRef]
  4. Pennini, F.; Plastino, A.R.; Plastino, A. Rènyi entropies and Fisher informations as measures of nonextensivity in a Tsallis setting. Physica A 1998, 258, 446–457. [Google Scholar] [CrossRef]
  5. Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, H. Fisher-based thermodynamics: Its Legendre transformations and concavity properties. Phys. Rev. E 1999, 60, 48–53. [Google Scholar] [CrossRef]
  6. Pennini, F.; Plastino, A.; Plastino, A.R.; Casas, M. How fundamental is the character of thermal uncertainty relations? Phys. Lett. A 2002, 302, 156–162. [Google Scholar] [CrossRef]
  7. Plastino, A.R.; Plastino, A. Symmetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time. Phys. Rev. E 1996, 54, 4423–4426. [Google Scholar] [CrossRef]
  8. Plastino, A.; Plastino, A.R.; Miller, H.G. On the relationship between the Fisher-Frieden-Soffer arrow of time, and the behaviour of the Boltzmann and Kullback entropies. Phys. Lett. A 1997, 235, 129–134. [Google Scholar] [CrossRef]
  9. Anderson, A.; Halliwell, J.J. Information-theoretic measure of uncertainty due to quantum and thermal fluctuations. Phys. Rev. D 1993, 48, 2753–2765. [Google Scholar] [CrossRef]
  10. Dimassi, M.; Sjoestrand, J. Spectral Asymptotics in the Semiclassical Limit; Cambridge Univesity Press: Cambridge, UK, 1999. [Google Scholar]
  11. Brack, M.; Bhaduri, R.K. Semiclassical Physics; Addison-Wesley: Boston, MA, USA, 1997. [Google Scholar]
  12. Wehrl, A. On the relation between classical and quantum entropy. Rep. Math. Phys. 1979, 16, 353–358. [Google Scholar] [CrossRef]
  13. Lieb, E.H. Proof of an entropy conjecture of wehrl. Commun. Math. Phys. 1978, 62, 35–41. [Google Scholar] [CrossRef]
  14. Schnack, J. Thermodynamics of the harmonic oscillator using coherent states. Europhys. Lett. 1999, 45, 647–652. [Google Scholar] [CrossRef]
  15. Klauder, J.R.; Skagerstam, B.S. Coherent States; World Scientific: Singapore, 1985. [Google Scholar]
  16. Pathria, R.K. Statistical Mechanics; Pergamon Press: Exeter, UK, 1993. [Google Scholar]
  17. Pennini, F.; Plastino, A. Heisenberg–Fisher thermal uncertainty measure. Phys. Rev. E 2004, 69, 057101:1–57101:4. [Google Scholar] [CrossRef]
  18. Wigner, E.P. On the quantum correction for thermodynamic equilibrium. Phys. Rev. 1932, 40, 749–759. [Google Scholar] [CrossRef]
  19. Wlodarz, J.J. Entropy and wigner distribution functions revisited. Int. J. Theor. Phys. 2003, 42, 1075–1084. [Google Scholar] [CrossRef]
  20. Husimi, K. Some formal properties of the density matrix. Proc. Phys. Math. Soc. JPN 1940, 22, 264–283. [Google Scholar]
  21. O’ Connel, R.F.; Wigner, E.P. Some properties of a non-negative quantum-mechanical distribution function. Phys. Lett. A 1981, 85, 121–126. [Google Scholar] [CrossRef]
  22. Mizrahi, S.S. Quantum mechanics in the Gaussian wave-packet phase space representation. Physica A 1984, 127, 241–264. [Google Scholar] [CrossRef]
  23. Mizrahi, S.S. Quantum mechanics in the Gaussian wave-packet phase space representation II: Dynamics. Physica A 1986, 135, 237–250. [Google Scholar] [CrossRef]
  24. Mizrahi, S.S. Quantum mechanics in the gaussian wave-packet phase space representation III: From phase space probability functions to wave-functions. Physica A 1988, 150, 541–554. [Google Scholar] [CrossRef]
  25. Pennini, F.; Plastino, A. Escort Husimi distributions, Fisher information and nonextensivity. Phys. Lett. A 2004, 326, 20–26. [Google Scholar] [CrossRef]
  26. Cramer, H. Mathematical Methods of Statistics; Princeton University Press: Princeton, NJ, USA, 1946. [Google Scholar]
  27. Rao, C.R. Information and accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc. 1945, 37, 81–91. [Google Scholar]
  28. Jaynes, E.T. Information theory and statistical mechanics I. Phys. Rev. 1957, 106, 620–630. [Google Scholar] [CrossRef]
  29. Jaynes, E.T. Information theory and statistical mechanics II. Phys. Rev. 1957, 108, 171–190. [Google Scholar] [CrossRef]
  30. Jaynes, E.T. Papers on Probability, Statistics and Statistical Physics; Rosenkrantz, R.D., Ed.; Kluwer Academic Publishers: Norwell, MA, USA, 1987. [Google Scholar]
  31. Katz, A. Principles of Statistical Mechanics: The Information Theory Approach; Freeman and Co.: San Francisco, CA, USA, 1967. [Google Scholar]
  32. Sugita, A.; Aiba, H. Second moment of the Husimi distribution as a measure of complexity of quantum states. Phys. Rev. E 2002, 65, 36205:1–36205:10. [Google Scholar] [CrossRef]
  33. Mandelbrot, B. The role of sufficiency and of estimation in thermodynamics. Ann. Math. Stat. 1962, 33, 1021–1038. [Google Scholar] [CrossRef]
  34. Pennini, F.; Plastino, A. Power-law distributions and Fisher’s information measure. Physica A 2004, 334, 132–138. [Google Scholar] [CrossRef]
  35. Dodonov, V.V. Purity- and entropy-bounded uncertainty relations for mixed quantum states. J. Opt. B Quantum Semiclass. Opt. 2002, 4, S98–S108. [Google Scholar] [CrossRef]
  36. Munro, W.J.; James, D.F.V.; White, A.G.; Kwiat, P.G. Maximizing the entanglement of two mixed qubits. Phys. Rev. A 2003, 64, 30302:1–30302:4. [Google Scholar] [CrossRef]
  37. Fano, U. Description of states in quantum mechanics by density matrix and pperator techniques. Rev. Mod. Phys. 1957, 29, 74–93. [Google Scholar] [CrossRef]
  38. Batle, J.; Plastino, A.R.; Casas, M.; Plastino, A. Conditional q-entropies and quantum separability: A numerical exploration. J. Phys. A Math. Gen. 2002, 35, 10311–11324. [Google Scholar] [CrossRef]
  39. Anderson, M.H.; Ensher, J.R.; Matthews, M.R.; Wieman, C.E.; Cornell, E.A. Observation of bose-einstein condensation in a dilute atomic vapor. Science 1995, 269, 198–201. [Google Scholar] [CrossRef] [PubMed]
  40. Davis, K.B.; Mewes, M.-O.; Andrews, M.R.; van Druten, N.J.; Durfee, D.S.; Kurn, D.M.; Ketterle, W. Bose-einstein condensation in a gas of sodium atoms. Phys. Rev. Lett. 1995, 75, 3969–3973. [Google Scholar] [CrossRef] [PubMed]
  41. Bradley, C.C.; Sackett, C.A.; Hulet, R.G. Bose-einstein condensation of lithium: Observation of limited condensate number. Phys. Rev. Lett. 1997, 78, 985–989. [Google Scholar] [CrossRef]
  42. Curilef, S.; Pennini, F.; Plastino, A.; Ferri, G.L. Fisher information, delocalization and the semiclassical description of molecular rotation. J. Phys. A Math. Theor. 2007, 40, 5127–5140. [Google Scholar] [CrossRef]

Share and Cite

MDPI and ACS Style

Pennini, F.; Ferri, G.; Plastino, A. Fisher Information and Semiclassical Treatments. Entropy 2009, 11, 972-992.

AMA Style

Pennini F, Ferri G, Plastino A. Fisher Information and Semiclassical Treatments. Entropy. 2009; 11(4):972-992.

Chicago/Turabian Style

Pennini, Flavia, Gustavo Ferri, and Angelo Plastino. 2009. "Fisher Information and Semiclassical Treatments" Entropy 11, no. 4: 972-992.

Article Metrics

Back to TopTop