Please select whether you prefer to view the MDPI pages with a view tailored for mobile displays or to view the MDPI
pages in the normal scrollable desktop version. This selection will be stored into your cookies and used automatically
in next visits. You can also change the view style at any point from the main header when using the pages with your
Departamento de Física, Universidad Católica del Norte, Av. Angamos 0610, Antofagasta, Chile
Instituto de Física La Plata (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata (UNLP), Consejo Nacional de Investigaciones (CCT-CONICET), C.C. 727, (1900) La Plata, Argentina
Facultad de Ciencias Exactas y Naturales, Universidad Nacional de La Pampa, Uruguay 151, Santa Rosa, La Pampa, Argentina
Author to whom correspondence should be addressed.
Received: 30 October 2009 / Accepted: 27 November 2009 / Published: 3 December 2009
We review here the difference between quantum statistical treatments and semiclassical ones, using as the main concomitant tool a semiclassical, shift-invariant Fisher information measure built up with Husimi distributions. Its semiclassical character notwithstanding, this measure also contains abundant information of a purely quantal nature. Such a tool allows us to refine the celebrated Lieb bound for Wehrl entropies and to discover thermodynamic-like relations that involve the degree of delocalization. Fisher-related thermal uncertainty relations are developed and the degree of purity of canonical distributions, regarded as mixed states, is connected to this Fisher measure as well.
A quarter of century before Shannon, R.A. Fisher advanced a method to measure the information content of continuous (rather than digital) inputs using not the binary computer codes but the statistical distribution of classical probability theory . Already in 1980 Wootters pointed out that Fisher’s information measure (FIM) and quantum mechanics share a common formalism and both relate probabilities to the squares of continuous functions .
The present review draws materials from much interesting work that is reported recently and devoted to the physical applications of Fisher’s information measure (see, for instance, [1,3,4,5,6]). Frieden and Soffer  have shown that Fisher’s information measure provides one with a powerful variational principle—the extreme physical information—that yields most of the canonical Lagrangians of theoretical physics [1,3]. Additionally, FIM has been shown to provide an interesting characterization of the “arrow of time”, alternative to the one associated with Boltzmann’s entropy [7,8]. Thus, unravelling the multiple FIM facets and their links to physics should be of general interest. The Legendre transform structure of thermodynamics can be replicated as well, without any change, if one replaces the Boltzmann–Gibbs–Shannon entropy S by Fisher’s information measure I. In particular, I possesses the all important concavity property , and use of the Fisher’s measure allows for the development of a thermodynamics that seems to be able to treat equilibrium and non-equilibrium situations in a manner entirely similar to the conventional one . Here, the focus of our attention will be, following , the thermal description of harmonic oscillator (HO).
The semiclassical approximation (SC) has had a long and distinguished history and remains today a very important weapon in the physics armory. It is indeed indispensable in many areas of scientific endeavor. Also, it facilitates, in many circumstances, an intuitive understanding of the underlying physics that may remain hidden in extensive numerical solutions of Schrödinger’s equation. Although the SC-approach is as old as quantum mechanics itself, it remains active, as reported, for example, in  and .
Our emphasis in this review will be placed on the study of the differences between (i) statistical treatments of a purely quantal nature, on the one hand, and (ii) semiclassical ones, on the other. We will show that these differences can be neatly expressed entirely in terms of a special version, to be called , of Fisher’s information measure: the so-called shift-invariant Fisher one , associated to phase space. Additionally is a functional of a semiclassical distribution function, namely, the Husimi function . The phase space measure will be shown to help to (1) refine the so-called Lieb-bound  and (2) connect this refinement with the delocalization in phase space. The latter can, of course, be visualized as information loss. will also be related to an interesting semiclassical measure that was early introduced to characterize the same phenomenon: the Wehrl entropy W ,
for which Lieb established the above cited lower bound , which is a manifestation of the uncertainty principle . is the Boltzmann’s constant. Henceforth we will set , for the sake of simplicity.
For the convenience of the reader, in the following section we describe some fundamental aspects of the HO canonical-ensemble description from a coherent states’ viewpoint , the Husimi probability distribution function, and the Wehrl information measure.
2. Background Notions
2.1. HO’s coherent states
In  the authors discuss quantum-mechanical phase space distributions expressed in terms of the celebrated coherent states of the harmonic oscillator, eigenstates of the annihilation operator [14,15], i.e.,
with z a complex combination of the phase space coordinates x, p
where , , and .
Coherent states span Hilbert’s space, constitute an over-complete basis and obey resolution of unity 
where the differential element of area in the plane is and the integration is carried out over the whole complex plane.
The coherent state can be expanded in terms of the states of the HO as follows
where are eigenstates of the HO Hamiltonian whose form is and we have
We write down now, for future reference, well-known quantal HO-expressions for, respectively, the partition function Z, the entropy S, the mean energy U, the mean excitation energy E, the free energy , and the specific heat C 
2.3. Husimi probability distribution
In the wake of a discussion advanced in , we will be mainly concerned with building “Husimi–Fisher” bridges. It is well-known that the oldest and most elaborate phase space (PS) formulation of quantum mechanics is that of Wigner [18,19]. To every quantum state a PS function (the Wigner one) can be assigned. This PS function can, regrettably enough, assume negative values so that a probabilistic interpretation becomes questionable. Such limitation was overcome, among others, by Husimi . In terms of the concomitant Husimi probability distributions, quantum mechanics can be completely reformulated [21,22,23,24]. This phase space distribution has the form of
where is the density operator of the system and are the coherent states (see, for instance,  and references therein). The function is normalized in the fashion
For a thermal equilibrium case , is the partition function, , with T being the temperature. Specializing things for the HO of frequency ω, with eigenstates associated to the eigenenergies , one has
with given by Equation (6), and the normalized Husimi probability distribution is
where is the “classical” distribution function (13) associated to the density operator of the system. The uncertainty principle manifests itself through the inequality , which was first conjectured by Wehrl  and later proved by Lieb . Equality holds if is a coherent state. After integration over all phase space, turns out to be 
3. Fisher’s Information Measure
Let us consider a system that is specified by a physical parameter θ, while x is a real stochastic variable and , which in turn depends on the parameter θ, is the probability density for . An observer makes a measurement of and estimates θ from this measurement, represented by . One wonders how well θ can be determined. Estimation theory  asserts that the best possible estimator , after a very large number of -samples is examined, suffers a mean-square error from θ that obeys a relationship involving Fisher’s I, namely, , where the Fisher information measure I is of the form
This “best” estimator is called the efficient estimator. Any other estimator must have a larger mean-square error. The only proviso to the above result is that all estimators be unbiased, i.e., satisfy . Thus, Fisher’s information measure has a lower bound, in the sense that, no matter what parameter of the system we choose to measure, I has to be larger or equal than the inverse of the mean-square error associated with the concomitant experiment. This result, is referred to as the Cramer–Rao bound [1,27]. A particular I-case is of great importance: that of translation families [1,4], i.e., distribution functions (DF) whose form does not change under θ-displacements. These DF are shift-invariant (à la Mach, no absolute origin for θ), and for them Fisher’s information measure adopts the somewhat simpler appearance 
Fisher’s measure is additive : If are independent, variables, . Notice that, for (a point in phase-space), we face a shift-invariance situation. Since in defining z in terms of the variables x and p, these are scaled by their respective variances, the Fisher measure associated to the probability distribution will be of the form 
We realize at this point that the Fisher measure built up with Husimi distributions is to be best employed to estimate “phase space position” . Further, efficient estimation is possible for all temperatures, a rather significant result. Comparison with Equation (18) allows one now to write
Since both W and are positive-definite quantities, (25) tells us that they are complementary informational quantities, thus if one of them gains, the other loses. Following Anderson et al.  let us now analyze the high and low temperatures limits. Given the form (23), when the temperature goes to zero , , its maximum possible value, since we know that the ground state will be the only one to be populated. If, on the other hand, the temperature tends to infinity , then and tends to zero, because we know beforehand that, in the limit, all energy levels will be populated in uniform. The uniform distribution is that of maximum ignorance [28,29,30,31]. The range of is , that of W is , as we can see in Figure 1. Using together with Equation (7) we notice that
so that it coincides with the canonical ensemble probability for finding the system in its ground state.
Fisher () and Wehrl (W) information measures vs. T (in units) for HO-Husimi distribution.
Fisher () and Wehrl (W) information measures vs. T (in units) for HO-Husimi distribution.
4. Fisher, Thermodynamics’ Third Law, and Thermodynamic Quantities
Consider now the general definition (19) of Fisher’s information measure in terms of the DF
with is the parameter to be estimated. Since
one readily ascertains that (i) the μ-mean value of (28) vanishes and (ii)
Reflection upon the -range (29) might led one to conclude that it constitutes a Fisher manifestation of thermodynamics’ third law. Not only Shannon’s measure but also Fisher’s (for the HO, at least) vanishes at zero temperature. Replacing now (23) and (29) into the entropy expression (8) we immediately arrive at the relation
The HO entropy can be expressed as the sum of two terms: one associated with the Fisher information and the other with the Fisher information for translation families corresponding to the phase space variables . Using Equation (7) we also have
with denoting the ground state energy. Thus,
which is to be compared to the well known canonical ensemble general expression connecting S and the mean energy U 
we see that is related to the excited spectrum contribution to U while is to be linked to the partition function. We will look now for a new connection between Fisher’s measures and . From (29) it is possible to rewrite in the form
i.e., the product on the left hand side is the β-derivative of the Boltzmann factor (constant energy-wise) at the inverse temperature β. In other words, measures the β-gradient of the Boltzmann factor.
Equation (23) implies, via Equations (7) to (12), that the quantal HO expressions for the most important thermodynamic quantities can be expressed in terms of the semiclassical measure . For this end we define the semiclassical free energy
which is the semiclassical contribution to the HO free-energy . Therefore, the thermodynamic quantities can be expressed as follows
which shows that the semiclassical, Husimi-based information measure does contain purely quantum statistical information. Furthermore, since from Helmholtz’ free energy F, we can derive all of the HO quantal thermodynamics , we see that the the HO-quantum thermostatistics is, as far as information is considered, entirely of semiclassical nature, as it is completely expressed in terms of a semiclassical measure. We emphasize thus the fact that the semiclassical quantity contains all the relevant HO-statistical quantum information.
5. HO-Semiclassical Fisher’s Measure
5.1. MaxEnt approach
All the previous results are exact. No reference whatsoever needs to be made to Jaynes’ Maximum Entropy Principle (MaxEnt)  up to this point. We wish now to consider a MaxEnt viewpoint. It is shown in  that the HO-energy can be cast as the sum of the ground-state energy plus the expectation value of the HO-Hamiltonian with respect to the coherent state , which is the sum of the ground-state energy plus a semiclassical excitation energy . One has for the semiclassical excitation HO-energy at z 
where the last expectation value is computed using the distribution . This semiclassical excitation energy is given, for a Husimi distribution μ, by 
Note now that, from Equation (16), we can conveniently recast the HO-expression for μ into the Gaussian fashion
peaked at the origin. The probability density μ of Equation (44) is clearly of the maximum entropy .
As a consequence, it proves convenient, at this stage, to view in the following light. The semiclassical form of the entropy S has exhaustively been studied by Wehrl. It is the (cf. 1) Shannon’s information measure evaluated with Husimi distributions . Assume we know a priori the value . We wish to determine the distribution that maximizes the Wehrl entropy under this value constraint. Accordingly, the MaxEnt distribution will be 
with the normalization Lagrange multiplier and η the one associated to . According to MaxEnt tenets we have 
Now, the multiplier is determined by the relation 
If we choose the Fisher-Husimi constraint given by Equation (43), this results in and from Equation (46) we get , i.e., , and we consequently arrive to the desired outcome
We have thus shown that the HO-Husimi distributions are MaxEnt-ones with the semiclassical excitation energy (43) as a constraint. It is clear from Equation (48) that plays there the role of an “inverse temperature”.
The preceding argument suggests that we are tacitly envisioning the existence of a quantity (the inverse of η) associated to the Wehrl measure that we here extend to extreme. This Wehrl temperature governs the width of our Gaussian Husimi distribution. On account of
and it is easy to see from the range of that the range of values of is then . Due to the semiclassical nature of both W and μ, has a lower bound greater than zero.
The two quantities W and have been shown to be related, for the HO, according to Equation (25). Since the Wehrl temperature yields the width of our Gaussian Husimi distribution, we can conceive of introducing a “delocalization factor” D
The above definition leads to the relation
As stressed above, W has been constructed as a delocalization measure . The preceding considerations clearly motivate one to regard the Fisher measure built up with Husimi distributions as a “localization estimator" in phase space. The HO-Gaussian expression for μ (44) illuminates the fact that the Fisher measure controls both the height and the spread (which is ). Obviously, spread is here a “phase-space delocalization indicator”. This fact is reflected by the quantity D introduced above.
Thus, an original physical interpretation of Fisher’s measure emerges: localization control. The inverse of the Fisher measure, D, turns out to be a delocalization-indicator. Differentiating Fisher’s measure (23) with respect T, notice also that
so that Fisher’s information decreases exponentially as the temperature grows. Our Gaussian distribution loses phase-space “localization” as energy and/or temperature are injected into our system, as reflected via or D. Notice that (52) complements the Lieb bound . It tells us by just how much W exceeds unity. We see that it does it by virtue of delocalization effects. Moreover, this fact can be expressed using the shift-invariant Fisher measure. We will now show that D is proportional to the system’s energy fluctuations.
5.3. Second moment of the Husimi distribution
The second moment of the Husimi distribution is an important measure to ascertain the “degree of complexity” of a quantum state (see below). It is a measure of the delocalization-degree of the Husimi distribution in phase space (see Reference  for details and discussions). It is defined as
that, after explicit evaluation of from Equation (44) reads
An important result is thus obtained: the delocalization factor D represents energy-fluctuations expressed in terms. Delocalization is clearly seen to be the counterpart of energy fluctuations.
6. Thermodynamics-Like Relations
Let us now go back to Equation (37) and revisit the entropic expression. It is immediately realized that we can recast the entropy S in terms of the quantal mean excitation energy E and the delocalization factor D as
i.e., if one injects into the system some excitation energy E, expressed in “natural” T units, it is apportioned partly as heat dissipation via S and partly via delocalization. More precisely, the part of this energy not dissipated is that employed to delocalize the system in phase space. Now, since , the above equation can be recast in alternative forms, as
which is a physically sensible result and
as it should, since at (third law of thermodynamics), while W attains there its Lieb’s lower bound of unity.
One finds in Equation (60) some degree of resemblance to thermodynamics’s first law. To reassure ourselves on this respect, we slightly changed our underlying canonical probabilities μ, multiplying it by a factor . Specifically, we generated random numbers according to the normal distribution and divided them by 100 to obtain the above factors . This process leads to new “perturbed” probabilities , conveniently normalized. With them we evaluate the concomitant changes , (we do this 50 times, with different random numbers in each instance). We were then able to numerically verify that the difference . The concomitant results are plotted in Figure 2) Since, as stated, numerically , this entails, from Equation (60), . The physical connotations are as follows: if the only modification effected is that of a change  in the canonical distribution μ, this implies that the system undergoes a heat transfer process  for which thermodynamics’ first law implies . This is numerically confirmed in the plots of Figure 2. The null contribution of to this process suggests that delocalization (not a thermodynamic effect, but a dynamic one) can be regarded as behaving (thermodynamically) like a kind of “work”.
Numerical computation results for the HO: changes and vs. that ensue after randomly generating variations in the underlying microscopic canonical probabilities .
Numerical computation results for the HO: changes and vs. that ensue after randomly generating variations in the underlying microscopic canonical probabilities .
Now, since (a) and (b) the mean energy of excitation is , one also finds, for the quantum-semiclassical difference (QsCD) the result
Moreover, since , we see that, always, , as expected, since the semiclassical treatment contains less information than the quantal one. Note that the QsCD can be expressed exclusively in Fisher’s information terms. This is, the quantum-semiclassical entropic difference may be given in terms only. Figure (3) depicts S, , and vs. the dimensionless quantity . Accordingly, entropy is apportioned in such a way that
part of it originates from excitation energy and
the remaining is accounted for by phase space delocalization.
A bit of algebra allows one now to express the rate of entropic change per unit temperature increase as
Figure 3.S, , and as a function of .
Figure 3.S, , and as a function of .
In the case of the one dimensional HO we see that the specific heat measures delocalization change per unit temperature increase. Also, , providing us with a very simple relationship between mean excitation energy changes and delocalization ones.
7. On Thermal Uncertainties
Additional considerations are in order with regards to thermal uncertainties, that express the effect of temperature on Heisenberg’s celebrated relations (see, for instance [6,33,34,35]). We use now a result obtained in  (Equation (3.12)), where the authors cast Wehrl’s information measure in terms of the coordinates’ variances and , obtaining
In the present context, the relation allows us to conclude that 
which can be regarded as a “Fisher uncertainty principle” and adds still another meaning to : since, necessarily, , it is clear that is the “correcting factor” that permits one to reach the uncertainty’s lower bound , a rather interesting result.
Phase space “localization” is possible, with Husimi distributions, only up to ℏ . This is to be compared to the uncertainties evaluated in a purely quantal fashion, without using Husimi distributions, and in particular with a previous result in . With the virial theorem  one can easily ascertain in  that
Thus We see that, as , is twice the minimum quantum value for , and , the “minimal” phase-space cell. The quantum and semiclassical results do coincide at very high temperature though. Indeed, one readily verifies  that Heisenberg’s uncertainty relation, as a function of both frequency and temperature, is governed by a thermal “uncertainty function” F that acquires the aspect
Coming back to results derivable within the present context, we realize here that F can be recast as
so that, for T varying in , the range of possible -values is Equation (73) is a “Heisenberg–Fisher” thermal uncertainty relation (for a discussion of this concept see, for instance, [6,33,34]).
grows with both E and D. The usual result is attained for minimum D and zero excitation energy. As for , one is able to set , since . Remarkably enough, the two contributions to are easily seen to be equal and . One can also write
providing us with a thermodynamic “costume” for the uncertainty function F that sheds some new light onto the meaning of both ℏ and ω. In particular, we see that is the derivative of the uncertainty function F with respect to the delocalization factor D. Increases of the thermal uncertainty function F are of two types
from the excitation energy, that supplies a contribution and
from the delocalization factor D.
8. Degrees of Purity Relations
8.1. Semiclassical purity
The quantal concept of degree of purity of a general density operator is expressed via [36,37]. Its inverse, the so-called participation ratio
is particularly convenient for calculations . It varies from unity for pure states to N for totally mixed states . It may be interpreted as the effective number of pure states that enter a quantum mixture. Here we will consider the “degree of purity” of a semiclassical distribution, given by
Clearly, coincides with the second moment of the Husimi distribution (44) given by Equation (54), i.e.,
Using now (52) we relate the semiclassical degree of purity to the delocalization factor and to the Wehrl temperature
and also to our semiclassical energy-fluctuations (57)
Since , the “best” purity attainable at the semiclassical level equals one-half.
8.2. Quantal purity
For the quantum mixed HO-state , where is the Hamiltonian of the harmonic oscillator and the partition function Z is given by Equation (7) , we have a degree of purity given by (see the detailed study by Dodonov )
where . Thus, Heisenberg’ uncertainty relation can be cast in the fashion
where and are the quantum variances for the canonically conjugated observables x and p 
which is to be compared to the semiclassical result that was derived above (cf.71).
We relate now the degree of purity of our thermal state with various physical quantities both in its quantal and semiclassical versions. Using Equations (71) and (77) we get
which leads to
such as clearly shows that (i) , and (ii) for a pure state, again, its semiclassical counterpart has a degree of purity equal .
Additionally, on account of Equation (69), on the one hand, and since the semiclassical degree of purity reads , on the other one, we are led to an uncertainty relation for mixed states in terms of , namely,
that tells us just how uncertainty grows as participation ratio augments. Equation (86) is of semiclassical origin, which makes it a bit different from the one that results form a purely quantal treatment (see , Equation (4)). Moreover, notice how information concerning the purely quantal notion of purity is already contained in the semiclassical measure .
Semiclassical purity vs. .
Semiclassical purity vs. .
We appreciate the fact that R increases as delocalization grows, a quite sensible result. Figure (4) depicts , a monotonously decreasing function, which tells us that degree of purity of a mixed state acts here as a thermometer, and allows then to assign a value to any of our mixed states.
Also, using once again the result together with the uncertainties just derived we see that . Thus, we can rewrite Equation (64) in the following form
which casts the difference between the quantal and semiclassical entropies in terms of the degrees of purity. From Equation (87) we can also give the quantal mean excitation energy E in terms of using (38)
We have explored in this review connections between canonical ensemble quantities and two Fisher information measures, associated to the estimation of phase-space location () and temperature (). Our most important result is, perhaps, to have shown that there exists a “Fisher-associated” third law of thermodynamics (at least for the HO). From a pure information-theoretic viewpoint, we have, obtained significant results, namely,
a connection between Wehrl’s entropy and (cf. Equation (25)),
an interpretation of as the HO’s ground state occupation probability (cf. Equation (26)),
an interpretation of proportional to the HO’s specific heat (cf. Equation (30)),
the possibility of expressing the HO’s entropy as a sum of two terms, one for each of the above FIM realizations (cf. Equation (31)),
a new form of Heisenberg’s uncertainty relations in Fisher terms (cf. Equation (73)),
that efficient -estimation can be achieved with at all temperatures, as the minimum Cramer–Rao value is always reached (cf. Equation (24)).
Our statistical semiclassical treatment yielded, we believe, some new interesting physics that we proceed to summarize. We have, for the HO,
established that the semiclassical Fisher measure contains all relevant statistical quantum information,
shown that the Husimi distributions are MaxEnt ones, with the semiclassical excitation energy as the only constraint,
complemented the Lieb bound on the Wehrl entropy using ,
observed in detailed fashion how delocalization becomes the counterpart of energy fluctuations,
written down the difference between the semiclassical and quantal entropy also in terms,
provided a relation between energy excitation and degree of delocalization,
shown that the derivative of twice the uncertainty function with respect to is the Planck constant ℏ,
established a semiclassical uncertainty relation in terms of the semiclassical purity , and
expressed both and the quantal degree of purity in terms of .
These results are, of course, restricted to the harmonic oscillator. However, this is such an important system that HO insights usually have a wide impact, as the HO constitutes much more than a mere example. Nowadays it is of particular interest for the dynamics of bosonic or fermionic atoms contained in magnetic traps [39,40,41] as well as for any system that exhibits an equidistant level spacing in the vicinity of the ground state, like nuclei or Luttinger liquids. The treatment of Hamiltonians including anharmonic terms is the next logical step. We are currently undertaking such a task. To this end analytical considerations do not suffice, and numerical methods are required. The ensuing results will be published elsewhere.
F. Pennini is grateful for the financial support by FONDECYT, grant 1080487.
References and Notes
Frieden, B.R. Science from Fisher Information, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
Wootters, W.K. The acquisition of information from quantum measurements. PhD Dissertation, University of Texas, Austin, TX, USA, 1980. [Google Scholar]
Frieden, B.R.; Soffer, B.H. Lagrangians of physics and the game of Fisher-information transfer. Phys. Rev. E1995, 52, 2274–2286. [Google Scholar] [CrossRef]
Pennini, F.; Plastino, A.R.; Plastino, A. Rènyi entropies and Fisher informations as measures of nonextensivity in a Tsallis setting. Physica A1998, 258, 446–457. [Google Scholar] [CrossRef]
Frieden, B.R.; Plastino, A.; Plastino, A.R.; Soffer, H. Fisher-based thermodynamics: Its Legendre transformations and concavity properties. Phys. Rev. E1999, 60, 48–53. [Google Scholar] [CrossRef]
Pennini, F.; Plastino, A.; Plastino, A.R.; Casas, M. How fundamental is the character of thermal uncertainty relations? Phys. Lett. A2002, 302, 156–162. [Google Scholar] [CrossRef]
Plastino, A.R.; Plastino, A. Symmetries of the Fokker-Planck equation and the Fisher-Frieden arrow of time. Phys. Rev. E1996, 54, 4423–4426. [Google Scholar] [CrossRef]
Plastino, A.; Plastino, A.R.; Miller, H.G. On the relationship between the Fisher-Frieden-Soffer arrow of time, and the behaviour of the Boltzmann and Kullback entropies. Phys. Lett. A1997, 235, 129–134. [Google Scholar] [CrossRef]
Anderson, A.; Halliwell, J.J. Information-theoretic measure of uncertainty due to quantum and thermal fluctuations. Phys. Rev. D1993, 48, 2753–2765. [Google Scholar] [CrossRef]
Dimassi, M.; Sjoestrand, J. Spectral Asymptotics in the Semiclassical Limit; Cambridge Univesity Press: Cambridge, UK, 1999. [Google Scholar]
Brack, M.; Bhaduri, R.K. Semiclassical Physics; Addison-Wesley: Boston, MA, USA, 1997. [Google Scholar]
Wehrl, A. On the relation between classical and quantum entropy. Rep. Math. Phys.1979, 16, 353–358. [Google Scholar] [CrossRef]
Lieb, E.H. Proof of an entropy conjecture of wehrl. Commun. Math. Phys.1978, 62, 35–41. [Google Scholar] [CrossRef]
Schnack, J. Thermodynamics of the harmonic oscillator using coherent states. Europhys. Lett.1999, 45, 647–652. [Google Scholar] [CrossRef]
Wigner, E.P. On the quantum correction for thermodynamic equilibrium. Phys. Rev.1932, 40, 749–759. [Google Scholar] [CrossRef]
Wlodarz, J.J. Entropy and wigner distribution functions revisited. Int. J. Theor. Phys.2003, 42, 1075–1084. [Google Scholar] [CrossRef]
Husimi, K. Some formal properties of the density matrix. Proc. Phys. Math. Soc. JPN1940, 22, 264–283. [Google Scholar]
O’ Connel, R.F.; Wigner, E.P. Some properties of a non-negative quantum-mechanical distribution function. Phys. Lett. A1981, 85, 121–126. [Google Scholar] [CrossRef]
Mizrahi, S.S. Quantum mechanics in the Gaussian wave-packet phase space representation. Physica A1984, 127, 241–264. [Google Scholar] [CrossRef]
Mizrahi, S.S. Quantum mechanics in the Gaussian wave-packet phase space representation II: Dynamics. Physica A1986, 135, 237–250. [Google Scholar] [CrossRef]
Mizrahi, S.S. Quantum mechanics in the gaussian wave-packet phase space representation III: From phase space probability functions to wave-functions. Physica A1988, 150, 541–554. [Google Scholar] [CrossRef]
Pennini, F.; Plastino, A. Escort Husimi distributions, Fisher information and nonextensivity. Phys. Lett. A2004, 326, 20–26. [Google Scholar] [CrossRef]
Cramer, H. Mathematical Methods of Statistics; Princeton University Press: Princeton, NJ, USA, 1946. [Google Scholar]
Rao, C.R. Information and accuracy attainable in the estimation of statistical parameters. Bull. Calcutta Math. Soc.1945, 37, 81–91. [Google Scholar]
Jaynes, E.T. Information theory and statistical mechanics I. Phys. Rev.1957, 106, 620–630. [Google Scholar] [CrossRef]
Jaynes, E.T. Information theory and statistical mechanics II. Phys. Rev.1957, 108, 171–190. [Google Scholar] [CrossRef]
Jaynes, E.T. Papers on Probability, Statistics and Statistical Physics; Rosenkrantz, R.D., Ed.; Kluwer Academic Publishers: Norwell, MA, USA, 1987. [Google Scholar]
Katz, A. Principles of Statistical Mechanics: The Information Theory Approach; Freeman and Co.: San Francisco, CA, USA, 1967. [Google Scholar]
Sugita, A.; Aiba, H. Second moment of the Husimi distribution as a measure of complexity of quantum states. Phys. Rev. E2002, 65, 36205:1–36205:10. [Google Scholar] [CrossRef]
Mandelbrot, B. The role of sufficiency and of estimation in thermodynamics. Ann. Math. Stat.1962, 33, 1021–1038. [Google Scholar] [CrossRef]
Pennini, F.; Plastino, A. Power-law distributions and Fisher’s information measure. Physica A2004, 334, 132–138. [Google Scholar] [CrossRef]
Dodonov, V.V. Purity- and entropy-bounded uncertainty relations for mixed quantum states. J. Opt. B Quantum Semiclass. Opt.2002, 4, S98–S108. [Google Scholar] [CrossRef]
Munro, W.J.; James, D.F.V.; White, A.G.; Kwiat, P.G. Maximizing the entanglement of two mixed qubits. Phys. Rev. A2003, 64, 30302:1–30302:4. [Google Scholar] [CrossRef]
Fano, U. Description of states in quantum mechanics by density matrix and pperator techniques. Rev. Mod. Phys.1957, 29, 74–93. [Google Scholar] [CrossRef]
Batle, J.; Plastino, A.R.; Casas, M.; Plastino, A. Conditional q-entropies and quantum separability: A numerical exploration. J. Phys. A Math. Gen.2002, 35, 10311–11324. [Google Scholar] [CrossRef]
Anderson, M.H.; Ensher, J.R.; Matthews, M.R.; Wieman, C.E.; Cornell, E.A. Observation of bose-einstein condensation in a dilute atomic vapor. Science1995, 269, 198–201. [Google Scholar] [CrossRef] [PubMed]
Davis, K.B.; Mewes, M.-O.; Andrews, M.R.; van Druten, N.J.; Durfee, D.S.; Kurn, D.M.; Ketterle, W. Bose-einstein condensation in a gas of sodium atoms. Phys. Rev. Lett.1995, 75, 3969–3973. [Google Scholar] [CrossRef] [PubMed]