Next Article in Journal
A Computational Model to Determine Membrane Ionic Conductance Using Electroencephalography in Epilepsy
Previous Article in Journal
The Geometry of Quivers
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Reciprocity Relations for Quantum Systems Based on Fisher Information †

by
Mariela Portesi
1,2,*,
Juan Manuel Pujol
1,2 and
Federico Holik
1
1
IFLP, CONICET—UNLP, La Plata 1900, Argentina
2
Facultad de Ciencias Exactas, Universidad Nacional de La Plata, La Plata 1900, Argentina
*
Author to whom correspondence should be addressed.
Presented at the 41st International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Paris, France, 18–22 July 2022.
Phys. Sci. Forum 2022, 5(1), 44; https://doi.org/10.3390/psf2022005044
Published: 29 January 2023

Abstract

:
We study reciprocity relations between fluctuations of the probability distributions corresponding to position and momentum, and other observables, in quantum theory. These kinds of relations have been previously studied in terms of quantifiers based on the Lipschitz constants of the concomitant distributions. However, it turned out that they were not valid for all states. Here, we ask the following question: can those relations be described using other quantifiers? By appealing to the Fisher information, we study reciprocity relations for different families of states. In particular, we look for a connection of this problem with previous works.

1. Introduction

Broadly speaking, information theory, developed after the seminal works of Claude E. Shannon in 1948 [1] and 1949 [2], provides a formal and abstract framework which allows formalizing notions such as ignorance, uncertainty, or unpredictability with regard to a given variable or system. This theory is the basis of many of the technological advancements in the fields of communication and cryptography, for instance. Furthermore, its extension to the physical sciences including quantum mechanics has been vastly studied, and relevant results have been obtained.
One of the most important quantities in information theory is entropy. Mathematically, an information entropy is a functional over random variables (or quantum states, in the case of quantum–mechanical systems), which aims to measure uncertainty in the sense of ignorance or lack of information associated to a given distribution associated to the variable. While Shannon entropy is the most popular one, there exist a great number of families of alternative entropies, such as the monoparametric forms provided by Rényi, Tsallis, or Kaniadakis that reduce to the Shannon case in a limiting case, among others, or even more generalized entropies as the ( h , ϕ ) -ones (see, for instance, the quantal version in Ref. [3] and references therein). In a strictly quantum framework, the von Neumann entropy, the minimum entropy and the alluded generalizations are used. The conditional and joint entropies, the mutual information, and other relative forms serve, e.g., to relate and discriminate distributions for more than one variable.
Entropic measures are of much interest in particular due to the maximum entropy principle (known for short as MEP or MaxEnt), which was introduced by Edwin Jaynes in 1957 in the field of statistical mechanics. This principle states that the statistical distribution representing a system in equilibrium is that which maximizes the entropy while fulfilling a set of constraints satisfied by the system; that is, MEP selects the distribution that minimizes the amount of information given some constraints. This principle presents equivalences in problems in the area of communications and of pattern recognition, among others. While it was originally formulated from Shannon entropy, its form does not present major changes when using generalized entropies. The informational quantities mentioned so far have been largely studied in order to establish different relations among them in terms of equalities and inequalities that connect them from different perspectives or establish lower or upper bounds, providing a meaning also to the states or distributions saturating the bound. Of particular relevance, mainly in quantum theory but also in communication, are the efforts performed with the goal of finding uncertainty relations, which are known as entropic uncertainty relations (EURs). There exist reasons for which a treatment based on entropies is favorable over a variance-based one.
Among the most relevant features of quantum mechanics, the existence of uncertainty relations (or trade-off or reciprocity inequalities) is particularly important. These types of relations present a paradigm shift toward a fundamental unpredictability present in natural phenomena. Their extensive study is not only related to its importance in a better understanding of quantum mechanics but also to their applications to fields such as cryptography, information, and computer science.
The first of these kinds of relations is the so-called Heisenberg uncertainty principle. It has been originally proposed as a trade-off involving uncertainties or imprecisions for the position x and momentum p of a particle, in terms of Planck’s constant h = 6.62607015 × 10 34 J Hz−1, in the form
Δ x Δ p h .
This was presented by Werner Heisenberg in 1927 [4] after the foundations of matrix and wave mechanics were already developed. He tried to make sense of this relation in the following way: “The more accurately the position is determined, the less accurately the momentum is known, and conversely”. Heisenberg also saw a “direct and intuitive interpretation” of commutation relations between the position and momentum operators.
Indeed, a derivation of this principle starting from a wave-mechanical formalism was given the same year by Earle Kennard, who provided an inequality for the standard deviations of the operators and established the lower bound as half the reduced Planck’s constant ( = h / ( 2 π ) ):
σ x σ p 2 .
While this relation is presented for the variances of the position and momentum observables, in 1929 Howard Robertson [5] and the following year Erwin Schrödinger showed a generalization for any two arbitrary operators. These forms included, respectively, the commutator [ A ^ , B ^ ] = A ^ B ^ B ^ A ^ and the anticommutator { A ^ , B ^ } = A ^ B ^ + B ^ A ^ of any two (not necessarily commuting) quantum–mechanical observables A ^ and B ^ . In its strongest form, it reads
σ A ^ 2 σ B ^ 2 1 2 i [ A ^ , B ^ ] 2 + 1 2 { A ^ , B ^ } A ^ B ^ 2 .
where the mean values represented by the brackets are computed for the corresponding wavefunction or, more generally, the density operator of the quantum system.
The last assertion is precisely one of the drawbacks of this sequence of relations, as both sides of the inequalities are commonly dependent on the system’s state. In other words, these are not universal relations for a given pair of observables. In passing, we note that more general variance-based relationships have been provided with different implications. These comprise general powers—other than the square—of the involved operators (see, for instance, [6] and references therein), or even a compromise for more than two observables in a unique trade-off. Other attempts as the Landau–Pollak inequality and geometric-inspired formulations also made their contribution to this problem from diverse viewpoints (see, e.g., [7]). As already mentioned, entropic uncertainty inequalities appeared as an effort to account for connections among quantum observables (or, in other terms depending on the context, among Fourier-related probability distributions). Within this framework, particularities of those states reaching the bounds have been analyzed [8,9,10].
Furthermore, in order to surpass weaknesses present in variance-based relationships, and also to provide an alternative point of view to the entropic inequalities, another kind of reciprocity relations have been envisaged that make use of the Fisher information for instance. It is interesting to notice that while Shannon entropy provides a global measure of the dispersion of a function, Fisher information is a cumulative measure sensitive to local changes in density; roughly speaking, the bigger the Fisher information, the more localized the function, and the smaller the associated uncertainty.
In this work, we deal with reciprocity relations, mainly for position and momentum operators for quantum systems, in terms of information quantifiers. In Section 2, we recall basic results on EURs for continuous quantum observables and refer to analogous inequalities for Fourier-transformed PDFs. Changes in the distributions corresponding to those observables are considered from another viewpoint, discussing some results for localized x-p fluctuations in Section 3, while Section 4 is devoted to the study of cumulative reciprocity relations. We provide some final results in Section 5.

2. Uncertainty Relations in Terms of Information Entropies

For a discrete random variable X that assumes values on the alphabet X of cardinal | X | (probably infinite), the Shannon entropy is given by
H ( X ) = x X p ( x ) log p ( x ) ,
where { p ( x ) } x X is the corresponding probability function, and the logarithm in base 2 gives the value of H in bits (any other base could be used as well). The extension for the case of a continuous random variable gives rise to the so-called differential entropy,
H ( X ) = X p ( x ) log p ( x ) d x
As mentioned, to account for the uncertainty principle quantitatively, unwanted behaviors shown by variance-based inequalities were replaced by other information (or uncertainty) quantifiers. Those most extensively studied are the entropy-based uncertainty relations. One of the first arguments in favor of EURs was given by David Deutsch in 1983 [9], pointing out the dependence of Robertson’s lower bound, in general, on the system’s state | φ . In particular, this limit is trivial when | φ yields a null mean value for the commutator [ A , B ] (which is always possible for finite-dimensional models).
It is interesting to note that in the context of Fourier analysis, entropic inequalities had been envisaged due to Hirschman and Everett [11,12] and proved by Beckner in 1975 [13]. In particular, the following expression was obtained:
H ( | f | 2 ) + H ( | g | 2 ) log e 2
for a function f and its Fourier transform g = f ˜ , where H is the differential entropy and | . | indicates the 2-norm of a function. This result was reinterpreted as an uncertainty relation for quantum systems by Bialynicki-Birula and Mycielski [8], under the form
H ( | ψ ( r ) | 2 ) + H ( | ϕ ( p ) | 2 ) d ( 1 + ln π ) ,
for the amplitudes of the wavefunction in position and momentum representations, being d the spatial dimension. From these relations, alternative ones and extensions have been developed, generating an active field of research and applications. One important fact to stress is that it can be demonstrated that EUR (7) implies inequality (2), and thus, the former is stronger than the latter.

3. A Discussion on Reciprocity Relations for Position–Momentum Fluctuations

The reciprocity relations mentioned so far relate the uncertainty, in the sense of dispersion (or width), for pairs of probability distributions. It is also interesting to study the existence of relationships between the fluctuations of a given distribution and its Fourier transform.
In Ref. [14], the authors search for these kinds of inequalities, measuring the fluctuations of a function in terms of its Lipschitz constant (LC). Numerical values were obtained for the product of LCs, η x η p , for some families of functions (Cauchy–Lorentz, Student’s t, Hermite, and Gaussian), suggesting a lower limit as
η x η p 0.3 .
However, no analytical proof is given nor a universal lower bound is shown to exist in that work.
As mentioned, one of the examples of the former inequality is the case of Student’s t distributions, which are defined as
f s ( x ; n ) = Γ n + 1 2 n π Γ n 2 1 + x 2 n n + 1 2 ,
where n is the number of degrees of freedom, as shown in Figure 1. Indeed, Pandit et al. [14] observe that for this distribution, η x η p when n = 2 , and when n , η x η p converges to 0.48 .
It can be seen that this distribution is related to a certain type of functions called q-Gaussians. The q-Gaussian distribution is a generalization of the standard Gaussian distribution, and it has the form:
f q ( x ) = C [ 1 ( 1 q ) β x 2 ] + 1 q 1 ,
where q and β are real parameters (here, q 1 , and the limit q 1 leads to the Gaussian case), and [ x ] + = x Θ ( x ) with Θ the step function. If one chooses q , β such that q = n + 3 n + 1 and β = 1 3 q , the Student’s t distribution is recovered.
A particular feature of these distributions is that in the same way as the Gaussian distribution maximizes Shannon entropy, the q-Gaussian distributions maximize Tsallis entropy (see [15] and refs. therein), which is defined for a continuous probability distribution p ( x ) as
S q [ p ] = 1 q 1 1 ( p ( x ) ) q d x .
This entropy has proven to be relevant in a wide range of applications. Interestingly enough, Vignat et al. [16] show that q-Gaussians correspond to the ground state of a quantum system under certain potential. To this end, they look for the expression of potentials whose ground states are wavefunctions of the type (10), finding
V ( r ) = β 2 d + β r 2 ( D ( q 1 ) + 3 2 q ) ( 1 ( q 1 ) β r 2 ) 2 .
When q 1 , it is seen that V ( r ) = d β 2 + 1 2 β 2 r 2 , which is the potential for a harmonic oscillator. When q > 1 , it presents a singularity at r w = 1 ( q 1 ) β , which is known as the “Tsallis cutoff condition”. The connections established so far between the shape of the ground-state wavefunctions and their corresponding potentials allow us to give a physical meaning to the distributions employed in [14] as examples, i.e., the Student’s t distribution in d = 1 arises from a potential of the form
V ( r ) = n + 1 4 n 1 + r 2 n 1 2 n 1 r 2 n 2
in terms of n. Therefore, we see that the LC-based reciprocity relations in this case refer to this physical problem.
Going back to the proposed lower bound in Equation (8), as said, it was obtained numerically and for a limited number of distributions. However, a counterexample has been provided by Bialynicki-Birula [17]. Indeed, he proposed a function and its Fourier transform such that the product of their Lipschitz constants can be made arbitrarily small. Those functions have the shape
ψ ( x ) = 1 π 4 1 a exp ( 1 + i b ) x 2 2 a 2
and
ψ ˜ ( p ) = 1 π 4 a 1 + i b exp a 2 p 2 2 ( 1 + i b )
where a and b are assumed to be real parameters, as shown in Figure 2. It is easily seen that
η x η p = 2 ( 1 + b 2 ) e π ,
which can in fact be made as small as desired by increasing the parameter b. This means the bound proposed in Equation (8) is not valid for all wavefunctions. An interpretation for this situation is that the fluctuations are in some sense contained into a phase factor and thus do not appear when considering the probability amplitudes.

4. On Global Reciprocity Relations

As it was seen earlier, the Lipschitz constant of a function is a reasonable measure of its fluctuations. However, there are some aspects which are not ideal for a given measure; for example, it depends only on one maximum value. Another quantity which can be used to measure fluctuations is the so-called Fisher information [18].
Fisher information has a wide variety of uses and interpretations. In statistics, it is used as a measure of the ability to estimate a parameter, through the Cramer–Rao bound, while in physics, it can be interpreted as a measure of the order (or disorder) of a system.
The Fisher information of a PDF ρ ( x ) is defined as
I [ ρ ] = ρ ( x ) 2 ρ ( x ) d x = 4 d d x ρ ( x ) 2 d x = ρ ( x ) d d x ln ρ ( x ) 2 d x ,
where all three definitions are equivalent. From these expressions, it can be seen that I depends not only on ρ but also on its derivative. This implies that unlike entropy or standard deviation, Fisher information is susceptible to local changes on ρ . Fisher information is greater the more localized the function and smaller the more spread out.
More recently, Fisher information has been used as a measure of information for probability distribution in quantum mechanics, and as a result, there have been studies on reciprocity relations involving this quantity [19,20,21]. In particular, it has been proven that when the wavefunction ψ ( x ) or its equivalent in the momentum representation ϕ ( p ) are real, the following relation holds true:
I x I p 4 d 2 ,
which also implies
I x + I p 4 d .
These expressions cannot, however, be extended to cases where both wavefunctions are complex.
With this in mind, it is interesting to note that the functions (14) and (15) are both complex. Then, it should not come as a surprise that those functions do not follow inequality (8). Furthermore, we argue that since Fisher information depends on the derivative of the probability distribution function and that it involves an integral over the whole domain, it is a more appropriate quantity to measure fluctuations than Lipschitz constants.

5. Final Remarks

Entropy is a central quantity in information–theory studies, as it possesses various properties that correspond with an intuitive notion of a measure of information (or, indeed, lack of it, that can be interpreted as uncertainty or indeterminacy). Shannon entropy H provides a global measure of dispersion of a probability distribution: the more localized it is, the lesser is its associated entropy.
In the realm of quantum mechanics, a foundational feature is the uncertainty principle for pairs of observables, which has been quantified in terms of entropic measures. Establishing entropic uncertainty relations and studying those states reaching the lower bound has been the focus of many efforts, while practical applications have also been envisaged.
Another type of reciprocity relations are those based on Fisher information. While Shannon entropy provides a global measure of the spread of a function, Fisher information is sensitive to local changes on the density; the greater the Fisher information, the more localized the distribution, and the smaller the associated uncertainty.
In light of the mentioned reciprocity relations, we focus on the interest to study the existence of relations in terms of inequalities, already established or emergent, while looking for new relationships between information measures.

Author Contributions

The authors contributed equally to this work. All authors have read and agreed to the published version of the manuscript.

Funding

The authors acknowledge support from the National Research Council (CONICET), Argentina. Financial assistance from UNLP under the project 11/X812, and from CONICET under the projects PIP 0519 and PUE 066, is also acknowledged. MP is grateful to GIPSA-Lab during her research stay at Université Grenoble-Alpes, France. FH was partially funded by the project “Per un’estensione semantica della Logica Computazionale Quantistica- Impatto teorico e ricadute implementative”, Regione Autonoma della Sardegna, (RAS: RASSR40341), L.R. 7/2017, annualità 2017- Fondo di Sviluppo e Coesione (FSC) 2014–2020.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef] [Green Version]
  2. Shannon, C.E. Communication theory of secrecy systems. Bell Syst. Tech. J. 1949, 28, 656–715. [Google Scholar] [CrossRef]
  3. Bosyk, G.M.; Zozor, S.; Holik, F.; Portesi, M.; Lamberti, P.W. A family of generalized quantum entropies: Definition and properties. Quantum Inf. Process. 2016, 15, 3393–3420. [Google Scholar] [CrossRef] [Green Version]
  4. Heisenberg, W. Über den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik. Zeitschrift fur Physik 1927, 43, 172–198. [Google Scholar] [CrossRef]
  5. Robertson, H.P. The Uncertainty Principle. Phys. Rev. 1929, 34, 163–164. [Google Scholar] [CrossRef]
  6. Zozor, S.; Portesi, M.; Sanchez-Moreno, P.; Dehesa, J.S. Position-momentum uncertainty relations based on moments of arbitrary order. Phys. Rev. A 2011, 83, 052107. [Google Scholar] [CrossRef] [Green Version]
  7. Bosyk, G.M.; Zozor, S.; Portesi, M.; Osán, T.M.; Lamberti, P.W. Geometric approach to extend Landau-Pollak uncertainty relations for positive operator-valued measures. Phys. Rev. A 2014, 90, 052114. [Google Scholar] [CrossRef] [Green Version]
  8. Białynicki-Birula, I.; Mycielski, J. Uncertainty relations for information entropy in wave mechanics. Commun. Math. Phys. 1975, 44, 129–132. [Google Scholar] [CrossRef]
  9. Deutsch, D. Uncertainty in Quantum Measurements. Phys. Rev. Lett. 1983, 50, 631–633. [Google Scholar] [CrossRef]
  10. Coles, P.J.; Berta, M.; Tomamichel, M.; Wehner, S. Entropic uncertainty relations and their applications. Rev. Mod. Phys. 2017, 89, 015002. [Google Scholar] [CrossRef]
  11. Hirschman, I.I. A Note on Entropy. Am. J. Math. 1957, 79, 152–156. [Google Scholar] [CrossRef]
  12. Everett, H., III. The Theory of the Universal Wave Function. Ph.D. Thesis, Princeton University, Princeton, NJ, USA, 1956. [Google Scholar]
  13. Beckner, W. Inequalities in Fourier analysis. Ann. Math. 1975, 102, 159–182. [Google Scholar] [CrossRef]
  14. Pandit, M.; Bera, A.; Sen, A.; Sen, U. Quantum reciprocity relations for fluctuations of position and momentum. Phys. Rev. A 2019, 100, 012131. [Google Scholar] [CrossRef] [Green Version]
  15. Tsallis, C. Beyond Boltzmann–Gibbs–Shannon in Physics and Elsewhere. Entropy 2019, 21, 696. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  16. Vignat, C.; Plastino, A.; Plastino, A.R.; Dehesa, J.S. Quantum potentials with q-Gaussian ground states. Phys. A Stat. Mech. Its Appl. 2012, 391, 1068–1073. [Google Scholar] [CrossRef]
  17. Bialynicki-Birula, I. Comment on “Quantum reciprocity relations for fluctuations of position and momentum”. Phys. Rev. A 2019, 100, 046101. [Google Scholar] [CrossRef]
  18. Fisher, R.A. On the Mathematical Foundations of Theoretical Statistics. Philos. Trans. R. Soc. Lond. Ser. A Contain. Pap. Math. Phys. Character 1922, 222, 309–368. [Google Scholar] [CrossRef] [Green Version]
  19. Sánchez-Moreno, P.; Plastino, A.R.; Dehesa, J.S. A quantum uncertainty relation based on Fisher’s information. J. Phys. A Math. Theor. 2011, 44, 065301. [Google Scholar] [CrossRef]
  20. Dehesa, J.; Sorokin, V. Information-theoretic measures for Morse and Pöschl–Teller potentials. Mol. Phys. 2006, 104, 613–622. [Google Scholar] [CrossRef]
  21. Sánchez-Moreno, P.; González-Férez, R.; Dehesa, J.S. Improvement of the Heisenberg and Fisher-information-based uncertainty relations forD-dimensional central potentials. New J. Phys. 2006, 8, 330. [Google Scholar] [CrossRef]
Figure 1. Student’s t distribution, Equation (9), for different degrees of freedom n.
Figure 1. Student’s t distribution, Equation (9), for different degrees of freedom n.
Psf 05 00044 g001
Figure 2. Examples of wavefunction (14) and (15): real (top) and imaginary (middle) parts and square modulus (bottom).
Figure 2. Examples of wavefunction (14) and (15): real (top) and imaginary (middle) parts and square modulus (bottom).
Psf 05 00044 g002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Portesi, M.; Pujol, J.M.; Holik, F. Reciprocity Relations for Quantum Systems Based on Fisher Information. Phys. Sci. Forum 2022, 5, 44. https://doi.org/10.3390/psf2022005044

AMA Style

Portesi M, Pujol JM, Holik F. Reciprocity Relations for Quantum Systems Based on Fisher Information. Physical Sciences Forum. 2022; 5(1):44. https://doi.org/10.3390/psf2022005044

Chicago/Turabian Style

Portesi, Mariela, Juan Manuel Pujol, and Federico Holik. 2022. "Reciprocity Relations for Quantum Systems Based on Fisher Information" Physical Sciences Forum 5, no. 1: 44. https://doi.org/10.3390/psf2022005044

APA Style

Portesi, M., Pujol, J. M., & Holik, F. (2022). Reciprocity Relations for Quantum Systems Based on Fisher Information. Physical Sciences Forum, 5(1), 44. https://doi.org/10.3390/psf2022005044

Article Metrics

Back to TopTop