1. Introduction
Broadly speaking, information theory, developed after the seminal works of Claude E. Shannon in 1948 [
1] and 1949 [
2], provides a formal and abstract framework which allows formalizing notions such as ignorance, uncertainty, or unpredictability with regard to a given variable or system. This theory is the basis of many of the technological advancements in the fields of communication and cryptography, for instance. Furthermore, its extension to the physical sciences including quantum mechanics has been vastly studied, and relevant results have been obtained.
One of the most important quantities in information theory is entropy. Mathematically, an information entropy is a functional over random variables (or quantum states, in the case of quantum–mechanical systems), which aims to measure uncertainty in the sense of ignorance or lack of information associated to a given distribution associated to the variable. While Shannon entropy is the most popular one, there exist a great number of families of alternative entropies, such as the monoparametric forms provided by Rényi, Tsallis, or Kaniadakis that reduce to the Shannon case in a limiting case, among others, or even more generalized entropies as the
-ones (see, for instance, the quantal version in Ref. [
3] and references therein). In a strictly quantum framework, the von Neumann entropy, the minimum entropy and the alluded generalizations are used. The conditional and joint entropies, the mutual information, and other relative forms serve, e.g., to relate and discriminate distributions for more than one variable.
Entropic measures are of much interest in particular due to the maximum entropy principle (known for short as MEP or MaxEnt), which was introduced by Edwin Jaynes in 1957 in the field of statistical mechanics. This principle states that the statistical distribution representing a system in equilibrium is that which maximizes the entropy while fulfilling a set of constraints satisfied by the system; that is, MEP selects the distribution that minimizes the amount of information given some constraints. This principle presents equivalences in problems in the area of communications and of pattern recognition, among others. While it was originally formulated from Shannon entropy, its form does not present major changes when using generalized entropies. The informational quantities mentioned so far have been largely studied in order to establish different relations among them in terms of equalities and inequalities that connect them from different perspectives or establish lower or upper bounds, providing a meaning also to the states or distributions saturating the bound. Of particular relevance, mainly in quantum theory but also in communication, are the efforts performed with the goal of finding uncertainty relations, which are known as entropic uncertainty relations (EURs). There exist reasons for which a treatment based on entropies is favorable over a variance-based one.
Among the most relevant features of quantum mechanics, the existence of uncertainty relations (or trade-off or reciprocity inequalities) is particularly important. These types of relations present a paradigm shift toward a fundamental unpredictability present in natural phenomena. Their extensive study is not only related to its importance in a better understanding of quantum mechanics but also to their applications to fields such as cryptography, information, and computer science.
The first of these kinds of relations is the so-called Heisenberg uncertainty principle. It has been originally proposed as a trade-off involving uncertainties or imprecisions for the position
x and momentum
p of a particle, in terms of Planck’s constant
J Hz
−1, in the form
This was presented by Werner Heisenberg in 1927 [
4] after the foundations of matrix and wave mechanics were already developed. He tried to make sense of this relation in the following way: “The more accurately the position is determined, the less accurately the momentum is known, and conversely”. Heisenberg also saw a “direct and intuitive interpretation” of commutation relations between the position and momentum operators.
Indeed, a derivation of this principle starting from a wave-mechanical formalism was given the same year by Earle Kennard, who provided an inequality for the standard deviations of the operators and established the lower bound as half the reduced Planck’s constant (
):
While this relation is presented for the variances of the position and momentum observables, in 1929 Howard Robertson [
5] and the following year Erwin Schrödinger showed a generalization for any two arbitrary operators. These forms included, respectively, the commutator
and the anticommutator
of any two (not necessarily commuting) quantum–mechanical observables
and
. In its strongest form, it reads
where the mean values represented by the brackets
are computed for the corresponding wavefunction or, more generally, the density operator of the quantum system.
The last assertion is precisely one of the drawbacks of this sequence of relations, as both sides of the inequalities are commonly dependent on the system’s state. In other words, these are not universal relations for a given pair of observables. In passing, we note that more general variance-based relationships have been provided with different implications. These comprise general powers—other than the square—of the involved operators (see, for instance, [
6] and references therein), or even a compromise for more than two observables in a unique trade-off. Other attempts as the Landau–Pollak inequality and geometric-inspired formulations also made their contribution to this problem from diverse viewpoints (see, e.g., [
7]). As already mentioned, entropic uncertainty inequalities appeared as an effort to account for connections among quantum observables (or, in other terms depending on the context, among Fourier-related probability distributions). Within this framework, particularities of those states reaching the bounds have been analyzed [
8,
9,
10].
Furthermore, in order to surpass weaknesses present in variance-based relationships, and also to provide an alternative point of view to the entropic inequalities, another kind of reciprocity relations have been envisaged that make use of the Fisher information for instance. It is interesting to notice that while Shannon entropy provides a global measure of the dispersion of a function, Fisher information is a cumulative measure sensitive to local changes in density; roughly speaking, the bigger the Fisher information, the more localized the function, and the smaller the associated uncertainty.
In this work, we deal with reciprocity relations, mainly for position and momentum operators for quantum systems, in terms of information quantifiers. In
Section 2, we recall basic results on EURs for continuous quantum observables and refer to analogous inequalities for Fourier-transformed PDFs. Changes in the distributions corresponding to those observables are considered from another viewpoint, discussing some results for localized
x-
p fluctuations in
Section 3, while
Section 4 is devoted to the study of cumulative reciprocity relations. We provide some final results in
Section 5.
2. Uncertainty Relations in Terms of Information Entropies
For a discrete random variable
X that assumes values on the alphabet
of cardinal
(probably infinite), the Shannon entropy is given by
where
is the corresponding probability function, and the logarithm in base 2 gives the value of
H in bits (any other base could be used as well). The extension for the case of a continuous random variable gives rise to the so-called differential entropy,
As mentioned, to account for the uncertainty principle quantitatively, unwanted behaviors shown by variance-based inequalities were replaced by other information (or uncertainty) quantifiers. Those most extensively studied are the entropy-based uncertainty relations. One of the first arguments in favor of EURs was given by David Deutsch in 1983 [
9], pointing out the dependence of Robertson’s lower bound, in general, on the system’s state
. In particular, this limit is trivial when
yields a null mean value for the commutator
(which is always possible for finite-dimensional models).
It is interesting to note that in the context of Fourier analysis, entropic inequalities had been envisaged due to Hirschman and Everett [
11,
12] and proved by Beckner in 1975 [
13]. In particular, the following expression was obtained:
for a function
f and its Fourier transform
, where
H is the differential entropy and
indicates the 2-norm of a function. This result was reinterpreted as an uncertainty relation for quantum systems by Bialynicki-Birula and Mycielski [
8], under the form
for the amplitudes of the wavefunction in position and momentum representations, being
d the spatial dimension. From these relations, alternative ones and extensions have been developed, generating an active field of research and applications. One important fact to stress is that it can be demonstrated that EUR (
7) implies inequality (
2), and thus, the former is stronger than the latter.
3. A Discussion on Reciprocity Relations for Position–Momentum Fluctuations
The reciprocity relations mentioned so far relate the uncertainty, in the sense of dispersion (or width), for pairs of probability distributions. It is also interesting to study the existence of relationships between the fluctuations of a given distribution and its Fourier transform.
In Ref. [
14], the authors search for these kinds of inequalities, measuring the fluctuations of a function in terms of its Lipschitz constant (LC). Numerical values were obtained for the product of LCs,
, for some families of functions (Cauchy–Lorentz, Student’s
t, Hermite, and Gaussian), suggesting a lower limit as
However, no analytical proof is given nor a universal lower bound is shown to exist in that work.
As mentioned, one of the examples of the former inequality is the case of Student’s
t distributions, which are defined as
where
n is the number of degrees of freedom, as shown in
Figure 1. Indeed, Pandit et al. [
14] observe that for this distribution,
when
, and when
,
converges to
.
It can be seen that this distribution is related to a certain type of functions called
q-Gaussians. The
q-Gaussian distribution is a generalization of the standard Gaussian distribution, and it has the form:
where
q and
are real parameters (here,
, and the limit
leads to the Gaussian case), and
with
the step function. If one chooses
such that
and
, the Student’s
t distribution is recovered.
A particular feature of these distributions is that in the same way as the Gaussian distribution maximizes Shannon entropy, the
q-Gaussian distributions maximize Tsallis entropy (see [
15] and refs. therein), which is defined for a continuous probability distribution
as
This entropy has proven to be relevant in a wide range of applications. Interestingly enough, Vignat et al. [
16] show that
q-Gaussians correspond to the ground state of a quantum system under certain potential. To this end, they look for the expression of potentials whose ground states are wavefunctions of the type (
10), finding
When
, it is seen that
, which is the potential for a harmonic oscillator. When
, it presents a singularity at
, which is known as the “Tsallis cutoff condition”. The connections established so far between the shape of the ground-state wavefunctions and their corresponding potentials allow us to give a physical meaning to the distributions employed in [
14] as examples, i.e., the Student’s
t distribution in
arises from a potential of the form
in terms of
n. Therefore, we see that the LC-based reciprocity relations in this case refer to this physical problem.
Going back to the proposed lower bound in Equation (
8), as said, it was obtained numerically and for a limited number of distributions. However, a counterexample has been provided by Bialynicki-Birula [
17]. Indeed, he proposed a function and its Fourier transform such that the product of their Lipschitz constants can be made arbitrarily small. Those functions have the shape
and
where
a and
b are assumed to be real parameters, as shown in
Figure 2. It is easily seen that
which can in fact be made as small as desired by increasing the parameter
b. This means the bound proposed in Equation (
8) is not valid for all wavefunctions. An interpretation for this situation is that the fluctuations are in some sense contained into a phase factor and thus do not appear when considering the probability amplitudes.
4. On Global Reciprocity Relations
As it was seen earlier, the Lipschitz constant of a function is a reasonable measure of its fluctuations. However, there are some aspects which are not ideal for a given measure; for example, it depends only on one maximum value. Another quantity which can be used to measure fluctuations is the so-called Fisher information [
18].
Fisher information has a wide variety of uses and interpretations. In statistics, it is used as a measure of the ability to estimate a parameter, through the Cramer–Rao bound, while in physics, it can be interpreted as a measure of the order (or disorder) of a system.
The Fisher information of a PDF
is defined as
where all three definitions are equivalent. From these expressions, it can be seen that
I depends not only on
but also on its derivative. This implies that unlike entropy or standard deviation, Fisher information is susceptible to local changes on
. Fisher information is greater the more localized the function and smaller the more spread out.
More recently, Fisher information has been used as a measure of information for probability distribution in quantum mechanics, and as a result, there have been studies on reciprocity relations involving this quantity [
19,
20,
21]. In particular, it has been proven that when the wavefunction
or its equivalent in the momentum representation
are real, the following relation holds true:
which also implies
These expressions cannot, however, be extended to cases where both wavefunctions are complex.
With this in mind, it is interesting to note that the functions (
14) and (
15) are both complex. Then, it should not come as a surprise that those functions do not follow inequality (
8). Furthermore, we argue that since Fisher information depends on the derivative of the probability distribution function and that it involves an integral over the whole domain, it is a more appropriate quantity to measure fluctuations than Lipschitz constants.
5. Final Remarks
Entropy is a central quantity in information–theory studies, as it possesses various properties that correspond with an intuitive notion of a measure of information (or, indeed, lack of it, that can be interpreted as uncertainty or indeterminacy). Shannon entropy H provides a global measure of dispersion of a probability distribution: the more localized it is, the lesser is its associated entropy.
In the realm of quantum mechanics, a foundational feature is the uncertainty principle for pairs of observables, which has been quantified in terms of entropic measures. Establishing entropic uncertainty relations and studying those states reaching the lower bound has been the focus of many efforts, while practical applications have also been envisaged.
Another type of reciprocity relations are those based on Fisher information. While Shannon entropy provides a global measure of the spread of a function, Fisher information is sensitive to local changes on the density; the greater the Fisher information, the more localized the distribution, and the smaller the associated uncertainty.
In light of the mentioned reciprocity relations, we focus on the interest to study the existence of relations in terms of inequalities, already established or emergent, while looking for new relationships between information measures.