Next Article in Journal
Mutual Information and Correlations across Topological Phase Transitions in Topologically Ordered Graphene Zigzag Nanoribbons
Next Article in Special Issue
Intrinsic Information-Theoretic Models
Previous Article in Journal
Entropy Optimization by Redesigning Organization in Hospital Operations
Previous Article in Special Issue
Collapse Models: A Theoretical, Experimental and Philosophical Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Information-Theoretic Models for Physical Observables

Department of Genetics, Microbiology and Statistics, Faculty of Biology, Universitat de Barcelona, 08028 Barcelona, Spain
*
Author to whom correspondence should be addressed.
Entropy 2023, 25(10), 1448; https://doi.org/10.3390/e25101448
Submission received: 24 August 2023 / Revised: 7 October 2023 / Accepted: 12 October 2023 / Published: 14 October 2023

Abstract

:
This work addresses J.A. Wheeler’s critical idea that all things physical are information-theoretic in origin. In this paper, we introduce a novel mathematical framework based on information geometry, using the Fisher information metric as a particular Riemannian metric, defined in the parameter space of a smooth statistical manifold of normal probability distributions. Following this approach, we study the stationary states with the time-independent Schrödinger’s equation to discover that the information could be represented and distributed over a set of quantum harmonic oscillators, one for each independent source of data, whose coordinate for each oscillator is a parameter of the smooth statistical manifold to estimate. We observe that the estimator’s variance equals the energy levels of the quantum harmonic oscillator, proving that the estimator’s variance is definitively quantized, being the minimum variance at the minimum energy level of the oscillator. Interestingly, we demonstrate that quantum harmonic oscillators reach the Cramér–Rao lower bound on the estimator’s variance at the lowest energy level. In parallel, we find that the global probability density function of the collective mode of a set of quantum harmonic oscillators at the lowest energy level equals the posterior probability distribution calculated using Bayes’ theorem from the sources of information for all data values, taking as a prior the Riemannian volume of the informative metric. Interestingly, the opposite is also true, as the prior is constant. Altogether, these results suggest that we can break the sources of information into little elements: quantum harmonic oscillators, with the square modulus of the collective mode at the lowest energy representing the most likely reality, supporting A. Zeilinger’s recent statement that the world is not broken into physical but informational parts.

1. Introduction

This work departs from the premise that physical objects are information-theoretic in origin, an idea that has recurrently appeared in human history, especially in physics, particularly in quantum physics research, for over a century. To further put this study in context with other studies and opinions, we would like to first consider the following views of great scientists, which depict a similar point of view.
For example, when asked about an underlying quantum world, N. Bohr, physicist and founder of quantum mechanics, answered: “There is no quantum world. There is only an abstract quantum physical description. It is wrong to think that the task of physics is to find out how Nature is. Physics concerns what we can say about Nature.” [1]. N. Bohr insisted that what we model results from observation, not the world itself.
Contemporary to N. Bohr, W. Heisenberg, physicist and founder of quantum mechanics, said: “The laws of nature which we formulate mathematically in quantum theory deal no longer with the particles themselves …,” but “…with our knowledge of the elementary particles.” [2]. For W. Heisenberg, the wave function codifies the knowledge about the system we study.
A few years later, in a brilliant paper in 1989, theoretical physicist J.A. Wheeler argued that information is the most fundamental thing, giving rise to the physical, a notion he summarized with the expression “it from bit.” Quoting him, “It from bit symbolizes the idea that every item of the physical world has at bottom—at a very deep bottom, in most instances—an immaterial source and explanation; that what we call reality arises in the last analysis from the posing of yes-no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and this is a participatory universe.” [3].
More recently, A. Zeilinger, quantum physicist and Nobel laureate in physics in 2022, has proposed an informational viewpoint to quantum mechanics in which the world is not broken up into physical but informational parts. According to him, a quantum system represents our knowledge of the world, not the world itself. Quoting him in a recent interview: “We can make measurements but cannot say anything about the essence of reality.” [4]. Therefore, we interpret his position as also in line with previous opinions.
On the other hand, according to neuroscience, we experience reality depending on interactions between ourselves, other beings, and multiple information sources. The nervous system’s different sensory modalities, smell, vision, audition, taste, and touch, generate evoked responses to the many questions we ask of the universe. One could say that sensory responses give rise to the phenomena we experience, with the brain being the natural organ capable of representing the surrounding reality. According to this view, physical observables—the “nature things”—may emerge after this preliminary process. We call this stage “pre-physics”, which refers to the preprocessing of the source data information performed, in principle, by our sensory systems and the brain.
With this initial work, we want to support and follow up on these relevant statements by renowned scientists about what we observe to be nothing but information by providing information-theoretic models for representing physical observables. In general, information-theoretic models are defined as theories designed to describe an entire behavior or specific situation with the idea that they will eventually be able to predict that behavior. We want to provide information-theoretic models to explain the “pre-physics” stage from which everything may emerge.
Toward this goal, we introduce a novel mathematical framework that can serve as the “informational” foundation for representing physical objects. The proposed approach is based on the field of information geometry, using the Fisher information metric as a particular Riemannian metric defined in the parameter space of a smooth statistical manifold of normal probability distributions. The informative geometry developed by C.R. Rao, extending results from P.C. Mahalanobis and R.A. Fisher, began in 1945 [5] but did not appear again until 1982 [6]. Since then, many works have highlighted its properties. The most relevant is the invariance against model transformations and changes in the model’s reference measurement, which makes it appropriate for addressing the formal properties of the observation process.
In developing the approach, we obtain exciting results about the building blocks of information and how they can be combined to represent the most likely reality. In this way, the laws of physics may come up afterward through a set of linear and nonlinear transformations of the parameters of an informative manifold. The approach is novel because it strives to lay down informational foundations for physics, especially quantum physics. The results are interesting because they connect fields such as information geometry, quantum physics, estimation, and probability theory. We certainly recognize that the project is very ambitious, but we have prospects for continuing work on future publications, making this article the first manuscript.

2. Mathematical Framework

The plan of this section, which, for didactic purposes, we divide into six subsections, is the following. In Section 2.1, we outline the modeling of a single source and the derivation of the Fisher information and the Riemannian manifold. Section 2.2 is devoted to analyzing the stationary states in the Riemannian manifold. In Section 2.3, we present the solutions of the stationary states in our formalism. In Section 2.4, we compute the probability density function, the estimator’s variance, and the Cramér–Rao lower bound for a single source. An extension of this approach to m independent sources is performed in Section 2.5 to compute the global probability density function at the ground-state level. In Section 2.6, we use Bayes’ theorem to obtain the posterior probability density function.

2.1. Single Source, the Fisher Information, and the Riemannian Manifold

We start our mathematical description by modeling a single source with a univariate normal probability distribution N ( μ , σ ) with σ = 1 [7]. This is a well-known parametric statistical model whose parameter space may be identified with the real line, i.e., Θ = R . We can compute all the relevant quantities relevant to our purpose. Some steps may seem obvious to those used to information geometry.
For a single sample, the normal distribution, the log-normal (or lognormal) distribution, and the derivative (or gradient) of the lognormal distribution with respect to the parameter μ , respectively, are
f ( x ; μ ) = 1 2 π e 1 2 ( x μ ) 2 ,
ln f = 1 2 ln ( 2 π ) 1 2 ( x μ ) 2 ,
ln f μ = x μ .
Equation (3) is also called the score function. We can calculate Fisher’s information [8] as the variance of the score function:
I ( μ ) = E μ ln f μ 2 = E μ x μ 2 = ( x μ ) 2 1 2 π e 1 2 ( x μ ) 2 = 1 .
The Fisher information measures how much information is available about an informative parameter, in this case, the parameter μ . If the source generates n samples, { x 1 , , x n } , drawn independently from a univariate normal probability distribution (1), the likelihood distribution, the log-likelihood distribution, and the score function, respectively, are
f n ( x 1 , , x n ; μ ) = i = 1 n f ( x i ; μ ) = 1 2 π n 2 exp ( 1 2 i = 1 n x i μ 2 ) ,
ln f n = n 2 ln ( 2 π ) 1 2 i = 1 n x i μ 2 ,
ln f n μ = i = 1 n x i μ = n x ¯ μ .
Likewise, from Equation (7), we can calculate Fisher’s information for n samples as
I n ( μ ) = E μ ln f n μ 2 = E μ n 2 x ¯ μ 2 = n 2 ( x ¯ μ ) 2 n 2 π 1 2 e 1 2 n ( x ¯ μ ) 2 = n .
In other words,
I n ( μ ) = n I ( μ ) ,
which shows the well-known additive property of the Fisher information for independent samples.
We can now introduce Riemannian manifolds [9]. These are smooth manifolds equipped with Riemannian metrics (smoothly varying choices of inner products on tangent spaces), which allow us to measure geometric quantities such as distances and angles.
The Riemannian metric from a single source with n samples derived from the Fisher information (8), is a scalar whose covariant component, contravariant component, and determinant, respectively, are
I n ( μ ) = g 11 ( μ ) = n ,
g 11 ( μ ) = 1 n ,
det ( g ( μ ) ) = n .

2.2. Stationary States in the Riemannian Manifold

To calculate the stationary states, we can invoke the time-independent non-relativistic Schrödinger’s equation [10] or the principle of minimum Fisher’s information [11]. The two approaches have been demonstrated to be equivalent [11]. The wave equation reads as follows:
k 2 ψ ( μ ) + U ( μ ) ψ ( μ ) = λ ψ ( μ ) ,
where U ( μ ) is the potential energy and k , λ are constants > 0 . The solution must also satisfy lim μ ψ ( μ ) = lim μ + ψ ( μ ) = 0 and ψ 2 ( μ ) d μ = 1 . For simplicity, we will write ψ instead of ψ ( μ ) . We can use the modulus square of the score function (7) as the potential energy, except for a constant term:
ln f n μ 2 = n x ¯ μ g 11 ( μ ) n x ¯ μ ,
= n x ¯ μ 1 n n x ¯ μ ,
= n x ¯ μ 2 .
Other approaches for the potential energy lead to similar expressions, such as considering ln f n ( x 1 , , x n ; x ¯ ) ln f n ( x 1 , , x n ; μ ) = n 2 x ¯ μ 2 , which is almost identical to Equation (14c). In this way, the potential energy can be written as U ( μ ) = n C x ¯ μ 2 where C is a constant > 0 . Equation (13) reads as
k 2 ψ + n C x ¯ μ 2 ψ = λ ψ .
We compute the Laplacian in Equation (15) as
2 ψ = 1 | g ( μ ) | μ | g ( μ ) | g 11 ( μ ) ψ μ ,
= 1 n μ n 1 n ψ μ ,
= 1 n 2 ψ μ 2 = 1 n ψ .
Inserting Equation (16) into Equation (15), we obtain
k n ψ + n C x ¯ μ 2 ψ = λ ψ ,
which is the Schrödinger’s equation of the quantum harmonic oscillator [12].

2.3. Solutions of the Quantum Harmonic Oscillator in the Riemannian Manifold

Some steps now may seem obvious for those used to quantum mechanics. Considering that ψ has the following form:
ψ ( μ ) = γ e η , with γ > 0 real , η a function of μ ,
Equation (17) results in
k n γ e η ( η ) 2 + γ e η η + n C x ¯ μ 2 γ e η = λ γ e η ,
k n ( η ) 2 + η + n C x ¯ μ 2 = λ .
Assuming a solution for η ( μ ) with the form
η ( μ ) = ξ n x ¯ μ 2 , with ξ > 0 ,
inserting this expression into Equation (19) gives
k n 4 ξ 2 n 2 x ¯ μ 2 2 ξ n + n C x ¯ μ 2 = λ ,
which implies that
4 k ξ 2 n = n C , 2 k ξ = λ .
In other words, constants k , C , λ , ξ can not be chosen arbitrarily because they have to satisfy these equations. For example, we can choose k = 2 n and C = 1 2 n , which forces ξ = 1 4 , and λ = 1 n . Therefore, we can write
2 n 2 ψ + 1 2 x ¯ μ 2 ψ = 1 n ψ ,
whose solution is given by
ψ ( μ ) = γ e 1 4 n x ¯ μ 2 .
With this configuration, we compute the normalization constant γ :
1 = ψ 2 ( μ ) d μ = γ 2 e 1 2 n x ¯ μ 2 d μ = 2 γ 2 0 e 1 2 n t 2 d t ,
where we used a first change of variable x ¯ μ = t . Now, using a second change of variable 1 2 n t 2 = s , d t = 2 n 1 2 s 1 2 d s , Equation (25) writes as
1 = 2 γ 2 0 e s 2 n 1 2 s 1 2 d s = γ 2 2 n 0 s 1 2 e s d s = γ 2 2 n π .
Isolating γ from Equation (26), we obtain γ = n 2 π 1 4 . Therefore, Equation (24) reads as
ψ ( μ ) = n 2 π 1 4 e 1 4 n x ¯ μ 2 = ψ 0 ( μ ) ,
which is the ground-state solution of the quantum harmonic oscillator problem, the wave function for the ground state. The solutions of the quantum harmonic oscillator involve Hermite polynomials, which were introduced elsewhere [13,14]. In this way, we can prove, after some tedious but straightforward computations, that the wave function
ψ 1 ( μ ) = γ 1 x ¯ μ e 1 4 n x ¯ μ 2 , with γ 1 > 0 ,
is also a solution of
2 n 2 ψ 1 + 1 2 x ¯ μ 2 ψ 1 = λ 1 ψ 1 ,
where λ 1 = 3 n is the energy of the first excited state, and γ 1 = n 3 2 π 1 4 is the normalization constant. With this representation, the λ ’s (energy levels) are given by
λ ν = 2 n ( ν + 1 2 ) = E ν , with ν = 0 , 1 , .
Looking closely at Equation (30), we appreciate that the energy levels depend on two numbers, ν and n. The ground state at ν = 0 has a finite energy E 0 = 1 n , and can get arbitrarily close to zero by massive sampling.

2.4. Probability Density Function, Estimator’s Variance, and Cramér–Rao Lower Bound

Assuming that the square modulus of the wave function can be interpreted as the probability density function
ψ ( μ ) 2 = ψ * ( μ ) ψ ( μ ) = ρ ( μ ) ,
we can compute the performance of the estimations of μ . For instance, we can calculate the variance of the estimator x ¯ at the ground state (27):
E μ , ρ 0 ( μ ) x ¯ μ 2 = 1 n .
Likewise, we can compute the variance of the estimator x ¯ at the first excited state (28):
E μ , ρ 1 ( μ ) x ¯ μ 2 = 3 n .
The estimator’s variance equals the quantum harmonic oscillator’s energy levels, i.e., the estimator’s variance is definitively quantized. Interestingly, the estimator’s variance at the ground state (32) equals the Cramér–Rao lower bound (CRLB) [5,15] on the variance of the estimator x ¯ :
E μ x ¯ μ 2 1 I n Q ( μ ) = 1 I n ( μ ) = 1 n ,
where I n Q ( μ ) is the quantum Fisher information for n samples, and I n ( μ ) is the classical Fisher information for n samples, as computed in Equation (8). The quantum Fisher information coincides with the classical Fisher information when in a polar representation of the wave function ψ ( μ ) = ρ ( μ ) exp { i α ( μ ) } , α ( μ ) = 0 [16], which is the case of study. To summarize the findings, we reach the minimum variance at the minimum energy level. It is worth highlighting that the above equations entail a novel relation between energy and information. The energy is minimal at the ground-state level and increases if we do not have enough information or lose information. Notably, the loss of information is quantized. This phenomenon deserves to be further explored in forthcoming communications.

2.5. m Independent Sources and Global Probability Density Function

With m independent sources, each generating n samples, a finite set of m quantum harmonic oscillators may represent reality. Presuming independence of the sources of information, the “global” wave function (also called the collective wave function) can factor as the product of single wave functions. We can write the global wave function as
ψ ( μ ) = ψ ( μ 1 , , μ m ) = ψ ( μ 1 ) ψ ( μ m ) .
When the sources do not evolve independently, the collective wave function for an m-system cannot be written as the product of individual wave functions; in this case, we speak of entangled states.
Equation (35) constitutes a many-body system, and we may refer to the vector μ as the μ field. For example, in the case of modeling two independent sources, the global wave function at the ground state will be the product of single wave functions, each of them at the ground state:
ψ 0 ( μ ) = ψ 0 ( μ 1 ) ψ 0 ( μ 2 ) ,
= n 2 π 1 4 exp { 1 4 n x ¯ 1 μ 1 2 } n 2 π 1 4 exp { 1 4 n x ¯ 2 μ 2 2 } ,
= n 2 π 1 4 n 2 π 1 4 exp { 1 4 n x ¯ 1 μ 1 2 + n x ¯ 2 μ 2 2 } ,
= n 2 π 1 2 exp { 1 4 n x ¯ 1 μ 1 , x ¯ 2 μ 2 x ¯ 1 μ 1 x ¯ 2 μ 2 } .
We can generalize Equation (36) for having m independent sources. The global wave function writes as
ψ 0 ( μ ) = n 2 π m 4 exp { 1 4 n x ¯ μ T x ¯ μ } .
Using Equation (31), the global probability density function is
ρ 0 ( μ ) = n 2 π m 2 exp { 1 2 n x ¯ μ T x ¯ μ } ,
which “essentially” results from Schrödinger’s equation (or the principle of minimum Fisher’s information) on the Riemannian manifold assuming independence of sources.

2.6. Bayesian Framework and Posterior Probability Density Function

In parallel, we can compute, using Bayes’ theorem [17], the posterior probability density function from the m sources of information for all data values, taking the Riemannian volume of the metric as a prior probability distribution. As proven elsewhere, the Riemannian volume can be considered an objective choice for a prior probability distribution [18]. In honor of the author, this measure is called the Jeffreys’ prior probability distribution on informative parameters.
To start with, we can model m independent sources with a multivariate normal probability distribution N m ( μ , Σ ) with Σ = I m . Some developments may now be trivial for those used in probability theory. If every source generates n samples that are drawn independently from the multivariate normal probability distribution, the likelihood probability distribution is
f m , n ( x 1 , , x n ; μ ) = i = 1 n 1 ( 2 π ) m 2 ( det ( I m ) ) 1 2 exp { 1 2 ( ( x i μ ) T I m 1 2 ( x i μ ) ) } ,
= 2 π m n 2 exp { 1 2 i = 1 n x i μ T x i μ } ,
= 2 π m n 2 exp { 1 2 Tr i = 1 n x i μ x i μ T } .
The summation term within the trace operator can be decomposed into two terms:
i = 1 n x i μ x i μ T = i = 1 n x i x ¯ x i x ¯ T + n x ¯ μ x ¯ μ T ,
= n S + n x ¯ μ x ¯ μ T ,
where S is the covariance matrix, a symmetric constant matrix. Inserting Equation (40) into Equation (39) gives
f m , n ( x 1 , , x n ; μ ) = 2 π m n 2 exp { 1 2 n Tr S } exp { 1 2 n x ¯ μ T x ¯ μ } .
To compute the posterior probability distribution, we choose a prior probability distribution proportional to the Riemannian volume:
det g ( μ ) = n m 2 .
Thus, p ( μ ) n m 2 . The posterior probability distribution will be proportional to the likelihood probability distribution (41) multiplied by the prior probability distribution (42):
f m , n ( μ ; x 1 , , x n ) f m , n ( x 1 , , x n ; μ ) p ( μ ) ,
2 π m n 2 exp { 1 2 n Tr S } exp { 1 2 n x ¯ μ T x ¯ μ } n m 2 ,
2 π m n 2 exp { 1 2 n Tr S } n m 2 constant term exp { 1 2 n x ¯ μ T x ¯ μ } μ -dependent term ,
exp { 1 2 n x ¯ μ T x ¯ μ } .
We need to normalize Equation (43) to be a probability density function:
1 = R m exp { 1 2 n x ¯ μ T x ¯ μ } d μ 1 d μ m ,
= R m i = 1 m exp { 1 2 n x ¯ i μ i 2 } d μ 1 d μ m ,
= exp { 1 2 n x ¯ i μ i 2 } d μ i m ,
= 2 0 exp { 1 2 n x ¯ i μ i 2 } d μ i m .
Using a change of variable t = 1 2 n μ i x ¯ i 2 , x ¯ i + 2 n t = μ i , d μ i = 2 n 1 2 t 1 2 d t , Equation (44) reads as
1 = 2 0 exp { t } 1 2 n t 1 2 d t m = 2 n Γ 1 2 m = 2 π n m 2 .
Thus, the normalization constant is n 2 π m 2 , and the posterior probability density function (43) is
f m , n ( μ ; x 1 , , x n ) = n 2 π m 2 exp { 1 2 n x ¯ μ T x ¯ μ } ,
which is the global probability density function of the global wave function at the ground state (38). Precisely, the global probability density function of m quantum harmonic oscillators at the ground state corresponds to the posterior probability density function calculated using Bayes’ theorem from m sources of information for all data values. Interestingly, the opposite is also true, as the prior is constant (42). In other words, we can also say that the likelihood probability distribution from m sources of information for all data values equals the posterior probability density function calculated using Bayes’ theorem from m quantum harmonic oscillators at the ground state. This unexpected and exciting result reveals a plausible relationship between energy and Bayes’ theorem.

3. Discussion

Over the past century, renowned scientists have proposed that what we observe is nothing but information. Among many scientists’ opinions, we find that what we actually model is the result of observation, not the world itself (N. Bohr); the laws of nature do not deal with the particles themselves but with our knowledge of elementary particles (W. Heisenberg); all things physical are information-theoretic in origin (J.A. Wheeler); and the world is not broken up into physical but informational parts (A. Zeilinger). From a pure neuroscientific perspective, we can fairly say that we experience reality depending on the central and peripheral nervous system interactions with multiple information sources. In this way, the physical observables—the “nature things”—may emerge after pre-processing the source data information performed by our sensory systems and the brain, a stage we call “pre-physics”, with the laws of physics appearing afterward.
With this initial work, we wanted to provide information-theoretic models to explain this generative process. We developed a novel mathematical framework for representing physical objects. Our approach is based on information geometry and uses Fisher’s information metric as a Riemannian metric, defined on a smooth statistical manifold of normal probability distributions. We assumed that a single information source generating n samples can be modeled as a univariate normal probability distribution N ( μ , σ ) , setting σ = 1 for simplicity. With this modeling, the Fisher information equals n, the number of samples. We also modeled m information sources with a multivariate normal probability distribution. It is an approximation for apparent reasons, and we may explore other probability distributions.
Investigating the stationary states invoking Schrödinger’s equation or the principle of minimum Fisher’s information, we discovered that information could be represented and distributed over a set of quantum harmonic oscillators, one for each independent source, whose coordinate is the μ parameter of the statistical manifold to estimate, taking into consideration the assumptions of our modeling. We observed that the estimator’s variance equals the energy levels of the quantum harmonic oscillator, proving that the estimator’s variance is definitively quantized. The minimum estimator’s variance occurs at the minimum energy level of the oscillator, which equals 1 n , where n is the number of samples. Also, we demonstrated that quantum harmonic oscillators reach the Cramér–Rao lower bound on the estimator’s variance at the lowest energy level, which indicates a relationship between energy and information that deserves to be studied in detail in future work.
We then showed that the global probability density function of the collective mode of a set of m quantum harmonic oscillators at the lowest energy level, calculated as the square modulus of the global wave function at the ground state, equals the posterior probability distribution calculated using Bayes’ theorem from the m sources of information for all data values, taking as a prior the Riemannian volume of the informative metric. Interestingly, the opposite is also true, as the prior is constant. These results suggest that the m sources of information could be broken into m little elements, in particular, m quantum harmonic oscillators, with the square modulus of the collective mode at the lowest energy representing the most likely reality, which supports A. Zeilinger’s statement that the world is not broken into physical but informational parts. In addition, these findings reveal a very interesting connection between energy and Bayes’ theorem to be explored in the future.
Interestingly, apart from the theoretical results we reported to be further explored, such as the relation between energy and information and the connection between energy and Bayes’ theorem, respectively, this mathematical framework offers other multiple alternatives that we are currently exploring. For example, the informational representation on statistical manifolds with σ unknown and the representation of multiple “dependent” sources. Moreover, in forthcoming studies, we will strive to generalize this approach by exploring other statistical manifolds and to depict how physical observables such as space and time may emerge from linear and nonlinear transformations of a set of parameters of a specific statistical manifold. This way, the laws of physics, including time’s arrow, will appear afterward.
Indeed, we can also speculate about the possibility of physical entities being represented this way, for example, in our nervous system. This complex system, consisting of several components, including water molecules, cell types, and conduction systems, could harbor this representation of information. However, we will not discuss these topics in this work. We will leave it now for internal conversations in which we extensively discuss how we may sample and perceive reality, which is possibly the true motivation for initiating and developing this research project.

Author Contributions

Conceptualization, D.B.-C. and J.M.O.; writing—original draft preparation, D.B.-C.; writing—review and editing, D.B.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study did not require ethical approval.

Data Availability Statement

No new data were created.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Petersen, A. The Philosophy of Niels Bohr. Bull. At. Sci. 1963, 19, 8–14. [Google Scholar] [CrossRef]
  2. Heisenberg, W. The Representation of Nature in Contemporary Physics. Daedalus 1958, 87, 95–108. [Google Scholar]
  3. Wheeler, J. Information, Physics, Quantum: The search for links. In Proceedings of the III International Symposium on Foundations of Quantum Mechanics, Tokyo, Japan, 28–31 August 1989; pp. 354–358. [Google Scholar]
  4. Ansede, M. Nobel winner Anton Zeilinger: Physicists Can Make Measurements, but Cannot Say Anything about the Essence of Reality. El Pais. 2023. Available online: https://english.elpais.com/science-tech/2023-06-14/nobel-winner-anton-zeilinger-physicists-can-make-measurements-but-cannot-say-anything-about-the-essence-of-reality.html (accessed on 1 July 2023).
  5. Rao, C. Information and Accuracy Attainable in Estimation of Statistical Parameters. Bull. Calcutta Math. Soc. 1945, 37, 81–91. [Google Scholar]
  6. Burbea, J.; Rao, C.R. Entropy differential metric, distance and divergence measures in probability spaces: A unified approach. J. Multivar. Anal. 1982, 12, 575–596. [Google Scholar] [CrossRef]
  7. Moivre, A.D. Approximatio Ad Summam Terminorum Binomii (a+b)n in Seriem Expansi, 1733; pp. 1–7. Self-Published Pamphlet.
  8. Fisher, R. On the mathematical foundations of theoretical statistics. Philos. Trans. R. Soc. London. Ser. Contain. Pap. Math. Phys. Character 1922, 222, 309–368. [Google Scholar]
  9. Riemann, B. Über die Hypothesen, Welche der Geometrie zu Grunde liegen. (Mitgetheilt durch R. Dedekind). 1868. Available online: https://eudml.org/doc/135760 (accessed on 15 July 2023).
  10. Schrödinger, E. An Undulatory Theory of the Mechanics of Atoms and Molecules. Phys. Rev. 1926, 28, 1049–1070. [Google Scholar] [CrossRef]
  11. Frieden, B.R. Fisher information as the basis for the Schrödinger wave equation. Am. J. Phys. 1989, 57, 1004–1008. [Google Scholar] [CrossRef]
  12. Schrödinger, E. Quantisierung als Eigenwertproblem. Ann. der Phys. 1926, 79, 489–527. [Google Scholar] [CrossRef]
  13. Laplace, P. Mémoire sur les Intégrales Définies et leur Application aux Probabilités, et Spécialement a la Recherche du Milieu Qu’il Faut Choisir Entre les Resultats des Observations. In Mémoires de la Classe des Sciences Mathématiques et Physiques de L’institut Impérial de France; L’institut Impérial de France: Paris, France, 1811; pp. 297–347. [Google Scholar]
  14. Hermite, C. Sur un Nouveau Développement en Série de Fonctions; Académie des Sciences and Centre National de la Recherche Scientifique de France: Paris, France, 1864. [Google Scholar]
  15. Cramér, H. Mathematical Methods of Statistics; Goldstine Printed Materials; Princeton University Press: Princeton, NJ, USA, 1946. [Google Scholar]
  16. Facchi, P.; Kulkarni, R.; Man’ko, V.; Marmo, G.; Sudarshan, E.; Ventriglia, F. Classical and quantum Fisher information in the geometrical formulation of quantum mechanics. Phys. Lett. A 2010, 374, 4801–4803. [Google Scholar] [CrossRef]
  17. Bayes, T. An essay towards solving a problem in the doctrine of chances. Phil. Trans. R. Soc. Lond. 1763, 53, 370–418. [Google Scholar] [CrossRef]
  18. Jeffreys, H. An invariant form for the prior probability in estimation problems. Proc. R. Soc. London. Ser. Math. Phys. Sci. 1946, 186, 453–461. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bernal-Casas, D.; Oller, J.M. Information-Theoretic Models for Physical Observables. Entropy 2023, 25, 1448. https://doi.org/10.3390/e25101448

AMA Style

Bernal-Casas D, Oller JM. Information-Theoretic Models for Physical Observables. Entropy. 2023; 25(10):1448. https://doi.org/10.3390/e25101448

Chicago/Turabian Style

Bernal-Casas, D., and J. M. Oller. 2023. "Information-Theoretic Models for Physical Observables" Entropy 25, no. 10: 1448. https://doi.org/10.3390/e25101448

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop