Fisher and Shannon Functionals for Hyperbolic Diffusion

The complexity measure for the distribution in space-time of a finite-velocity diffusion process is calculated. Numerical results are presented for the calculation of Fisher’s information, Shannon’s entropy, and the Cramér–Rao inequality, all of which are associated with a positively normalized solution to the telegrapher’s equation. In the framework of hyperbolic diffusion, the non-local Fisher’s information with the x-parameter is related to the local Fisher’s information with the t-parameter. A perturbation theory is presented to calculate Shannon’s entropy of the telegrapher’s equation at long times, as well as a toy model to describe the system as an attenuated wave in the ballistic regime (short times).

The study of the propagation of thermal waves is also a fundamental theoretical and experimental subject [23,24], where the temperature profile ψ(x, t) > 0 can be described by the TE: thus, the wave packet propagates at the velocity (v) and is attenuated at a rate of τ −1 .
The same TE serves as the starting point for studying electromagnetic field transport in waveguides, where Ohm's law plays a fundamental role in describing the conducting media and characterizing the dissipative parameter τ −1 [1,2].In this case, an electromagnetic dissipative wave is the solution to (1), but it is not necessary to impose positivity and normalization on it.It is interesting to note that Equation (1) has two extreme cases: • The wave limit Taking τ → ∞, we recover the wave equation: Then, its solution can then be considered a wave packet that moves either to the right or left, represented as ψ WE (x ± vt), without changing its form throughout the whole domain.

•
The diffusion limit Taking τ → 0 and v → ∞, such that τv 2 → D, we recover the diffusion equation: x ψ W (x, t) = 0, where its solution is given by the Wiener process: In the following sections, we will present a Shannon entropic analysis [25] for the TE.Fisher's information [26] is naturally linked to variations of entropy; therefore, it can be related to the control of disorder in a transport process.Fisher's measure of indeterminacy has several natural and important applications in the design of codes and protocols, biophysics transport, machine learning, etc.In fact, this measure tells us how much information one (input) parameter carries about another (output) value, and so Fisher's information is widely used in optimal experimental design in many areas of research [27,28].In the next sections, the analysis of the Shannon and Fisher functionals will be presented for the finite-velocity diffusion process (hyperbolic diffusion).
The paper is organized as follows: In Section 2, we present hyperbolic diffusion as a novel integro-differential equation for a non-Markov diffusion process.In Section 3, we present, for the first time, to our knowledge, the Fisher and Shannon functionals for the hyperbolic diffusion problem; in the Markovian limit, Fisher and Shannon functionals for Wiener's diffusion are recovered.In Section 4, we calculate Shannon's entropy, Fisher's information, the complexity measure, and the Cramér-Rao bound.All of these novel results are obtained by solving numerically the TE (1).In Section 5, we present conclusions on the present approach, as well as its future extensions and applications.In addition, Appendices A-C are used to present the algebra necessary to prove the diffusion convergence, introduce a novel toy model for short times, and develop a perturbation theory for the calculation of Shannon's entropy at long times.

Hyperbolic Diffusion
In the following, we are concerned with the hyperbolic diffusion problem.Therefore, in the present work, we are interested in solutions to (1) under the following conditions: and we look for a solution that fulfills the particular initial conditions: where δ(x) is the Dirac delta function.An analytic solution can straightforwardly be obtained by working with Equation (1) in the Fourier-Laplace representation; see Appendix A.
Other initial conditions, such as a "bullet packet", can also be worked out, but will not be considered in the present work.
It is interesting to note that the TE can also be written as an integral operator: We note that, from this equation, we cannot use a bullet initial condition, because in this case, ψ(x, t)| t=0 = δ(x − vt)| t=0 and ∂ t ψ(x, t)| t=0 = 0. Equation (4) demonstrates that the solution to the hyperbolic diffusion can be thought of as a non-Markov diffusion process (see chap. 7 in [29]).

Fisher and Shannon Functionals
The Shannon entropy can be defined as follows: in this formula, we use normalization: and explicitly write the small space parameter ∆, which is necessary for the correct definition of Shannon's entropy for any distribution fulfilling [dx ψ(x, t)] = dimensionless.Thus, S(t) − ln 1  ∆ ≡ ∆S(t) can be interpreted as a relative entropy to a sharp initial condition at the origin.

Wiener Diffusion Case
Here, we show a relation between the Shannon and Fisher information functionals for the diffusion process.Applying the time derivative to the definition of Shannon's entropy of the Wiener process, denoted as S W (t), we obtain: where we use the normalization of ψ W (x, t).For the Wiener process, the distribution fulfills the diffusion equation: x ψ W (x, t), then replacing this operator in (6), we obtain the following expression: Integrating by parts on the right-hand side (RHS) and using [ln ψ W (x, t) ∂ x ψ W (x, t)] x→±∞ → 0, we obtain the following: The RHS expression in ( 7) is proportional to Fisher's information: in this case, Fisher's information concerns the x-parameter, and the symbol • • • represents the mean value over the PDF ψ(x, t).Then, we obtain the following: Using the Shannon entropy of the Wiener process, S W (t) = ln √ 4πeDt, in (9), we arrive at the following: thus, Fisher's information for the Wiener process is as follows: Then, we write a closed expression, connecting Shannon's entropy and Fisher's information for the Wiener process: After a little algebra, we can write the following: where C W (t) is the complexity measure.This complexity combines Shannon's entropy and Fisher's information for Brownian motion in a simple manner.In general, the complexity measure is defined as follows [27,28]: In addition, it is simple to see from ( 10) that the Cramér-Rao bound [30,31] is fulfilled for the Wiener process.Indeed, using I W (t) = 1/ x(t) 2 and taking into account the dispersion x(t) 2 = ∆x(t) 2 , we can write the following bound: and so we propose a generic Cramér-Rao lower bound to be used in the TE: The Cramér-Rao inequality is a fundamental tool in information analysis and is widely used in optimal experimental design.Due to the reciprocity of estimator variance and Fisher's information, minimizing variance corresponds to maximizing information.
In Appendix B, we present the general definition of the θ−parameter Fisher's information functional.

Hyperbolic Diffusion Case
Here, we present a relationship between the Shannon and Fisher functionals for the solution to the TE.We will show that there is a non-local connection between Shannon's entropy and Fisher's information.This can be seen by considering that the TE also comes from a non-Markov operator.Using (4) in the formula for the time derivative of S TE (t), we obtain the following: the last line is obtained by part integration, and it shows that dS TE (t)/dt is connected to the non-local Fisher's information: A connection between the non-local x-parameter Fisher's information and the t-parameter Fisher's information is demonstrated in Appendix B, as shown in (A18).
As can be seen from ( 15), only in the limit τ → 0 do we recover a local relation (with obtaining (9).The case τ → ∞ must be taken with care because a wave solution ψ WE (x − vt) intrinsically represents a bullet-like initial condition.Therefore, the surface terms that come from integration by parts cannot be taken as zero.In addition, as time goes on, for a wave packet that moves without deformation, there must not be an increase in disorder or loss of information.To make this analysis, we can take, asymptotically, the limit t/τ 1, so from (15), we obtain the first-order approximation: therefore, in the limit t/τ 1, we can approximate S TE (t) ∼ as constant in the ballistic regime.See Appendix A.

Estimation Theory
In many experimental situations, a quantity of interest cannot be measured directly; it can only be measured from a sample of data.This situation is covered by the inference theory, sometimes referred to as the estimation approach [32,33].Regarding a sample of data of independent and identically distributed (i.i.d.) random variables, the Fisher measure provides an invaluable tool for analyzing this problem, particularly the Cramér-Rao lower bound.For stochastic processes, the situation is much more involved.In particular, in the diffusion case, where the increments of the Wiener process are i.i.d.random variables, the analysis is quite accessible, while in hyperbolic diffusion, where there are strong correlations, it is widely unexplored.
To estimate the finite velocity of diffusion "v" in the most precise way possible by sampling from the probability distribution of the hyperbolic diffusion is a very interesting issue.Likelihood estimation in the TE is very important in many experimental situations, and we believe that with the help of these ideas, this analysis will be promoted.Nevertheless, we note that this subject is outside the scope of the present work and will be considered in the future.

Numerical Results for the Telegrapher's Equation
Here, we are going to show the numerical results from the direct integration of the TE (1), and for different values of the parameters {τ, v}.First, we calculate Shannon's entropy S TE (t) for the solution to the TE.Unfortunately, we cannot find an analytical expression for the Wiener process, so we perform its calculations numerically.Nevertheless, in Appendix C, we present a cumulant perturbation approach for τ 1.In Figure 1, we show ∆S(t) = S(t) − S(0) for the TE process and compare this plot against the analytical result for the Wiener case: S W (t) = ln √ 4πeDt .In all cases, we use a thin Gaussian as the initial condition: ψ(x, t)| t=0 = exp −x 2 /2σ 2 √ 2πσ 2 with dispersion σ = 0.07.The parameters of the TE are τ = 5 and v = 1, and for the Wiener process, we take D = 5.In Figure 2, we show I(t) versus S(t) for the solution to the TE.This plot also shows a linear fit (in red) at short times 0 ≤ t 10, while at long times, the behavior is non-linear.The parameters of the TE are as in Figure 1, i.e., τ = 5 and v = 1.In Figure 3, we show the complexity measure C(t) for the solution to the TE.While this measure for the Wiener process is a constant as shown in (11), for the TE, this measure is a non-trivial function of time.Parameters of the TE are as in Figures 1 and 2, i.e., τ = 5 and v = 1.This complexity C(t) shows a maximum at time t max ≈ 25.This time is one order of magnitude larger than the relaxation time t Relax ≈ 2τ in the TE; see Appendix A. In Figure 4, we show the complexity measure C(t) for the TE, with parameters τ = 1 and v = 1.This complexity C(t) shows a narrow peak and a maximum at time t max ≈ 4. The time, t max , at which the complexity attains its maximum value, is described by the following relation: This implies that when the relative velocity of Fisher's information, İ(t)/I = 2 Ṡ(t) , equals twice the modulus of the Entropy velocity, an observation emerges.In hyperbolic diffusion, the loss of information and the increase in disorder are not equivalent measures.This issue is connected to the fact that the solution of the TE has two quite different behaviors in the short and long times, as seen in Appendix A.2.In Figure 5a,b, we show the behavior of the complexity C(t).That is, we plot the maximum value C(t max ) and the time t max as a function of τ.These values can be fitted by the following expressions: For τ 1, we can use the toy model (A10) to characterize the ballistic regime.Then, it is possible to calculate t max as a function of all the physical parameters.Using (A9) and (A23), we obtain the following: In Figure 6, we show the Cramér-Rao inequality ( 14) applied to the solution of the TE under the parameters τ = 5 and v = 1.The plot shows the time-dependent behavior of the product I(t)K 2 (t).It is to be noted that when the initial condition for the solution to ψ(x, t) is not a delta function, as in (3), the second cumulant K 2 (t ∼ 0) does not go to zero.In fact, the second cumulant is given by (A28) and goes to a constant for t → 0. In this case, Fisher's information does not diverge at t = 0.This issue can be seen in the logarithmic representation in Figure 6 (thus, lim t→0 I(t)K 2 (t) → 1).In the same Figure, we plot the Cramér-Rao inequality, calculated analytically with our toy model based on the approximation (A10), which is valid for τ 1.The Cramér-Rao inequality tends, for t → ∞, to the lower bound one, corresponding to the Wiener process, as shown in (13).
We note that the convergence in time from the solution to the TE toward the Wiener process can also be proved by calculating the time-dependent kurtosis, as shown in Appendix A.3.In addition, in the limit τ 1, a perturbation theory can be presented in terms of all cumulants of ψ(x, t).Therefore, for example, Shannon's entropy for the TE can be calculated, as shown in Appendix C. √ 2πσ 2 , (toy model (A10) in red dashed line).The vertical blue dashed line represents the location of the t max given by our Formula (22).The parameters are τ = 5, v = 1, σ = 0.07.

Conclusions
The Wiener process is ubiquitous in nature because it is a time-dependent Gaussian process.For the Wiener process, the complexity C and the Cramér-Rao bound are well known, while if the diffusion process has a finite velocity of propagation, the situation is less known.In the present work, we studied the time-dependent Fisher I TE (t) and Shannon S TE (t) functionals associated with the hyperbolic diffusion, as characterized by the telegrapher's equation.The solution to the telegrapher's equation shows ballistic behavior at a short time (t τ), while at a long time (t τ), the behavior is diffusive, and the crossover between both regimes is of the order of the relaxation time, τ.Therefore, it is important to know how information measures behave in a hyperbolic diffusion protocol.In this context, we characterized the complexity C(t) of this hyperbolic diffusion process as a function of time, showing that this measure has a maximum at t max before relaxing to a constant value.We also presented the response of this time t max as a function of the parameter τ (τ −1 is the rate of dissipation).The same occurs with the Cramér-Rao bound connecting Fisher's information and the spatial uncertainty ∆x(t) 2 in the process.A relation between a non-local x-parameter Fisher's information with the local t-parameter Fisher's information has been established; see (A18), as well as the time-behavior of the Fisher and Shannon functionals for the solution to the telegrapher's equation.A toy model approximation, for large τ, was used to calculate analytically the Cramér-Rao inequality as a function of time, as well as the relative entropy in the ballistic regime, see Figure 1.
Numerical results have been used to study the Fisher and Shannon functionals as functions of the parameter τ and time t.In addition a perturbation approach (in cumulants series) for small τ for the solution to the hyperbolic diffusion is also presented in Appendix C.This perturbation is a useful tool to analytically approximate many statistical objects in the theory of information.
For many decades, Fisher's information has been used to find limits on the precision of codes and protocols.On the other hand, the telegrapher's equation has found applications in many areas of interest where finite-velocity diffusion is the crucial ingredient, such as in engineering code problems, the transport of electromagnetic signals, biophysics of neuronal responses, machine learning, etc.We believe that the present work will stimulate research in the area of the theory of information on hyperbolic diffusion.
Extensions of the present approach can also be conducted when the rate of absorption of energy (characterized by the parameter τ −1 ) has time fluctuations (noise) [11].In this case, the general interest lies in averaging statistical objects over realizations of disorder (time fluctuations in the rate τ −1 ).Works in this direction are in progress.
Our results can also be extended to consider the issue of random initial conditions in the telegrapher's equation and to model temperature fluctuations.Therefore, the present approach can help in estimating and statistically inferring physical parameters derived from cosmic microwave background data [17].

Figure 1 .
Figure 1.∆S(t) as a function of time for a Wiener process (red dashed line), and the solution to the TE (full black line).The blue circle line represents ∆S(t) from the toy model solution to the TE, the fit is valid for t/τ 1; see Appendix B.1.The parameters are τ = 5, v = 1.

Figure 2 .
Figure 2. Fisher's information as a function of entropy for the solution to the TE (black circle).The linear fit (for some values of t 10) is denoted by the dashed red line.The parameters are τ = 5, v = 1.The equation of linear fit is I(S) = −41.57S+ 166

Figure 3 .
Figure 3. Complexity C(t) as a function of time for the solution to the TE (full black line).The inset corresponds to the Wiener process (red dashed line).The parameters are τ = 5, v = 1.

Figure 4 .
Figure 4. Complexity C(t) as a function of time for the solution to the TE (full black line).The inset corresponds to the Wiener process (red dashed line).The parameters are τ = 1, v = 1.

Figure 5 .Figure 6 .
Figure 5. Characteristics of the complexity measure C(t).(a) Maximum time t max as a function of τ.(b) Maximum value for the complexity as a function of τ.