# ϕ-Informational Measures: Some Results and Interrelations

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. ϕ-Entropies—Direct and Inverse Maximum Entropy Problems

**Definition**

**1**

**.**Let $\varphi :\mathcal{Y}\subseteq {\mathbb{R}}_{+}\mapsto \mathbb{R}$ be a convex function defined on a convex set $\mathcal{Y}$. Then, if f is a probability distribution defined with respect to a general measure μ on a set $\mathcal{X}\subseteq {\mathbb{R}}^{d}$ such that $f\left(\mathcal{X}\right)\subseteq \mathcal{Y}$, when this quantity exists,

**Definition**

**2**

**.**With the same assumptions as in Definition 1, the Bregman divergence associated with ϕ defined on a convex set $\mathcal{Y}$ is given by the function defined on $\mathcal{Y}\times \mathcal{Y}$,

#### 2.1. Maximum Entropy Principle: The Direct Problem

**Proposition**

**1**

**Proof.**

#### 2.2. Maximum Entropy Principle: The Inverse Problems

- the domain of definition of ${\varphi}^{\prime}$ must include $f\left(\mathcal{X}\right)$; this will be satisfied by construction;
- from the strict convexity property of $\varphi $, ${\varphi}^{\prime}$ must be strictly increasing.

- (C1)
- $f\left(x\right)$ and $\sum _{i=1}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left(x\right)$ must have the same variations, i.e., $\sum _{i=0}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left(x\right)$ is increasing (resp. decreasing, constant) where f is increasing (resp. decreasing, constant);
- (C2)
- $f\left(x\right)$ and $\sum _{i=1}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left(x\right)$ must have the same level sets,$$f\left({x}_{1}\right)=f\left({x}_{2}\right)\phantom{\rule{0.222222em}{0ex}}\iff \phantom{\rule{0.222222em}{0ex}}\sum _{i=0}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left({x}_{1}\right)\phantom{\rule{0.166667em}{0ex}}=\phantom{\rule{0.166667em}{0ex}}\sum _{i=0}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left({x}_{2}\right)$$

- for $\mathcal{X}={\mathbb{R}}_{+},\phantom{\rule{0.222222em}{0ex}}{T}_{1}\left(x\right)=x$, ${\lambda}_{1}$ must be negative and $f\left(x\right)$ must be decreasing,
- for $\mathcal{X}=\mathbb{R},\phantom{\rule{0.222222em}{0ex}}{T}_{1}\left(x\right)={x}^{2}$ or ${T}_{1}\left(x\right)=\left|x\right|$, ${\lambda}_{1}$ must be negative and $f\left(x\right)$ must be even and unimodal.

#### 2.3. Second Inverse Maximum Entropy Problem: Some Examples

**Example**

**1.**

**Example**

**2.**

**Example**

**3.**

## 3. State-Dependent Entropic Functionals and Minimization Revisited

**Definition**

**3**

**Proposition**

**2**

**.**Suppose that there exists a probability distribution f satisfying

**Proof.**

- design a partition $({\mathcal{X}}_{1},\dots ,{\mathcal{X}}_{k})$ so that (C2) is satisfied in each ${\mathcal{X}}_{l}$ (at least, such that f is either strictly monotonic, or constant, on ${\mathcal{X}}_{l}$) and
- determine ${\varphi}_{l}$ as in Equation (7) in each ${\mathcal{X}}_{l}$, that is$${\varphi}_{l}^{\prime}\left(y\right)=\sum _{i=0}^{n}{\lambda}_{i}\phantom{\rule{0.166667em}{0ex}}{T}_{i}\left(\right)open="("\; close=")">{f}_{l}^{-1}\left(y\right)$$

**Example**

**4.**

**Example**

**5.**

## 4. $\mathbf{\varphi}$-Escort Distribution, $(\mathbf{\varphi},\mathbf{\alpha})$-Moments, $(\mathbf{\varphi},\mathbf{\beta})$-Fisher Information, Generalized Cramér–Rao Inequalities

**Definition**

**4**

**.**Let $\varphi :\mathcal{X}\times \mathcal{Y}\mapsto \mathbb{R}$ such that for any $x\in \mathcal{X}\subseteq {\mathbb{R}}^{d}$ function $\varphi (x,\xb7)$ is a strictly convex twice differentiable function defined on the closed convex set $\mathcal{Y}\subseteq {\mathbb{R}}_{+}$. Then, if f is a probability distribution defined with respect to a general measure μ on a set $\mathcal{X}$ such that $f\left(\mathcal{X}\right)\subseteq \mathcal{Y}$, and such that

**Example**

**1**

**(cont.).**

**Example**

**2**

**(cont.).**

**Example**

**3**

**(cont.).**

**Definition**

**5**

**.**Under the assumptions of Definition 4, with $\mathcal{X}$ equipped with a norm ${\parallel \xb7\parallel}_{\chi}$, we define the $(\alpha ,\varphi )$-moment of a random variable X associated to distribution f by

**Example**

**1**

**(cont.).**

**Example**

**2**

**(cont.).**

**Example**

**3**

**(cont.).**

**Definition**

**6**

**.**With the same assumption as in Definition 4, denoting by ${\parallel \xb7\parallel}_{\chi *}$ the dual norm (the norm induced in the dual space that gives here ${\parallel z\parallel}_{{\chi}^{*}}=\underset{{\parallel x\parallel}_{\chi}=1}{sup}{z}^{t}x$ [105,106]), for any differentiable density f, we define the quantity

**Definition**

**7**

**Example**

**1**

**(cont.).**

**Example**

**2**

**(cont.).**

**Example**

**3**

**(cont.).**

**Proposition**

**3**

**.**Assume that a differentiable probability density function with respect to a measure μ, defined on a domain $\mathcal{X}$, admits an $(\alpha ,\varphi )$-moment and an $({\alpha}^{*},\varphi )$-Fisher information with $\alpha \ge 1$ and ${\alpha}^{*}$ its Hölder-conjugated, $\frac{1}{\alpha}+\frac{1}{{\alpha}^{*}}=1$, and that $xf\left(x\right)$ vanishes at the boundary of $\mathcal{X}$. Thus, density f satisfies the $(\alpha ,\varphi )$ extended Cramér–Rao inequality

**Proof.**

**Proposition**

**4**

**.**Let f be a probability density function with respect to a general measure μ defined over a set $\mathcal{X}$, where f is parameterized by a parameter $\theta \in \Theta \subseteq {\mathbb{R}}^{m}$, and satisfies the conditions of Definition 7. Assume that both μ and $\mathcal{X}$ do not depend on θ, that f is a jointly measurable function of x and θ which is integrable with respect to x and absolutely continuous with respect to θ, and that the derivatives of f with respect to each component of θ are locally integrable. Thus, for any estimator $\widehat{\theta}\left(X\right)$ of θ that does not depend on θ, we have

**Proof.**

**Example**

**1**

**(cont.).**

**Example**

**2**

**(cont.).**

**Example**

**3**

**(cont.).**

## 5. $\mathbf{\varphi}$-Heat Equation and Extended de Bruijn Identity

**Proposition**

**5**

**.**Let f be a probability distribution with respect to a measure μ. Suppose that f is parameterized by a parameter $\theta \in \Theta \subseteq {\mathbb{R}}^{m}$, and is defined over a set $\mathcal{X}\subset {\mathbb{R}}^{d}$. Assume that both $\mathcal{X}$ and μ do not depend on θ, and that f satisfies the nonlinear ϕ-heat equation Equation (24) for a twice differentiable convex function ϕ. Assume that ${\nabla}_{\theta}\varphi \left(f\right)$ is absolutely integrable and locally integrable with respect to θ, and that the function ${\left(\right)}_{{\nabla}_{x}}^{{\varphi}^{\prime}}{\chi}^{*}\beta -2$ vanishes at the boundary of $\mathcal{X}$. Thus, distribution f satisfies the extended de Bruijn identity, relating the ϕ-entropy of f and its nonparametric $(\beta ,\varphi )$-Fisher information as follows,

**Proof.**

**Example**

**1**

**(cont.).**

**Example**

**2**

**(cont.).**

## 6. Concluding Remarks

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Appendix A. Inverse Maximum Entropy Problem and Associated Inequalities: Some Examples

#### Appendix A.1. Normal Distribution and Second-Order Moment

#### Appendix A.2. q-Gaussian Distribution and Second-Order Moment

#### Appendix A.3. q-Exponential Distribution and First-Order Moment

#### Appendix A.4. The Arcsine Distribution

#### Appendix A.4.1. Second-Order Moment

#### Appendix A.4.2. (Partial) First-Order Moment(s)

**Figure A1.**Univalued entropic functional ${\varphi}_{\mathrm{u}}$ derived from the arcsine distribution with partial constraints ${T}_{\pm ,1}\left(x\right)=x{\U0001d7d9}_{{\mathcal{X}}_{\pm}}\left(x\right)$.

#### Appendix A.5. The Logistic Distribution

#### Appendix A.5.1. Second Order Moment Constraint

#### Appendix A.5.2. (Partial) First-Order Moment(s) Constraint(s)

**Figure A2.**Entropy functional ${\varphi}_{\mathrm{u}}$ derived from the logistic distribution: (

**a**) with ${T}_{1}\left(x\right)={x}^{2}$ and (

**b**) with ${T}_{\pm ,1}\left(x\right)=x{\U0001d7d9}_{{\mathcal{X}}_{\pm}}\left(x\right)$.

#### Appendix A.6. The Gamma Distribution and (Partial) P-Order Moment(s)

- The constraints degenerate to a single uniform constraint ${T}_{1}\left(x\right)={x}^{p}$;
- In this limit, conditions (C1) and (C2) are both satisfied.
- The entropic functional becomes state-independent (uniform), where only the branch ${\varphi}_{-1}$ remains.

**Figure A3.**Multiform entropy functional ${\varphi}_{\mathrm{u}}$ derived from the gamma distribution with the partial moment constraints ${T}_{k,1}\left(x\right)=x{\U0001d7d9}_{{\mathcal{X}}_{k}}\left(x\right)$ ($p=1$), $k\in \{0,-1\}$ for $q=1.02,\phantom{\rule{0.166667em}{0ex}}1.25,\phantom{\rule{0.166667em}{0ex}}1.5,\phantom{\rule{0.166667em}{0ex}}1.75,\phantom{\rule{0.166667em}{0ex}}2,\phantom{\rule{0.166667em}{0ex}}2.25,\phantom{\rule{0.166667em}{0ex}}2.5$. (

**a**): ${\varphi}_{0,\mathrm{u}}-{\gamma}_{0}-\beta u$ (${\alpha}_{0}=1$); (

**b**): ${\varphi}_{-1,\mathrm{u}}$ with ${\alpha}_{-1}=\beta =1$, ${\gamma}_{-1}=-\Gamma \left(q\right)$, and Shannon entropic functional $u\phantom{\rule{0.166667em}{0ex}}logu$ (thin line).

## References

- Von Neumann, J. Thermodynamik quantenmechanischer Gesamtheiten. Nachr. Ges. Wiss. Gött.
**1927**, 1, 273–291. [Google Scholar] - Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J.
**1948**, 27, 623–656. [Google Scholar] [CrossRef] - Boltzmann, L. Lectures on Gas Theory (Translated by S. G. Brush); Dover: Leipzig, Germany, 1964. [Google Scholar]
- Boltzmann, L. Vorlesungen Über Gastheorie—I; Verlag von Johann Ambrosius Barth: Leipzig, Germany, 1896. [Google Scholar]
- Boltzmann, L. Vorlesungen Über Gastheorie—II; Verlag von Johann Ambrosius Barth: Leipzig, Germany, 1898. [Google Scholar]
- Planck, M. Eight Lectures on Theoretical Physics; Columbia University Press: New York, NY, USA, 2015. [Google Scholar]
- Maxwell, J.C. The Scientific Papers of James Clerk Maxwell; Dover: New York, NY, USA, 1952; Volume 2. [Google Scholar]
- Jaynes, E.T. Gibbs vs Boltzmann Entropies. Am. J. Phys.
**1965**, 33, 391–398. [Google Scholar] [CrossRef] - Müller, I.; Müller, W.H. Fundamentals of Thermodynamics and Applications. With Historical Annotations and Many Citations from Avogadro to Zermelo; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar] [CrossRef]
- Rényi, A. On measures of entropy and information. Proc. Berkeley Symp. Math. Stat. Probab.
**1961**, 1, 547–561. [Google Scholar] - Varma, R.S. Generalization of Rényi’s Entropy of Order α. J. Math. Sci.
**1966**, 1, 34–48. [Google Scholar] - Havrda, J.; Charvát, F. Quantification Method of Classification Processes: Concept of Structural α-Entropy. Kybernetika
**1967**, 3, 30–35. [Google Scholar] - Csiszàr, I. Information-Type Measures of Difference of Probability Distributions and Indirect Observations. Stud. Sci. Math. Hung.
**1967**, 2, 299–318. [Google Scholar] - Daróczy, Z. Generalized Information Functions. Inf. Control
**1970**, 16, 36–51. [Google Scholar] [CrossRef] [Green Version] - Aczél, J.; Daróczy, Z. On Measures of Information and Their Characterizations; Academic Press: New York, NY, USA, 1975. [Google Scholar]
- Daróczy, Z.; Járai, A. On the measurable solution of a functional equation arising in information theory. Acta Math. Acad. Sci. Hung.
**1979**, 34, 105–116. [Google Scholar] [CrossRef] - Tsallis, C. Possible Generalization of Boltzmann-Gibbs Statistics. J. Stat. Phys.
**1988**, 52, 479–487. [Google Scholar] [CrossRef] - Salicrú, M. Funciones de entropía asociada a medidas de Csiszár. Qüestiió
**1987**, 11, 3–12. [Google Scholar] - Salicrú, M.; Menéndez, M.L.; Morales, D.; Pardo, L. Asymptotic distribution of (h,ϕ)-entropies. Commun. Stat. Theory Methods
**1993**, 22, 2015–2031. [Google Scholar] [CrossRef] - Salicrú, M. Measures of information associated with Csiszár’s divergences. Kybernetica
**1994**, 30, 563–573. [Google Scholar] - Liese, F.; Vajda, I. On Divergence and Informations in Statistics and Information Theory. IEEE Trans. Inf. Theory
**2006**, 52, 4394–4412. [Google Scholar] [CrossRef] - Basseville, M. Divergence measures for statistical data processing—An annotated bibliography. Signal Process.
**2013**, 93, 621–633. [Google Scholar] [CrossRef] - Panter, P.F.; Dite, W. Quantization distortion in pulse-count modulation with nonuniform spacing of levels. Proc. IRE
**1951**, 39, 44–48. [Google Scholar] [CrossRef] - Lloyd, S.P. Least Squares Quantization in PCM. IEEE Trans. Inf. Theory
**1982**, 28, 129–137. [Google Scholar] [CrossRef] - Gersho, A.; Gray, R.M. Vector Quantization and Signal Compression; Kluwer: Boston, MA, USA, 1992. [Google Scholar] [CrossRef]
- Campbell, L.L. A coding theorem and Rényi’s entropy. Inf. Control
**1965**, 8, 423–429. [Google Scholar] [CrossRef] [Green Version] - Bercher, J.F. Source coding with escort distributions and Rényi entropy bounds. Phys. Lett. A
**2009**, 373, 3235–3238. [Google Scholar] [CrossRef] [Green Version] - Burbea, J.; Rao, C.R. On the Convexity of Some Divergence Measures Based on Entropy Functions. IEEE Trans. Inf. Theory
**1982**, 28, 489–495. [Google Scholar] [CrossRef] - Menéndez, M.L.; Morales, D.; Pardo, L.; Salicrú, M. (h,Φ)-entropy differential metric. Appl. Math.
**1997**, 42, 81–98. [Google Scholar] [CrossRef] - Pardo, L. Statistical Inference Based on Divergence Measures; Chapman & Hall: Boca Raton, FL, USA, 2006. [Google Scholar]
- Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev.
**1957**, 106, 620–630. [Google Scholar] [CrossRef] - Kapur, J.N. Maximum Entropy Model in Sciences and Engineering; Wiley Eastern Limited: New Dehli, India, 1989. [Google Scholar]
- Arndt, C. Information Measures: Information and Its Description in Sciences and Engeniering; Springer: Berlin/Heidelberg, Germany, 2001. [Google Scholar] [CrossRef] [Green Version]
- Cover, T.M.; Thomas, J.A. Elements of Information Theory, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
- Gokhale, D.V. Maximum entropy characterizations of some distributions. In A Modern Course on Statistical Distributions in Scientific Work; Patil, S.K., Ord, J.K., Eds.; Reidel: Dordrecht, The Netherlands, 1975; Volume III, pp. 299–304. [Google Scholar] [CrossRef] [Green Version]
- Jaynes, E.T. Prior probabilities. IEEE Trans. Syst. Sci. Cybern.
**1968**, 4, 227–241. [Google Scholar] [CrossRef] - Csiszàr, I. Why Least Squares and Maximum Entropy? An Axiomatic Approach to Inference for Linear Inverse Problems. Ann. Stat.
**1991**, 19, 2031–2066. [Google Scholar] [CrossRef] - Frigyik, B.A.; Srivastava, S.; Gupta, M.R. Functional Bregman Divergence and Bayesian Estimation of Distributions. IEEE Trans. Infom. Theory
**2008**, 54, 5130–5139. [Google Scholar] [CrossRef] [Green Version] - Robert, C.P. The Bayesian Choice. From Decision-Theoretic Foundations to Computational Implementation, 2nd ed.; Springer: New York, NY, USA, 2007. [Google Scholar]
- Jaynes, E.T. On the rational of maximum-entropy methods. Proc. IEEE
**1982**, 70, 939–952. [Google Scholar] [CrossRef] - Jones, L.K.; Byrne, C.L. General Entropy Criteria for Inverse Problems, with Applications to Data Compression, Pattern Classification, and Cluster Analysis. IEEE Trans. Inf. Theory
**1990**, 36, 23–30. [Google Scholar] [CrossRef] - Hero III, A.O.; Ma, B.; Michel, O.J.J.; Gorman, J. Application of Entropic Spanning Graphs. IEEE Signal Process. Mag.
**2002**, 19, 85–95. [Google Scholar] [CrossRef] - Park, S.Y.; Bera, A.K. Maximum entropy autoregressive conditional heteroskedasticity model. J. Econom.
**2009**, 150, 219–230. [Google Scholar] [CrossRef] - Vasicek, O. A Test for Normality Based on Sample Entropy. J. R. Stat. Soc. B
**1976**, 38, 54–59. [Google Scholar] [CrossRef] - Gokhale, D. On entropy-based goodness-of-fit tests. Comput. Stat. Data Anal.
**1983**, 1, 157–165. [Google Scholar] [CrossRef] - Song, K.S. Goodness-of-fit tests based on Kullback-Leibler discrimination information. IEEE Trans. Inf. Theory
**2002**, 48, 1103–1117. [Google Scholar] [CrossRef] - Lequesne, J. A goodness-of-fit test of Student distributions based on Rényi entropy. In AIP Conference Proceedings of the 34th International Workshop on Bayesian Inference and Maximum Entropy Methods (MaxEnt’14); Djafari, A., Barbaresco, F., Barbaresco, F., Eds.; American Institute of Physics: College Park, MD, USA, 2014; Volume 1641, pp. 487–494. [Google Scholar] [CrossRef]
- Lequesne, J. Tests Statistiques Basés sur la Théorie de L’information, Applications en Biologie et en Démographie. Ph.D. Thesis, Université de Caen Basse-Normandie, Caen, France, 2015. [Google Scholar]
- Girardin, V.; Regnault, P. Escort distributions minimizing the Kullback-Leibler divergence for a large deviations principle and tests of entropy level. Ann. Inst. Stat. Math.
**2015**, 68, 439–468. [Google Scholar] [CrossRef] - Kesavan, H.K.; Kapur, J.N. The Generalized Maximum Entropy Principle. IEEE Trans. Syst. Man Cybern.
**1989**, 19, 1042–1052. [Google Scholar] [CrossRef] - Borwein, J.M.; Lewis, A.S. Duality Relationships for Entropy-Like Minimization Problems. SIAM J. Control Optim.
**1991**, 29, 325–338. [Google Scholar] [CrossRef] [Green Version] - Borwein, J.M.; Lewis, A.S. Convergence of best entropy estimates. SIAM J. Optim.
**1991**, 1, 191–205. [Google Scholar] [CrossRef] [Green Version] - Borwein, J.M.; Lewis, A.S. Partially-finite programming in L
_{1}and the existence of maximum entropy estimates. SIAM J. Optim.**1993**, 3, 248–267. [Google Scholar] [CrossRef] [Green Version] - Mézard, M.; Montanari, A. Information, Physics, and Computation; Oxford University Press: New York, NY, USA, 2009. [Google Scholar]
- Darmois, G. Sur les lois de probabilités à estimation exhaustive. C. R. l’Acadéie Sci.
**1935**, 200, 1265–1966. [Google Scholar] - Koopman, B.O. On distributions admitting a sufficient statistic. Trans. Am. Math. Soc.
**1936**, 39, 399–409. [Google Scholar] [CrossRef] - Pitman, E.J.G. Sufficient statistics and intrinsic accuracy. Math. Proc. Camb. Philos. Soc.
**1936**, 32, 567–579. [Google Scholar] [CrossRef] - Lehmann, E.L.; Casella, G. Theory of Point Estimation, 2nd ed.; Springer: New York, NY, USA, 1998. [Google Scholar]
- Mukhopadhyay, N. Probability and Statistical Inference, 5th ed.; Statistics: Textbooks and Monographs; Marcel Dekker: New York, NY, USA, 2000; Volume 162. [Google Scholar]
- Rao, C.R. Linear Statistical Inference and Its Applications; John Wiley & Sons: New York, NY, USA, 2001. [Google Scholar]
- Tsallis, C.; Mendes, R.M.; Plastino, A.R. The role of constraints within generalized nonextensive statistics. Physica A
**1998**, 261, 534–554. [Google Scholar] [CrossRef] - Tsallis, C. Nonextensive Statistics: Theoretical, Experimental and Computational Evidences and Connections. Braz. J. Phys.
**1999**, 29, 1–35. [Google Scholar] [CrossRef] - Tsallis, C. Introduction to Nonextensive Statistical Mechanics—Approaching a Complex World; Springer: New York, NY, USA, 2009. [Google Scholar] [CrossRef] [Green Version]
- Essex, C.; Schulzsky, C.; Franz, A.; Hoffmann, K.H. Tsallis and Rényi entropies in fractional diffusion and entropy production. Physica A
**2000**, 284, 299–308. [Google Scholar] [CrossRef] - Parvan, A.S.; Biró, T.S. Extensive Rényi statistics from non-extensive entropy. Phys. Lett. A
**2005**, 340, 375–387. [Google Scholar] [CrossRef] [Green Version] - Kay, S.M. Fundamentals for Statistical Signal Processing: Estimation Theory; Prentice Hall: Upper Saddle River, NJ, USA, 1993; Volume 1. [Google Scholar]
- Frieden, B.R. Science from Fisher Information: A Unification; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Jeffrey, H. An Invariant Form for the Prior Probability in Estimation Problems. Proc. R. Soc. A
**1946**, 186, 453–461. [Google Scholar] [CrossRef] [Green Version] - Vignat, C.; Bercher, J.F. Analysis of signals in the Fisher-Shannon information plane. Phys. Lett. A
**2003**, 312, 27–33. [Google Scholar] [CrossRef] - Romera, E.; Angulo, J.C.; Dehesa, J.S. Fisher entropy and uncertainty like relationships in many-body systems. Phys. Rev. A
**1999**, 59, 4064–4067. [Google Scholar] [CrossRef] [Green Version] - Romera, E.; Sánchez-Moreno, P.; Dehesa, J.S. Uncertainty relation for Fisher information of D-dimensional single-particle systems with central potentials. J. Math. Phys.
**2006**, 47, 103504. [Google Scholar] [CrossRef] - Sánchez-Moreno, P.; González-Férez, R.; Dehesa, J.S. Improvement of the Heisenberg and Fisher-information-based uncertainty relations for D-dimensional potentials. New J. Phys.
**2006**, 8, 330. [Google Scholar] [CrossRef] [Green Version] - Toranzo, I.V.; Lopez-Rosa, S.; Esquivel, R.; Dehesa, J.S. Heisenberg-like and Fisher-information uncertainties relations for N-fermion d-dimensional systems. Phys. Rev. A
**2015**, in press. [Google Scholar] [CrossRef] - Stam, A.J. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon. Inf. Control
**1959**, 2, 101–112. [Google Scholar] [CrossRef] [Green Version] - Dembo, A.; Cover, T.M.; Thomas, J.A. Information Theoretic Inequalities. IEEE Trans. Inf. Theory
**1991**, 37, 1501–1518. [Google Scholar] [CrossRef] [Green Version] - Guo, D.; Shamai, S.; Verdú, S. Mutual Information and Minimum Mean-Square Error in Gaussian Channels. IEEE Trans. Inf. Theory
**2005**, 51, 1261–1282. [Google Scholar] [CrossRef] [Green Version] - Folland, G.B.; Sitaram, A. The uncertainty principle: A mathematical survey. J. Fourier Anal. Appl.
**1997**, 3, 207–233. [Google Scholar] [CrossRef] - Sen, K.D. Statistical Complexity. Application in Electronic Structure; Springer: New York, NY, USA, 2011. [Google Scholar] [CrossRef]
- Vajda, I. χ
^{α}-divergence and generalized Fisher’s information. In Transactions of the 6th Prague Conference on Information Theory, Statistics, Decision Functions and Random Processes; Academia: Prague, Czech Republic, 1973; pp. 873–886. [Google Scholar] - Boekee, D.E. An extension of the Fisher information measure. Topics in information theory. In Proceedings 2nd Colloquium on Information Theory; Csiszàr, I., Elias., P., Eds.; Series: Colloquia Mathematica Societatis János Bolyai; North-Holland: Keszthely, Hungary, 1977; Volume 16, pp. 113–123. [Google Scholar]
- Hammad, P. Mesure d’ordre α de l’information au sens de Fisher. Rev. Stat. Appl.
**1978**, 26, 73–84. [Google Scholar] - Boekee, D.E.; Van der Lubbe, J.C.A. The R-Norm Information Measure. Inf. Control
**1980**, 45, 136–155. [Google Scholar] [CrossRef] [Green Version] - Lutwak, E.; Yang, D.; Zhang, G. Moment-Entropy Inequalities. Ann. Probab.
**2004**, 32, 757–774. [Google Scholar] [CrossRef] - Lutwak, E.; Yang, D.; Zhang, G. Cramér-Rao and Moment-Entropy Inequalities for Rényi Entropy and Generalized Fisher Information. IEEE Trans. Inf. Theory
**2005**, 51, 473–478. [Google Scholar] [CrossRef] - Lutwak, E.; Yang, D.; Zhang, G. Moment-Entropy Inequalities for a Random Vector. IEEE Trans. Inf. Theory
**2007**, 53, 1603–1607. [Google Scholar] [CrossRef] - Lutwak, E.; Lv, S.; Yang, D.; Zhang, G. Extension of Fisher Information and Stam’s Inequality. IEEE Trans. Inf. Theory
**2012**, 58, 1319–1327. [Google Scholar] [CrossRef] - Bercher, J.F. On a (β,q)-generalized Fisher information and inequalities invoving q-Gaussian distributions. J. Math. Phys.
**2012**, 53, 063303. [Google Scholar] [CrossRef] - Bercher, J.F. On generalized Cramér-Rao inequalities, generalized Fisher information and characterizations of generalized q-Gaussian distributions. J. Phys. A
**2012**, 45, 255303. [Google Scholar] [CrossRef] [Green Version] - Bercher, J.F. On multidimensional generalized Cramér-Rao inequalities, uncertainty relations and characterizations of generalized q-Gaussian distributions. J. Phys. A
**2013**, 46, 095303. [Google Scholar] [CrossRef] [Green Version] - Bercher, J.F. Some properties of generalized Fisher information in the context of nonextensive thermostatistics. Physica A
**2013**, 392, 3140–3154. [Google Scholar] [CrossRef] [Green Version] - Bregman, L.M. The relaxation method of finding the common point of convex sets and its application to the solution of problem in convex programming. USSR Comput. Math. Math. Phys.
**1967**, 7, 200–217. [Google Scholar] [CrossRef] - Nielsen, F.; Nock, R. Generalizing Skew Jensen Divergences and Bregman Divergences With Comparative Convexity. IEEE Signal Process. Lett.
**2017**, 24, 1123–1127. [Google Scholar] [CrossRef] - Ben-Tal, A.; Bornwein, J.M.; Teboulle, M. Spectral Estimation via Convex Programming. In Systems and Management Science by Extremal Methods; Phillips, F.Y., Rousseau, J.J., Eds.; Springer US: New York, NY, USA, 1992. [Google Scholar] [CrossRef]
- Teboulle, M.; Vajda, I. Convergence of Best ϕ-Entropy Estimates. IEEE Trans. Inf. Theory
**1993**, 39, 297–301. [Google Scholar] [CrossRef] - Girardin, V. Méthodes de réalisation de produit scalaire et de problème de moments avec maximisation d’entropie. Stud. Math.
**1997**, 124, 199–213. [Google Scholar] [CrossRef] [Green Version] - Girardin, V. Relative Entropy and Spectral Constraints: Some Invariance Properties of the ARMA Class. J. Time Ser. Anal.
**2007**, 28, 844–866. [Google Scholar] [CrossRef] - Costa, J.A.; Hero III, A.O.; Vignat, C. On Solutions to Multivariate Maximum α-Entropy Problems. In 4th International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition (EMMCVPR); Lecture Notes in Computer, Sciences; Rangarajan, A., Figueiredo, M.A.T., Zerubia, J., Eds.; Springer: Lisbon, Portugal, 2003; Volume 2683, pp. 211–226. [Google Scholar]
- Johnson, N.L.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions, 2nd ed.; John Wiley & Sons: New York, NY, USA, 1995; Volume 2. [Google Scholar]
- Chhabra, A.; Jensen, R.V. Direct determination of the f(α) singularity spectrum. Phys. Rev. Lett.
**1989**, 62, 1327. [Google Scholar] [CrossRef] - Beck, C.; Schögl, F. Thermodynamics of Chaotic Systems: An Introduction; Cambridge University Press: Cambridge, UK, 1993. [Google Scholar] [CrossRef]
- Naudts, J. Generalized Thermostatistics; Springer: London, UK, 2011. [Google Scholar] [CrossRef]
- Martínez, S.; Nicolás, F.; Pennini, F.; Plastino, A. Tsallis’ entropy maximization procedure revisited. Physica A
**2000**, 286, 489–502. [Google Scholar] [CrossRef] [Green Version] - Chimento, L.P.; Pennini, F.; Plastino, A. Naudts-like duality and the extreme Fisher information principle. Phys. Rev. E
**2000**, 62, 7462–7465. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Casas, M.; Chimento, L.; Pennini, F.; Plastino, A.; Plastino, A.R. Fisher information in a Tsallis non-extensive environment. Chaos Solitons Fractals
**2002**, 13, 451–459. [Google Scholar] [CrossRef] - Rudin, W. Functional Analysis, 2nd ed.; McGraw-Hill: New York, NY, USA, 1991. [Google Scholar]
- Morrison, T.J. Functional Analysis. An Introduction to Banach Space Theory; John Wiley & Sons: New York, NY, USA, 2000. [Google Scholar]
- Rioul, O. Information Theoretic Proofs of Entropy Power Inequalities. IEEE Trans. Inf. Theory
**2011**, 57, 33–55. [Google Scholar] [CrossRef] [Green Version] - Rao, C.R.; Wishart, J. Minimum variance and the estimation of several parameters. Math. Proc. Camb. Philos. Soc.
**1947**, 43, 280–283. [Google Scholar] [CrossRef] - Van den Bos, A. Parameter Estimation for Scientists and Engineers; John Wiley & Sons: Hoboken, NJ, USA, 2007. [Google Scholar]
- Magnus, J.R.; Neudecker, H. Matrix Differential Calculus with Applications in Statistics and Econometrics, 3rd ed.; John Wiley & Sons: New York, NY, USA, 1999. [Google Scholar]
- Guo, D.; Shamai, S.; Verdú, S. Additive Non-Gaussian Noise Channels: Mutual Information and Conditional Mean Estimation. IEEE Int. Symp. Inf. Theory
**2005**, 719–723. [Google Scholar] [CrossRef] - Palomar, D.P.; Verdú, S. Gradient of Mutual Information in Linear Vector Gaussian Channels. IEEE Trans. Inf. Theory
**2006**, 52, 141–154. [Google Scholar] [CrossRef] [Green Version] - Verdú, S. Mismatched Estimation and Relative Entropy. IEEE Trans. Inf. Theory
**2010**, 56, 3712–3720. [Google Scholar] [CrossRef] [Green Version] - Barron, A.R. Entropy and the Central Limit Theorem. Ann. Probab.
**1986**, 14, 336–342. [Google Scholar] [CrossRef] - Johnson, O. Information Theory and the Central Limit Theorem; Imperial College Press: London, UK, 2004. [Google Scholar]
- Madiman, M.; Barron, A. Generalized Entropy Power Inequalities and Monotonicity Properties of Information. IEEE Trans. Inf. Theory
**2007**, 53, 2317–2329. [Google Scholar] [CrossRef] [Green Version] - Toranzo, I.V.; Zozor, S.; Brossier, J.M. Generalization of the de Bruijn identity to general ϕ-entropies and ϕ-Fisher informations. IEEE Trans. Inf. Theory
**2018**, 64, 6743–6758. [Google Scholar] [CrossRef] - Widder, D.V. The Heat Equation; Academic Press: New York, NY, USA, 1975. [Google Scholar]
- Roubíček, T. Nonlinear Partial Differential Equations with Applications; Birkhaäuser: Basel, Switzerland, 2005. [Google Scholar]
- Tsallis, C.; Lenzi, E.K. Anomalous diffusion: Nonlinear fractional Fokker-Planck equation. Chem. Phys.
**2002**, 284, 341–347. [Google Scholar] [CrossRef] - Vázquez, J.L. Smoothing and Decay Estimates for Nonlinear Diffusion Equations—Equation of Porous Medium Type; Oxford University Press: New York, NY, USA, 2006. [Google Scholar]
- Gilding, B.H.; Kersner, R. Travelling Waves in Nonlinear Diffusion-Convection Reaction; Springer: Basel, Switzerland, 2004. [Google Scholar] [CrossRef] [Green Version]
- Price, R. A Useful Theorem for Nonlinear Devices Having Gaussian Inputs. IEEE Trans. Inf. Theory
**1958**, 4, 69–72. [Google Scholar] [CrossRef] - Pawula, R. A modified version of Price’s theorem. IEEE Trans. Inf. Theory
**1967**, 13, 285–288. [Google Scholar] [CrossRef] - Riba, J.; de Cabrera, F. A Proof of de Bruijn Identity based on Generalized Price’s Theorem. IEEE Int. Symp. Inf. Theory
**2019**, 2509–2513. [Google Scholar] [CrossRef] - Lieb, E.H. Proof of an Entropy Conjecture of Wehrl. Commun. Math. Phys.
**1978**, 62, 35–41. [Google Scholar] [CrossRef] - Costa, M.; Cover, T. On the Similarity of the Entropy Power Inequality and the Brunn-Minkowski Inequality. IEEE Trans. Inf. Theory
**1984**, 30, 837–839. [Google Scholar] [CrossRef] - Carlen, E.A.; Soffer, A. Entropy Production by Block Variable Summation and Central Limit Theorems. Commun. Math. Phys.
**1991**, 140, 339–371. [Google Scholar] [CrossRef] - Harremoës, P.; Vignat, C. An Entropy Power Inequality for the Binomial Family. J. Inequalities Pure Appl. Math.
**2003**, 4, 93. [Google Scholar] - Johnson, O.; Yu, Y. Monotonicity, Thinning, and Discrete Versions of the Entropy Power Inequality. IEEE Trans. Inf. Theory
**2010**, 56, 5387–5395. [Google Scholar] [CrossRef] [Green Version] - Haghighatshoar, S.; Abbe, E.; Telatar, I.E. A New Entropy Power Inequality for Integer-Valued Random Variables. IEEE Trans. Inf. Theory
**2014**, 60, 3787–3796. [Google Scholar] [CrossRef] [Green Version] - Bobkov, S.G.; Chistyakov, G.P. Entropy Power Inequality for the Rényi Entropy. IEEE Trans. Inf. Theory
**2015**, 61, 708–714. [Google Scholar] [CrossRef] - Costa, M. A New Entropy Power Inequality. IEEE Trans. Inf. Theory
**1985**, 31, 751–760. [Google Scholar] [CrossRef] - Dembo, A. Simple Proof of the Concavity of the Entropy Power with Respect to Added Gaussian Noise. IEEE Trans. Inf. Theory
**1989**, 35, 887–888. [Google Scholar] [CrossRef] - Villani, C. A Short Proof of the “Concavity of Entropy Power”. IEEE Trans. Inf. Theory
**2000**, 46, 1695–1696. [Google Scholar] [CrossRef] [Green Version] - Toscani, G. Heat Equation and Convolution Inequalities. Milan J. Math.
**2014**, 82, 183–212. [Google Scholar] [CrossRef] - Toscani, G. A Strengthened Entropy Power Inequality for Log-Concave Densities. IEEE Trans. Inf. Theory
**2015**, 61, 6550–6559. [Google Scholar] [CrossRef] [Green Version] - Ram, E.; Sason, I. On Rényi Entropy Power Inequalities. IEEE Trans. Inf. Theory
**2016**, 62, 6800–6815. [Google Scholar] [CrossRef] [Green Version] - Bobkov, S.G.; Marsiglietti, A. Variants of the Entropy Power Inequality. IEEE Trans. Inf. Theory
**2017**, 63, 7747–7752. [Google Scholar] [CrossRef] - Savaré, G.; Toscani, G. The Concavity of Rényi Entropy Power. IEEE Trans. Inf. Theory
**2014**, 60, 2687–2693. [Google Scholar] [CrossRef] [Green Version] - Zozor, S.; Puertas-Centeno, D.; Dehesa, J.S. On Generalized Stam Inequalities and Fisher–Rényi Complexity Measures. Entropy
**2017**, 19, 493. [Google Scholar] [CrossRef] [Green Version] - Rioul, O. Yet Another Proof of the Entropy Power Inequality. IEEE Trans. Inf. Theory
**2017**, 63, 3595–3599. [Google Scholar] [CrossRef] [Green Version] - Rosenblatt, M. Remarks on Some Nonparametric Estimates of a Density Function. Ann. Math. Stat.
**1956**, 27, 832–837. [Google Scholar] [CrossRef] - Parzen, E. On Estimation of a Probability Density Function and Mode. Ann. Math. Stat.
**1962**, 33, 1065–1076. [Google Scholar] [CrossRef] - Beirlant, J.; Dudewicz, E.J.; Györfi, L.; van der Meulen, E.C. Nonparametric Entropy Estimation: An Overview. Int. J. Math. Stat. Sci.
**1997**, 6, 17–39. [Google Scholar] - Leonenko, N.; Pronzato, L.; Savani, V. A Class of Rényi Information Estimators for Multidimensional Densities. Ann. Stat.
**2008**, 36, 2153–2182. [Google Scholar] [CrossRef] - Johnson, N.L.; Kotz, S.; Balakrishnan, N. Continuous Univariate Distributions, 2nd ed.; John Wiley & Sons: New York, NY, USA, 1995; Volume 1. [Google Scholar]
- Corless, R.M.; Gonnet, G.H.; Hare, D.E.G.; Jeffrey, D.J.; Knuth, D.E. On the Lambert W Function. Adv. Comput. Math.
**1996**, 5, 329–359. [Google Scholar] [CrossRef] - Abramowitz, M.; Stegun, I.A. Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th ed.; Dover: New York, NY, USA, 1970. [Google Scholar]
- Gradshteyn, I.S.; Ryzhik, I.M. Table of Integrals, Series, and Products, 8th ed.; Academic Press: San Diego, CA, USA, 2015. [Google Scholar]
- Alzahrani, F.; Salem, A. Sharp bounds for the Lambert W function. Integral Transform. Spec. Funct.
**2018**, 29, 971–978. [Google Scholar] [CrossRef]

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zozor, S.; Bercher, J.-F.
*ϕ*-Informational Measures: Some Results and Interrelations. *Entropy* **2021**, *23*, 911.
https://doi.org/10.3390/e23070911

**AMA Style**

Zozor S, Bercher J-F.
*ϕ*-Informational Measures: Some Results and Interrelations. *Entropy*. 2021; 23(7):911.
https://doi.org/10.3390/e23070911

**Chicago/Turabian Style**

Zozor, Steeve, and Jean-François Bercher.
2021. "*ϕ*-Informational Measures: Some Results and Interrelations" *Entropy* 23, no. 7: 911.
https://doi.org/10.3390/e23070911