Skewed Jensen—Fisher Divergence and Its Bounds
Abstract
:1. Introduction
2. Definition via Jensen–Shannon Divergence
3. Definition via Relative Fisher Information
Some Background on Relative Fisher Information
4. A Bound by Skewed Jensen–Shannon Divergence
5. A Lower Bound by Variational Distance
6. Summary and Discussion
Funding
Institutional Review Board Statement
Informed Consent Statement
Acknowledgments
Conflicts of Interest
References
- Lin, J. Divergence measures based on the Shannon entropy. IEEE Trans. Inf. Theory 1991, 37, 145. [Google Scholar] [CrossRef] [Green Version]
- Nielsen, F. A family of statistical symmetric divergences based on Jensen’s inequality. arXiv 2011, arXiv:1009.4004v2. [Google Scholar]
- Yamano, T. Some bounds for skewed α-Jensen-Shannon divergence. Results Appl. Math. 2019, 3, 10064. [Google Scholar] [CrossRef]
- Sánchez-Moreno, P.; Zarzo, A.; Dehesa, J.S. Jensen divergence based on Fisher’s information. J. Phys. A Math. Theor. 2012, 45, 125305. [Google Scholar] [CrossRef] [Green Version]
- Lee, L. Measures of distributional similarity. In Proceedings of the 37th Annual Meeting of the Association for Computational Linguistics on Computational Linguistics, University of Maryland, College Park, MA, USA, 20–26 June 1999; pp. 25–32. [Google Scholar]
- Lee, L. On the effectiveness of the skew divergence for statistical language analysis. In Artificial Intelligence and Statistics; Morgan Kaufmann Publisher: Burlington, MA, USA, 2001; pp. 65–72. [Google Scholar]
- Sibson, R. Information radius. Z. Wahrscheinlichkeitstheorie Verw Geb. 1969, 14, 149. [Google Scholar] [CrossRef]
- Endres, D.; Schindelin, J. A new metric for probability distributions. IEEE Trans. Inf. Theory 2003, 49, 1858. [Google Scholar] [CrossRef] [Green Version]
- Lin, J.; Wong, S.K.M. A new directed divergence measure and its characterization. Int. J. Gen. Syst. 1990, 17, 73. [Google Scholar] [CrossRef]
- Nielsen, F.; Nock, R. On the geometry of mixtures of prescribed distributions. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Calgary, AB, Canada, 15–20 April 2018; pp. 2861–2865. [Google Scholar]
- Stam, A. Some Inequalities Satisfied by the Quantities of Information of Fisher and Shannon. Inf. Control 1959, 2, 101. [Google Scholar] [CrossRef] [Green Version]
- Dembo, A.; Cover, T.; Thomas, J. Information theoretic inequalities. IEEE Trans. Inf. Theory 1991, 37, 1501. [Google Scholar] [CrossRef] [Green Version]
- Cover, T.; Thomas, J. Elements of Information Theory, 2nd ed.; Wiley-Interscience: New York, NY, USA, 2006. [Google Scholar]
- Narayanan, K.R.; Srinivasa, A.R. On the thermodynamic temperature of a general distribution. arXiv 2007, arXiv:0711.1460v2. [Google Scholar]
- Fisher, R.A. Theory of statistical estimation. Proc. Camb. Philos. Soc. 1925, 22, 700. [Google Scholar] [CrossRef] [Green Version]
- Rao, C.R. Linear Statistical Interference and Its Applications; Wiley: New York, NY, USA, 1965. [Google Scholar]
- Yamano, T. Relative Fisher information of hydrogen-like atoms. Chem. Phys. Lett. 2018, 691, 196. [Google Scholar] [CrossRef]
- Yamano, T. Fisher information of radial wavefunctions for relativistic hydrogenic atoms. Chem. Phys. Lett. 2019, 731, 136618. [Google Scholar] [CrossRef]
- Yamano, T. de Bruijn-type identity for systems with flux. Eur. Phys. J. B 2013, 86, 363. [Google Scholar] [CrossRef] [Green Version]
- Hyvärinen, A. Estimation of non-normalized statistical models by score matching. J. Mach. Learn. Res. 2005, 6, 695. [Google Scholar]
- Hyvärinen, A. Some extensions of score matching. Comput. Stat. Data Anal. 2007, 51, 2499. [Google Scholar] [CrossRef] [Green Version]
- Yang, Y.; Martin, R.; Bondell, H. Variational approximations using Fisher divergence. arXiv 2019, arXiv:1905.05284v1. [Google Scholar]
- Huggins, J.H.; Campbell, T.; Kasprzak, M.; Broderick, T. Practical bounds on the error of Bayesian posterior approximations: A nonasymptotic approach. arXiv 2018, arXiv:1809.09505. [Google Scholar]
- Elkhalil, K.; Hasan, A.; Ding, J.; Farsiu, S.; Tarokh, V. Fisher Auto-Encoders. Proc. Mach. Learn. Res. 2021, 130, 352. [Google Scholar]
- Kostrikov, I.; Tompson, J.; Fergus, R.; Nachum, O. Offline reinforcement learning with Fisher divergence critic regularization. Proc. Mach. Learn. Res. 2021, 139, 5774. [Google Scholar]
- Hammad, P. Mesure d’ordre α de l’information au sens de Fisher. Rev. Stat. Appl. 1978, 26, 73. [Google Scholar]
- Barron, A.R. Entropy and the central limit theorem. Ann. Probab. 1986, 14, 336. [Google Scholar] [CrossRef]
- Johnson, O.; Barron, A. Fisher information inequalities and the central limit theorem. Probab. Theory Relat. Fields 2004, 129, 391. [Google Scholar] [CrossRef] [Green Version]
- Johnson, O.T. Information Theory and the Central Limit Theorem; World Scientific: Singapore, 2004; pp. 23–24. [Google Scholar]
- Otto, F.; Villani, C. Generalization of an Inequality by Talagrand and Links with the Logarithmic Sobolev Inequality. J. Funct. Anal. 2000, 173, 361. [Google Scholar] [CrossRef] [Green Version]
- Villani, C. Topics in Optimal Transportation, Graduate Studies in Mathematics; American Mathematical Society: Providence, RI, USA, 2000; Volume 58, p. 278. [Google Scholar]
- Antolín, J.; Angulo, J.C.; López-Rosa, S. Fisher and Jensen-Shannon divergences: Quantitative comparisons among distributions. Application to position and momentum atomic densities. J. Chem. Phys. 2009, 130, 074110. [Google Scholar] [CrossRef] [Green Version]
- López-Rosa, S.; Antolín, J.; Angulo, J.C.; Esquivel, R.O. Divergence analysis of atomic ionization processes and isoelectronic series. Phys. Rev. A 2009, 80, 012505. [Google Scholar] [CrossRef] [Green Version]
- Mukherjee, N.; Roy, A.K. Relative Fisher information in some central potentials. Ann. Phys. 2018, 398, 190. [Google Scholar] [CrossRef] [Green Version]
- Yamano, T. Relative Fisher information for Morse potential and isotropic quantum oscillators. J. Phys. Commun. 2018, 2, 085018. [Google Scholar] [CrossRef] [Green Version]
- Yamano, T. Fisher Information of Free-Electron Landau States. Entropy 2021, 23, 268. [Google Scholar] [CrossRef] [PubMed]
- Levämäki, H.; Nagy, Á.; Vilja, I.; Kokko, K.; Vitos, L. Kullback-Leibler and relative Fisher information as descriptors of locality. Int. J. Quantum Chem. 2018, 118, e25557. [Google Scholar] [CrossRef]
- Nagy, Á. Relative information in excited-state orbital-free density functional theory. Int. J. Quantum Chem. 2020, 120, e26405. [Google Scholar] [CrossRef]
- Yamano, T. Phase space gradient of dissipated work and information: A role of relative Fisher information. J. Math. Phys. 2013, 54, 113301. [Google Scholar] [CrossRef] [Green Version]
- Yamano, T. Constraints on stochastic heat probability prescribed by exchange fluctuation theorems. Results Phys. 2020, 18, 103300. [Google Scholar] [CrossRef]
- Blachman, K.M. The convolution inequality for entropy powers. IEEE Trans. Inf. Theory 1965, 2, 267. [Google Scholar] [CrossRef]
- Gross, L. Logarithmic Sobolev inequalities. Am. J. Math. 1975, 97, 1061. [Google Scholar] [CrossRef]
- Lieb, E.H.; Loss, M. Analysis, 2nd ed.; Graduate Studies in Mathematics; American Mathematical Society: Providence, RI, USA, 2001; Chapter 8; Volume 14. [Google Scholar]
- Chernoff, H. A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations. Ann. Math. Stat. 1952, 23, 493. [Google Scholar] [CrossRef]
- Nielsen, F.; Boltz, S. The Burbea-Rao and Bhattacharyya centroids. IEEE Trans. Inf. Theory 2011, 57, 5455. [Google Scholar] [CrossRef] [Green Version]
- Bhattacharyya, A. On a measure of divergence between two statistical populations defined by their probability distributions. Bull. Calcutta Math. Soc. 1943, 35, 99. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yamano, T. Skewed Jensen—Fisher Divergence and Its Bounds. Foundations 2021, 1, 256-264. https://doi.org/10.3390/foundations1020018
Yamano T. Skewed Jensen—Fisher Divergence and Its Bounds. Foundations. 2021; 1(2):256-264. https://doi.org/10.3390/foundations1020018
Chicago/Turabian StyleYamano, Takuya. 2021. "Skewed Jensen—Fisher Divergence and Its Bounds" Foundations 1, no. 2: 256-264. https://doi.org/10.3390/foundations1020018
APA StyleYamano, T. (2021). Skewed Jensen—Fisher Divergence and Its Bounds. Foundations, 1(2), 256-264. https://doi.org/10.3390/foundations1020018