# Entropic Updating of Probabilities and Density Matrices

## Abstract

**:**

## 1. Introduction

## 2. The Design of Entropic Inference

**information**operationally $(\ast )$ as the rationale that causes a probability distribution to change (inspired by and adapted from [7]). Directly from [7]:

Our goal is to design a method that allows a systematic search for the preferred posterior distribution. The central idea, first proposed in [4], is disarmingly simple: to select the posterior, first rank all candidate distributions in increasing order of preference and then pick the distribution that ranks the highest. Irrespective of what it is that makes one distribution preferable over another (we will get to that soon enough), it is clear that any ranking according to preference must be transitive: if distribution ${\rho}_{1}$ is preferred over distribution ${\rho}_{2}$, and ${\rho}_{2}$ is preferred over ${\rho}_{3}$, then ${\rho}_{1}$ is preferred over ${\rho}_{3}$. Such transitive rankings are implemented by assigning to each $\rho (x)$ a real number $S[\rho ]$, which is called the entropy of $\rho $, in such a way that if ${\rho}_{1}$ is preferred over ${\rho}_{2}$, then $S[{\rho}_{1}]>S[{\rho}_{2}]$. The selected distribution (one or possibly many, for there may be several equally preferred distributions) is that which maximizes the entropy functional.

The Principle of Minimal Updating (PMU):A probability distribution should only be updated to the extent required by the new information.

Subdomain Independence: When information is received about one set of propositions, it should not affect or change the state of knowledge (probability distribution) of the other propositions (else information was also received about them too);

Subsystem Independence: When two systems are a priori believed to be independent and we only receive information about one, then the state of knowledge of the other system remains unchanged.

#### The Design Criteria and the Standard Relative Entropy

**Case 1:**We receive the extremely constraining information that the posterior distribution for system 1 is completely specified to be ${P}_{1}({x}_{1})$ while we receive no information at all about system 2. We treat the two systems jointly. Maximize the joint entropy $S[\rho ({x}_{1},{x}_{2}),\phi ({x}_{1})\phi ({x}_{2})]$ subject to the following constraints on the $\rho ({x}_{1},{x}_{2})\phantom{\rule{0.166667em}{0ex}}$:

**Case 1—Conclusion:**When the system 2 is not updated the dependence on ${\phi}_{2}$ and ${x}_{2}$ drops out,

**Case 2:**Now consider a different special case in which the marginal posterior distributions for systems 1 and 2 are both completely specified to be ${P}_{1}({x}_{1})$ and ${P}_{2}({x}_{2}),$ respectively. Maximize the joint entropy $S[\rho ({x}_{1},{x}_{2}),\phi ({x}_{1})\phi ({x}_{2})]$ subject to the following constraints on the $\rho ({x}_{1},{x}_{2})\phantom{\rule{0.166667em}{0ex}}$,

**Case 2—Conclusion:**Substituting back into (25) gives us a functional equation for $\varphi \phantom{\rule{0.166667em}{0ex}}$,

## 3. The Design of the Quantum Relative Entropy

#### 3.1. Designing the Quantum Relative Entropy

**Case 1:**We receive the extremely constraining information that the posterior distribution for system 1 is completely specified to be ${\widehat{\rho}}_{1}$ while we receive no information about system 2 at all. We treat the two systems jointly. Maximize the joint entropy $S[{\widehat{\rho}}_{12},{\widehat{\phi}}_{1}\otimes {\widehat{\phi}}_{2}]$, subject to the following constraints on the ${\widehat{\rho}}_{12}\phantom{\rule{0.166667em}{0ex}}$,

**Case 1—Conclusion:**The analysis leads us to conclude that when the system 2 is not updated, the dependence on ${\widehat{\phi}}_{2}$ drops out,

**Case 2:**Now consider a different special case in which the marginal posterior distributions for systems 1 and 2 are both completely specified to be ${\widehat{\rho}}_{1}$ and ${\widehat{\rho}}_{2}$, respectively. Maximize the joint entropy, $S[{\widehat{\rho}}_{12},{\widehat{\phi}}_{1}\otimes {\widehat{\phi}}_{2}]$, subject to the following constraints on the ${\widehat{\rho}}_{12}\phantom{\rule{0.166667em}{0ex}}$,

**Case 2—Conclusion:**Substituting back into (59) gives us a functional equation for $\varphi \phantom{\rule{0.166667em}{0ex}}$,

#### 3.2. Remarks

## 4. Conclusions

## Acknowledgments

## Conflicts of Interest

## Appendix A

#### Appendix A.1. Simple Functional Equations

**Theorem**

**A1.**

**Proof.**

**Theorem**

**A2.**

**Theorem**

**A3.**

#### Appendix A.2. Functional Equations with Multiple Arguments

#### Appendix A.3. Relative Entropy

#### Appendix A.4. Matrix Functional Equations

#### Appendix A.5. Quantum Relative Entropy

## Appendix B. Spin Example

**Proof.**

## References

- Shore, J.E.; Johnson, R.W. Axiomatic derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy. IEEE Trans. Inf. Theory
**1980**, 26, 26–37. [Google Scholar] [CrossRef] - Shore, J.E.; Johnson, R.W. Properties of Cross-Entropy Minimization. IEEE Trans. Inf. Theory
**1981**, 27, 472–482. [Google Scholar] [CrossRef] - Csiszár, I. Why least squares and maximum entropy: An axiomatic approach to inference for linear inverse problems. Ann. Stat.
**1991**, 19, 2032. [Google Scholar] [CrossRef] - Skilling, J. The Axioms of Maximum Entropy. In Maximum-Entropy and Bayesian Methods in Science and Engineering; Erickson, G.J., Smith, C.R., Eds.; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1988. [Google Scholar]
- Skilling, J. Classic Maximum Entropy. In Maximum-Entropy and Bayesian Methods in Science and Engineering; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1988. [Google Scholar]
- Skilling, J. Quantified Maximum Entropy. In Maximum-Entropy and Bayesian Methods in Science and Engineering; Fougére, P.F., Ed.; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1990. [Google Scholar]
- Caticha, A. Entropic Inference and the Foundations of Physics (Monograph Commissioned by the 11th Brazilian Meeting on Bayesian Statistics—EBEB-2012). Available online: http://www.albany.edu/physics/ACaticha-EIFP-book.pdf (accessed on 30 November 2017).
- Hiai, F.; Petz, D. The Proper Formula for Relative Entropy and its Asymptotics in Quantum Probability. Commun. Math. Phys.
**1991**, 143, 99–114. [Google Scholar] [CrossRef] - Petz, D. Characterization of the Relative Entropy of States of Matrix Algebras. Acta Math. Hung.
**1992**, 59, 449–455. [Google Scholar] [CrossRef] - Ohya, M.; Petz, D. Quantum Entropy and Its Use; Springer: New York, NY, USA, 1993; ISBN 0-387-54881-5. [Google Scholar]
- Wilming, H.; Gallego, R.; Eisert, J. Axiomatic Characterization of the Quantum Relative Entropy and Free Energy. Entropy
**2017**, 19, 241. [Google Scholar] [CrossRef] - Jaynes, E.T. Information Theory and Statistical Mechanics. Phys. Rev.
**1957**, 106, 620–630. [Google Scholar] [CrossRef] - Jaynes, E.T. Probability Theory: The Logic of Science; Cambridge University Press: Cambridge, UK, 2003. [Google Scholar]
- Jaynes, E.T. Information Theory and Statistical Mechanics II. Phys. Rev.
**1957**, 108, 171–190. [Google Scholar] [CrossRef] - Balian, R.; Vénéroni, M. Incomplete descriptions, relevant information, and entropy production in collision processes. Ann. Phys.
**1987**, 174, 229–244. [Google Scholar] [CrossRef] - Balian, R.; Balazs, N.L. Equiprobability, inference and entropy in quantum theory. Ann. Phys.
**1987**, 179, 97–144. [Google Scholar] [CrossRef] - Balian, R. Justification of the Maximum Entropy Criterion in Quantum Mechanics. In Maximum Entropy and Bayesian Methods; Skilling, J., Ed.; Kluwer Academic Publishers: Dordrecht, The Netherlands, 1989; pp. 123–129. [Google Scholar]
- Balian, R. On the principles of quantum mechanics. Am. J. Phys.
**1989**, 57, 1019–1027. [Google Scholar] [CrossRef] - Balian, R. Gain of information in a quantum measurement. Eur. J. Phys.
**1989**, 10, 208–213. [Google Scholar] [CrossRef] - Balian, R. Incomplete descriptions and relevant entropies. Am. J. Phys.
**1999**, 67, 1078–1090. [Google Scholar] [CrossRef] - Blankenbecler, R.; Partovi, H. Uncertainty, Entropy, and the Statistical Mechanics of Microscopic Systems. Phys. Rev. Lett.
**1985**, 54, 373–376. [Google Scholar] [CrossRef] [PubMed] - Blankenbecler, R.; Partovi, H. Quantum Density Matrix and Entropic Uncertainty. In Proceedings of the Fifth Workshop on Maximum Entropy and Bayesian Methods in Applied Statistics, Laramie, WY, USA, 5–8 August 1985. [Google Scholar]
- Von Neumann, J. Mathematische Grundlagen der Quantenmechanik; Springer: Berlin, Germany, 1932; English Translation: Mathematical Foundations of Quantum Mechanics; Princeton University Press: Princeton, NY, USA, 1983. [Google Scholar]
- Ali, S.A.; Cafaro, C.; Giffin, A.; Lupo, C.; Mancini, S. On a Differential Geometric Viewpoint of Jaynes’ Maxent Method and its Quantum Extension. AIP Conf. Proc.
**2012**, 1443, 120–128. [Google Scholar] - Caticha, A. Entropic Dynamics: Quantum Mechanics from Entropy and Information Geometry. Available online: https://arxiv.org/abs/1711.02538 (accessed on 30 November 2017).
- Reginatto, M.; Hall, M.J.W. Quantum-classical interactions and measurement: A consistent description using statistical ensembles on configuration space. J. Phys. Conf. Ser.
**2009**, 174, 012038. [Google Scholar] [CrossRef] - Reginatto, M.; Hall, M.J.W. Information geometry, dynamics and discrete quantum mechanics. AIP Conf. Proc.
**2013**, 1553, 246–253. [Google Scholar] - Caves, C.; Fuchs, C.; Schack, R. Quantum probabilities as Bayesian probabilities. Phys. Rev. A
**2002**, 65, 022305. [Google Scholar] [CrossRef] - Vanslette, K. The Quantum Bayes Rule and Generalizations from the Quantum Maximum Entropy Method. Available online: https://arxiv.org/abs/1710.10949 (accessed on 30 November 2017).
- Schack, R.; Brun, T.; Caves, C. Quantum Bayes rule. Phys. Rev. A
**2001**, 64, 014305. [Google Scholar] [CrossRef] - Korotkov, A. Continuous quantum measurement of a double dot. Phys. Rev. B
**1999**, 60, 5737–5742. [Google Scholar] [CrossRef] - Korotkov, A. Selective quantum evolution of a qubit state due to continuous measurement. Phys. Rev. B
**2000**, 63, 115403. [Google Scholar] [CrossRef] - Jordan, A.; Korotkov, A. Qubit feedback and control with kicked quantum nondemolition measurements: A quantum Bayesian analysis. Phys. Rev. B
**2006**, 74, 085307. [Google Scholar] [CrossRef] - Hellmann, F.; Kamiński, W.; Kostecki, P. Quantum collapse rules from the maximum relative entropy principle. New J. Phys.
**2016**, 18, 013022. [Google Scholar] [CrossRef] - Warmuth, M. A Bayes Rule for Density Matrices. In Advances in Neural Information Processing Systems 18, Proceedings of the Neural Information Processing Systems Conference, Montréal, QC, Canada, 7–12 December 2005; Neural Information Processing Systems Foundation, Inc.: La Jolla, CA, USA, 2015. [Google Scholar]
- Warmuth, M.; Kuzmin, D. A Bayesian Probability Calculus for Density Matrices. Mach. Learn.
**2010**, 78, 63–101. [Google Scholar] [CrossRef] - Tsuda, K. Machine learning with quantum relative entropy. J. Phys. Conf. Ser.
**2009**, 143, 012021. [Google Scholar] [CrossRef] - Giffin, A.; Caticha, A. Updating Probabilities. Presented at the 26th International Workshop on Bayesian Inference and Maximum Entropy Methods (MaxEnt 2006), Paris, France, 8–13 July 2006. [Google Scholar]
- Wang, Z.; Busemeyer, J.; Atmanspacher, H.; Pothos, E. The Potential of Using Quantum Theory to Build Models of Cognition. Top. Cogn. Sci.
**2013**, 5, 672–688. [Google Scholar] [CrossRef] [PubMed] - Giffin, A. Maximum Entropy: The Universal Method for Inference. Ph.D. Thesis, University at Albany (SUNY), Albany, NY, USA, 2008. [Google Scholar]
- Caticha, A. Toward an Informational Pragmatic Realism. Minds Mach.
**2014**, 24, 37–70. [Google Scholar] [CrossRef] - Umegaki, H. Conditional expectation in an operator algebra, IV (entropy and information). Ködai Math. Sem. Rep.
**1962**, 14, 59–85. [Google Scholar] [CrossRef] - Uhlmann, A. Relative entropy and the Wigner-Yanase-Dyson-Lieb concavity in an interpolation theory. Commun. Math. Phys.
**1997**, 54, 21–32. [Google Scholar] [CrossRef] - Schumacher, B.; Westmoreland, M. Relative entropy in quantum information theory. In Proceedings of the AMS Special Session on Quantum Information and Computation, Washington, DC, USA, 19–21 January 2000. [Google Scholar]
- Suzuki, M. On the Convergence of Exponential Operators—The Zassenhaus Formula, BCH Formula and Systematic Approximants. Commun. Math. Phys.
**1977**, 57, 193–200. [Google Scholar] [CrossRef] - Horn, A. Eigenvalues of sums of Hermitian matrices. Pac. J. Math.
**1962**, 12, 225–241. [Google Scholar] [CrossRef] - Bhatia, R. Linear Algebra to Quantum Cohomology: The Story of Alfred Horn’s Inequalities. Am. Math. Mon.
**2001**, 108, 289–318. [Google Scholar] [CrossRef] - Knutson, A.; Tao, T. Honeycombs and Sums of Hermitian Matrices. Not. AMS
**2001**, 48, 175–186. [Google Scholar] - Aczél, J. Lectures on Functional Equations and Their Applications; Academic Press Inc.: New York, NY, USA, 1966; Volume 19, pp. 31–44, 141–145, 213–217, 301–302, 347–349. [Google Scholar]
- Darboux, G. Sur le théorème fondamental de la géométrie projective. Math. Ann.
**1880**, 17, 55–61. [Google Scholar] [CrossRef]

© 2017 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Vanslette, K. Entropic Updating of Probabilities and Density Matrices. *Entropy* **2017**, *19*, 664.
https://doi.org/10.3390/e19120664

**AMA Style**

Vanslette K. Entropic Updating of Probabilities and Density Matrices. *Entropy*. 2017; 19(12):664.
https://doi.org/10.3390/e19120664

**Chicago/Turabian Style**

Vanslette, Kevin. 2017. "Entropic Updating of Probabilities and Density Matrices" *Entropy* 19, no. 12: 664.
https://doi.org/10.3390/e19120664