A Definition of Conditional Probability with Non-Stochastic Information
Abstract
1. Introduction
1.1. Relationship to the Literature
1.2. Motivation
2. Results
2.1. The New Definition
- The posterior form is given byfor some function (). Since and are all that are available and set, it is clear that the update must solely depend on these functions. We are, hence, asking for the unique which provides the update for all , i.e., invariant. This is a reasonable requirement since how one updates should not depend on .
- If the piece of information (I) is made of two independent pieces of information, and , in the sense that , for every in , then the posterior probability satisfies the following:This ensures we end up with as the same object whether we update with together or one after the other.
- If for every and some , thenwhere is the restriction of to B, is the information that B certainly occurs or has occured, and is p restricted and normalized to B, i.e., . In other words, outcomes with infinite loss are disregarded in the updating.
- Lower evidence (larger loss) for a state should yield smaller posterior probabilities under the same prior. So, if for some , for and for , then
- If , then . That is, if the observation provides no information about , since the loss function is a constant, then the posterior is the same as the prior.
- 6.
- If for some constant c, then .
- 7.
- for any set ,for every ω in B. This means that whether we update the prior probability restricted to set B, or update the prior probability and then restrict to set B, we obtain the same update.
2.2. An Illustration
3. Discussion
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
References
- Bissiri, P.G.; Walker, S.G. Converting information into probability measures with the Kullback–Leibler divergence. Ann. Inst. Stat. Math. 2012, 64, 1139–1160. [Google Scholar] [CrossRef]
- Bissiri, P.G.; Holmes, C.C.; Walker, S.G. A general framework for updating belief distributions. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 2016, 78, 1103–1130. [Google Scholar] [CrossRef] [PubMed]
- Van Fraassen, B.C. The geometry of opinion: Jeffrey shifts and linear operators. Philos. Sci. 1992, 59, 163–175. [Google Scholar] [CrossRef]
- Shafer, G. Jeffrey’s rule of conditioning. Philos. Sci. 1981, 48, 337–363. [Google Scholar] [CrossRef]
- Skyrms, B. Maximum entropy inference as a special case of conditionalization. Synthese 1985, 63, 55–74. [Google Scholar] [CrossRef]
- Domotor, Z. Probability kinematics, conditionals, and entropy principles. Synthese 1985, 63, 75–114. [Google Scholar] [CrossRef]
- Diaconis, P.; Zabell, S. Updating Subjective Probability. J. Am. Stat. Assoc. 1982, 77, 822–830. [Google Scholar] [CrossRef]
- Shore, J.; Johnson, R. Axiomatic derivation of the Principle of Maximum Entropy and the Principle of Minimum Cross-Entropy. IEEE Trans. Inf. Theory 1980, IT-26, 26–37. [Google Scholar] [CrossRef]
- Freund, J.E. Puzzle or paradox? Am. Stat. 1965, 19, 29–44. [Google Scholar]
- Gardner, M. The Scientific American Book of Mathematical Puzzles and Diversions; Simon and Schuster: New York, NY, USA, 1959. [Google Scholar]
- Shafer, G. Conditional probability. Int. Stat. Rev. 1985, 53, 261–275. [Google Scholar] [CrossRef]
- Hutchison, K. What are conditional probabilities conditional upon? Br. J. Philos. Sci. 1999, 50, 665–695. [Google Scholar] [CrossRef]
- Hutchison, K. Resolving some puzzles of conditional probability. Adv. Sci. Lett. 2008, 1, 212–221. [Google Scholar] [CrossRef]
- Bar-Hillel, M.; Falk, R. Some teasers concerning conditional probabilities. Cognition 1982, 11, 109–122. [Google Scholar] [CrossRef]
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bissiri, P.G.; Walker, S.G. A Definition of Conditional Probability with Non-Stochastic Information. Entropy 2018, 20, 572. https://doi.org/10.3390/e20080572
Bissiri PG, Walker SG. A Definition of Conditional Probability with Non-Stochastic Information. Entropy. 2018; 20(8):572. https://doi.org/10.3390/e20080572
Chicago/Turabian StyleBissiri, Pier Giovanni, and Stephen G. Walker. 2018. "A Definition of Conditional Probability with Non-Stochastic Information" Entropy 20, no. 8: 572. https://doi.org/10.3390/e20080572
APA StyleBissiri, P. G., & Walker, S. G. (2018). A Definition of Conditional Probability with Non-Stochastic Information. Entropy, 20(8), 572. https://doi.org/10.3390/e20080572

