Open AccessThis article is

- freely available
- re-usable

Article

# The entropy of a mixture of probability distributions

Received: 13 September 2004 / Accepted: 20 January 2005 / Published: 20 January 2005

Download PDF [330 KB, uploaded 24 February 2015]

Abstract: If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\\frac{1}{m+1} + \\frac{1}{m+2} + ... + \\frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation.

*This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.*

Export to BibTeX | EndNote

**MDPI and ACS Style**

Vos, A., De. The entropy of a mixture of probability distributions. *Entropy* **2005**, *7*, 15-37.

**AMA Style**

Vos A, De. The entropy of a mixture of probability distributions. *Entropy*. 2005; 7(1):15-37.

**Chicago/Turabian Style**

Vos, Alexis, De. 2005. "The entropy of a mixture of probability distributions." *Entropy* 7, no. 1: 15-37.