Entropy 2005, 7(1), 15-37; doi:10.3390/e7010015
Article

The entropy of a mixture of probability distributions

Imec v.z.w., Vakgroep voor elektronika en informatiesystemen, Universiteit Gent, Sint Pieters-nieuwstraat 41, B-9000 Gent, Belgium
Received: 13 September 2004; Accepted: 20 January 2005 / Published: 20 January 2005
PDF Full-text Download PDF Full-Text [330 KB, uploaded 16 September 2008 11:01 CEST]
Abstract: If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\\frac{1}{m+1} + \\frac{1}{m+2} + ... + \\frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation.
Keywords: probability distribution; mixture distribution; Bhattacharyya space; Hilbert space

Article Statistics

Load and display the download statistics.

Citations to this Article

Cite This Article

MDPI and ACS Style

Vos, A., De The entropy of a mixture of probability distributions. Entropy 2005, 7, 15-37.

AMA Style

Vos A, De. The entropy of a mixture of probability distributions. Entropy. 2005; 7(1):15-37.

Chicago/Turabian Style

Vos, Alexis, De. 2005. "The entropy of a mixture of probability distributions." Entropy 7, no. 1: 15-37.

Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert