Next Article in Journal
Numerical Study On Local Entropy Generation In Compressible Flow Through A Suddenly Expanding Pipe
Previous Article in Journal
Lagrangian submanifolds generated by the Maximum Entropy principle
Open AccessArticle

The entropy of a mixture of probability distributions

Imec v.z.w., Vakgroep voor elektronika en informatiesystemen, Universiteit Gent, Sint Pieters-nieuwstraat 41, B-9000 Gent, Belgium
Entropy 2005, 7(1), 15-37; https://doi.org/10.3390/e7010015
Received: 13 September 2004 / Accepted: 20 January 2005 / Published: 20 January 2005
If a message can have n different values and all values are equally probable, then the entropy of the message is log(n). In the present paper, we investigate the expectation value of the entropy, for arbitrary probability distribution. For that purpose, we apply mixed probability distributions. The mixing distribution is represented by a point on an infinite dimensional hypersphere in Hilbert space. During an `arbitrary' calculation, this mixing distribution has the tendency to become uniform over a flat probability space of ever decreasing dimensionality. Once such smeared-out mixing distribution is established, subsequent computing steps introduce an entropy loss expected to equal $\\frac{1}{m+1} + \\frac{1}{m+2} + ... + \\frac{1}{n}$, where n is the number of possible inputs and m the number of possible outcomes of the computation.
Keywords: probability distribution; mixture distribution; Bhattacharyya space; Hilbert space probability distribution; mixture distribution; Bhattacharyya space; Hilbert space
MDPI and ACS Style

Vos, A., De. The entropy of a mixture of probability distributions. Entropy 2005, 7, 15-37.

Show more citation formats Show less citations formats

Article Access Map by Country/Region

1
Only visits after 24 November 2015 are recorded.
Back to TopTop