Computing Entropies with Nested Sampling
AbstractThe Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple Gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario. View Full-Text
Share & Cite This Article
Brewer, B.J. Computing Entropies with Nested Sampling. Entropy 2017, 19, 422.
Brewer BJ. Computing Entropies with Nested Sampling. Entropy. 2017; 19(8):422.Chicago/Turabian Style
Brewer, Brendon J. 2017. "Computing Entropies with Nested Sampling." Entropy 19, no. 8: 422.
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.