Next Article in Journal
Securing Wireless Communications of the Internet of Things from the Physical Layer, An Overview
Next Article in Special Issue
Disproportionate Allocation of Indirect Costs at Individual-Farm Level Using Maximum Entropy
Previous Article in Journal
Incipient Fault Diagnosis of Rolling Bearings Based on Impulse-Step Impact Dictionary and Re-Weighted Minimizing Nonconvex Penalty Lq Regular Technique
Previous Article in Special Issue
Statistical Process Control for Unimodal Distribution Based on Maximum Entropy Distribution Approximation
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessArticle
Entropy 2017, 19(8), 422; doi:10.3390/e19080422

Computing Entropies with Nested Sampling

Department of Statistics, The University of Auckland, Auckland 1142, New Zealand
Received: 12 July 2017 / Revised: 14 August 2017 / Accepted: 16 August 2017 / Published: 18 August 2017
(This article belongs to the Special Issue Maximum Entropy and Bayesian Methods)
View Full-Text   |   Download PDF [782 KB, uploaded 21 August 2017]   |  

Abstract

The Shannon entropy, and related quantities such as mutual information, can be used to quantify uncertainty and relevance. However, in practice, it can be difficult to compute these quantities for arbitrary probability distributions, particularly if the probability mass functions or densities cannot be evaluated. This paper introduces a computational approach, based on Nested Sampling, to evaluate entropies of probability distributions that can only be sampled. I demonstrate the method on three examples: a simple Gaussian example where the key quantities are available analytically; (ii) an experimental design example about scheduling observations in order to measure the period of an oscillating signal; and (iii) predicting the future from the past in a heavy-tailed scenario. View Full-Text
Keywords: information theory; entropy; mutual information; Monte Carlo; nested sampling; Bayesian inference information theory; entropy; mutual information; Monte Carlo; nested sampling; Bayesian inference
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. (CC BY 4.0).

Scifeed alert for new publications

Never miss any articles matching your research from any publisher
  • Get alerts for new papers matching your research
  • Find out the new papers from selected authors
  • Updated daily for 49'000+ journals and 6000+ publishers
  • Define your Scifeed now

SciFeed Share & Cite This Article

MDPI and ACS Style

Brewer, B.J. Computing Entropies with Nested Sampling. Entropy 2017, 19, 422.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top