Next Article in Journal
Group-Constrained Maximum Correntropy Criterion Algorithms for Estimating Sparse Mix-Noised Channels
Next Article in Special Issue
Writing, Proofreading and Editing in Information Theory
Previous Article in Journal
Logical Entropy and Logical Mutual Information of Experiments in the Intuitionistic Fuzzy Case
Article

Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations

1
Department of Natural Sciences, Fordham University, New York, NY 10023, USA
2
Department of Physics, University of California, Berkeley, CA 94720, USA
3
Redwood Center for Theoretical Neuroscience, University of California, Berkeley, CA 94720, USA
4
Mathematical Sciences Research Institute, Berkeley, CA 94720, USA
5
Helen Wills Neuroscience Institute, University of California, Berkeley, CA 94720, USA
6
Biophysics Graduate Group, University of California, Berkeley, CA 94720, USA
7
Google Brain, Google, Mountain View, CA 94043, USA
*
Author to whom correspondence should be addressed.
Entropy 2017, 19(8), 427; https://doi.org/10.3390/e19080427
Received: 27 June 2017 / Revised: 8 August 2017 / Accepted: 18 August 2017 / Published: 21 August 2017
(This article belongs to the Special Issue Thermodynamics of Information Processing)
Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. We provide upper and lower bounds on the entropy for the minimum entropy distribution over arbitrarily large collections of binary units with any fixed set of mean values and pairwise correlations. We also construct specific low-entropy distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size for any set of first- and second-order statistics consistent with arbitrarily large systems. We further demonstrate that some sets of these low-order statistics can only be realized by small systems. Our results show how only small amounts of randomness are needed to mimic low-order statistical properties of highly entropic distributions, and we discuss some applications for engineered and biological information transmission systems. View Full-Text
Keywords: information theory; minimum entropy; maximum entropy; statistical mechanics; Ising model; pairwise correlations; compressed sensing; neural networks information theory; minimum entropy; maximum entropy; statistical mechanics; Ising model; pairwise correlations; compressed sensing; neural networks
Show Figures

Figure 1

MDPI and ACS Style

Albanna, B.F.; Hillar, C.; Sohl-Dickstein, J.; DeWeese, M.R. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations. Entropy 2017, 19, 427. https://doi.org/10.3390/e19080427

AMA Style

Albanna BF, Hillar C, Sohl-Dickstein J, DeWeese MR. Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations. Entropy. 2017; 19(8):427. https://doi.org/10.3390/e19080427

Chicago/Turabian Style

Albanna, Badr F., Christopher Hillar, Jascha Sohl-Dickstein, and Michael R. DeWeese. 2017. "Minimum and Maximum Entropy Distributions for Binary Systems with Known Means and Pairwise Correlations" Entropy 19, no. 8: 427. https://doi.org/10.3390/e19080427

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop