Next Article in Journal
A Generalized Information-Theoretic Approach for Bounding the Number of Independent Sets in Bipartite Graphs
Previous Article in Journal
Fisher Information of Free-Electron Landau States
Article

A Neural Network MCMC Sampler That Maximizes Proposal Entropy

by 1,2,*, 1,3 and 1,4,5
1
Redwood Center for Theoretical Neuroscience, Berkeley, CA 94720, USA
2
Department of Physics, University of California Berkeley, Berkeley, CA 94720, USA
3
Berkeley AI Research, University of California Berkeley, Berkeley, CA 94720, USA
4
Helen Wills Neuroscience Institute, University of California Berkeley, Berkeley, CA 94720, USA
5
Neuromorphic Computing Group, Intel Labs, 2200 Mission College Blvd., Santa Clara, CA 95054-1549, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Philip Broadbridge
Entropy 2021, 23(3), 269; https://doi.org/10.3390/e23030269
Received: 10 February 2021 / Revised: 17 February 2021 / Accepted: 22 February 2021 / Published: 25 February 2021
Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods. Augmenting samplers with neural networks can potentially improve their efficiency. Previous neural network-based samplers were trained with objectives that either did not explicitly encourage exploration, or contained a term that encouraged exploration but only for well structured distributions. Here we propose to maximize proposal entropy for adapting the proposal to distributions of any shape. To optimize proposal entropy directly, we devised a neural network MCMC sampler that has a flexible and tractable proposal distribution. Specifically, our network architecture utilizes the gradient of the target distribution for generating proposals. Our model achieved significantly higher efficiency than previous neural network MCMC techniques in a variety of sampling tasks, sometimes by more than an order magnitude. Further, the sampler was demonstrated through the training of a convergent energy-based model of natural images. The adaptive sampler achieved unbiased sampling with significantly higher proposal entropy than a Langevin dynamics sample. The trained sampler also achieved better sample quality. View Full-Text
Keywords: MCMC; neural network sampler; maximum entropy; energy-based model MCMC; neural network sampler; maximum entropy; energy-based model
Show Figures

Figure 1

MDPI and ACS Style

Li, Z.; Chen, Y.; Sommer, F.T. A Neural Network MCMC Sampler That Maximizes Proposal Entropy. Entropy 2021, 23, 269. https://doi.org/10.3390/e23030269

AMA Style

Li Z, Chen Y, Sommer FT. A Neural Network MCMC Sampler That Maximizes Proposal Entropy. Entropy. 2021; 23(3):269. https://doi.org/10.3390/e23030269

Chicago/Turabian Style

Li, Zengyi, Yubei Chen, and Friedrich T. Sommer. 2021. "A Neural Network MCMC Sampler That Maximizes Proposal Entropy" Entropy 23, no. 3: 269. https://doi.org/10.3390/e23030269

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop