Next Article in Journal
Quantum Games with Unawareness with Duopoly Problems in View
Previous Article in Journal
Perceptual Video Coding Scheme Using Just Noticeable Distortion Model Based on Entropy Filter
Open AccessArticle

Dynamical Sampling with Langevin Normalization Flows

by Minghao Gu 1, Shiliang Sun 1,2,* and Yan Liu 3
1
School of Computer Science and Technology, East China Normal University, 3663 North Zhongshan Road, Shanghai 200241, China
2
Shanghai Institute of Intelligent Science and Technology, Tongji University, Shanghai 201804, China
3
School of Data Science and Engineering, East China Normal University, 3663 North Zhongshan Road, Shanghai 200241, China
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(11), 1096; https://doi.org/10.3390/e21111096
Received: 7 October 2019 / Revised: 5 November 2019 / Accepted: 6 November 2019 / Published: 10 November 2019
In Bayesian machine learning, sampling methods provide the asymptotically unbiased estimation for the inference of the complex probability distributions, where Markov chain Monte Carlo (MCMC) is one of the most popular sampling methods. However, MCMC can lead to high autocorrelation of samples or poor performances in some complex distributions. In this paper, we introduce Langevin diffusions to normalization flows to construct a brand-new dynamical sampling method. We propose the modified Kullback-Leibler divergence as the loss function to train the sampler, which ensures that the samples generated from the proposed method can converge to the target distribution. Since the gradient function of the target distribution is used during the process of calculating the modified Kullback-Leibler, which makes the integral of the modified Kullback-Leibler intractable. We utilize the Monte Carlo estimator to approximate this integral. We also discuss the situation when the target distribution is unnormalized. We illustrate the properties and performances of the proposed method on varieties of complex distributions and real datasets. The experiments indicate that the proposed method not only takes the advantage of the flexibility of neural networks but also utilizes the property of rapid convergence to the target distribution of the dynamics system and demonstrate superior performances competing with dynamics based MCMC samplers.
Keywords: Normalization flows; Langevin diffusions; Langevin normalization flows; Monte Carlo sampling Normalization flows; Langevin diffusions; Langevin normalization flows; Monte Carlo sampling
MDPI and ACS Style

Gu, M.; Sun, S.; Liu, Y. Dynamical Sampling with Langevin Normalization Flows. Entropy 2019, 21, 1096.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop