Special Issue "Approximate Bayesian Inference"
Deadline for manuscript submissions: closed (22 June 2021).
Already extremely popular when it comes to statistical inference, Bayesian methods are also becoming popular in machine learning and AI problems, where it is important for any device not only to predict well, but also to provide a quantification of the uncertainty of the prediction.
Traditionally, Bayesian estimators were implemented using Monte Carlo methods, such as the Metropolis–Hastings of the Gibbs sampler. These algorithms target the exact posterior distribution. However, many modern models in statistics are simply too complex to use such methodologies. In machine learning, the volume of the data used in practice makes Monte Carlo methods too slow to be useful.
Motivated by these applications, many faster algorithms have recently been proposed that target an approximation of the posterior.
1) A first family of methods still relies on Monte Carlo simulations but targets an approximation of the posterior. For example, approximate versions of Metropolis–Hastings based on subsampling, or Langevin Monte Carlo methods, are extremely useful when the sample size or the dimension of the data is too large. The ABC algorithm is useful when the model is generative, in the sense that it is simple to sample from it, even though its likelihood may be intractable.
2) Another interesting class of methods relies on the optimization algorithm to approximate the posterior by a member of a tractable family of probability distributions—for example, variational approximations, Laplace approximations, the EP algorithm, etc.
Of course, even though these algorithms are much faster than exact methods, it is extremely important to quantify what is lost in accuracy with respect to the exact posterior. For some of the previous methods, such results are still only partially available. Recent work established the very good scaling properties of Langevin Monte Carlo with the dimension of the data. Another series of paper connected the question of the accuracy of variational approximations to the PAC–Bayes literature in machine learning and obtained convergence results.
The objective of this Special Issue is to provide the latest advances in approximate Monte Carlo methods and in approximations of the posterior: design of efficient algorithms, study of the statistical properties of these algorithms, and challenging applications.
Dr. Pierre Alquier
Manuscript Submission Information
Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.
Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed open access monthly journal published by MDPI.
Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.
- Bayesian statistics
- variational approximations
- EP algorithm
- Langevin Monte Carlo
- Laplace approximations
- Approximate Bayesian Computation (ABC)
- Markov chain Monte Carlo (MCMC)