Next Article in Journal
Diabetic Retinal Grading Using Attention-Based Bilinear Convolutional Neural Network and Complement Cross Entropy
Previous Article in Journal
Subjective and Objective Quality Assessments of Display Products
Previous Article in Special Issue
Improved Local Search with Momentum for Bayesian Networks Structure Learning
Article

Extended Variational Message Passing for Automated Approximate Bayesian Inference

1
Department of Electrical Engineering, Eindhoven University of Technology, P.O. Box 513, 5600MB Eindhoven, The Netherlands
2
GN Hearing BV, JF Kennedylaan 2, 5612AB Eindhoven, The Netherlands
*
Author to whom correspondence should be addressed.
Current address: Department of Radiology and Nuclear Medicine, Erasmus MC, P.O. Box 2040, 3000CA Rotterdam, The Netherlands.
Academic Editors: Antonio Salmerón and Rafael Rumí
Entropy 2021, 23(7), 815; https://doi.org/10.3390/e23070815
Received: 18 May 2021 / Revised: 22 June 2021 / Accepted: 23 June 2021 / Published: 26 June 2021
(This article belongs to the Special Issue Bayesian Inference in Probabilistic Graphical Models)
Variational Message Passing (VMP) provides an automatable and efficient algorithmic framework for approximating Bayesian inference in factorized probabilistic models that consist of conjugate exponential family distributions. The automation of Bayesian inference tasks is very important since many data processing problems can be formulated as inference tasks on a generative probabilistic model. However, accurate generative models may also contain deterministic and possibly nonlinear variable mappings and non-conjugate factor pairs that complicate the automatic execution of the VMP algorithm. In this paper, we show that executing VMP in complex models relies on the ability to compute the expectations of the statistics of hidden variables. We extend the applicability of VMP by approximating the required expectation quantities in appropriate cases by importance sampling and Laplace approximation. As a result, the proposed Extended VMP (EVMP) approach supports automated efficient inference for a very wide range of probabilistic model specifications. We implemented EVMP in the Julia language in the probabilistic programming package ForneyLab.jl and show by a number of examples that EVMP renders an almost universal inference engine for factorized probabilistic models. View Full-Text
Keywords: Bayesian inference; variational inference; factor graphs; variational message passing; probabilistic programming Bayesian inference; variational inference; factor graphs; variational message passing; probabilistic programming
Show Figures

Figure 1

MDPI and ACS Style

Akbayrak, S.; Bocharov, I.; de Vries, B. Extended Variational Message Passing for Automated Approximate Bayesian Inference. Entropy 2021, 23, 815. https://doi.org/10.3390/e23070815

AMA Style

Akbayrak S, Bocharov I, de Vries B. Extended Variational Message Passing for Automated Approximate Bayesian Inference. Entropy. 2021; 23(7):815. https://doi.org/10.3390/e23070815

Chicago/Turabian Style

Akbayrak, Semih, Ivan Bocharov, and Bert de Vries. 2021. "Extended Variational Message Passing for Automated Approximate Bayesian Inference" Entropy 23, no. 7: 815. https://doi.org/10.3390/e23070815

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop