Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning
- URL: http://arxiv.org/abs/2205.11568v1
- Date: Mon, 23 May 2022 18:54:27 GMT
- Title: Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning
- Authors: Martin Magris, Mostafa Shabani, Alexandros Iosifidis
- Abstract summary: We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
- Score: 84.90242084523565
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop an optimization algorithm suitable for Bayesian learning in
complex models. Our approach relies on natural gradient updates within a
general black-box framework for efficient training with limited model-specific
derivations. It applies within the class of exponential-family variational
posterior distributions, for which we extensively discuss the Gaussian case for
which the updates have a rather simple form. Our Quasi Black-box Variational
Inference (QBVI) framework is readily applicable to a wide class of Bayesian
inference problems and is of simple implementation as the updates of the
variational posterior do not involve gradients with respect to the model
parameters, nor the prescription of the Fisher information matrix. We develop
QBVI under different hypotheses for the posterior covariance matrix, discuss
details about its robust and feasible implementation, and provide a number of
real-world applications to demonstrate its effectiveness.
Related papers
- Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - A Tutorial on Parametric Variational Inference [0.0]
Variational inference is now the preferred choice for many high-dimensional models and large datasets.
This tutorial introduces variational inference from the parametric perspective that dominates these recent developments.
arXiv Detail & Related papers (2023-01-03T17:30:50Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Recursive Monte Carlo and Variational Inference with Auxiliary Variables [64.25762042361839]
Recursive auxiliary-variable inference (RAVI) is a new framework for exploiting flexible proposals.
RAVI generalizes and unifies several existing methods for inference with expressive expressive families.
We show RAVI's design framework and theorems by using them to analyze and improve upon Salimans et al.'s Markov Chain Variational Inference.
arXiv Detail & Related papers (2022-03-05T23:52:40Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - The FMRIB Variational Bayesian Inference Tutorial II: Stochastic
Variational Bayes [1.827510863075184]
This tutorial revisits the original FMRIB Variational Bayes tutorial.
This new approach bears a lot of similarity to, and has benefited from, computational methods applied to machine learning algorithms.
arXiv Detail & Related papers (2020-07-03T11:31:52Z) - Sparse Gaussian Processes Revisited: Bayesian Approaches to
Inducing-Variable Approximations [27.43948386608]
Variational inference techniques based on inducing variables provide an elegant framework for scalable estimation in Gaussian process (GP) models.
In this work we challenge the common wisdom that optimizing the inducing inputs in variational framework yields optimal performance.
arXiv Detail & Related papers (2020-03-06T08:53:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.