Transformation Models for Flexible Posteriors in Variational Bayes
- URL: http://arxiv.org/abs/2106.00528v1
- Date: Tue, 1 Jun 2021 14:43:47 GMT
- Title: Transformation Models for Flexible Posteriors in Variational Bayes
- Authors: Sefan H\"ortling, Daniel Dold, Oliver D\"urr, Beate Sick
- Abstract summary: In neural networks, variational inference is widely used to approximate difficult-to-compute posteriors.
Transformation models are flexible enough to fit any distribution.
TM-VI allows to accurately approximate complex posteriors in models with one parameter.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The main challenge in Bayesian models is to determine the posterior for the
model parameters. Already, in models with only one or few parameters, the
analytical posterior can only be determined in special settings. In Bayesian
neural networks, variational inference is widely used to approximate
difficult-to-compute posteriors by variational distributions. Usually,
Gaussians are used as variational distributions (Gaussian-VI) which limits the
quality of the approximation due to their limited flexibility. Transformation
models on the other hand are flexible enough to fit any distribution. Here we
present transformation model-based variational inference (TM-VI) and
demonstrate that it allows to accurately approximate complex posteriors in
models with one parameter and also works in a mean-field fashion for
multi-parameter models like neural networks.
Related papers
- Scaling and renormalization in high-dimensional regression [72.59731158970894]
This paper presents a succinct derivation of the training and generalization performance of a variety of high-dimensional ridge regression models.
We provide an introduction and review of recent results on these topics, aimed at readers with backgrounds in physics and deep learning.
arXiv Detail & Related papers (2024-05-01T15:59:00Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Bayesian Neural Network Inference via Implicit Models and the Posterior
Predictive Distribution [0.8122270502556371]
We propose a novel approach to perform approximate Bayesian inference in complex models such as Bayesian neural networks.
The approach is more scalable to large data than Markov Chain Monte Carlo.
We see this being useful in applications such as surrogate and physics-based models.
arXiv Detail & Related papers (2022-09-06T02:43:19Z) - Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning [84.90242084523565]
We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
arXiv Detail & Related papers (2022-05-23T18:54:27Z) - Generalised Gaussian Process Latent Variable Models (GPLVM) with
Stochastic Variational Inference [9.468270453795409]
We study the doubly formulation of the BayesianVM model amenable with minibatch training.
We show how this framework is compatible with different latent variable formulations and perform experiments to compare a suite of models.
We demonstrate how we can train in the presence of massively missing data and obtain high-fidelity reconstructions.
arXiv Detail & Related papers (2022-02-25T21:21:51Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Variational Filtering with Copula Models for SLAM [5.242618356321224]
We show how it is possible to perform simultaneous localization and mapping (SLAM) with a larger class of distributions.
We integrate the distribution model with copulas into a Sequential Monte Carlo estimator and show how unknown model parameters can be learned through gradient-based optimization.
arXiv Detail & Related papers (2020-08-02T15:38:23Z) - Decision-Making with Auto-Encoding Variational Bayes [71.44735417472043]
We show that a posterior approximation distinct from the variational distribution should be used for making decisions.
Motivated by these theoretical results, we propose learning several approximate proposals for the best model.
In addition to toy examples, we present a full-fledged case study of single-cell RNA sequencing.
arXiv Detail & Related papers (2020-02-17T19:23:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.