A Tutorial on Parametric Variational Inference
- URL: http://arxiv.org/abs/2301.01236v1
- Date: Tue, 3 Jan 2023 17:30:50 GMT
- Title: A Tutorial on Parametric Variational Inference
- Authors: Jens Sj\"olund
- Abstract summary: Variational inference is now the preferred choice for many high-dimensional models and large datasets.
This tutorial introduces variational inference from the parametric perspective that dominates these recent developments.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational inference uses optimization, rather than integration, to
approximate the marginal likelihood, and thereby the posterior, in a Bayesian
model. Thanks to advances in computational scalability made in the last decade,
variational inference is now the preferred choice for many high-dimensional
models and large datasets. This tutorial introduces variational inference from
the parametric perspective that dominates these recent developments, in
contrast to the mean-field perspective commonly found in other introductory
texts.
Related papers
- Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Conformal inference for regression on Riemannian Manifolds [49.7719149179179]
We investigate prediction sets for regression scenarios when the response variable, denoted by $Y$, resides in a manifold, and the covariable, denoted by X, lies in Euclidean space.
We prove the almost sure convergence of the empirical version of these regions on the manifold to their population counterparts.
arXiv Detail & Related papers (2023-10-12T10:56:25Z) - Prior Density Learning in Variational Bayesian Phylogenetic Parameters
Inference [1.03590082373586]
We propose an approach to relax the rigidity of the prior densities by learning their parameters using a gradient-based method and a neural network-based parameterization.
The results of performed simulations show that the approach is powerful in estimating branch lengths and evolutionary model parameters.
arXiv Detail & Related papers (2023-02-06T01:29:15Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning [84.90242084523565]
We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
arXiv Detail & Related papers (2022-05-23T18:54:27Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Transfer Learning with Gaussian Processes for Bayesian Optimization [9.933956770453438]
We provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods.
We develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity.
We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods.
arXiv Detail & Related papers (2021-11-22T14:09:45Z) - Transformation Models for Flexible Posteriors in Variational Bayes [0.0]
In neural networks, variational inference is widely used to approximate difficult-to-compute posteriors.
Transformation models are flexible enough to fit any distribution.
TM-VI allows to accurately approximate complex posteriors in models with one parameter.
arXiv Detail & Related papers (2021-06-01T14:43:47Z) - Challenges and Opportunities in High-dimensional Variational Inference [65.53746326245059]
We show why intuitions about approximate families and divergences for low-dimensional posteriors fail for higher-dimensional posteriors.
For high-dimensional posteriors we recommend using the exclusive KL divergence that is most stable and easiest to optimize.
In low to moderate dimensions, heavy-tailed variational families and mass-covering divergences can increase the chances that the approximation can be improved by importance sampling.
arXiv Detail & Related papers (2021-03-01T15:53:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.