Amortized Variational Inference: A Systematic Review
- URL: http://arxiv.org/abs/2209.10888v2
- Date: Tue, 24 Oct 2023 07:26:52 GMT
- Title: Amortized Variational Inference: A Systematic Review
- Authors: Ankush Ganguly, Sanjana Jain, and Ukrit Watchareeruetai
- Abstract summary: The core principle of Variational Inference (VI) is to convert the statistical inference problem of computing complex posterior probability densities into a tractable optimization problem.
The traditional VI algorithm is not scalable to large data sets and is unable to readily infer out-of-bounds data points.
Recent developments in the field, like black box-, and amortized-VI, have helped address these issues.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The core principle of Variational Inference (VI) is to convert the
statistical inference problem of computing complex posterior probability
densities into a tractable optimization problem. This property enables VI to be
faster than several sampling-based techniques. However, the traditional VI
algorithm is not scalable to large data sets and is unable to readily infer
out-of-bounds data points without re-running the optimization process. Recent
developments in the field, like stochastic-, black box-, and amortized-VI, have
helped address these issues. Generative modeling tasks nowadays widely make use
of amortized VI for its efficiency and scalability, as it utilizes a
parameterized function to learn the approximate posterior density parameters.
In this paper, we review the mathematical foundations of various VI techniques
to form the basis for understanding amortized VI. Additionally, we provide an
overview of the recent trends that address several issues of amortized VI, such
as the amortization gap, generalization issues, inconsistent representation
learning, and posterior collapse. Finally, we analyze alternate divergence
measures that improve VI optimization.
Related papers
- ASPIRE: Iterative Amortized Posterior Inference for Bayesian Inverse Problems [0.974963895316339]
New advances in machine learning and variational inference (VI) have lowered the computational barrier by learning from examples.
Two VI paradigms have emerged that represent different tradeoffs: amortized and non-amortized.
We present a solution that enables iterative improvement of amortized posteriors that uses the same networks architectures and training data.
arXiv Detail & Related papers (2024-05-08T20:03:12Z) - Efficient Training of Probabilistic Neural Networks for Survival Analysis [0.6437284704257459]
Variational Inference (VI) is a commonly used technique for approximate Bayesian inference and uncertainty estimation in deep learning models.
It comes at a computational cost, as it doubles the number of trainable parameters to represent uncertainty.
We investigate how to train deep probabilistic survival models in large datasets without introducing additional overhead in model complexity.
arXiv Detail & Related papers (2024-04-09T16:10:39Z) - Functional Graphical Models: Structure Enables Offline Data-Driven Optimization [111.28605744661638]
We show how structure can enable sample-efficient data-driven optimization.
We also present a data-driven optimization algorithm that infers the FGM structure itself.
arXiv Detail & Related papers (2024-01-08T22:33:14Z) - PAVI: Plate-Amortized Variational Inference [55.975832957404556]
Inference is challenging for large population studies where millions of measurements are performed over a cohort of hundreds of subjects.
This large cardinality renders off-the-shelf Variational Inference (VI) computationally impractical.
In this work, we design structured VI families that efficiently tackle large population studies.
arXiv Detail & Related papers (2023-08-30T13:22:20Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Bernstein Flows for Flexible Posteriors in Variational Bayes [0.0]
Variational inference (VI) is a technique to approximate difficult to compute posteriors by optimization.
This paper presents Bernstein flow variational inference (BF-VI), a robust and easy-to-use method, flexible enough to approximate complex posteriors.
arXiv Detail & Related papers (2022-02-11T14:45:52Z) - Relay Variational Inference: A Method for Accelerated Encoderless VI [47.72653430712088]
Relay VI is a framework that dramatically improves the convergence and performance of encoderless VI.
We study the effectiveness of RVI in terms of convergence speed, loss, representation power and missing data imputation.
arXiv Detail & Related papers (2021-10-26T05:48:00Z) - Regularizing Variational Autoencoder with Diversity and Uncertainty
Awareness [61.827054365139645]
Variational Autoencoder (VAE) approximates the posterior of latent variables based on amortized variational inference.
We propose an alternative model, DU-VAE, for learning a more Diverse and less Uncertain latent space.
arXiv Detail & Related papers (2021-10-24T07:58:13Z) - An Introduction to Variational Inference [0.0]
In this paper, we introduce the concept of Variational Inference (VI)
VI is a popular method in machine learning that uses optimization techniques to estimate complex probability densities.
We discuss the applications of VI to variational auto-encoders (VAE) and VAE-Generative Adversarial Network (VAE-GAN)
arXiv Detail & Related papers (2021-08-30T09:40:04Z) - Meta-Learning Divergences of Variational Inference [49.164944557174294]
Variational inference (VI) plays an essential role in approximate Bayesian inference.
We propose a meta-learning algorithm to learn the divergence metric suited for the task of interest.
We demonstrate our approach outperforms standard VI on Gaussian mixture distribution approximation.
arXiv Detail & Related papers (2020-07-06T17:43:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.