Statistical Guarantees for Transformation Based Models with Applications
to Implicit Variational Inference
- URL: http://arxiv.org/abs/2010.14056v2
- Date: Wed, 4 Nov 2020 20:02:43 GMT
- Title: Statistical Guarantees for Transformation Based Models with Applications
to Implicit Variational Inference
- Authors: Sean Plummer, Shuang Zhou, Anirban Bhattacharya, David Dunson, Debdeep
Pati
- Abstract summary: We provide theoretical justification for the use of non-linear latent variable models (NL-LVMs) in non-parametric inference.
We use the NL-LVMs to construct an implicit family of variational distributions, deemed GP-IVI.
To the best of our knowledge, this is the first work on providing theoretical guarantees for implicit variational inference.
- Score: 8.333191406788423
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Transformation-based methods have been an attractive approach in
non-parametric inference for problems such as unconditional and conditional
density estimation due to their unique hierarchical structure that models the
data as flexible transformation of a set of common latent variables. More
recently, transformation-based models have been used in variational inference
(VI) to construct flexible implicit families of variational distributions.
However, their use in both non-parametric inference and variational inference
lacks theoretical justification. We provide theoretical justification for the
use of non-linear latent variable models (NL-LVMs) in non-parametric inference
by showing that the support of the transformation induced prior in the space of
densities is sufficiently large in the $L_1$ sense. We also show that, when a
Gaussian process (GP) prior is placed on the transformation function, the
posterior concentrates at the optimal rate up to a logarithmic factor. Adopting
the flexibility demonstrated in the non-parametric setting, we use the NL-LVM
to construct an implicit family of variational distributions, deemed GP-IVI. We
delineate sufficient conditions under which GP-IVI achieves optimal risk bounds
and approximates the true posterior in the sense of the Kullback-Leibler
divergence. To the best of our knowledge, this is the first work on providing
theoretical guarantees for implicit variational inference.
Related papers
- Amortized Variational Inference for Deep Gaussian Processes [0.0]
Deep Gaussian processes (DGPs) are multilayer generalizations of Gaussian processes (GPs)
We introduce amortized variational inference for DGPs, which learns an inference function that maps each observation to variational parameters.
Our method performs similarly or better than previous approaches at less computational cost.
arXiv Detail & Related papers (2024-09-18T20:23:27Z) - Reparameterized Variational Rejection Sampling [12.189621777178354]
Variational Rejection Sampling (VRS) combines a parametric proposal distribution with sampling rejection to define a rich non-parametric family of distributions.
We show that our method performs well in practice and that it is well-suited for black-box inference, especially for models with local latent variables.
arXiv Detail & Related papers (2023-09-26T01:46:53Z) - Monte Carlo inference for semiparametric Bayesian regression [5.488491124945426]
This paper introduces a simple, general, and efficient strategy for joint posterior inference of an unknown transformation and all regression model parameters.
It delivers (1) joint posterior consistency under general conditions, including multiple model misspecifications, and (2) efficient Monte Carlo (not Markov chain Monte Carlo) inference for the transformation and all parameters for important special cases.
arXiv Detail & Related papers (2023-06-08T18:42:42Z) - Variational Nonlinear Kalman Filtering with Unknown Process Noise
Covariance [24.23243651301339]
This paper presents a solution for identification of nonlinear state estimation and model parameters based on the approximate Bayesian inference principle.
The performance of the proposed method is verified on radar target tracking applications by both simulated and real-world data.
arXiv Detail & Related papers (2023-05-06T03:34:39Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Amortized backward variational inference in nonlinear state-space models [0.0]
We consider the problem of state estimation in general state-space models using variational inference.
We establish for the first time that, under mixing assumptions, the variational approximation of expectations of additive state functionals induces an error which grows at most linearly in the number of observations.
arXiv Detail & Related papers (2022-06-01T08:35:54Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Loss function based second-order Jensen inequality and its application
to particle variational inference [112.58907653042317]
Particle variational inference (PVI) uses an ensemble of models as an empirical approximation for the posterior distribution.
PVI iteratively updates each model with a repulsion force to ensure the diversity of the optimized models.
We derive a novel generalization error bound and show that it can be reduced by enhancing the diversity of models.
arXiv Detail & Related papers (2021-06-09T12:13:51Z) - Efficient Semi-Implicit Variational Inference [65.07058307271329]
We propose an efficient and scalable semi-implicit extrapolational (SIVI)
Our method maps SIVI's evidence to a rigorous inference of lower gradient values.
arXiv Detail & Related papers (2021-01-15T11:39:09Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.