Automatic structured variational inference
- URL: http://arxiv.org/abs/2002.00643v3
- Date: Wed, 10 Feb 2021 18:52:08 GMT
- Title: Automatic structured variational inference
- Authors: Luca Ambrogioni, Kate Lin, Emily Fertig, Sharad Vikram, Max Hinne,
Dave Moore, Marcel van Gerven
- Abstract summary: We introduce automatic structured variational inference (ASVI)
ASVI is a fully automated method for constructing structured variational families.
We find that ASVI provides a clear improvement in performance when compared with other popular approaches.
- Score: 12.557212589634112
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic variational inference offers an attractive option as a default
method for differentiable probabilistic programming. However, the performance
of the variational approach depends on the choice of an appropriate variational
family. Here, we introduce automatic structured variational inference (ASVI), a
fully automated method for constructing structured variational families,
inspired by the closed-form update in conjugate Bayesian models. These
convex-update families incorporate the forward pass of the input probabilistic
program and can therefore capture complex statistical dependencies.
Convex-update families have the same space and time complexity as the input
probabilistic program and are therefore tractable for a very large family of
models including both continuous and discrete variables. We validate our
automatic variational method on a wide range of low- and high-dimensional
inference problems. We find that ASVI provides a clear improvement in
performance when compared with other popular approaches such as the mean-field
approach and inverse autoregressive flows. We provide an open source
implementation of ASVI in TensorFlow Probability.
Related papers
- Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Rethinking Variational Inference for Probabilistic Programs with
Stochastic Support [23.07504711090434]
We introduce Support Decomposition Vari Inference (SDVI), a new variational inference (VI) approach for probabilistic programs with support.
SDVI instead breaks the program down into sub-programs with static support, before automatically building separate sub-guides for each.
This decomposition significantly aids in the construction of suitable variational families, enabling, in turn, substantial improvements in inference performance.
arXiv Detail & Related papers (2023-11-01T15:38:51Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Training and Inference on Any-Order Autoregressive Models the Right Way [97.39464776373902]
A family of Any-Order Autoregressive Models (AO-ARMs) has shown breakthrough performance in arbitrary conditional tasks.
We identify significant improvements to be made to previous formulations of AO-ARMs.
Our method leads to improved performance with no compromises on tractability.
arXiv Detail & Related papers (2022-05-26T18:00:02Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - d3p -- A Python Package for Differentially-Private Probabilistic
Programming [16.91407609966877]
We present d3p, a software package designed to help fielding runtime efficient Bayesian inference under differential privacy guarantees.
d3p achieves general applicability to a wide range of probabilistic modelling problems by implementing the differentially private variational inference algorithm.
arXiv Detail & Related papers (2021-03-22T08:15:58Z) - Automatic variational inference with cascading flows [6.252236971703546]
We present a new family of variational programs that embed the forward-pass.
A cascading flows program interposes a newly designed highway flow architecture in between the conditional distributions of the prior program.
We evaluate the performance of the new variational programs in a series of structured inference problems.
arXiv Detail & Related papers (2021-02-09T12:44:39Z) - Conditional independence by typing [30.194205448457385]
A central goal of probabilistic programming languages (PPLs) is to separate modelling from inference.
Conditional independence (CI) relationships among parameters are a crucial aspect of probabilistic models.
We show that for a well-typed program in our system, the distribution it implements is guaranteed to have certain CI-relationships.
arXiv Detail & Related papers (2020-10-22T17:27:22Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Identification of Probability weighted ARX models with arbitrary domains [75.91002178647165]
PieceWise Affine models guarantees universal approximation, local linearity and equivalence to other classes of hybrid system.
In this work, we focus on the identification of PieceWise Auto Regressive with eXogenous input models with arbitrary regions (NPWARX)
The architecture is conceived following the Mixture of Expert concept, developed within the machine learning field.
arXiv Detail & Related papers (2020-09-29T12:50:33Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.