Conditional independence by typing
- URL: http://arxiv.org/abs/2010.11887v2
- Date: Fri, 18 Feb 2022 14:19:22 GMT
- Title: Conditional independence by typing
- Authors: Maria I. Gorinova, Andrew D. Gordon, Charles Sutton, Matthijs
V\'ak\'ar
- Abstract summary: A central goal of probabilistic programming languages (PPLs) is to separate modelling from inference.
Conditional independence (CI) relationships among parameters are a crucial aspect of probabilistic models.
We show that for a well-typed program in our system, the distribution it implements is guaranteed to have certain CI-relationships.
- Score: 30.194205448457385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A central goal of probabilistic programming languages (PPLs) is to separate
modelling from inference. However, this goal is hard to achieve in practice.
Users are often forced to re-write their models in order to improve efficiency
of inference or meet restrictions imposed by the PPL. Conditional independence
(CI) relationships among parameters are a crucial aspect of probabilistic
models that capture a qualitative summary of the specified model and can
facilitate more efficient inference. We present an information flow type system
for probabilistic programming that captures conditional independence (CI)
relationships, and show that, for a well-typed program in our system, the
distribution it implements is guaranteed to have certain CI-relationships.
Further, by using type inference, we can statically deduce which CI-properties
are present in a specified model. As a practical application, we consider the
problem of how to perform inference on models with mixed discrete and
continuous parameters. Inference on such models is challenging in many existing
PPLs, but can be improved through a workaround, where the discrete parameters
are used implicitly, at the expense of manual model re-writing. We present a
source-to-source semantics-preserving transformation, which uses our CI-type
system to automate this workaround by eliminating the discrete parameters from
a probabilistic program. The resulting program can be seen as a hybrid
inference algorithm on the original program, where continuous parameters can be
drawn using efficient gradient-based inference methods, while the discrete
parameters are inferred using variable elimination. We implement our CI-type
system and its example application in SlicStan: a compositional variant of
Stan.
Related papers
- Correct-by-Construction Control for Stochastic and Uncertain Dynamical
Models via Formal Abstractions [44.99833362998488]
We develop an abstraction framework that can be used to solve this problem under various modeling assumptions.
We use state-of-the-art verification techniques to compute an optimal policy on the iMDP with guarantees for satisfying the given specification.
We then show that, by construction, we can refine this policy into a feedback controller for which these guarantees carry over to the dynamical model.
arXiv Detail & Related papers (2023-11-16T11:03:54Z) - Diffusion models for probabilistic programming [56.47577824219207]
Diffusion Model Variational Inference (DMVI) is a novel method for automated approximate inference in probabilistic programming languages (PPLs)
DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model.
arXiv Detail & Related papers (2023-11-01T12:17:05Z) - Variable Importance Matching for Causal Inference [73.25504313552516]
We describe a general framework called Model-to-Match that achieves these goals.
Model-to-Match uses variable importance measurements to construct a distance metric.
We operationalize the Model-to-Match framework with LASSO.
arXiv Detail & Related papers (2023-02-23T00:43:03Z) - Training and Inference on Any-Order Autoregressive Models the Right Way [97.39464776373902]
A family of Any-Order Autoregressive Models (AO-ARMs) has shown breakthrough performance in arbitrary conditional tasks.
We identify significant improvements to be made to previous formulations of AO-ARMs.
Our method leads to improved performance with no compromises on tractability.
arXiv Detail & Related papers (2022-05-26T18:00:02Z) - d3p -- A Python Package for Differentially-Private Probabilistic
Programming [16.91407609966877]
We present d3p, a software package designed to help fielding runtime efficient Bayesian inference under differential privacy guarantees.
d3p achieves general applicability to a wide range of probabilistic modelling problems by implementing the differentially private variational inference algorithm.
arXiv Detail & Related papers (2021-03-22T08:15:58Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Deep Conditional Transformation Models [0.0]
Learning the cumulative distribution function (CDF) of an outcome variable conditional on a set of features remains challenging.
Conditional transformation models provide a semi-parametric approach that allows to model a large class of conditional CDFs.
We propose a novel network architecture, provide details on different model definitions and derive suitable constraints.
arXiv Detail & Related papers (2020-10-15T16:25:45Z) - Control as Hybrid Inference [62.997667081978825]
We present an implementation of CHI which naturally mediates the balance between iterative and amortised inference.
We verify the scalability of our algorithm on a continuous control benchmark, demonstrating that it outperforms strong model-free and model-based baselines.
arXiv Detail & Related papers (2020-07-11T19:44:09Z) - Automatic structured variational inference [12.557212589634112]
We introduce automatic structured variational inference (ASVI)
ASVI is a fully automated method for constructing structured variational families.
We find that ASVI provides a clear improvement in performance when compared with other popular approaches.
arXiv Detail & Related papers (2020-02-03T10:52:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.