$\omega$PAP Spaces: Reasoning Denotationally About Higher-Order,
Recursive Probabilistic and Differentiable Programs
- URL: http://arxiv.org/abs/2302.10636v2
- Date: Thu, 25 May 2023 22:19:58 GMT
- Title: $\omega$PAP Spaces: Reasoning Denotationally About Higher-Order,
Recursive Probabilistic and Differentiable Programs
- Authors: Mathieu Huot, Alexander K. Lew, Vikash K. Mansinghka, Sam Staton
- Abstract summary: $omega$PAP spaces are spaces for reasoning denotationally about expressive differentiable and probabilistic programming languages.
Our semantics is general enough to assign meanings to most practical probabilistic and differentiable programs.
We establish the almost-everywhere differentiability of probabilistic programs' trace density functions.
- Score: 64.25762042361839
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce a new setting, the category of $\omega$PAP spaces, for reasoning
denotationally about expressive differentiable and probabilistic programming
languages. Our semantics is general enough to assign meanings to most practical
probabilistic and differentiable programs, including those that use general
recursion, higher-order functions, discontinuous primitives, and both discrete
and continuous sampling. But crucially, it is also specific enough to exclude
many pathological denotations, enabling us to establish new results about both
deterministic differentiable programs and probabilistic programs. In the
deterministic setting, we prove very general correctness theorems for automatic
differentiation and its use within gradient descent. In the probabilistic
setting, we establish the almost-everywhere differentiability of probabilistic
programs' trace density functions, and the existence of convenient base
measures for density computation in Monte Carlo inference. In some cases these
results were previously known, but required detailed proofs with an operational
flavor; by contrast, all our proofs work directly with programs' denotations.
Related papers
- Generalizing Stochastic Smoothing for Differentiation and Gradient Estimation [59.86921150579892]
We deal with the problem of gradient estimation for differentiable relaxations of algorithms, operators, simulators, and other non-differentiable functions.
We develop variance reduction strategies for differentiable sorting and ranking, differentiable shortest-paths on graphs, differentiable rendering for pose estimation, as well as differentiable cryo-ET simulations.
arXiv Detail & Related papers (2024-10-10T17:10:00Z) - Exact Bayesian Inference on Discrete Models via Probability Generating
Functions: A Probabilistic Programming Approach [7.059472280274009]
We present an exact Bayesian inference method for discrete statistical models.
We use a probabilistic programming language that supports discrete and continuous sampling, discrete observations, affine functions, (stochastic) branching, and conditioning on discrete events.
Our inference method is provably correct and fully automated.
arXiv Detail & Related papers (2023-05-26T16:09:59Z) - Probabilistic unifying relations for modelling epistemic and aleatoric uncertainty: semantics and automated reasoning with theorem proving [0.3441021278275805]
Probabilistic programming combines general computer programming, statistical inference, and formal semantics.
ProbURel is based on Hehner's predicative probabilistic programming, but there are several obstacles to the broader adoption of his work.
Our contributions include the formalisation of relations using Unifying Theories of Programming (UTP) and probabilities outside the brackets.
We demonstrate our work with six examples, including problems in robot localisation, classification in machine learning, and the termination of probabilistic loops.
arXiv Detail & Related papers (2023-03-16T23:36:57Z) - Generalized Differentiable RANSAC [95.95627475224231]
$nabla$-RANSAC is a differentiable RANSAC that allows learning the entire randomized robust estimation pipeline.
$nabla$-RANSAC is superior to the state-of-the-art in terms of accuracy while running at a similar speed to its less accurate alternatives.
arXiv Detail & Related papers (2022-12-26T15:13:13Z) - Logical Credal Networks [87.25387518070411]
This paper introduces Logical Credal Networks, an expressive probabilistic logic that generalizes many prior models that combine logic and probability.
We investigate its performance on maximum a posteriori inference tasks, including solving Mastermind games with uncertainty and detecting credit card fraud.
arXiv Detail & Related papers (2021-09-25T00:00:47Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Compositional Semantics for Probabilistic Programs with Exact
Conditioning [0.0]
We define a probabilistic programming language for Gaussian random variables with a first-class exact conditioning construct.
We show that the good properties of our language are not particular to Gaussians, but can be derived from universal properties.
arXiv Detail & Related papers (2021-01-27T12:31:18Z) - Tractable Inference in Credal Sentential Decision Diagrams [116.6516175350871]
Probabilistic sentential decision diagrams are logic circuits where the inputs of disjunctive gates are annotated by probability values.
We develop the credal sentential decision diagrams, a generalisation of their probabilistic counterpart that allows for replacing the local probabilities with credal sets of mass functions.
For a first empirical validation, we consider a simple application based on noisy seven-segment display images.
arXiv Detail & Related papers (2020-08-19T16:04:34Z) - Densities of Almost Surely Terminating Probabilistic Programs are
Differentiable Almost Everywhere [1.911678487931003]
We study the differential properties of higher-order statistical probabilistic programs with recursion and conditioning.
A by-product of this work is that almost-surely terminating deterministic (S)PCF programs with real parameters denote functions that are almost-everywhere differentiability.
arXiv Detail & Related papers (2020-04-08T10:40:14Z) - Stochastically Differentiable Probabilistic Programs [18.971852464650144]
The existence of discrete random variables prohibits many basic gradient-based inference engines.
We present a novel approach to run inference efficiently and robustly in such programs using Markov Chain Monte Carlo family of algorithms.
arXiv Detail & Related papers (2020-03-02T08:04:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.