Probabilistic Circuits for Cumulative Distribution Functions
- URL: http://arxiv.org/abs/2408.04229v1
- Date: Thu, 8 Aug 2024 05:33:21 GMT
- Title: Probabilistic Circuits for Cumulative Distribution Functions
- Authors: Oliver Broadrick, William Cao, Benjie Wang, Martin Trapp, Guy Van den Broeck,
- Abstract summary: We show that for distributions over binary random variables these representations (PMF and CDF) are equivalent, in the sense that one can be transformed to the other in time.
For continuous variables, smooth, decomposable PCs computing PDFs and CDFs can be efficiently transformed to each other by modifying only the leaves of the circuit.
- Score: 27.511363113215374
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A probabilistic circuit (PC) succinctly expresses a function that represents a multivariate probability distribution and, given sufficient structural properties of the circuit, supports efficient probabilistic inference. Typically a PC computes the probability mass (or density) function (PMF or PDF) of the distribution. We consider PCs instead computing the cumulative distribution function (CDF). We show that for distributions over binary random variables these representations (PMF and CDF) are essentially equivalent, in the sense that one can be transformed to the other in polynomial time. We then show how a similar equivalence holds for distributions over finite discrete variables using a modification of the standard encoding with binary variables that aligns with the CDF semantics. Finally we show that for continuous variables, smooth, decomposable PCs computing PDFs and CDFs can be efficiently transformed to each other by modifying only the leaves of the circuit.
Related papers
- Exact spectral form factors of non-interacting fermions with Dyson statistics [0.0]
spectral form factor (SFF) is a powerful diagnostic of random matrix behavior in quantum many-body systems.
We introduce a family of random circuit ensembles whose SFFs can be computed $textitexactly$.
arXiv Detail & Related papers (2024-10-10T18:00:00Z) - Polynomial Semantics of Tractable Probabilistic Circuits [29.3642918977097]
We show that each of these circuit models is equivalent in the sense that any circuit for one of them can be transformed into a circuit for any of the others with only an increase in size.
They are all tractable for marginal inference on the same class of distributions.
arXiv Detail & Related papers (2024-02-14T11:02:04Z) - Characteristic Circuits [26.223089423713486]
Probabilistic circuits (PCs) compose simple, tractable distributions into a high-dimensional probability distribution.
We introduce characteristic circuits (CCs) providing a unified formalization of distributions over heterogeneous data in the spectral domain.
We show that CCs outperform state-of-the-art density estimators for heterogeneous data domains on common benchmark data sets.
arXiv Detail & Related papers (2023-12-12T23:15:07Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - GFlowNet Foundations [66.69854262276391]
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context.
We show a number of additional theoretical properties of GFlowNets.
arXiv Detail & Related papers (2021-11-17T17:59:54Z) - Top-N: Equivariant set and graph generation without exchangeability [61.24699600833916]
We consider one-shot probabilistic decoders that map a vector-shaped prior to a distribution over sets or graphs.
These functions can be integrated into variational autoencoders (VAE), generative adversarial networks (GAN) or normalizing flows.
Top-n is a deterministic, non-exchangeable set creation mechanism which learns to select the most relevant points from a trainable reference set.
arXiv Detail & Related papers (2021-10-05T14:51:19Z) - Probabilistic Kolmogorov-Arnold Network [1.4732811715354455]
The present paper proposes a method for estimating probability distributions of the outputs in the case of aleatoric uncertainty.
The suggested approach covers input-dependent probability distributions of the outputs, as well as the variation of the distribution type with the inputs.
Although the method is applicable to any regression model, the present paper combines it with KANs, since the specific structure of KANs leads to computationally-efficient models' construction.
arXiv Detail & Related papers (2021-04-04T23:49:15Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Linear Optimal Transport Embedding: Provable Wasserstein classification
for certain rigid transformations and perturbations [79.23797234241471]
Discriminating between distributions is an important problem in a number of scientific fields.
The Linear Optimal Transportation (LOT) embeds the space of distributions into an $L2$-space.
We demonstrate the benefits of LOT on a number of distribution classification problems.
arXiv Detail & Related papers (2020-08-20T19:09:33Z) - Densities of Almost Surely Terminating Probabilistic Programs are
Differentiable Almost Everywhere [1.911678487931003]
We study the differential properties of higher-order statistical probabilistic programs with recursion and conditioning.
A by-product of this work is that almost-surely terminating deterministic (S)PCF programs with real parameters denote functions that are almost-everywhere differentiability.
arXiv Detail & Related papers (2020-04-08T10:40:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.