Probabilistic Generating Circuits
- URL: http://arxiv.org/abs/2102.09768v1
- Date: Fri, 19 Feb 2021 07:06:53 GMT
- Title: Probabilistic Generating Circuits
- Authors: Honghua Zhang, Brendan Juba, Guy Van den Broeck
- Abstract summary: We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
- Score: 50.98473654244851
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generating functions, which are widely used in combinatorics and probability
theory, encode function values into the coefficients of a polynomial. In this
paper, we explore their use as a tractable probabilistic model, and propose
probabilistic generating circuits (PGCs) for their efficient representation.
PGCs strictly subsume many existing tractable probabilistic models, including
determinantal point processes (DPPs), probabilistic circuits (PCs) such as
sum-product networks, and tractable graphical models. We contend that PGCs are
not just a theoretical framework that unifies vastly different existing models,
but also show huge potential in modeling realistic data. We exhibit a simple
class of PGCs that are not trivially subsumed by simple combinations of PCs and
DPPs, and obtain competitive performance on a suite of density estimation
benchmarks. We also highlight PGCs' connection to the theory of strongly
Rayleigh distributions.
Related papers
- Sum of Squares Circuits [8.323409122604893]
Probabilistic circuits (PCs) offer a framework where this tractability-vs-expressiveness trade-off can be analyzed theoretically.
We show that squared PCs encoding subtractive mixtures via negative parameters can be exponentially more expressive than monotonic PCs.
We formalize a novel class of PCs -- sum of squares PCs -- that can be exponentially more expressive than both squared and monotonic PCs.
arXiv Detail & Related papers (2024-08-21T17:08:05Z) - Probabilistic Circuits with Constraints via Convex Optimization [2.6436521007616114]
The proposed approach takes both a PC and constraints as inputs, and outputs a new PC that satisfies the constraints.
Empirical evaluations indicate that the combination of constraints and PCs can have multiple use cases.
arXiv Detail & Related papers (2024-03-19T19:55:38Z) - Probabilistic Integral Circuits [11.112802758446344]
We introduce a new language of computational graphs that extends PCs with integral units representing continuous LVs.
In practice, we parameterise PICs with light-weight neural nets delivering an intractable hierarchical continuous mixture.
We show that such PIC-approximating PCs systematically outperform PCs commonly learned via expectation-maximization or SGD.
arXiv Detail & Related papers (2023-10-25T20:38:18Z) - Continuous Mixtures of Tractable Probabilistic Models [10.667104977730304]
Probabilistic models based on continuous latent spaces, such as variational autoencoders, can be understood as uncountable mixture models.
Probabilistic circuits (PCs) can be understood as hierarchical discrete mixture models.
In this paper, we investigate a hybrid approach, namely continuous mixtures of tractable models with a small latent dimension.
arXiv Detail & Related papers (2022-09-21T18:18:32Z) - PAC Reinforcement Learning for Predictive State Representations [60.00237613646686]
We study online Reinforcement Learning (RL) in partially observable dynamical systems.
We focus on the Predictive State Representations (PSRs) model, which is an expressive model that captures other well-known models.
We develop a novel model-based algorithm for PSRs that can learn a near optimal policy in sample complexity scalingly.
arXiv Detail & Related papers (2022-07-12T17:57:17Z) - HyperSPNs: Compact and Expressive Probabilistic Circuits [89.897635970366]
HyperSPNs is a new paradigm of generating the mixture weights of large PCs using a small-scale neural network.
We show the merits of our regularization strategy on two state-of-the-art PC families introduced in recent literature.
arXiv Detail & Related papers (2021-12-02T01:24:43Z) - GFlowNet Foundations [66.69854262276391]
Generative Flow Networks (GFlowNets) have been introduced as a method to sample a diverse set of candidates in an active learning context.
We show a number of additional theoretical properties of GFlowNets.
arXiv Detail & Related papers (2021-11-17T17:59:54Z) - PSD Representations for Effective Probability Models [117.35298398434628]
We show that a recently proposed class of positive semi-definite (PSD) models for non-negative functions is particularly suited to this end.
We characterize both approximation and generalization capabilities of PSD models, showing that they enjoy strong theoretical guarantees.
Our results open the way to applications of PSD models to density estimation, decision theory and inference.
arXiv Detail & Related papers (2021-06-30T15:13:39Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.