Strudel: Learning Structured-Decomposable Probabilistic Circuits
- URL: http://arxiv.org/abs/2007.09331v2
- Date: Wed, 2 Sep 2020 05:16:31 GMT
- Title: Strudel: Learning Structured-Decomposable Probabilistic Circuits
- Authors: Meihua Dang, Antonio Vergari, Guy Van den Broeck
- Abstract summary: Strudel is a simple, fast and accurate learning algorithm for structured-decomposable PCs.
It delivers more accurate single PC models in fewer iterations, and dramatically scales learning when building ensembles of PCs.
We show these advantages on standard density estimation benchmarks and challenging inference scenarios.
- Score: 37.153542210716004
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Probabilistic circuits (PCs) represent a probability distribution as a
computational graph. Enforcing structural properties on these graphs guarantees
that several inference scenarios become tractable. Among these properties,
structured decomposability is a particularly appealing one: it enables the
efficient and exact computations of the probability of complex logical
formulas, and can be used to reason about the expected output of certain
predictive models under missing data. This paper proposes Strudel, a simple,
fast and accurate learning algorithm for structured-decomposable PCs. Compared
to prior work for learning structured-decomposable PCs, Strudel delivers more
accurate single PC models in fewer iterations, and dramatically scales learning
when building ensembles of PCs. It achieves this scalability by exploiting
another structural property of PCs, called determinism, and by sharing the same
computational graph across mixture components. We show these advantages on
standard density estimation benchmarks and challenging inference scenarios.
Related papers
- On the Expressive Power of Tree-Structured Probabilistic Circuits [8.160496835449157]
We show that for $n$ variables, there exists a quasi-polynomial upper bound $nO(log n)$ on the size of an equivalent tree computing the same probability distribution.
We also show that given a depth restriction on the tree, there is a super-polynomial separation between tree and DAG-structured PCs.
arXiv Detail & Related papers (2024-10-07T19:51:30Z) - Compositional Probabilistic and Causal Inference using Tractable Circuit
Models [20.07977560803858]
We introduce md-vtrees, a novel structural formulation of (marginal) determinism in structured decomposable PCs.
We derive the first polytime algorithms for causal inference queries such as backdoor adjustment on PCs.
arXiv Detail & Related papers (2023-04-17T13:48:16Z) - Bayesian Structure Scores for Probabilistic Circuits [13.441379161477272]
Probabilistic circuits (PCs) are a prominent representation of probability with tractable inference.
We develop Bayesian structure scores for deterministic PCs, i.e., structure likelihood with parameters marginalized out.
We achieve good trade-offs between training time and model fit in terms of log-likelihood.
arXiv Detail & Related papers (2023-02-23T16:12:19Z) - Structural Learning of Probabilistic Sentential Decision Diagrams under
Partial Closed-World Assumption [127.439030701253]
Probabilistic sentential decision diagrams are a class of structured-decomposable circuits.
We propose a new scheme based on a partial closed-world assumption: data implicitly provide the logical base of the circuit.
Preliminary experiments show that the proposed approach might properly fit training data, and generalize well to test data, provided that these remain consistent with the underlying logical base.
arXiv Detail & Related papers (2021-07-26T12:01:56Z) - Tractable Regularization of Probabilistic Circuits [31.841838579553034]
Probabilistic Circuits (PCs) are a promising avenue for probabilistic modeling.
We propose two intuitive techniques, data softening and entropy regularization, that take advantage of PCs' tractability.
We show that both methods consistently improve the generalization performance of a wide variety of PCs.
arXiv Detail & Related papers (2021-06-04T05:11:13Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Predictive Coding Approximates Backprop along Arbitrary Computation
Graphs [68.8204255655161]
We develop a strategy to translate core machine learning architectures into their predictive coding equivalents.
Our models perform equivalently to backprop on challenging machine learning benchmarks.
Our method raises the potential that standard machine learning algorithms could in principle be directly implemented in neural circuitry.
arXiv Detail & Related papers (2020-06-07T15:35:47Z) - Einsum Networks: Fast and Scalable Learning of Tractable Probabilistic
Circuits [99.59941892183454]
We propose Einsum Networks (EiNets), a novel implementation design for PCs.
At their core, EiNets combine a large number of arithmetic operations in a single monolithic einsum-operation.
We show that the implementation of Expectation-Maximization (EM) can be simplified for PCs, by leveraging automatic differentiation.
arXiv Detail & Related papers (2020-04-13T23:09:15Z) - Torch-Struct: Deep Structured Prediction Library [138.5262350501951]
We introduce Torch-Struct, a library for structured prediction.
Torch-Struct includes a broad collection of probabilistic structures accessed through a simple and flexible distribution-based API.
arXiv Detail & Related papers (2020-02-03T16:43:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.