Sum-product networks: A survey
- URL: http://arxiv.org/abs/2004.01167v1
- Date: Thu, 2 Apr 2020 17:46:29 GMT
- Title: Sum-product networks: A survey
- Authors: Iago Par\'is, Raquel S\'anchez-Cauce, Francisco Javier D\'iez
- Abstract summary: A sum-product network (SPN) is a probabilistic model, based on a rooted acyclic directed graph.
This paper offers a survey of SPNs, including their definition, the main algorithms for inference and learning from data, the main applications, a brief review of software libraries, and a comparison with related models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A sum-product network (SPN) is a probabilistic model, based on a rooted
acyclic directed graph, in which terminal nodes represent univariate
probability distributions and non-terminal nodes represent convex combinations
(weighted sums) and products of probability functions. They are closely related
to probabilistic graphical models, in particular to Bayesian networks with
multiple context-specific independencies. Their main advantage is the
possibility of building tractable models from data, i.e., models that can
perform several inference tasks in time proportional to the number of links in
the graph. They are somewhat similar to neural networks and can address the
same kinds of problems, such as image processing and natural language
understanding. This paper offers a survey of SPNs, including their definition,
the main algorithms for inference and learning from data, the main
applications, a brief review of software libraries, and a comparison with
related models
Related papers
- Estimating Causal Effects from Learned Causal Networks [56.14597641617531]
We propose an alternative paradigm for answering causal-effect queries over discrete observable variables.
We learn the causal Bayesian network and its confounding latent variables directly from the observational data.
We show that this emphmodel completion learning approach can be more effective than estimand approaches.
arXiv Detail & Related papers (2024-08-26T08:39:09Z) - GraphSPNs: Sum-Product Networks Benefit From Canonical Orderings [0.0]
Graph sum-product networks (GraphSPNs) are a tractable deep generative model which provides exact and efficient inference over (arbitrary parts of) graphs.
We demonstrate that GraphSPNs are able to (conditionally) generate novel and chemically valid molecular graphs.
arXiv Detail & Related papers (2024-08-18T12:19:16Z) - Sum-Product-Set Networks: Deep Tractable Models for Tree-Structured Graphs [0.0]
We propose sum-product-set networks, an extension of probabilistic circuits from unstructured data to tree-structured graph data.
We demonstrate that our tractable model performs comparably to various intractable models based on neural networks.
arXiv Detail & Related papers (2024-08-14T09:13:27Z) - Cyclic Directed Probabilistic Graphical Model: A Proposal Based on
Structured Outcomes [0.0]
We describe a probabilistic graphical model - probabilistic relation network - that allows the direct capture of directional cyclic dependencies.
This model does not violate the probability axioms, and it supports learning from observed data.
Notably, it supports probabilistic inference, making it a prospective tool in data analysis and in expert and design-making applications.
arXiv Detail & Related papers (2023-10-25T10:19:03Z) - Goodness-of-Fit of Attributed Probabilistic Graph Generative Models [11.58149447373971]
We define goodness of fit in terms of the mean square contingency coefficient for random binary networks.
We apply these criteria to verify the representation capability of a probabilistic generative model for various popular types of graph models.
arXiv Detail & Related papers (2023-07-28T18:48:09Z) - Tractable Probabilistic Graph Representation Learning with Graph-Induced
Sum-Product Networks [25.132159381873656]
We introduce Graph-Induced Sum-Product Networks (GSPNs), a new probabilistic framework for graph representation learning.
We show the model's competitiveness on scarce supervision scenarios, under missing data, and for graph classification in comparison to popular neural models.
arXiv Detail & Related papers (2023-05-17T20:02:08Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Probabilistic Generating Circuits [50.98473654244851]
We propose probabilistic generating circuits (PGCs) for their efficient representation.
PGCs are not just a theoretical framework that unifies vastly different existing models, but also show huge potential in modeling realistic data.
We exhibit a simple class of PGCs that are not trivially subsumed by simple combinations of PCs and DPPs, and obtain competitive performance on a suite of density estimation benchmarks.
arXiv Detail & Related papers (2021-02-19T07:06:53Z) - Learned Factor Graphs for Inference from Stationary Time Sequences [107.63351413549992]
We propose a framework that combines model-based algorithms and data-driven ML tools for stationary time sequences.
neural networks are developed to separately learn specific components of a factor graph describing the distribution of the time sequence.
We present an inference algorithm based on learned stationary factor graphs, which learns to implement the sum-product scheme from labeled data.
arXiv Detail & Related papers (2020-06-05T07:06:19Z) - Bayesian Sparse Factor Analysis with Kernelized Observations [67.60224656603823]
Multi-view problems can be faced with latent variable models.
High-dimensionality and non-linear issues are traditionally handled by kernel methods.
We propose merging both approaches into single model.
arXiv Detail & Related papers (2020-06-01T14:25:38Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.