Tractable Probabilistic Graph Representation Learning with Graph-Induced
Sum-Product Networks
- URL: http://arxiv.org/abs/2305.10544v2
- Date: Fri, 16 Feb 2024 10:58:18 GMT
- Title: Tractable Probabilistic Graph Representation Learning with Graph-Induced
Sum-Product Networks
- Authors: Federico Errica, Mathias Niepert
- Abstract summary: We introduce Graph-Induced Sum-Product Networks (GSPNs), a new probabilistic framework for graph representation learning.
We show the model's competitiveness on scarce supervision scenarios, under missing data, and for graph classification in comparison to popular neural models.
- Score: 25.132159381873656
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Graph-Induced Sum-Product Networks (GSPNs), a new probabilistic
framework for graph representation learning that can tractably answer
probabilistic queries. Inspired by the computational trees induced by vertices
in the context of message-passing neural networks, we build hierarchies of
sum-product networks (SPNs) where the parameters of a parent SPN are learnable
transformations of the a-posterior mixing probabilities of its children's sum
units. Due to weight sharing and the tree-shaped computation graphs of GSPNs,
we obtain the efficiency and efficacy of deep graph networks with the
additional advantages of a probabilistic model. We show the model's
competitiveness on scarce supervision scenarios, under missing data, and for
graph classification in comparison to popular neural models. We complement the
experiments with qualitative analyses on hyper-parameters and the model's
ability to answer probabilistic queries.
Related papers
- Estimating Causal Effects from Learned Causal Networks [56.14597641617531]
We propose an alternative paradigm for answering causal-effect queries over discrete observable variables.
We learn the causal Bayesian network and its confounding latent variables directly from the observational data.
We show that this emphmodel completion learning approach can be more effective than estimand approaches.
arXiv Detail & Related papers (2024-08-26T08:39:09Z) - Sum-Product-Set Networks: Deep Tractable Models for Tree-Structured Graphs [0.0]
We propose sum-product-set networks, an extension of probabilistic circuits from unstructured data to tree-structured graph data.
We demonstrate that our tractable model performs comparably to various intractable models based on neural networks.
arXiv Detail & Related papers (2024-08-14T09:13:27Z) - Graph Reasoning Networks [9.18586425686959]
Graph Reasoning Networks (GRNs) is a novel approach to combine the strengths of fixed and learned graph representations and a reasoning module based on a differentiable satisfiability solver.
Results on real-world datasets show comparable performance to GNNs.
Experiments on synthetic datasets demonstrate the potential of the newly proposed method.
arXiv Detail & Related papers (2024-07-08T10:53:49Z) - ExSpliNet: An interpretable and expressive spline-based neural network [0.3867363075280544]
We present ExSpliNet, an interpretable and expressive neural network model.
We give a probabilistic interpretation of the model and show its universal approximation properties.
arXiv Detail & Related papers (2022-05-03T14:06:36Z) - Capsule Graph Neural Networks with EM Routing [8.632437524560133]
This paper proposed novel Capsule Graph Neural Networks that use the EM routing mechanism (CapsGNNEM) to generate high-quality graph embeddings.
Experimental results on a number of real-world graph datasets demonstrate that the proposed CapsGNNEM outperforms nine state-of-the-art models in graph classification tasks.
arXiv Detail & Related papers (2021-10-18T06:23:37Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - GraphSVX: Shapley Value Explanations for Graph Neural Networks [81.83769974301995]
Graph Neural Networks (GNNs) achieve significant performance for various learning tasks on geometric data.
In this paper, we propose a unified framework satisfied by most existing GNN explainers.
We introduce GraphSVX, a post hoc local model-agnostic explanation method specifically designed for GNNs.
arXiv Detail & Related papers (2021-04-18T10:40:37Z) - Probabilistic Graph Attention Network with Conditional Kernels for
Pixel-Wise Prediction [158.88345945211185]
We present a novel approach that advances the state of the art on pixel-level prediction in a fundamental aspect, i.e. structured multi-scale features learning and fusion.
We propose a probabilistic graph attention network structure based on a novel Attention-Gated Conditional Random Fields (AG-CRFs) model for learning and fusing multi-scale representations in a principled manner.
arXiv Detail & Related papers (2021-01-08T04:14:29Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Sum-product networks: A survey [0.0]
A sum-product network (SPN) is a probabilistic model, based on a rooted acyclic directed graph.
This paper offers a survey of SPNs, including their definition, the main algorithms for inference and learning from data, the main applications, a brief review of software libraries, and a comparison with related models.
arXiv Detail & Related papers (2020-04-02T17:46:29Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.