Probabilistic Graph Circuits: Deep Generative Models for Tractable Probabilistic Inference over Graphs
- URL: http://arxiv.org/abs/2503.12162v1
- Date: Sat, 15 Mar 2025 15:01:53 GMT
- Title: Probabilistic Graph Circuits: Deep Generative Models for Tractable Probabilistic Inference over Graphs
- Authors: Milan Papež, Martin Rektoris, Václav Šmídl, Tomáš Pevný,
- Abstract summary: We propose a framework of tractable DGMs that provide exact and efficient probabilistic inference over (arbitrary parts of) graphs.<n> achieving both exactness and efficiency is challenging in the permutation-invariant setting of graphs.<n>We design PGCs that are inherently invariant and satisfy these two requirements, yet at the cost of low expressive power.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep generative models (DGMs) have recently demonstrated remarkable success in capturing complex probability distributions over graphs. Although their excellent performance is attributed to powerful and scalable deep neural networks, it is, at the same time, exactly the presence of these highly non-linear transformations that makes DGMs intractable. Indeed, despite representing probability distributions, intractable DGMs deny probabilistic foundations by their inability to answer even the most basic inference queries without approximations or design choices specific to a very narrow range of queries. To address this limitation, we propose probabilistic graph circuits (PGCs), a framework of tractable DGMs that provide exact and efficient probabilistic inference over (arbitrary parts of) graphs. Nonetheless, achieving both exactness and efficiency is challenging in the permutation-invariant setting of graphs. We design PGCs that are inherently invariant and satisfy these two requirements, yet at the cost of low expressive power. Therefore, we investigate two alternative strategies to achieve the invariance: the first sacrifices the efficiency, and the second sacrifices the exactness. We demonstrate that ignoring the permutation invariance can have severe consequences in anomaly detection, and that the latter approach is competitive with, and sometimes better than, existing intractable DGMs in the context of molecular graph generation.
Related papers
- GraphSPNs: Sum-Product Networks Benefit From Canonical Orderings [0.0]
Graph sum-product networks (GraphSPNs) are a tractable deep generative model which provides exact and efficient inference over (arbitrary parts of) graphs.
We demonstrate that GraphSPNs are able to (conditionally) generate novel and chemically valid molecular graphs.
arXiv Detail & Related papers (2024-08-18T12:19:16Z) - Endowing Pre-trained Graph Models with Provable Fairness [49.8431177748876]
We propose a novel adapter-tuning framework that endows pre-trained graph models with provable fairness (called GraphPAR)
Specifically, we design a sensitive semantic augmenter on node representations, to extend the node representations with different sensitive attribute semantics for each node.
With GraphPAR, we quantify whether the fairness of each node is provable, i.e., predictions are always fair within a certain range of sensitive attribute semantics.
arXiv Detail & Related papers (2024-02-19T14:16:08Z) - Inference for Probabilistic Dependency Graphs [42.03917543423699]
Probabilistic dependency graphs (PDGs) are a flexible class of probabilistic models.
We present the first tractable inference algorithm for PDGs with discrete variables.
arXiv Detail & Related papers (2023-11-09T18:40:12Z) - Discrete Graph Auto-Encoder [52.50288418639075]
We introduce a new framework named Discrete Graph Auto-Encoder (DGAE)
We first use a permutation-equivariant auto-encoder to convert graphs into sets of discrete latent node representations.
In the second step, we sort the sets of discrete latent representations and learn their distribution with a specifically designed auto-regressive model.
arXiv Detail & Related papers (2023-06-13T12:40:39Z) - Neural Graph Revealers [2.2721854258621064]
We propose Neural Graph Revealers (NGRs) to efficiently merge sparse graph recovery methods with Probabilistic Graphical Models.
NGRs view the neural networks as a glass box' or more specifically as a multitask learning framework.
We show experimental results of doing sparse graph recovery and probabilistic inference on data from Gaussian graphical models and a multimodal infant mortality dataset by Centers for Disease Control and Prevention.
arXiv Detail & Related papers (2023-02-27T08:40:45Z) - Loss as the Inconsistency of a Probabilistic Dependency Graph: Choose
Your Model, Not Your Loss Function [0.0]
We show that many standard loss functions arise as the inconsistency of a natural PDG describing the appropriate scenario.
We also show that the PDG inconsistency captures a large class of statistical divergences.
We observe that inconsistency becomes the log partition function (free energy) in the setting where PDGs are factor graphs.
arXiv Detail & Related papers (2022-02-24T01:51:21Z) - Handling Distribution Shifts on Graphs: An Invariance Perspective [78.31180235269035]
We formulate the OOD problem on graphs and develop a new invariant learning approach, Explore-to-Extrapolate Risk Minimization (EERM)
EERM resorts to multiple context explorers that are adversarially trained to maximize the variance of risks from multiple virtual environments.
We prove the validity of our method by theoretically showing its guarantee of a valid OOD solution.
arXiv Detail & Related papers (2022-02-05T02:31:01Z) - BCD Nets: Scalable Variational Approaches for Bayesian Causal Discovery [97.79015388276483]
A structural equation model (SEM) is an effective framework to reason over causal relationships represented via a directed acyclic graph (DAG)
Recent advances enabled effective maximum-likelihood point estimation of DAGs from observational data.
We propose BCD Nets, a variational framework for estimating a distribution over DAGs characterizing a linear-Gaussian SEM.
arXiv Detail & Related papers (2021-12-06T03:35:21Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.