GraphGDP: Generative Diffusion Processes for Permutation Invariant Graph
Generation
- URL: http://arxiv.org/abs/2212.01842v1
- Date: Sun, 4 Dec 2022 15:12:44 GMT
- Title: GraphGDP: Generative Diffusion Processes for Permutation Invariant Graph
Generation
- Authors: Han Huang, Leilei Sun, Bowen Du, Yanjie Fu, Weifeng Lv
- Abstract summary: Graph generative models have broad applications in biology, chemistry and social science.
Current leading autoregressive models fail to capture the permutation invariance nature of graphs.
We propose a continuous-time generative diffusion process for permutation invariant graph generation.
- Score: 43.196067037856515
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph generative models have broad applications in biology, chemistry and
social science. However, modelling and understanding the generative process of
graphs is challenging due to the discrete and high-dimensional nature of
graphs, as well as permutation invariance to node orderings in underlying graph
distributions. Current leading autoregressive models fail to capture the
permutation invariance nature of graphs for the reliance on generation ordering
and have high time complexity. Here, we propose a continuous-time generative
diffusion process for permutation invariant graph generation to mitigate these
issues. Specifically, we first construct a forward diffusion process defined by
a stochastic differential equation (SDE), which smoothly converts graphs within
the complex distribution to random graphs that follow a known edge probability.
Solving the corresponding reverse-time SDE, graphs can be generated from newly
sampled random graphs. To facilitate the reverse-time SDE, we newly design a
position-enhanced graph score network, capturing the evolving structure and
position information from perturbed graphs for permutation equivariant score
estimation. Under the evaluation of comprehensive metrics, our proposed
generative diffusion process achieves competitive performance in graph
distribution learning. Experimental results also show that GraphGDP can
generate high-quality graphs in only 24 function evaluations, much faster than
previous autoregressive models.
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [24.192931640371746]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - Overcoming Order in Autoregressive Graph Generation [12.351817671944515]
Graph generation is a fundamental problem in various domains, including chemistry and social networks.
Recent work has shown that molecular graph generation using recurrent neural networks (RNNs) is advantageous compared to traditional generative approaches.
arXiv Detail & Related papers (2024-02-04T09:58:22Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Conditional Diffusion Based on Discrete Graph Structures for Molecular
Graph Generation [32.66694406638287]
We propose a Conditional Diffusion model based on discrete Graph Structures (CDGS) for molecular graph generation.
Specifically, we construct a forward graph diffusion process on both graph structures and inherent features through differential equations (SDE)
We present a specialized hybrid graph noise prediction model that extracts the global context and the local node-edge dependency from intermediate graph states.
arXiv Detail & Related papers (2023-01-01T15:24:15Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.