GraphGDP: Generative Diffusion Processes for Permutation Invariant Graph
Generation
- URL: http://arxiv.org/abs/2212.01842v1
- Date: Sun, 4 Dec 2022 15:12:44 GMT
- Title: GraphGDP: Generative Diffusion Processes for Permutation Invariant Graph
Generation
- Authors: Han Huang, Leilei Sun, Bowen Du, Yanjie Fu, Weifeng Lv
- Abstract summary: Graph generative models have broad applications in biology, chemistry and social science.
Current leading autoregressive models fail to capture the permutation invariance nature of graphs.
We propose a continuous-time generative diffusion process for permutation invariant graph generation.
- Score: 43.196067037856515
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph generative models have broad applications in biology, chemistry and
social science. However, modelling and understanding the generative process of
graphs is challenging due to the discrete and high-dimensional nature of
graphs, as well as permutation invariance to node orderings in underlying graph
distributions. Current leading autoregressive models fail to capture the
permutation invariance nature of graphs for the reliance on generation ordering
and have high time complexity. Here, we propose a continuous-time generative
diffusion process for permutation invariant graph generation to mitigate these
issues. Specifically, we first construct a forward diffusion process defined by
a stochastic differential equation (SDE), which smoothly converts graphs within
the complex distribution to random graphs that follow a known edge probability.
Solving the corresponding reverse-time SDE, graphs can be generated from newly
sampled random graphs. To facilitate the reverse-time SDE, we newly design a
position-enhanced graph score network, capturing the evolving structure and
position information from perturbed graphs for permutation equivariant score
estimation. Under the evaluation of comprehensive metrics, our proposed
generative diffusion process achieves competitive performance in graph
distribution learning. Experimental results also show that GraphGDP can
generate high-quality graphs in only 24 function evaluations, much faster than
previous autoregressive models.
Related papers
- Random Walk Diffusion for Efficient Large-Scale Graph Generation [0.43108040967674194]
We propose ARROW-Diff (AutoRegressive RandOm Walk Diffusion), a novel random walk-based diffusion approach for efficient large-scale graph generation.
We demonstrate that ARROW-Diff can scale to large graphs efficiently, surpassing other baseline methods in terms of both generation time and multiple graph statistics.
arXiv Detail & Related papers (2024-08-08T13:42:18Z) - Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - Overcoming Order in Autoregressive Graph Generation [12.351817671944515]
Graph generation is a fundamental problem in various domains, including chemistry and social networks.
Recent work has shown that molecular graph generation using recurrent neural networks (RNNs) is advantageous compared to traditional generative approaches.
arXiv Detail & Related papers (2024-02-04T09:58:22Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.