FLOWGEN: Fast and slow graph generation
- URL: http://arxiv.org/abs/2207.07656v1
- Date: Fri, 15 Jul 2022 16:32:23 GMT
- Title: FLOWGEN: Fast and slow graph generation
- Authors: Aman Madaan, Yiming Yang
- Abstract summary: We present FLOWGEN, a graph-generation model inspired by the dual-process theory of mind.
Depending on the difficulty of completing the graph at the current step, graph generation is routed to either a fast(weaker) or a slow(stronger) model.
Experiments on real-world graphs show that ours can successfully generate graphs similar to those generated by a single large model in a fraction of time.
- Score: 49.21890450444187
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present FLOWGEN, a graph-generation model inspired by the dual-process
theory of mind that generates large graphs incrementally. Depending on the
difficulty of completing the graph at the current step, graph generation is
routed to either a fast~(weaker) or a slow~(stronger) model. fast and slow
models have identical architectures, but vary in the number of parameters and
consequently the strength. Experiments on real-world graphs show that ours can
successfully generate graphs similar to those generated by a single large model
in a fraction of time.
Related papers
- Random Walk Diffusion for Efficient Large-Scale Graph Generation [0.43108040967674194]
We propose ARROW-Diff (AutoRegressive RandOm Walk Diffusion), a novel random walk-based diffusion approach for efficient large-scale graph generation.
We demonstrate that ARROW-Diff can scale to large graphs efficiently, surpassing other baseline methods in terms of both generation time and multiple graph statistics.
arXiv Detail & Related papers (2024-08-08T13:42:18Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - GraphMaker: Can Diffusion Models Generate Large Attributed Graphs? [7.330479039715941]
Large-scale graphs with node attributes are increasingly common in various real-world applications.
Traditional graph generation methods are limited in their capacity to handle these complex structures.
This paper introduces a novel diffusion model, GraphMaker, specifically designed for generating large attributed graphs.
arXiv Detail & Related papers (2023-10-20T22:12:46Z) - Efficient and Degree-Guided Graph Generation via Discrete Diffusion
Modeling [20.618785908770356]
Diffusion-based generative graph models have been proven effective in generating high-quality small graphs.
However, they need to be more scalable for generating large graphs containing thousands of nodes desiring graph statistics.
We propose EDGE, a new diffusion-based generative graph model that addresses generative tasks with large graphs.
arXiv Detail & Related papers (2023-05-06T18:32:27Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Generative Diffusion Models on Graphs: Methods and Applications [50.44334458963234]
Diffusion models, as a novel generative paradigm, have achieved remarkable success in various image generation tasks.
Graph generation is a crucial computational task on graphs with numerous real-world applications.
arXiv Detail & Related papers (2023-02-06T06:58:17Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - TIGGER: Scalable Generative Modelling for Temporal Interaction Graphs [19.71442902979904]
Existing generative models do not scale with the time horizon or the number of nodes.
In this paper, we bridge these gaps with a novel generative model called TIGGER.
We establish TIGGER generates graphs of superior fidelity, while also being up to 3 orders of magnitude faster than the state-of-the-art.
arXiv Detail & Related papers (2022-03-07T18:09:05Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.