Overcoming Order in Autoregressive Graph Generation
- URL: http://arxiv.org/abs/2402.03387v1
- Date: Sun, 4 Feb 2024 09:58:22 GMT
- Title: Overcoming Order in Autoregressive Graph Generation
- Authors: Edo Cohen-Karlik, Eyal Rozenberg and Daniel Freedman
- Abstract summary: Graph generation is a fundamental problem in various domains, including chemistry and social networks.
Recent work has shown that molecular graph generation using recurrent neural networks (RNNs) is advantageous compared to traditional generative approaches.
- Score: 12.351817671944515
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph generation is a fundamental problem in various domains, including
chemistry and social networks. Recent work has shown that molecular graph
generation using recurrent neural networks (RNNs) is advantageous compared to
traditional generative approaches which require converting continuous latent
representations into graphs. One issue which arises when treating graph
generation as sequential generation is the arbitrary order of the sequence
which results from a particular choice of graph flattening method. In this work
we propose using RNNs, taking into account the non-sequential nature of graphs
by adding an Orderless Regularization (OLR) term that encourages the hidden
state of the recurrent model to be invariant to different valid orderings
present under the training distribution. We demonstrate that sequential graph
generation models benefit from our proposed regularization scheme, especially
when data is scarce. Our findings contribute to the growing body of research on
graph generation and provide a valuable tool for various applications requiring
the synthesis of realistic and diverse graph structures.
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - GraphRCG: Self-Conditioned Graph Generation [78.69810678803248]
We propose a novel self-conditioned graph generation framework designed to explicitly model graph distributions.
Our framework demonstrates superior performance over existing state-of-the-art graph generation methods in terms of graph quality and fidelity to training data.
arXiv Detail & Related papers (2024-03-02T02:28:20Z) - Let There Be Order: Rethinking Ordering in Autoregressive Graph
Generation [6.422073551199993]
Conditional graph generation tasks involve training a model to generate a graph given a set of input conditions.
Many previous studies employ autoregressive models to incrementally generate graph components such as nodes and edges.
As graphs typically lack a natural ordering among their components, converting a graph into a sequence of tokens is not straightforward.
arXiv Detail & Related papers (2023-05-24T20:52:34Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - GraphGDP: Generative Diffusion Processes for Permutation Invariant Graph
Generation [43.196067037856515]
Graph generative models have broad applications in biology, chemistry and social science.
Current leading autoregressive models fail to capture the permutation invariance nature of graphs.
We propose a continuous-time generative diffusion process for permutation invariant graph generation.
arXiv Detail & Related papers (2022-12-04T15:12:44Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Order Matters: Probabilistic Modeling of Node Sequence for Graph
Generation [18.03898476141173]
A graph generative model defines a distribution over graphs.
We derive the exact joint probability over the graph and the node ordering of the sequential process.
We train graph generative models by maximizing this bound, without using the ad-hoc node orderings of previous methods.
arXiv Detail & Related papers (2021-06-11T06:37:52Z) - A Robust and Generalized Framework for Adversarial Graph Embedding [73.37228022428663]
We propose a robust framework for adversarial graph embedding, named AGE.
AGE generates the fake neighbor nodes as the enhanced negative samples from the implicit distribution.
Based on this framework, we propose three models to handle three types of graph data.
arXiv Detail & Related papers (2021-05-22T07:05:48Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Learning to Generate Time Series Conditioned Graphs with Generative
Adversarial Nets [9.884477413012815]
We are interested in a novel problem named Time Seriesed Graph Generation: given an input time series, we aim to infer a target relation graph.
To achieve this, we propose a novel Time Series conditioned Graph Generation-Generative Adrial Networks (TSGGGAN)
Extensive experiments on synthetic and real-word gene regulatory networks datasets demonstrate the effectiveness and generalizability of the proposed TSGG-GAN.
arXiv Detail & Related papers (2020-03-03T10:41:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.