Graph Deconvolutional Generation
- URL: http://arxiv.org/abs/2002.07087v1
- Date: Fri, 14 Feb 2020 04:37:14 GMT
- Title: Graph Deconvolutional Generation
- Authors: Daniel Flam-Shepherd, Tony Wu and Alan Aspuru-Guzik
- Abstract summary: We focus on the modern equivalent of the Erdos-Renyi random graph model: the graph variational autoencoder (GVAE)
GVAE has difficulty matching the training distribution and relies on an expensive graph matching procedure.
We improve this class of models by building a message passing neural network into GVAE's encoder and decoder.
- Score: 3.5138314002170192
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph generation is an extremely important task, as graphs are found
throughout different areas of science and engineering. In this work, we focus
on the modern equivalent of the Erdos-Renyi random graph model: the graph
variational autoencoder (GVAE). This model assumes edges and nodes are
independent in order to generate entire graphs at a time using a multi-layer
perceptron decoder. As a result of these assumptions, GVAE has difficulty
matching the training distribution and relies on an expensive graph matching
procedure. We improve this class of models by building a message passing neural
network into GVAE's encoder and decoder. We demonstrate our model on the
specific task of generating small organic molecules
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - Efficient and Degree-Guided Graph Generation via Discrete Diffusion
Modeling [20.618785908770356]
Diffusion-based generative graph models have been proven effective in generating high-quality small graphs.
However, they need to be more scalable for generating large graphs containing thousands of nodes desiring graph statistics.
We propose EDGE, a new diffusion-based generative graph model that addresses generative tasks with large graphs.
arXiv Detail & Related papers (2023-05-06T18:32:27Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - GraphGen-Redux: a Fast and Lightweight Recurrent Model for labeled Graph
Generation [13.956691231452336]
We present a novel graph preprocessing approach for labeled graph generation.
By introducing a novel graph preprocessing approach, we are able to process the labeling information of both nodes and edges jointly.
The corresponding model, which we term GraphGen-Redux, improves upon the generative performances of GraphGen in a wide range of datasets.
arXiv Detail & Related papers (2021-07-18T09:26:10Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Adaptive Graph Auto-Encoder for General Data Clustering [90.8576971748142]
Graph-based clustering plays an important role in the clustering area.
Recent studies about graph convolution neural networks have achieved impressive success on graph type data.
We propose a graph auto-encoder for general data clustering, which constructs the graph adaptively according to the generative perspective of graphs.
arXiv Detail & Related papers (2020-02-20T10:11:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.