A Graph VAE and Graph Transformer Approach to Generating Molecular
Graphs
- URL: http://arxiv.org/abs/2104.04345v1
- Date: Fri, 9 Apr 2021 13:13:06 GMT
- Title: A Graph VAE and Graph Transformer Approach to Generating Molecular
Graphs
- Authors: Joshua Mitton, Hans M. Senn, Klaas Wynne, Roderick Murray-Smith
- Abstract summary: We propose a variational autoencoder and a transformer based model which fully utilise graph convolutional and graph pooling layers.
The transformer model implements a novel node encoding layer, replacing the position encoding typically used in transformers, to create a transformer with no position information that operates on graphs.
In experiments we chose a benchmark task of molecular generation, given the importance of both generated node and edge features.
- Score: 1.6631602844999724
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a combination of a variational autoencoder and a transformer based
model which fully utilises graph convolutional and graph pooling layers to
operate directly on graphs. The transformer model implements a novel node
encoding layer, replacing the position encoding typically used in transformers,
to create a transformer with no position information that operates on graphs,
encoding adjacent node properties into the edge generation process. The
proposed model builds on graph generative work operating on graphs with edge
features, creating a model that offers improved scalability with the number of
nodes in a graph. In addition, our model is capable of learning a disentangled,
interpretable latent space that represents graph properties through a mapping
between latent variables and graph properties. In experiments we chose a
benchmark task of molecular generation, given the importance of both generated
node and edge features. Using the QM9 dataset we demonstrate that our model
performs strongly across the task of generating valid, unique and novel
molecules. Finally, we demonstrate that the model is interpretable by
generating molecules controlled by molecular properties, and we then analyse
and visualise the learned latent representation.
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - GrannGAN: Graph annotation generative adversarial networks [72.66289932625742]
We consider the problem of modelling high-dimensional distributions and generating new examples of data with complex relational feature structure coherent with a graph skeleton.
The model we propose tackles the problem of generating the data features constrained by the specific graph structure of each data point by splitting the task into two phases.
In the first it models the distribution of features associated with the nodes of the given graph, in the second it complements the edge features conditionally on the node features.
arXiv Detail & Related papers (2022-12-01T11:49:07Z) - DiGress: Discrete Denoising diffusion for graph generation [79.13904438217592]
DiGress is a discrete denoising diffusion model for generating graphs with categorical node and edge attributes.
It achieves state-of-the-art performance on molecular and non-molecular datasets, with up to 3x validity improvement.
It is also the first model to scale to the large GuacaMol dataset containing 1.3M drug-like molecules.
arXiv Detail & Related papers (2022-09-29T12:55:03Z) - Gransformer: Transformer-based Graph Generation [14.161975556325796]
Gransformer is an algorithm based on Transformer for generating graphs.
We modify the Transformer encoder to exploit the structural information of the given graph.
We also introduce a graph-based familiarity measure between node pairs.
arXiv Detail & Related papers (2022-03-25T14:05:12Z) - Graph Self-Attention for learning graph representation with Transformer [13.49645012479288]
We propose a novel Graph Self-Attention module to enable Transformer models to learn graph representation.
We propose context-aware attention which considers the interactions between query, key and graph information.
Our method achieves state-of-the-art performance on multiple benchmarks of graph representation learning.
arXiv Detail & Related papers (2022-01-30T11:10:06Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - GraphPiece: Efficiently Generating High-Quality Molecular Graph with
Substructures [7.021635649909492]
We propose a method to automatically discover common substructures, which we call em graph pieces, from given molecular graphs.
Based on graph pieces, we leverage a variational autoencoder to generate molecules in two phases: piece-level graph generation followed by bond completion.
arXiv Detail & Related papers (2021-06-29T05:26:18Z) - Learning Graphon Autoencoders for Generative Graph Modeling [91.32624399902755]
Graphon is a nonparametric model that generates graphs with arbitrary sizes and can be induced from graphs easily.
We propose a novel framework called textitgraphon autoencoder to build an interpretable and scalable graph generative model.
A linear graphon factorization model works as a decoder, leveraging the latent representations to reconstruct the induced graphons.
arXiv Detail & Related papers (2021-05-29T08:11:40Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.