RGCVAE: Relational Graph Conditioned Variational Autoencoder for
Molecule Design
- URL: http://arxiv.org/abs/2305.11699v2
- Date: Thu, 8 Jun 2023 10:42:01 GMT
- Title: RGCVAE: Relational Graph Conditioned Variational Autoencoder for
Molecule Design
- Authors: Davide Rigoni, Nicol\`o Navarin, Alessandro Sperduti
- Abstract summary: Deep Graph Variational Autoencoders are among the most powerful machine learning tools with which it is possible to address this problem.
We propose RGCVAE, an efficient and effective Graph Variational Autoencoder based on: (i) an encoding network exploiting a new powerful Graph Isomorphism Network; (ii) a novel probabilistic decoding component.
- Score: 70.59828655929194
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Identifying molecules that exhibit some pre-specified properties is a
difficult problem to solve. In the last few years, deep generative models have
been used for molecule generation. Deep Graph Variational Autoencoders are
among the most powerful machine learning tools with which it is possible to
address this problem. However, existing methods struggle in capturing the true
data distribution and tend to be computationally expensive. In this work, we
propose RGCVAE, an efficient and effective Graph Variational Autoencoder based
on: (i) an encoding network exploiting a new powerful Relational Graph
Isomorphism Network; (ii) a novel probabilistic decoding component. Compared to
several state-of-the-art VAE methods on two widely adopted datasets, RGCVAE
shows state-of-the-art molecule generation performance while being
significantly faster to train.
Related papers
- BatmanNet: Bi-branch Masked Graph Transformer Autoencoder for Molecular
Representation [21.03650456372902]
We propose a novel bi-branch masked graph transformer autoencoder (BatmanNet) to learn molecular representations.
BatmanNet features two tailored complementary and asymmetric graph autoencoders to reconstruct the missing nodes and edges.
It achieves state-of-the-art results for multiple drug discovery tasks, including molecular properties prediction, drug-drug interaction, and drug-target interaction.
arXiv Detail & Related papers (2022-11-25T09:44:28Z) - Benchmarking GPU and TPU Performance with Graph Neural Networks [0.0]
This work analyzes and compares the GPU and TPU performance training a Graph Neural Network (GNN) developed to solve a real-life pattern recognition problem.
Characterizing the new class of models acting on sparse data may prove helpful in optimizing the design of deep learning libraries and future AI accelerators.
arXiv Detail & Related papers (2022-10-21T21:03:40Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs [55.66953093401889]
Masked graph autoencoder (MGAE) framework to perform effective learning on graph structure data.
Taking insights from self-supervised learning, we randomly mask a large proportion of edges and try to reconstruct these missing edges during training.
arXiv Detail & Related papers (2022-01-07T16:48:07Z) - Multiresolution Graph Variational Autoencoder [11.256959274636724]
We propose Multiresolution Graph Networks (MGN) and Multiresolution Graph Variational Autoencoders (MGVAE)
At each resolution level, MGN employs higher order message passing to encode the graph while learning to partition it into mutually exclusive clusters and coarsening into a lower resolution.
MGVAE constructs a hierarchical generative model based on MGN to variationally autoencode the hierarchy of coarsened graphs.
arXiv Detail & Related papers (2021-06-02T06:28:47Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Conditional Constrained Graph Variational Autoencoders for Molecule
Design [70.59828655929194]
We present Conditional Constrained Graph Variational Autoencoder (CCGVAE), a model that implements this key-idea in a state-of-the-art model.
We show improved results on several evaluation metrics on two commonly adopted datasets for molecule generation.
arXiv Detail & Related papers (2020-09-01T21:58:07Z) - Graph-Aware Transformer: Is Attention All Graphs Need? [5.240000443825077]
GRaph-Aware Transformer (GRAT) is first Transformer-based model which can encode and decode whole graphs in end-to-end fashion.
GRAT has shown very promising results including state-of-the-art performance on 4 regression tasks in QM9 benchmark.
arXiv Detail & Related papers (2020-06-09T12:13:56Z) - Auto-decoding Graphs [91.3755431537592]
The generative model is an auto-decoder that learns to synthesize graphs from latent codes.
Graphs are synthesized using self-attention modules that are trained to identify likely connectivity patterns.
arXiv Detail & Related papers (2020-06-04T14:23:01Z) - Graph Deconvolutional Generation [3.5138314002170192]
We focus on the modern equivalent of the Erdos-Renyi random graph model: the graph variational autoencoder (GVAE)
GVAE has difficulty matching the training distribution and relies on an expensive graph matching procedure.
We improve this class of models by building a message passing neural network into GVAE's encoder and decoder.
arXiv Detail & Related papers (2020-02-14T04:37:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.