Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model
- URL: http://arxiv.org/abs/2307.00623v2
- Date: Tue, 22 Aug 2023 07:10:19 GMT
- Title: Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model
- Authors: Daiki Koge, Naoaki Ono and Shigehiko Kanaya
- Abstract summary: We propose a novel deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
We demonstrate that our model can design effective molecular latent vectors for molecular property prediction from some experiments by small datasets on physical properties and activity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In data-driven drug discovery, designing molecular descriptors is a very
important task. Deep generative models such as variational autoencoders (VAEs)
offer a potential solution by designing descriptors as probabilistic latent
vectors derived from molecular structures. These models can be trained on large
datasets, which have only molecular structures, and applied to transfer
learning. Nevertheless, the approximate posterior distribution of the latent
vectors of the usual VAE assumes a simple multivariate Gaussian distribution
with zero covariance, which may limit the performance of representing the
latent features. To overcome this limitation, we propose a novel molecular deep
generative model that incorporates a hierarchical structure into the
probabilistic latent vectors. We achieve this by a denoising diffusion
probabilistic model (DDPM). We demonstrate that our model can design effective
molecular latent vectors for molecular property prediction from some
experiments by small datasets on physical properties and activity. The results
highlight the superior prediction performance and robustness of our model
compared to existing approaches.
Related papers
- MING: A Functional Approach to Learning Molecular Generative Models [46.189683355768736]
This paper introduces a novel paradigm for learning molecule generative models based on functional representations.
We propose Molecular Implicit Neural Generation (MING), a diffusion-based model that learns molecular distributions in function space.
arXiv Detail & Related papers (2024-10-16T13:02:02Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - A novel molecule generative model of VAE combined with Transformer for unseen structure generation [0.0]
Transformer and VAE are widely used as powerful models, but they are rarely used in combination due to structural and performance mismatch.
This study proposes a model that combines these two models through structural and parameter optimization in handling diverse molecules.
The proposed model shows comparable performance to existing models in generating molecules, and showed by far superior performance in generating molecules with unseen structures.
arXiv Detail & Related papers (2024-02-19T08:46:04Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - Attribute Graphs Underlying Molecular Generative Models: Path to Learning with Limited Data [42.517927809224275]
We provide an algorithm that relies on perturbation experiments on latent codes of a pre-trained generative autoencoder to uncover an attribute graph.
We show that one can fit an effective graphical model that models a structural equation model between latent codes.
Using a pre-trained generative autoencoder trained on a large dataset of small molecules, we demonstrate that the graphical model can be used to predict a specific property.
arXiv Detail & Related papers (2022-07-14T19:20:30Z) - Learning Neural Generative Dynamics for Molecular Conformation
Generation [89.03173504444415]
We study how to generate molecule conformations (textiti.e., 3D structures) from a molecular graph.
We propose a novel probabilistic framework to generate valid and diverse conformations given a molecular graph.
arXiv Detail & Related papers (2021-02-20T03:17:58Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Physics-Constrained Predictive Molecular Latent Space Discovery with
Graph Scattering Variational Autoencoder [0.0]
We develop a molecular generative model based on variational inference and graph theory in the small data regime.
The model's performance is evaluated by generating molecules with desired target properties.
arXiv Detail & Related papers (2020-09-29T09:05:27Z) - Improving Molecular Design by Stochastic Iterative Target Augmentation [38.44457632751997]
Generative models in molecular design tend to be richly parameterized, data-hungry neural models.
We propose a surprisingly effective self-training approach for iteratively creating additional molecular targets.
Our approach outperforms the previous state-of-the-art in conditional molecular design by over 10% in absolute gain.
arXiv Detail & Related papers (2020-02-11T22:40:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.