Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model
- URL: http://arxiv.org/abs/2307.00623v2
- Date: Tue, 22 Aug 2023 07:10:19 GMT
- Title: Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model
- Authors: Daiki Koge, Naoaki Ono and Shigehiko Kanaya
- Abstract summary: We propose a novel deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
We demonstrate that our model can design effective molecular latent vectors for molecular property prediction from some experiments by small datasets on physical properties and activity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In data-driven drug discovery, designing molecular descriptors is a very
important task. Deep generative models such as variational autoencoders (VAEs)
offer a potential solution by designing descriptors as probabilistic latent
vectors derived from molecular structures. These models can be trained on large
datasets, which have only molecular structures, and applied to transfer
learning. Nevertheless, the approximate posterior distribution of the latent
vectors of the usual VAE assumes a simple multivariate Gaussian distribution
with zero covariance, which may limit the performance of representing the
latent features. To overcome this limitation, we propose a novel molecular deep
generative model that incorporates a hierarchical structure into the
probabilistic latent vectors. We achieve this by a denoising diffusion
probabilistic model (DDPM). We demonstrate that our model can design effective
molecular latent vectors for molecular property prediction from some
experiments by small datasets on physical properties and activity. The results
highlight the superior prediction performance and robustness of our model
compared to existing approaches.
Related papers
- A novel molecule generative model of VAE combined with Transformer for unseen structure generation [0.0]
Transformer and VAE are widely used as powerful models, but they are rarely used in combination due to structural and performance mismatch.
This study proposes a model that combines these two models through structural and parameter optimization in handling diverse molecules.
The proposed model shows comparable performance to existing models in generating molecules, and showed by far superior performance in generating molecules with unseen structures.
arXiv Detail & Related papers (2024-02-19T08:46:04Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Learning inducing points and uncertainty on molecular data by scalable
variational Gaussian processes [0.0]
We show that variational learning of the inducing points in a molecular descriptor space improves the prediction of energies and atomic forces on two molecular dynamics datasets.
We extend our study to a large molecular crystal system, showing that variational GP models perform well for predicting atomic forces by efficiently learning a sparse representation of the dataset.
arXiv Detail & Related papers (2022-07-16T10:41:41Z) - Learning Neural Generative Dynamics for Molecular Conformation
Generation [89.03173504444415]
We study how to generate molecule conformations (textiti.e., 3D structures) from a molecular graph.
We propose a novel probabilistic framework to generate valid and diverse conformations given a molecular graph.
arXiv Detail & Related papers (2021-02-20T03:17:58Z) - Probabilistic Circuits for Variational Inference in Discrete Graphical
Models [101.28528515775842]
Inference in discrete graphical models with variational methods is difficult.
Many sampling-based methods have been proposed for estimating Evidence Lower Bound (ELBO)
We propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN)
We show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is aweighted the corresponding ELBO can be computed analytically.
arXiv Detail & Related papers (2020-10-22T05:04:38Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z) - Physics-Constrained Predictive Molecular Latent Space Discovery with
Graph Scattering Variational Autoencoder [0.0]
We develop a molecular generative model based on variational inference and graph theory in the small data regime.
The model's performance is evaluated by generating molecules with desired target properties.
arXiv Detail & Related papers (2020-09-29T09:05:27Z) - Improving Molecular Design by Stochastic Iterative Target Augmentation [38.44457632751997]
Generative models in molecular design tend to be richly parameterized, data-hungry neural models.
We propose a surprisingly effective self-training approach for iteratively creating additional molecular targets.
Our approach outperforms the previous state-of-the-art in conditional molecular design by over 10% in absolute gain.
arXiv Detail & Related papers (2020-02-11T22:40:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.