Swallowing the Bitter Pill: Simplified Scalable Conformer Generation
- URL: http://arxiv.org/abs/2311.17932v3
- Date: Fri, 10 May 2024 04:32:48 GMT
- Title: Swallowing the Bitter Pill: Simplified Scalable Conformer Generation
- Authors: Yuyang Wang, Ahmed A. Elhag, Navdeep Jaitly, Joshua M. Susskind, Miguel Angel Bautista,
- Abstract summary: We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the equis of prior works and achieves state of the art results by using the advantages of scale.
We are able to radically simplify structure learning, and make it trivial to scale up the model sizes.
This model, called Molecular Conformer Fields (MCF), works by parameterizing conformer structures as functions that map elements from a molecular graph directly to their 3D location in space.
- Score: 12.341835649897886
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the heuristics of prior works and achieves state of the art results by using the advantages of scale. By training a diffusion generative model directly on 3D atomic positions without making assumptions about the explicit structure of molecules (e.g. modeling torsional angles) we are able to radically simplify structure learning, and make it trivial to scale up the model sizes. This model, called Molecular Conformer Fields (MCF), works by parameterizing conformer structures as functions that map elements from a molecular graph directly to their 3D location in space. This formulation allows us to boil down the essence of structure prediction to learning a distribution over functions. Experimental results show that scaling up the model capacity leads to large gains in generalization performance without enforcing inductive biases like rotational equivariance. MCF represents an advance in extending diffusion models to handle complex scientific problems in a conceptually simple, scalable and effective manner.
Related papers
- SubGDiff: A Subgraph Diffusion Model to Improve Molecular Representation Learning [14.338345772161102]
We propose a novel diffusion model termed SubGDiff for involving the molecular subgraph information in diffusion.
SubGDiff adopts three vital techniques: subgraph prediction, expectation state, and k-step same subgraph diffusion.
Experimentally, extensive downstream tasks demonstrate the superior performance of our approach.
arXiv Detail & Related papers (2024-05-09T10:37:33Z) - Accelerating Inference in Molecular Diffusion Models with Latent Representations of Protein Structure [0.0]
Diffusion generative models operate directly on 3D molecular structures.
We present a novel GNN-based architecture for learning latent representations of molecular structure.
Our model achieves comparable performance to one with an all-atom protein representation while exhibiting a 3-fold reduction in inference time.
arXiv Detail & Related papers (2023-11-22T15:32:31Z) - Investigating the Behavior of Diffusion Models for Accelerating
Electronic Structure Calculations [24.116064925926914]
Investigation driven by their potential to significantly accelerate electronic structure calculations using machine learning.
We show that the model learns about the first-order structure of the potential energy surface, and then later learns about higher-order structure.
For structure relaxations, the model finds geometries with 10x lower energy than those produced by a classical force field for small organic molecules.
arXiv Detail & Related papers (2023-11-02T17:58:37Z) - Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model [0.0]
We propose a novel deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
We demonstrate that our model can design effective molecular latent vectors for molecular property prediction from some experiments by small datasets on physical properties and activity.
arXiv Detail & Related papers (2023-07-02T17:29:41Z) - Molecule Design by Latent Space Energy-Based Modeling and Gradual
Distribution Shifting [53.44684898432997]
Generation of molecules with desired chemical and biological properties is critical for drug discovery.
We propose a probabilistic generative model to capture the joint distribution of molecules and their properties.
Our method achieves very strong performances on various molecule design tasks.
arXiv Detail & Related papers (2023-06-09T03:04:21Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Modeling Molecular Structures with Intrinsic Diffusion Models [2.487445341407889]
This thesis proposes Intrinsic Diffusion Modeling.
It combines diffusion generative models with scientific knowledge about the flexibility of biological complexes.
We demonstrate the effectiveness of this approach on two fundamental tasks at the basis of computational chemistry and biology.
arXiv Detail & Related papers (2023-02-23T03:26:48Z) - MolCPT: Molecule Continuous Prompt Tuning to Generalize Molecular
Representation Learning [77.31492888819935]
We propose a novel paradigm of "pre-train, prompt, fine-tune" for molecular representation learning, named molecule continuous prompt tuning (MolCPT)
MolCPT defines a motif prompting function that uses the pre-trained model to project the standalone input into an expressive prompt.
Experiments on several benchmark datasets show that MolCPT efficiently generalizes pre-trained GNNs for molecular property prediction.
arXiv Detail & Related papers (2022-12-20T19:32:30Z) - Equivariant Diffusion for Molecule Generation in 3D [74.289191525633]
This work introduces a diffusion model for molecule computation generation in 3D that is equivariant to Euclidean transformations.
Experimentally, the proposed method significantly outperforms previous 3D molecular generative methods regarding the quality of generated samples and efficiency at training time.
arXiv Detail & Related papers (2022-03-31T12:52:25Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.