Swallowing the Bitter Pill: Simplified Scalable Conformer Generation
- URL: http://arxiv.org/abs/2311.17932v3
- Date: Fri, 10 May 2024 04:32:48 GMT
- Title: Swallowing the Bitter Pill: Simplified Scalable Conformer Generation
- Authors: Yuyang Wang, Ahmed A. Elhag, Navdeep Jaitly, Joshua M. Susskind, Miguel Angel Bautista,
- Abstract summary: We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the equis of prior works and achieves state of the art results by using the advantages of scale.
We are able to radically simplify structure learning, and make it trivial to scale up the model sizes.
This model, called Molecular Conformer Fields (MCF), works by parameterizing conformer structures as functions that map elements from a molecular graph directly to their 3D location in space.
- Score: 12.341835649897886
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the heuristics of prior works and achieves state of the art results by using the advantages of scale. By training a diffusion generative model directly on 3D atomic positions without making assumptions about the explicit structure of molecules (e.g. modeling torsional angles) we are able to radically simplify structure learning, and make it trivial to scale up the model sizes. This model, called Molecular Conformer Fields (MCF), works by parameterizing conformer structures as functions that map elements from a molecular graph directly to their 3D location in space. This formulation allows us to boil down the essence of structure prediction to learning a distribution over functions. Experimental results show that scaling up the model capacity leads to large gains in generalization performance without enforcing inductive biases like rotational equivariance. MCF represents an advance in extending diffusion models to handle complex scientific problems in a conceptually simple, scalable and effective manner.
Related papers
- Pre-trained Molecular Language Models with Random Functional Group Masking [54.900360309677794]
We propose a SMILES-based underlineem Molecular underlineem Language underlineem Model, which randomly masking SMILES subsequences corresponding to specific molecular atoms.
This technique aims to compel the model to better infer molecular structures and properties, thus enhancing its predictive capabilities.
arXiv Detail & Related papers (2024-11-03T01:56:15Z) - MING: A Functional Approach to Learning Molecular Generative Models [46.189683355768736]
This paper introduces a novel paradigm for learning molecule generative models based on functional representations.
We propose Molecular Implicit Neural Generation (MING), a diffusion-based model that learns molecular distributions in function space.
arXiv Detail & Related papers (2024-10-16T13:02:02Z) - Molecule Design by Latent Prompt Transformer [76.2112075557233]
This work explores the challenging problem of molecule design by framing it as a conditional generative modeling task.
We propose a novel generative model comprising three components: (1) a latent vector with a learnable prior distribution; (2) a molecule generation model based on a causal Transformer, which uses the latent vector as a prompt; and (3) a property prediction model that predicts a molecule's target properties and/or constraint values using the latent prompt.
arXiv Detail & Related papers (2024-02-27T03:33:23Z) - Accelerating Inference in Molecular Diffusion Models with Latent Representations of Protein Structure [0.0]
Diffusion generative models operate directly on 3D molecular structures.
We present a novel GNN-based architecture for learning latent representations of molecular structure.
Our model achieves comparable performance to one with an all-atom protein representation while exhibiting a 3-fold reduction in inference time.
arXiv Detail & Related papers (2023-11-22T15:32:31Z) - Investigating the Behavior of Diffusion Models for Accelerating
Electronic Structure Calculations [24.116064925926914]
Investigation driven by their potential to significantly accelerate electronic structure calculations using machine learning.
We show that the model learns about the first-order structure of the potential energy surface, and then later learns about higher-order structure.
For structure relaxations, the model finds geometries with 10x lower energy than those produced by a classical force field for small organic molecules.
arXiv Detail & Related papers (2023-11-02T17:58:37Z) - Variational Autoencoding Molecular Graphs with Denoising Diffusion
Probabilistic Model [0.0]
We propose a novel deep generative model that incorporates a hierarchical structure into the probabilistic latent vectors.
We demonstrate that our model can design effective molecular latent vectors for molecular property prediction from some experiments by small datasets on physical properties and activity.
arXiv Detail & Related papers (2023-07-02T17:29:41Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Modeling Molecular Structures with Intrinsic Diffusion Models [2.487445341407889]
This thesis proposes Intrinsic Diffusion Modeling.
It combines diffusion generative models with scientific knowledge about the flexibility of biological complexes.
We demonstrate the effectiveness of this approach on two fundamental tasks at the basis of computational chemistry and biology.
arXiv Detail & Related papers (2023-02-23T03:26:48Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.