Learning Harmonic Molecular Representations on Riemannian Manifold
- URL: http://arxiv.org/abs/2303.15520v1
- Date: Mon, 27 Mar 2023 18:02:47 GMT
- Title: Learning Harmonic Molecular Representations on Riemannian Manifold
- Authors: Yiqun Wang, Yuning Shen, Shi Chen, Lihao Wang, Fei Ye, Hao Zhou
- Abstract summary: Molecular representation learning plays a crucial role in AI-assisted drug discovery research.
We propose a Harmonic Molecular Representation learning framework, which represents a molecule using the Laplace-Beltrami eigenfunctions of its molecular surface.
- Score: 18.49126496517951
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Molecular representation learning plays a crucial role in AI-assisted drug
discovery research. Encoding 3D molecular structures through Euclidean neural
networks has become the prevailing method in the geometric deep learning
community. However, the equivariance constraints and message passing in
Euclidean space may limit the network expressive power. In this work, we
propose a Harmonic Molecular Representation learning (HMR) framework, which
represents a molecule using the Laplace-Beltrami eigenfunctions of its
molecular surface. HMR offers a multi-resolution representation of molecular
geometric and chemical features on 2D Riemannian manifold. We also introduce a
harmonic message passing method to realize efficient spectral message passing
over the surface manifold for better molecular encoding. Our proposed method
shows comparable predictive power to current models in small molecule property
prediction, and outperforms the state-of-the-art deep learning models for
ligand-binding protein pocket classification and the rigid protein docking
challenge, demonstrating its versatility in molecular representation learning.
Related papers
- FARM: Functional Group-Aware Representations for Small Molecules [55.281754551202326]
We introduce Functional Group-Aware Representations for Small Molecules (FARM)
FARM is a foundation model designed to bridge the gap between SMILES, natural language, and molecular graphs.
We rigorously evaluate FARM on the MoleculeNet dataset, where it achieves state-of-the-art performance on 10 out of 12 tasks.
arXiv Detail & Related papers (2024-10-02T23:04:58Z) - Pre-training of Molecular GNNs via Conditional Boltzmann Generator [0.0]
We propose a pre-training method for molecular GNNs using an existing dataset of molecular conformations.
We show that our model has a better prediction performance for molecular properties than existing pre-training methods.
arXiv Detail & Related papers (2023-12-20T15:30:15Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - MUDiff: Unified Diffusion for Complete Molecule Generation [104.7021929437504]
We present a new model for generating a comprehensive representation of molecules, including atom features, 2D discrete molecule structures, and 3D continuous molecule coordinates.
We propose a novel graph transformer architecture to denoise the diffusion process.
Our model is a promising approach for designing stable and diverse molecules and can be applied to a wide range of tasks in molecular modeling.
arXiv Detail & Related papers (2023-04-28T04:25:57Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - Improving Molecular Pretraining with Complementary Featurizations [20.86159731100242]
Molecular pretraining is a paradigm to solve a variety of tasks in computational chemistry and drug discovery.
We show that different featurization techniques convey chemical information differently.
We propose a simple and effective MOlecular pretraining framework with COmplementary featurizations (MOCO)
arXiv Detail & Related papers (2022-09-29T21:11:09Z) - A Molecular Multimodal Foundation Model Associating Molecule Graphs with
Natural Language [63.60376252491507]
We propose a molecular multimodal foundation model which is pretrained from molecular graphs and their semantically related textual data.
We believe that our model would have a broad impact on AI-empowered fields across disciplines such as biology, chemistry, materials, environment, and medicine.
arXiv Detail & Related papers (2022-09-12T00:56:57Z) - Graph-based Molecular Representation Learning [59.06193431883431]
Molecular representation learning (MRL) is a key step to build the connection between machine learning and chemical science.
Recently, MRL has achieved considerable progress, especially in methods based on deep molecular graph learning.
arXiv Detail & Related papers (2022-07-08T17:43:20Z) - ChemRL-GEM: Geometry Enhanced Molecular Representation Learning for
Property Prediction [25.49976851499949]
We propose a novel Geometry Enhanced Molecular representation learning method (GEM) for Chemical Representation Learning (ChemRL)
At first, we design a geometry-based GNN architecture that simultaneously models atoms, bonds, and bond angles in a molecule.
On top of the devised GNN architecture, we propose several novel geometry-level self-supervised learning strategies to learn spatial knowledge.
arXiv Detail & Related papers (2021-06-11T02:35:53Z) - Molecular CT: Unifying Geometry and Representation Learning for
Molecules at Different Scales [3.987395340580183]
A new deep neural network architecture, Molecular Configuration Transformer ( Molecular CT), is introduced for this purpose.
The computational efficiency and universality make Molecular CT versatile for a variety of molecular learning scenarios.
As examples, we show that Molecular CT enables representational learning for molecular systems at different scales, and achieves comparable or improved results on common benchmarks.
arXiv Detail & Related papers (2020-12-22T03:41:16Z) - Learning a Continuous Representation of 3D Molecular Structures with
Deep Generative Models [0.0]
Generative models are an entirely different approach that learn to represent and optimize molecules in a continuous latent space.
We describe deep generative models of three dimensional molecular structures using atomic density grids.
We are also able to sample diverse sets of molecules based on a given input compound to increase the probability of creating valid, drug-like molecules.
arXiv Detail & Related papers (2020-10-17T01:15:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.