Equivariant Masked Position Prediction for Efficient Molecular Representation
- URL: http://arxiv.org/abs/2502.08209v1
- Date: Wed, 12 Feb 2025 08:39:26 GMT
- Title: Equivariant Masked Position Prediction for Efficient Molecular Representation
- Authors: Junyi An, Chao Qu, Yun-Fei Shi, XinHao Liu, Qianwei Tang, Fenglei Cao, Yuan Qi,
- Abstract summary: Graph neural networks (GNNs) have shown considerable promise in computational chemistry.
We introduce a novel self-supervised approach termed Equivariant Masked Position Prediction (EMPP)
EMPP formulates a nuanced position prediction task that is more well-defined and enhances the learning of quantum mechanical features.
- Score: 6.761418610103767
- License:
- Abstract: Graph neural networks (GNNs) have shown considerable promise in computational chemistry. However, the limited availability of molecular data raises concerns regarding GNNs' ability to effectively capture the fundamental principles of physics and chemistry, which constrains their generalization capabilities. To address this challenge, we introduce a novel self-supervised approach termed Equivariant Masked Position Prediction (EMPP), grounded in intramolecular potential and force theory. Unlike conventional attribute masking techniques, EMPP formulates a nuanced position prediction task that is more well-defined and enhances the learning of quantum mechanical features. EMPP also bypasses the approximation of the Gaussian mixture distribution commonly used in denoising methods, allowing for more accurate acquisition of physical properties. Experimental results indicate that EMPP significantly enhances performance of advanced molecular architectures, surpassing state-of-the-art self-supervised approaches. Our code is released in https://github.com/ajy112/EMPP.
Related papers
- Equivariant Graph Network Approximations of High-Degree Polynomials for Force Field Prediction [62.05532524197309]
equivariant deep models have shown promise in accurately predicting atomic potentials and force fields in molecular dynamics simulations.
In this work, we analyze the equivariant functions for equivariant architecture, and introduce a novel equivariant network, named PACE.
As experimented in commonly used benchmarks, PACE demonstrates state-of-the-art performance in predicting atomic energy and force fields.
arXiv Detail & Related papers (2024-11-06T19:34:40Z) - Contrastive Dual-Interaction Graph Neural Network for Molecular Property Prediction [0.0]
We introduce DIG-Mol, a novel self-supervised graph neural network framework for molecular property prediction.
DIG-Mol integrates a momentum distillation network with two interconnected networks to efficiently improve molecular characterization.
We have established DIG-Mol's state-of-the-art performance through extensive experimental evaluation in a variety of molecular property prediction tasks.
arXiv Detail & Related papers (2024-05-04T10:09:27Z) - Universal neural network potentials as descriptors: Towards scalable chemical property prediction using quantum and classical computers [0.0]
We present a versatile approach that uses the intermediate information of a universal neural network potential as a general-purpose descriptor for chemical property prediction.
We show that transfer learning with graph neural network potentials such as M3GNet and MACE achieves accuracy comparable to state-of-the-art methods for predicting the NMR chemical shifts.
arXiv Detail & Related papers (2024-02-28T15:57:22Z) - Towards Predicting Equilibrium Distributions for Molecular Systems with
Deep Learning [60.02391969049972]
We introduce a novel deep learning framework, called Distributional Graphormer (DiG), in an attempt to predict the equilibrium distribution of molecular systems.
DiG employs deep neural networks to transform a simple distribution towards the equilibrium distribution, conditioned on a descriptor of a molecular system.
arXiv Detail & Related papers (2023-06-08T17:12:08Z) - Atomic and Subgraph-aware Bilateral Aggregation for Molecular
Representation Learning [57.670845619155195]
We introduce a new model for molecular representation learning called the Atomic and Subgraph-aware Bilateral Aggregation (ASBA)
ASBA addresses the limitations of previous atom-wise and subgraph-wise models by incorporating both types of information.
Our method offers a more comprehensive way to learn representations for molecular property prediction and has broad potential in drug and material discovery applications.
arXiv Detail & Related papers (2023-05-22T00:56:00Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Pre-training via Denoising for Molecular Property Prediction [53.409242538744444]
We describe a pre-training technique that utilizes large datasets of 3D molecular structures at equilibrium.
Inspired by recent advances in noise regularization, our pre-training objective is based on denoising.
arXiv Detail & Related papers (2022-05-31T22:28:34Z) - TorchMD-NET: Equivariant Transformers for Neural Network based Molecular
Potentials [2.538209532048867]
We propose TorchMD-NET, a novel equivariant transformer (ET) architecture, outperforming state-of-the-art on MD17, ANI-1, and many QM9 targets in both accuracy and computational efficiency.
arXiv Detail & Related papers (2022-02-05T12:53:40Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - MEG: Generating Molecular Counterfactual Explanations for Deep Graph
Networks [11.291571222801027]
We present a novel approach to tackle explainability of deep graph networks in the context of molecule property prediction t asks.
We generate informative counterfactual explanations for a specific prediction under the form of (valid) compounds with high structural similarity and different predicted properties.
We discuss the results showing how the model can convey non-ML experts with key insights into the learning model focus in the neighbourhood of a molecule.
arXiv Detail & Related papers (2021-04-16T12:17:19Z) - Benchmarking adaptive variational quantum eigensolvers [63.277656713454284]
We benchmark the accuracy of VQE and ADAPT-VQE to calculate the electronic ground states and potential energy curves.
We find both methods provide good estimates of the energy and ground state.
gradient-based optimization is more economical and delivers superior performance than analogous simulations carried out with gradient-frees.
arXiv Detail & Related papers (2020-11-02T19:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.