GPS++: Reviving the Art of Message Passing for Molecular Property
Prediction
- URL: http://arxiv.org/abs/2302.02947v2
- Date: Fri, 12 May 2023 15:52:18 GMT
- Title: GPS++: Reviving the Art of Message Passing for Molecular Property
Prediction
- Authors: Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam
Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Andrew Fitzgibbon,
Shenyang Huang, Ladislav Ramp\'a\v{s}ek, Dominique Beaini
- Abstract summary: GPS++ is a hybrid Message Passing Neural Network / Graph Transformer model for molecular property prediction.
Our approach is significantly more accurate than prior art when 3D positional information is not available.
- Score: 2.4476539922912632
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We present GPS++, a hybrid Message Passing Neural Network / Graph Transformer
model for molecular property prediction. Our model integrates a well-tuned
local message passing component and biased global attention with other key
ideas from prior literature to achieve state-of-the-art results on large-scale
molecular dataset PCQM4Mv2. Through a thorough ablation study we highlight the
impact of individual components and find that nearly all of the model's
performance can be maintained without any use of global self-attention, showing
that message passing is still a competitive approach for 3D molecular property
prediction despite the recent dominance of graph transformers. We also find
that our approach is significantly more accurate than prior art when 3D
positional information is not available.
Related papers
- Graph Residual based Method for Molecular Property Prediction [0.7499722271664147]
This manuscript highlights a detailed description of the novel GRU-based methodology, ECRGNN, to map the inputs that have been used.
A detailed description of the Variational Autoencoder (VAE) and the end-to-end learning method used for multi-class multi-label property prediction has been provided as well.
arXiv Detail & Related papers (2024-07-27T09:01:36Z) - On the importance of catalyst-adsorbate 3D interactions for relaxed
energy predictions [98.70797778496366]
We investigate whether it is possible to predict a system's relaxed energy in the OC20 dataset while ignoring the relative position of the adsorbate.
We find that while removing binding site information impairs accuracy as expected, modified models are able to predict relaxed energies with remarkably decent MAE.
arXiv Detail & Related papers (2023-10-10T14:57:04Z) - Geometry-aware Line Graph Transformer Pre-training for Molecular
Property Prediction [4.598522704308923]
Geometry-aware line graph transformer (Galformer) pre-training is a novel self-supervised learning framework.
Galformer consistently outperforms all baselines on both classification and regression tasks.
arXiv Detail & Related papers (2023-09-01T14:20:48Z) - Dynamic Molecular Graph-based Implementation for Biophysical Properties
Prediction [9.112532782451233]
We propose a novel approach based on the transformer model utilizing GNNs for characterizing dynamic features of protein-ligand interactions.
Our message passing transformer pre-trains on a set of molecular dynamic data based off of physics-based simulations to learn coordinate construction and make binding probability and affinity predictions.
arXiv Detail & Related papers (2022-12-20T04:21:19Z) - Transforming Model Prediction for Tracking [109.08417327309937]
Transformers capture global relations with little inductive bias, allowing it to learn the prediction of more powerful target models.
We train the proposed tracker end-to-end and validate its performance by conducting comprehensive experiments on multiple tracking datasets.
Our tracker sets a new state of the art on three benchmarks, achieving an AUC of 68.5% on the challenging LaSOT dataset.
arXiv Detail & Related papers (2022-03-21T17:59:40Z) - 3D Infomax improves GNNs for Molecular Property Prediction [1.9703625025720701]
We propose pre-training a model to reason about the geometry of molecules given only their 2D molecular graphs.
We show that 3D pre-training provides significant improvements for a wide range of properties.
arXiv Detail & Related papers (2021-10-08T13:30:49Z) - Learning Attributed Graph Representations with Communicative Message
Passing Transformer [3.812358821429274]
We propose a Communicative Message Passing Transformer (CoMPT) neural network to improve the molecular graph representation.
Unlike the previous transformer-style GNNs that treat molecules as fully connected graphs, we introduce a message diffusion mechanism to leverage the graph connectivity inductive bias.
arXiv Detail & Related papers (2021-07-19T11:58:32Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z) - TRiPOD: Human Trajectory and Pose Dynamics Forecasting in the Wild [77.59069361196404]
TRiPOD is a novel method for predicting body dynamics based on graph attentional networks.
To incorporate a real-world challenge, we learn an indicator representing whether an estimated body joint is visible/invisible at each frame.
Our evaluation shows that TRiPOD outperforms all prior work and state-of-the-art specifically designed for each of the trajectory and pose forecasting tasks.
arXiv Detail & Related papers (2021-04-08T20:01:00Z) - Self-Supervised Graph Transformer on Large-Scale Molecular Data [73.3448373618865]
We propose a novel framework, GROVER, for molecular representation learning.
GROVER can learn rich structural and semantic information of molecules from enormous unlabelled molecular data.
We pre-train GROVER with 100 million parameters on 10 million unlabelled molecules -- the biggest GNN and the largest training dataset in molecular representation learning.
arXiv Detail & Related papers (2020-06-18T08:37:04Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.