Directional Message Passing on Molecular Graphs via Synthetic
Coordinates
- URL: http://arxiv.org/abs/2111.04718v1
- Date: Mon, 8 Nov 2021 18:53:58 GMT
- Title: Directional Message Passing on Molecular Graphs via Synthetic
Coordinates
- Authors: Johannes Klicpera, Chandan Yeshwanth, Stephan G\"unnemann
- Abstract summary: We propose synthetic coordinates that enable the use of advanced GNNs without requiring the true molecular configuration.
We show that with this transformation we can reduce the error of a normal graph neural network by 55% on the ZINC benchmark.
We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.
- Score: 7.314446024059812
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks that leverage coordinates via directional message
passing have recently set the state of the art on multiple molecular property
prediction tasks. However, they rely on atom position information that is often
unavailable, and obtaining it is usually prohibitively expensive or even
impossible. In this paper we propose synthetic coordinates that enable the use
of advanced GNNs without requiring the true molecular configuration. We propose
two distances as synthetic coordinates: Distance bounds that specify the rough
range of molecular configurations, and graph-based distances using a symmetric
variant of personalized PageRank. To leverage both distance and angular
information we propose a method of transforming normal graph neural networks
into directional MPNNs. We show that with this transformation we can reduce the
error of a normal graph neural network by 55% on the ZINC benchmark. We
furthermore set the state of the art on ZINC and coordinate-free QM9 by
incorporating synthetic coordinates in the SMP and DimeNet++ models. Our
implementation is available online.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Recurrent Distance Filtering for Graph Representation Learning [34.761926988427284]
Graph neural networks based on iterative one-hop message passing have been shown to struggle in harnessing the information from distant nodes effectively.
We propose a new architecture to reconcile these challenges.
Our model aggregates other nodes by their shortest distances to the target and uses a linear RNN to encode the sequence of hop representations.
arXiv Detail & Related papers (2023-12-03T23:36:16Z) - HYVE: Hybrid Vertex Encoder for Neural Distance Fields [9.40036617308303]
We present a neural-network architecture suitable for accurate encoding of 3D shapes in a single forward pass.
Our network is able to output valid signed distance fields without explicit prior knowledge of non-zero distance values or shape occupancy.
arXiv Detail & Related papers (2023-10-10T14:07:37Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Template based Graph Neural Network with Optimal Transport Distances [11.56532171513328]
Current Graph Neural Networks (GNN) architectures rely on two important components: node features embedding through message passing, and aggregation with a specialized form of pooling.
We propose in this work a novel point of view, which places distances to some learnable graph templates at the core of the graph representation.
This distance embedding is constructed thanks to an optimal transport distance: the Fused Gromov-Wasserstein (FGW) distance.
arXiv Detail & Related papers (2022-05-31T12:24:01Z) - Direct Molecular Conformation Generation [217.4815525740703]
We propose a method that directly predicts the coordinates of atoms.
Our method achieves state-of-the-art results on four public benchmarks.
arXiv Detail & Related papers (2022-02-03T01:01:58Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - Directed Graph Attention Neural Network Utilizing 3D Coordinates for
Molecular Property Prediction [11.726245297344418]
Kernel method and graph neural networks have been widely studied as two mainstream methods for property prediction.
In this work, we shed light on the Directed Graph Attention Neural Network (DGANN), which only takes chemical bonds as edges.
Our model has matched or outperformed most baseline graph neural networks on QM9 datasets.
arXiv Detail & Related papers (2020-12-01T11:06:40Z) - Nonlinear State-Space Generalizations of Graph Convolutional Neural
Networks [172.18295279061607]
Graph convolutional neural networks (GCNNs) learn compositional representations from network data by nesting linear graph convolutions into nonlinearities.
In this work, we approach GCNNs from a state-space perspective revealing that the graph convolutional module is a minimalistic linear state-space model.
We show that this state update may be problematic because it is nonparametric, and depending on the graph spectrum it may explode or vanish.
We propose a novel family of nodal aggregation rules that aggregate node features within a layer in a nonlinear state-space parametric fashion allowing for a better trade-off.
arXiv Detail & Related papers (2020-10-27T19:48:56Z) - Directional Message Passing for Molecular Graphs [0.0]
Graph neural networks have recently achieved great successes in predicting quantum mechanical properties of molecules.
We propose directional message passing, in which we embed the messages passed between atoms instead of the atoms themselves.
DimeNet outperforms previous GNNs on average by 76% on MD17 and by 31% on QM9.
arXiv Detail & Related papers (2020-03-06T10:30:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.