Directional Message Passing for Molecular Graphs
- URL: http://arxiv.org/abs/2003.03123v2
- Date: Tue, 5 Apr 2022 11:39:42 GMT
- Title: Directional Message Passing for Molecular Graphs
- Authors: Johannes Gasteiger, Janek Gro{\ss}, Stephan G\"unnemann
- Abstract summary: Graph neural networks have recently achieved great successes in predicting quantum mechanical properties of molecules.
We propose directional message passing, in which we embed the messages passed between atoms instead of the atoms themselves.
DimeNet outperforms previous GNNs on average by 76% on MD17 and by 31% on QM9.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks have recently achieved great successes in predicting
quantum mechanical properties of molecules. These models represent a molecule
as a graph using only the distance between atoms (nodes). They do not, however,
consider the spatial direction from one atom to another, despite directional
information playing a central role in empirical potentials for molecules, e.g.
in angular potentials. To alleviate this limitation we propose directional
message passing, in which we embed the messages passed between atoms instead of
the atoms themselves. Each message is associated with a direction in coordinate
space. These directional message embeddings are rotationally equivariant since
the associated directions rotate with the molecule. We propose a message
passing scheme analogous to belief propagation, which uses the directional
information by transforming messages based on the angle between them.
Additionally, we use spherical Bessel functions and spherical harmonics to
construct theoretically well-founded, orthogonal representations that achieve
better performance than the currently prevalent Gaussian radial basis
representations while using fewer than 1/4 of the parameters. We leverage these
innovations to construct the directional message passing neural network
(DimeNet). DimeNet outperforms previous GNNs on average by 76% on MD17 and by
31% on QM9. Our implementation is available online.
Related papers
- Towards Dynamic Message Passing on Graphs [104.06474765596687]
We propose a novel dynamic message-passing mechanism for graph neural networks (GNNs)
It projects graph nodes and learnable pseudo nodes into a common space with measurable spatial relations between them.
With nodes moving in the space, their evolving relations facilitate flexible pathway construction for a dynamic message-passing process.
arXiv Detail & Related papers (2024-10-31T07:20:40Z) - Recurrent Distance Filtering for Graph Representation Learning [34.761926988427284]
Graph neural networks based on iterative one-hop message passing have been shown to struggle in harnessing the information from distant nodes effectively.
We propose a new architecture to reconcile these challenges.
Our model aggregates other nodes by their shortest distances to the target and uses a linear RNN to encode the sequence of hop representations.
arXiv Detail & Related papers (2023-12-03T23:36:16Z) - Neural Atoms: Propagating Long-range Interaction in Molecular Graphs through Efficient Communication Channel [48.6168145845412]
We propose a method to abstract the collective information of atomic groups into a few $textitNeural Atoms$ by implicitly projecting the atoms of a molecular.
Specifically, we explicitly exchange the information among neural atoms and project them back to the atoms' representations as an enhancement.
With this mechanism, neural atoms establish the communication channels among distant nodes, effectively reducing the interaction scope of arbitrary node pairs into a single hop.
arXiv Detail & Related papers (2023-11-02T14:44:50Z) - Molecular Geometry-aware Transformer for accurate 3D Atomic System
modeling [51.83761266429285]
We propose a novel Transformer architecture that takes nodes (atoms) and edges (bonds and nonbonding atom pairs) as inputs and models the interactions among them.
Moleformer achieves state-of-the-art on the initial state to relaxed energy prediction of OC20 and is very competitive in QM9 on predicting quantum chemical properties.
arXiv Detail & Related papers (2023-02-02T03:49:57Z) - Dynamic Graph Message Passing Networks for Visual Recognition [112.49513303433606]
Modelling long-range dependencies is critical for scene understanding tasks in computer vision.
A fully-connected graph is beneficial for such modelling, but its computational overhead is prohibitive.
We propose a dynamic graph message passing network, that significantly reduces the computational complexity.
arXiv Detail & Related papers (2022-09-20T14:41:37Z) - Direct Molecular Conformation Generation [217.4815525740703]
We propose a method that directly predicts the coordinates of atoms.
Our method achieves state-of-the-art results on four public benchmarks.
arXiv Detail & Related papers (2022-02-03T01:01:58Z) - Directional Message Passing on Molecular Graphs via Synthetic
Coordinates [7.314446024059812]
We propose synthetic coordinates that enable the use of advanced GNNs without requiring the true molecular configuration.
We show that with this transformation we can reduce the error of a normal graph neural network by 55% on the ZINC benchmark.
We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.
arXiv Detail & Related papers (2021-11-08T18:53:58Z) - Graph Neural Networks with Learnable Structural and Positional
Representations [83.24058411666483]
A major issue with arbitrary graphs is the absence of canonical positional information of nodes.
We introduce Positional nodes (PE) of nodes, and inject it into the input layer, like in Transformers.
We observe a performance increase for molecular datasets, from 2.87% up to 64.14% when considering learnable PE for both GNN classes.
arXiv Detail & Related papers (2021-10-15T05:59:15Z) - GemNet: Universal Directional Graph Neural Networks for Molecules [7.484063729015126]
We show that GNNs with directed edge embeddings and two-hop message passing are indeed universal approximators for predictions.
We then leverage these insights and multiple structural improvements to propose the geometric message passing neural network (GemNet)
arXiv Detail & Related papers (2021-06-02T15:44:55Z) - Directed Graph Attention Neural Network Utilizing 3D Coordinates for
Molecular Property Prediction [11.726245297344418]
Kernel method and graph neural networks have been widely studied as two mainstream methods for property prediction.
In this work, we shed light on the Directed Graph Attention Neural Network (DGANN), which only takes chemical bonds as edges.
Our model has matched or outperformed most baseline graph neural networks on QM9 datasets.
arXiv Detail & Related papers (2020-12-01T11:06:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.