Neural Message Passing on High Order Paths
- URL: http://arxiv.org/abs/2002.10413v1
- Date: Mon, 24 Feb 2020 17:58:02 GMT
- Title: Neural Message Passing on High Order Paths
- Authors: Daniel Flam-Shepherd, Tony Wu, Pascal Friederich and Alan Aspuru-Guzik
- Abstract summary: We generalize graph neural nets to pass messages and aggregate across higher order paths.
This allows for information to propagate over various levels and substructures of the graph.
- Score: 4.273470365293033
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural network have achieved impressive results in predicting molecular
properties, but they do not directly account for local and hidden structures in
the graph such as functional groups and molecular geometry. At each propagation
step, GNNs aggregate only over first order neighbours, ignoring important
information contained in subsequent neighbours as well as the relationships
between those higher order connections. In this work, we generalize graph
neural nets to pass messages and aggregate across higher order paths. This
allows for information to propagate over various levels and substructures of
the graph. We demonstrate our model on a few tasks in molecular property
prediction.
Related papers
- Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Discovering the Representation Bottleneck of Graph Neural Networks from
Multi-order Interactions [51.597480162777074]
Graph neural networks (GNNs) rely on the message passing paradigm to propagate node features and build interactions.
Recent works point out that different graph learning tasks require different ranges of interactions between nodes.
We study two common graph construction methods in scientific domains, i.e., emphK-nearest neighbor (KNN) graphs and emphfully-connected (FC) graphs.
arXiv Detail & Related papers (2022-05-15T11:38:14Z) - Structure Enhanced Graph Neural Networks for Link Prediction [6.872826041648584]
We propose Structure Enhanced Graph neural network (SEG) for link prediction.
SEG incorporates surrounding topological information of target nodes into an ordinary GNN model.
Experiments on the OGB link prediction datasets demonstrate that SEG achieves state-of-the-art results.
arXiv Detail & Related papers (2022-01-14T03:49:30Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Distance-aware Molecule Graph Attention Network for Drug-Target Binding
Affinity Prediction [54.93890176891602]
We propose a diStance-aware Molecule graph Attention Network (S-MAN) tailored to drug-target binding affinity prediction.
As a dedicated solution, we first propose a position encoding mechanism to integrate the topological structure and spatial position information into the constructed pocket-ligand graph.
We also propose a novel edge-node hierarchical attentive aggregation structure which has edge-level aggregation and node-level aggregation.
arXiv Detail & Related papers (2020-12-17T17:44:01Z) - Higher-Order Explanations of Graph Neural Networks via Relevant Walks [3.1510406584101776]
Graph Neural Networks (GNNs) are a popular approach for predicting graph structured data.
In this paper, we show that GNNs can in fact be naturally explained using higher-order expansions.
We extract practically relevant insights on sentiment analysis of text data, structure-property relationships in quantum chemistry, and image classification.
arXiv Detail & Related papers (2020-06-05T17:59:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.