Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link
Prediction
- URL: http://arxiv.org/abs/2206.09567v1
- Date: Mon, 20 Jun 2022 04:50:38 GMT
- Title: Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link
Prediction
- Authors: Yang Hu, Xiyuan Wang, Zhouchen Lin, Pan Li, Muhan Zhang
- Abstract summary: Link prediction is one important application of graph neural networks (GNNs)
Most existing GNNs for link prediction are based on one-dimensional Weisfeiler-Lehman (1-WL) test.
We study a completely different approach which can directly obtain node pair (link) representations based on textittwo-dimensional Weisfeiler-Lehman (2-WL) tests.
- Score: 61.01337335214126
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Link prediction is one important application of graph neural networks (GNNs).
Most existing GNNs for link prediction are based on one-dimensional
Weisfeiler-Lehman (1-WL) test. 1-WL-GNNs first compute node representations by
iteratively passing neighboring node features to the center, and then obtain
link representations by aggregating the pairwise node representations. As
pointed out by previous works, this two-step procedure results in low
discriminating power, as 1-WL-GNNs by nature learn node-level representations
instead of link-level. In this paper, we study a completely different approach
which can directly obtain node pair (link) representations based on
\textit{two-dimensional Weisfeiler-Lehman (2-WL) tests}. 2-WL tests directly
use links (2-tuples) as message passing units instead of nodes, and thus can
directly obtain link representations. We theoretically analyze the expressive
power of 2-WL tests to discriminate non-isomorphic links, and prove their
superior link discriminating power than 1-WL. Based on different 2-WL variants,
we propose a series of novel 2-WL-GNN models for link prediction. Experiments
on a wide range of real-world datasets demonstrate their competitive
performance to state-of-the-art baselines and superiority over plain 1-WL-GNNs.
Related papers
- Union Subgraph Neural Networks [7.922920885565194]
We empower Graph Neural Networks (GNNs) by injecting neighbor-connectivity information extracted from a new type of substructure.
By infusing the encoded neighbor connectivities, we propose a novel model, namely Union Subgraph Neural Network (UnionSNN)
Experiments on 18 benchmarks of both graph-level and node-level tasks demonstrate that UnionSNN outperforms state-of-the-art baseline models.
arXiv Detail & Related papers (2023-05-25T05:52:43Z) - Graph Neural Networks are Inherently Good Generalizers: Insights by
Bridging GNNs and MLPs [71.93227401463199]
This paper pinpoints the major source of GNNs' performance gain to their intrinsic capability, by introducing an intermediate model class dubbed as P(ropagational)MLP.
We observe that PMLPs consistently perform on par with (or even exceed) their GNN counterparts, while being much more efficient in training.
arXiv Detail & Related papers (2022-12-18T08:17:32Z) - Weisfeiler-Lehman goes Dynamic: An Analysis of the Expressive Power of Graph Neural Networks for Attributed and Dynamic Graphs [1.3757956340051607]
Graph Neural Networks (GNNs) are a large class of relational models for graph processing.
Recent studies on the expressive power of GNNs have focused on their ability to distinguish graphs.
Real-life applications often involve a much larger variety of graph types.
arXiv Detail & Related papers (2022-10-08T10:14:41Z) - Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification [48.087302573188396]
We propose a novel graph isomorphism test method, namely Twin-WL, which simultaneously passes node labels and node identities.
We prove that the two Twin-GNNs both have higher expressive power than traditional message passing GNNs.
arXiv Detail & Related papers (2022-03-22T12:58:03Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - A Novel Higher-order Weisfeiler-Lehman Graph Convolution [2.658812114255374]
We propose a novel graph convolution operator that is based on the 2-dimensional Weisfeiler-Lehman test.
We show that the resulting 2-WL-GNN architecture is more discriminative than existing GNN approaches.
arXiv Detail & Related papers (2020-07-01T09:32:01Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.