A Novel Higher-order Weisfeiler-Lehman Graph Convolution
- URL: http://arxiv.org/abs/2007.00346v2
- Date: Mon, 21 Sep 2020 11:33:32 GMT
- Title: A Novel Higher-order Weisfeiler-Lehman Graph Convolution
- Authors: Clemens Damke, Vitalik Melnikov, Eyke H\"ullermeier
- Abstract summary: We propose a novel graph convolution operator that is based on the 2-dimensional Weisfeiler-Lehman test.
We show that the resulting 2-WL-GNN architecture is more discriminative than existing GNN approaches.
- Score: 2.658812114255374
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current GNN architectures use a vertex neighborhood aggregation scheme, which
limits their discriminative power to that of the 1-dimensional
Weisfeiler-Lehman (WL) graph isomorphism test. Here, we propose a novel graph
convolution operator that is based on the 2-dimensional WL test. We formally
show that the resulting 2-WL-GNN architecture is more discriminative than
existing GNN approaches. This theoretical result is complemented by
experimental studies using synthetic and real data. On multiple common graph
classification benchmarks, we demonstrate that the proposed model is
competitive with state-of-the-art graph kernels and GNNs.
Related papers
- Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node
Representations [26.77596449192451]
Graph neural networks (GNNs) have emerged as a promising tool for solving machine learning problems on graphs.
In this paper, we define a distance function between nodes which is based on the hierarchy produced by the Weisfeiler-Leman (WL) algorithm.
We propose a model that learns representations which preserve those distances between nodes.
arXiv Detail & Related papers (2022-11-04T15:03:41Z) - Weisfeiler-Lehman goes Dynamic: An Analysis of the Expressive Power of Graph Neural Networks for Attributed and Dynamic Graphs [1.3757956340051607]
Graph Neural Networks (GNNs) are a large class of relational models for graph processing.
Recent studies on the expressive power of GNNs have focused on their ability to distinguish graphs.
Real-life applications often involve a much larger variety of graph types.
arXiv Detail & Related papers (2022-10-08T10:14:41Z) - Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link
Prediction [61.01337335214126]
Link prediction is one important application of graph neural networks (GNNs)
Most existing GNNs for link prediction are based on one-dimensional Weisfeiler-Lehman (1-WL) test.
We study a completely different approach which can directly obtain node pair (link) representations based on textittwo-dimensional Weisfeiler-Lehman (2-WL) tests.
arXiv Detail & Related papers (2022-06-20T04:50:38Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification [48.087302573188396]
We propose a novel graph isomorphism test method, namely Twin-WL, which simultaneously passes node labels and node identities.
We prove that the two Twin-GNNs both have higher expressive power than traditional message passing GNNs.
arXiv Detail & Related papers (2022-03-22T12:58:03Z) - Graph Neural Networks with Parallel Neighborhood Aggregations for Graph
Classification [14.112444998191698]
We focus on graph classification using a graph neural network (GNN) model that precomputes the node features using a bank of neighborhood aggregation graph operators arranged in parallel.
These GNN models have a natural advantage of reduced training and inference time due to the precomputations.
We demonstrate via numerical experiments that the developed model achieves state-of-the-art performance on many diverse real-world datasets.
arXiv Detail & Related papers (2021-11-22T19:19:40Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.