Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification
- URL: http://arxiv.org/abs/2203.11683v1
- Date: Tue, 22 Mar 2022 12:58:03 GMT
- Title: Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification
- Authors: Zhaohui Wang, Qi Cao, Huawei Shen, Bingbing Xu and Xueqi Cheng
- Abstract summary: We propose a novel graph isomorphism test method, namely Twin-WL, which simultaneously passes node labels and node identities.
We prove that the two Twin-GNNs both have higher expressive power than traditional message passing GNNs.
- Score: 48.087302573188396
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The expressive power of message passing GNNs is upper-bounded by
Weisfeiler-Lehman (WL) test. To achieve high expressive GNNs beyond WL test, we
propose a novel graph isomorphism test method, namely Twin-WL, which
simultaneously passes node labels and node identities rather than only passes
node label as WL. The identity-passing mechanism encodes complete structure
information of rooted subgraph, and thus Twin-WL can offer extra power beyond
WL at distinguishing graph structures. Based on Twin-WL, we implement two
Twin-GNNs for graph classification via defining readout function over rooted
subgraph: one simply readouts the size of rooted subgraph and the other
readouts rich structure information of subgraph following a GNN-style. We prove
that the two Twin-GNNs both have higher expressive power than traditional
message passing GNNs. Experiments also demonstrate the Twin-GNNs significantly
outperform state-of-the-art methods at the task of graph classification.
Related papers
- FSW-GNN: A Bi-Lipschitz WL-Equivalent Graph Neural Network [2.766564215769441]
MPNNs' ability to distinguish between graphs is limited to graphs separable by the Weisfeiler-Lemann (WL) graph isomorphism test.
We show that our MPNN is competitive with standard MPNNs for several graph learning tasks and is far more accurate in over-squashing long-range tasks.
arXiv Detail & Related papers (2024-10-10T18:11:23Z) - Improving the Expressiveness of $K$-hop Message-Passing GNNs by Injecting Contextualized Substructure Information [17.56609419806051]
We propose textitsubstructure encoding function to uplift the expressive power of any $K$-hop message-passing GNN.
Our method is provably more powerful than previous works on $K$-hop graph neural networks and 1-WL subgraph GNNs.
arXiv Detail & Related papers (2024-06-27T15:10:56Z) - A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests [30.558563815430418]
Subgraph neural networks (GNNs) have emerged as an important direction for developing expressive graph neural networks (GNNs)
This paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL)
We prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $mathsfSSWL$ achieves the maximal expressive power.
arXiv Detail & Related papers (2023-02-14T14:42:54Z) - Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link
Prediction [61.01337335214126]
Link prediction is one important application of graph neural networks (GNNs)
Most existing GNNs for link prediction are based on one-dimensional Weisfeiler-Lehman (1-WL) test.
We study a completely different approach which can directly obtain node pair (link) representations based on textittwo-dimensional Weisfeiler-Lehman (2-WL) tests.
arXiv Detail & Related papers (2022-06-20T04:50:38Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - A Novel Higher-order Weisfeiler-Lehman Graph Convolution [2.658812114255374]
We propose a novel graph convolution operator that is based on the 2-dimensional Weisfeiler-Lehman test.
We show that the resulting 2-WL-GNN architecture is more discriminative than existing GNN approaches.
arXiv Detail & Related papers (2020-07-01T09:32:01Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Can Graph Neural Networks Count Substructures? [53.256112515435355]
We study the power of graph neural networks (GNNs) via their ability to count attributed graph substructures.
We distinguish between two types of substructure counting: inducedsubgraph-count and subgraphcount-count, and both positive and negative answers for popular GNN architectures.
arXiv Detail & Related papers (2020-02-10T18:53:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.