A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests
- URL: http://arxiv.org/abs/2302.07090v2
- Date: Wed, 29 Mar 2023 16:48:49 GMT
- Title: A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests
- Authors: Bohang Zhang, Guhao Feng, Yiheng Du, Di He, Liwei Wang
- Abstract summary: Subgraph neural networks (GNNs) have emerged as an important direction for developing expressive graph neural networks (GNNs)
This paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL)
We prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $mathsfSSWL$ achieves the maximal expressive power.
- Score: 30.558563815430418
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, subgraph GNNs have emerged as an important direction for developing
expressive graph neural networks (GNNs). While numerous architectures have been
proposed, so far there is still a limited understanding of how various design
paradigms differ in terms of expressive power, nor is it clear what design
principle achieves maximal expressiveness with minimal architectural
complexity. To address these fundamental questions, this paper conducts a
systematic study of general node-based subgraph GNNs through the lens of
Subgraph Weisfeiler-Lehman Tests (SWL). Our central result is to build a
complete hierarchy of SWL with strictly growing expressivity. Concretely, we
prove that any node-based subgraph GNN falls into one of the six SWL
equivalence classes, among which $\mathsf{SSWL}$ achieves the maximal
expressive power. We also study how these equivalence classes differ in terms
of their practical expressiveness such as encoding graph distance and
biconnectivity. Furthermore, we give a tight expressivity upper bound of all
SWL algorithms by establishing a close relation with localized versions of WL
and Folklore WL (FWL) tests. Our results provide insights into the power of
existing subgraph GNNs, guide the design of new architectures, and point out
their limitations by revealing an inherent gap with the 2-FWL test. Finally,
experiments demonstrate that $\mathsf{SSWL}$-inspired subgraph GNNs can
significantly outperform prior architectures on multiple benchmarks despite
great simplicity.
Related papers
- Beyond Weisfeiler-Lehman: A Quantitative Framework for GNN
Expressiveness [35.409017863665575]
Homomorphism expressivity measures the ability of GNN models to count graphs under homomorphism.
By examining four classes of prominent GNNs as case studies, we derive simple, unified, and elegant descriptions of their homomorphism expressivity.
Our results provide novel insights into a series of previous work, unify the landscape of different subareas in the community, and settle several open questions.
arXiv Detail & Related papers (2024-01-16T17:23:23Z) - MAG-GNN: Reinforcement Learning Boosted Graph Neural Network [68.60884768323739]
A particular line of work proposed subgraph GNNs that use subgraph information to improve GNNs' expressivity and achieved great success.
Such effectivity sacrifices the efficiency of GNNs by enumerating all possible subgraphs.
We propose Magnetic Graph Neural Network (MAG-GNN), a reinforcement learning (RL) boosted GNN, to solve the problem.
arXiv Detail & Related papers (2023-10-29T20:32:21Z) - Equivariant Polynomials for Graph Neural Networks [38.15983687193912]
Graph Networks (GNN) are inherently limited in their expressive power.
This paper introduces an alternative power hierarchy based on the ability of GNNs to calculate equivariants of certain degree.
These enhanced GNNs demonstrate state-of-the-art results in experiments across multiple graph learning benchmarks.
arXiv Detail & Related papers (2023-02-22T18:53:38Z) - Rethinking the Expressive Power of GNNs via Graph Biconnectivity [45.4674360883544]
We introduce a novel class of expressivity metrics via graph biconnectivity and highlight their importance in both theory and practice.
We introduce a principled and more efficient approach, called the Generalized Distance Weisfeiler-Lehman (GD-WL)
Practically, we show GD-WL can be implemented by a Transformer-like architecture that preserves and enjoys full parallelizability.
arXiv Detail & Related papers (2023-01-23T15:58:59Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification [48.087302573188396]
We propose a novel graph isomorphism test method, namely Twin-WL, which simultaneously passes node labels and node identities.
We prove that the two Twin-GNNs both have higher expressive power than traditional message passing GNNs.
arXiv Detail & Related papers (2022-03-22T12:58:03Z) - Equivariant Subgraph Aggregation Networks [23.26140936226352]
This paper proposes a novel framework called Equivariant Subgraph Aggregation Networks (ESAN) to address this issue.
While two graphs may not be distinguishable by an MPNN, they often contain distinguishable subgraphs.
We develop novel variants of the 1-dimensional Weisfeiler-Leman (1-WL) test for graph isomorphism, and prove lower bounds on the expressiveness of ESAN.
We provide theoretical results that describe how design choices such as the subgraph selection policy and equivariant neural architecture affect our architecture's expressive power.
arXiv Detail & Related papers (2021-10-06T16:45:07Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.