1-WL Expressiveness Is (Almost) All You Need
- URL: http://arxiv.org/abs/2202.10156v1
- Date: Mon, 21 Feb 2022 12:05:06 GMT
- Title: 1-WL Expressiveness Is (Almost) All You Need
- Authors: Markus Zopf
- Abstract summary: It has been shown that a message passing neural networks (MPNNs) are at most as expressive as the first-order Weisfeiler-Leman (1-WL) graph isomorphism test.
In this work, we analyze if the limited expressiveness is actually a limiting factor for MPNNs and other WL-based models in standard graph datasets.
- Score: 3.4012007729454807
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It has been shown that a message passing neural networks (MPNNs), a popular
family of neural networks for graph-structured data, are at most as expressive
as the first-order Weisfeiler-Leman (1-WL) graph isomorphism test, which has
motivated the development of more expressive architectures. In this work, we
analyze if the limited expressiveness is actually a limiting factor for MPNNs
and other WL-based models in standard graph datasets. Interestingly, we find
that the expressiveness of WL is sufficient to identify almost all graphs in
most datasets. Moreover, we find that the classification accuracy upper bounds
are often close to 100\%. Furthermore, we find that simple WL-based neural
networks and several MPNNs can be fitted to several datasets. In sum, we
conclude that the performance of WL/MPNNs is not limited by their
expressiveness in practice.
Related papers
- An Empirical Study of Realized GNN Expressiveness [27.458151256734947]
We study the realized expressive power that a practical model instance can achieve using a novel expressiveness dataset, BREC.
We synthetically test 23 models with higher-than-1-WL expressiveness on BREC.
Our experiment gives the first thorough measurement of the realized expressiveness of those state-of-the-art beyond-1-WL GNN models.
arXiv Detail & Related papers (2023-04-16T05:53:47Z) - On the Expressiveness and Generalization of Hypergraph Neural Networks [77.65788763444877]
This extended abstract describes a framework for analyzing the expressiveness, learning, and (structural) generalization of hypergraph neural networks (HyperGNNs)
Specifically, we focus on how HyperGNNs can learn from finite datasets and generalize structurally to graph reasoning problems of arbitrary input sizes.
arXiv Detail & Related papers (2023-03-09T18:42:18Z) - A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph
Weisfeiler-Lehman Tests [30.558563815430418]
Subgraph neural networks (GNNs) have emerged as an important direction for developing expressive graph neural networks (GNNs)
This paper conducts a systematic study of general node-based subgraph GNNs through the lens of Subgraph Weisfeiler-Lehman Tests (SWL)
We prove that any node-based subgraph GNN falls into one of the six SWL equivalence classes, among which $mathsfSSWL$ achieves the maximal expressive power.
arXiv Detail & Related papers (2023-02-14T14:42:54Z) - A Practical, Progressively-Expressive GNN [27.267362661067285]
Message passing neural networks (MPNNs) have become a dominant flavor of graph neural networks (GNNs) in recent years.
MPNNs come with notable limitations; they are at most as powerful as the 1-dimensional Weisfeiler-Leman (1-WL) test in distinguishing graphs in a graph isomorphism testing frame-work.
We propose the (k, c)(=)-SETWL hierarchy with greatly reduced complexity from k-WL, achieved by moving from k-tuples of nodes to sets with =k nodes defined over =c connected components in
arXiv Detail & Related papers (2022-10-18T01:27:21Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Graph Representation Learning with Individualization and Refinement [19.436520792345064]
Graph Neural Networks (GNNs) have emerged as prominent models for representation learning on graph structured data.
In this work, we follow the classical approach of Individualization and Refinement (IR)
Our technique lets us learn richer node embeddings while keeping the computational complexity manageable.
arXiv Detail & Related papers (2022-03-17T07:50:48Z) - Neural Graph Matching for Pre-training Graph Neural Networks [72.32801428070749]
Graph neural networks (GNNs) have been shown powerful capacity at modeling structural data.
We present a novel Graph Matching based GNN Pre-Training framework, called GMPT.
The proposed method can be applied to fully self-supervised pre-training and coarse-grained supervised pre-training.
arXiv Detail & Related papers (2022-03-03T09:53:53Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.