Weisfeiler and Leman Go Measurement Modeling: Probing the Validity of the WL Test
- URL: http://arxiv.org/abs/2307.05775v3
- Date: Sun, 31 Mar 2024 17:03:00 GMT
- Title: Weisfeiler and Leman Go Measurement Modeling: Probing the Validity of the WL Test
- Authors: Arjun Subramonian, Adina Williams, Maximilian Nickel, Yizhou Sun, Levent Sagun,
- Abstract summary: We uncover misalignments between graph machine learning practitioners' conceptualizations of expressive power and $k$-WL.
We argue for extensional definitions and measurement of expressive power based on benchmarks.
- Score: 58.1543112860658
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The expressive power of graph neural networks is usually measured by comparing how many pairs of graphs or nodes an architecture can possibly distinguish as non-isomorphic to those distinguishable by the $k$-dimensional Weisfeiler-Leman ($k$-WL) test. In this paper, we uncover misalignments between graph machine learning practitioners' conceptualizations of expressive power and $k$-WL through a systematic analysis of the reliability and validity of $k$-WL. We conduct a survey ($n = 18$) of practitioners to surface their conceptualizations of expressive power and their assumptions about $k$-WL. In contrast to practitioners' beliefs, our analysis (which draws from graph theory and benchmark auditing) reveals that $k$-WL does not guarantee isometry, can be irrelevant to real-world graph tasks, and may not promote generalization or trustworthiness. We argue for extensional definitions and measurement of expressive power based on benchmarks. We further contribute guiding questions for constructing such benchmarks, which is critical for graph machine learning practitioners to develop and transparently communicate our understandings of expressive power.
Related papers
- Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - Fine-grained Expressivity of Graph Neural Networks [15.766353435658043]
We consider continuous extensions of both $1$-WL and MPNNs to graphons.
We show that the continuous variant of $1$-WL delivers an accurate topological characterization of the expressive power of MPNNs on graphons.
We also evaluate different MPNN architectures based on their ability to preserve graph distances.
arXiv Detail & Related papers (2023-06-06T14:12:23Z) - Three iterations of $(d-1)$-WL test distinguish non isometric clouds of
$d$-dimensional points [58.9648095506484]
We study when the Weisfeiler--Lehman test is complete for clouds of euclidean points represented by complete distance graphs.
Our main result states that the $(d-1)$-dimensional WL test is complete for point clouds in $d$-dimensional Euclidean space, for any $dge 2$, and that only three iterations of the test suffice.
arXiv Detail & Related papers (2023-03-22T18:23:24Z) - Probing Graph Representations [77.7361299039905]
We use a probing framework to quantify the amount of meaningful information captured in graph representations.
Our findings on molecular datasets show the potential of probing for understanding the inductive biases of graph-based models.
We advocate for probing as a useful diagnostic tool for evaluating graph-based models.
arXiv Detail & Related papers (2023-03-07T14:58:18Z) - Rethinking the Expressive Power of GNNs via Graph Biconnectivity [45.4674360883544]
We introduce a novel class of expressivity metrics via graph biconnectivity and highlight their importance in both theory and practice.
We introduce a principled and more efficient approach, called the Generalized Distance Weisfeiler-Lehman (GD-WL)
Practically, we show GD-WL can be implemented by a Transformer-like architecture that preserves and enjoys full parallelizability.
arXiv Detail & Related papers (2023-01-23T15:58:59Z) - Representation Power of Graph Neural Networks: Improved Expressivity via
Algebraic Analysis [124.97061497512804]
We show that standard Graph Neural Networks (GNNs) produce more discriminative representations than the Weisfeiler-Lehman (WL) algorithm.
We also show that simple convolutional architectures with white inputs, produce equivariant features that count the closed paths in the graph.
arXiv Detail & Related papers (2022-05-19T18:40:25Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.