WL meet VC
- URL: http://arxiv.org/abs/2301.11039v3
- Date: Tue, 30 May 2023 12:17:30 GMT
- Title: WL meet VC
- Authors: Christopher Morris, Floris Geerts, Jan T\"onshoff, Martin Grohe
- Abstract summary: We study the expressive power of graph neural networks (GNNs) through the lens of Vapnik--Chervonenkis (VC) dimension theory.
We derive an upper bound for GNNs' VC dimension using the number of colors produced by the $1text-mathsfWL$ and GNNs' VC dimension.
- Score: 9.468816647649104
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, many works studied the expressive power of graph neural networks
(GNNs) by linking it to the $1$-dimensional Weisfeiler--Leman algorithm
($1\text{-}\mathsf{WL}$). Here, the $1\text{-}\mathsf{WL}$ is a well-studied
heuristic for the graph isomorphism problem, which iteratively colors or
partitions a graph's vertex set. While this connection has led to significant
advances in understanding and enhancing GNNs' expressive power, it does not
provide insights into their generalization performance, i.e., their ability to
make meaningful predictions beyond the training set. In this paper, we study
GNNs' generalization ability through the lens of Vapnik--Chervonenkis (VC)
dimension theory in two settings, focusing on graph-level predictions. First,
when no upper bound on the graphs' order is known, we show that the bitlength
of GNNs' weights tightly bounds their VC dimension. Further, we derive an upper
bound for GNNs' VC dimension using the number of colors produced by the
$1\text{-}\mathsf{WL}$. Secondly, when an upper bound on the graphs' order is
known, we show a tight connection between the number of graphs distinguishable
by the $1\text{-}\mathsf{WL}$ and GNNs' VC dimension. Our empirical study
confirms the validity of our theoretical findings.
Related papers
- A note on the VC dimension of 1-dimensional GNNs [6.0757501646966965]
Graph Neural Networks (GNNs) have become an essential tool for analyzing graph-structured data.
This paper focuses on the generalization of GNNs by investigating their Vapnik-Chervonenkis (VC) dimension.
arXiv Detail & Related papers (2024-10-10T11:33:15Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - VC dimension of Graph Neural Networks with Pfaffian activation functions [4.141514895639094]
Graph Neural Networks (GNNs) have emerged in recent years as a powerful tool to learn tasks across a wide range of graph domains.
The aim of our work is to extend this analysis on the VC dimension of GNNs to other commonly used activation functions, such as sigmoid and hyperbolic tangent.
arXiv Detail & Related papers (2024-01-22T21:11:22Z) - Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - Generalization Limits of Graph Neural Networks in Identity Effects
Learning [12.302336258860116]
Graph Neural Networks (GNNs) have emerged as a powerful tool for data-driven learning on various graph domains.
We establish new generalization properties and fundamental limits of GNNs in the context of learning so-called identity effects.
Our study is motivated by the need to understand the capabilities of GNNs when performing simple cognitive tasks.
arXiv Detail & Related papers (2023-06-30T20:56:38Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link
Prediction [61.01337335214126]
Link prediction is one important application of graph neural networks (GNNs)
Most existing GNNs for link prediction are based on one-dimensional Weisfeiler-Lehman (1-WL) test.
We study a completely different approach which can directly obtain node pair (link) representations based on textittwo-dimensional Weisfeiler-Lehman (2-WL) tests.
arXiv Detail & Related papers (2022-06-20T04:50:38Z) - Twin Weisfeiler-Lehman: High Expressive GNNs for Graph Classification [48.087302573188396]
We propose a novel graph isomorphism test method, namely Twin-WL, which simultaneously passes node labels and node identities.
We prove that the two Twin-GNNs both have higher expressive power than traditional message passing GNNs.
arXiv Detail & Related papers (2022-03-22T12:58:03Z) - Training Free Graph Neural Networks for Graph Matching [103.45755859119035]
TFGM is a framework to boost the performance of Graph Neural Networks (GNNs) based graph matching without training.
Applying TFGM on various GNNs shows promising improvements over baselines.
arXiv Detail & Related papers (2022-01-14T09:04:46Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Let's Agree to Degree: Comparing Graph Convolutional Networks in the
Message-Passing Framework [5.835421173589977]
We cast neural networks defined on graphs as message-passing neural networks (MPNNs)
We consider two variants of MPNNs: anonymous MPNNs and degree-aware MPNNs.
We obtain lower and upper bounds on the distinguishing power of MPNNs in terms of the distinguishing power of the Weisfeiler-Lehman (WL) algorithm.
arXiv Detail & Related papers (2020-04-06T12:14:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.