Higher-Order Graphon Neural Networks: Approximation and Cut Distance
- URL: http://arxiv.org/abs/2503.14338v1
- Date: Tue, 18 Mar 2025 15:14:49 GMT
- Title: Higher-Order Graphon Neural Networks: Approximation and Cut Distance
- Authors: Daniel Herbst, Stefanie Jegelka,
- Abstract summary: We extend the $k$-WL test for graphons to the graphon-signal space and introduce signal-weighted homomorphism densities as a key tool.<n>As an exemplary focus, we generalize Invariant Graph Networks (IGNs) to graphons.<n>We show that IWNs of order $k$ are at least as powerful as the $k$-WL test, and we establish universal approximation results for graphon-signals in $Lp$ distances.
- Score: 30.258169775217926
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph limit models, like graphons for limits of dense graphs, have recently been used to study size transferability of graph neural networks (GNNs). While most literature focuses on message passing GNNs (MPNNs), in this work we attend to the more powerful higher-order GNNs. First, we extend the $k$-WL test for graphons (B\"oker, 2023) to the graphon-signal space and introduce signal-weighted homomorphism densities as a key tool. As an exemplary focus, we generalize Invariant Graph Networks (IGNs) to graphons, proposing Invariant Graphon Networks (IWNs) defined via a subset of the IGN basis corresponding to bounded linear operators. Even with this restricted basis, we show that IWNs of order $k$ are at least as powerful as the $k$-WL test, and we establish universal approximation results for graphon-signals in $L^p$ distances. This significantly extends the prior work of Cai & Wang (2022), showing that IWNs--a subset of their IGN-small--retain effectively the same expressivity as the full IGN basis in the limit. In contrast to their approach, our blueprint of IWNs also aligns better with the geometry of graphon space, for example facilitating comparability to MPNNs. We highlight that, while typical higher-order GNNs are discontinuous w.r.t. cut distance--which causes their lack of convergence and is inherently tied to the definition of $k$-WL--their transferability remains comparable to MPNNs.
Related papers
- On the Expressive Power of Subgraph Graph Neural Networks for Graphs with Bounded Cycles [17.29046077604317]
This work investigates $k$-hop subgraph GNNs that aggregate information from neighbors with distances up to $k$.<n>We prove that the $k$-hop subgraph GNNs can approximate any permutation-invariant/equivariant continuous function over graphs without cycles of length greater than $2k+1$ within any error tolerance.
arXiv Detail & Related papers (2025-02-06T01:25:22Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - From Local to Global: Spectral-Inspired Graph Neural Networks [28.858773653743075]
Graph Neural Networks (GNNs) are powerful deep learning methods for Non-Euclidean data.
MPNNs are message-passing algorithms that aggregate and combine signals in a local graph neighborhood.
MPNNs can suffer from issues like over-smoothing or over-squashing.
arXiv Detail & Related papers (2022-09-24T17:19:00Z) - Convergence of Invariant Graph Networks [13.008323851750442]
In this paper, we investigate the convergence of one powerful GNN, Invariant Graph Network (IGN) over graphs sampled from graphons.
We show that IGN-small still contains function class rich enough that can approximate spectral GNNs arbitrarily well.
arXiv Detail & Related papers (2022-01-25T07:02:58Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Graph Neural Networks with Local Graph Parameters [1.8600631687568656]
Local graph parameters can be added to any Graph Neural Networks (GNNs) architecture.
Our results connect GNNs with deep results in finite model theory and finite variable logics.
arXiv Detail & Related papers (2021-06-12T07:43:51Z) - A Unified Lottery Ticket Hypothesis for Graph Neural Networks [82.31087406264437]
We present a unified GNN sparsification (UGS) framework that simultaneously prunes the graph adjacency matrix and the model weights.
We further generalize the popular lottery ticket hypothesis to GNNs for the first time, by defining a graph lottery ticket (GLT) as a pair of core sub-dataset and sparse sub-network.
arXiv Detail & Related papers (2021-02-12T21:52:43Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Let's Agree to Degree: Comparing Graph Convolutional Networks in the
Message-Passing Framework [5.835421173589977]
We cast neural networks defined on graphs as message-passing neural networks (MPNNs)
We consider two variants of MPNNs: anonymous MPNNs and degree-aware MPNNs.
We obtain lower and upper bounds on the distinguishing power of MPNNs in terms of the distinguishing power of the Weisfeiler-Lehman (WL) algorithm.
arXiv Detail & Related papers (2020-04-06T12:14:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.