A Graphop Analysis of Graph Neural Networks on Sparse Graphs: Generalization and Universal Approximation
- URL: http://arxiv.org/abs/2602.08785v1
- Date: Mon, 09 Feb 2026 15:25:36 GMT
- Title: A Graphop Analysis of Graph Neural Networks on Sparse Graphs: Generalization and Universal Approximation
- Authors: Ofek Amran, Tom Gilat, Ron Levie,
- Abstract summary: Generalization and approximation capabilities of graph neural networks (MPNNs) are studied by defining a compact metric on a space of input graphs under which MPNNs are Hlder continuous.<n>We present a unified approach, defining a compact metric on the space of graphs of all sizes, both sparse and dense, under which MPNNs are Hlder continuous.
- Score: 13.843101157271034
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generalization and approximation capabilities of message passing graph neural networks (MPNNs) are often studied by defining a compact metric on a space of input graphs under which MPNNs are Hölder continuous. Such analyses are of two varieties: 1) when the metric space includes graphs of unbounded sizes, the theory is only appropriate for dense graphs, and, 2) when studying sparse graphs, the metric space only includes graphs of uniformly bounded size. In this work, we present a unified approach, defining a compact metric on the space of graphs of all sizes, both sparse and dense, under which MPNNs are Hölder continuous. This leads to more powerful universal approximation theorems and generalization bounds than previous works. The theory is based on, and extends, a recent approach to graph limit theory called graphop analysis.
Related papers
- A Note on Graphon-Signal Analysis of Graph Neural Networks [16.990615110732524]
We introduce several refinements and extensions to existing results that address these shortcomings.<n>We improve the generalization bound by utilizing robustness-type generalization bounds.<n>We extend the analysis to non-symmetric graphons and kernels.
arXiv Detail & Related papers (2025-08-25T23:46:02Z) - Generalization, Expressivity, and Universality of Graph Neural Networks on Attributed Graphs [53.27010448621372]
We analyze the universality and generalization of graph neural networks (GNNs) on attributed graphs with node attributes.<n>We prove a universal approximation theorem for GNNs and bounds generalization for GNNs on any data distribution of attributed graphs.<n>Our work extends and unites previous approaches which either derived theory only for graphs with no attributes, derived compact metrics under which GNNs are continuous but without separation power, or derived metrics under which GNNs are continuous and separate points.
arXiv Detail & Related papers (2024-11-08T10:34:24Z) - Generalization of Geometric Graph Neural Networks with Lipschitz Loss Functions [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)<n>We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.<n>We verify this theoretical result with experiments on multiple real-world datasets.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.<n>We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Generalization Bounds for Message Passing Networks on Mixture of Graphons [21.457225542391267]
We study the generalization capabilities of Message Passing Neural Networks (MPNNs)
We derive generalization bounds specifically for MPNNs with normalized sum aggregation and mean aggregation.
Our results imply that MPNNs with higher complexity than the size of the training set can still generalize effectively.
arXiv Detail & Related papers (2024-04-04T14:26:47Z) - Limits, approximation and size transferability for GNNs on sparse graphs
via graphops [44.02161831977037]
We take a perspective of taking limits of operators derived from graphs, such as the aggregation operation that makes up GNNs.
Our results hold for dense and sparse graphs, and various notions of graph limits.
arXiv Detail & Related papers (2023-06-07T15:04:58Z) - A graphon-signal analysis of graph neural networks [8.423229632785025]
We present an approach for analyzing message passing graph neural networks (MPNNs) based on an extension of graphon analysis to a so called graphon-signal analysis.
We prove that MPNNs are Lipschitz continuous functions over the graphon-signal metric space.
arXiv Detail & Related papers (2023-05-25T12:27:35Z) - Tight and fast generalization error bound of graph embedding in metric
space [54.279425319381374]
We show that graph embedding in non-Euclidean metric space can outperform that in Euclidean space with much smaller training data than the existing bound has suggested.
Our new upper bound is significantly tighter and faster than the existing one, which can be exponential to $R$ and $O(frac1S)$ at the fastest.
arXiv Detail & Related papers (2023-05-13T17:29:18Z) - Graphon Pooling for Reducing Dimensionality of Signals and Convolutional
Operators on Graphs [131.53471236405628]
We present three methods that exploit the induced graphon representation of graphs and graph signals on partitions of [0, 1]2 in the graphon space.
We prove that those low dimensional representations constitute a convergent sequence of graphs and graph signals.
We observe that graphon pooling performs significantly better than other approaches proposed in the literature when dimensionality reduction ratios between layers are large.
arXiv Detail & Related papers (2022-12-15T22:11:34Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.