Limitless stability for Graph Convolutional Networks
- URL: http://arxiv.org/abs/2301.11443v3
- Date: Sat, 30 Sep 2023 15:24:11 GMT
- Title: Limitless stability for Graph Convolutional Networks
- Authors: Christian Koke
- Abstract summary: This work establishes rigorous, novel and widely applicable stability guarantees and transferability bounds for graph convolutional networks.
It is showcased that graph convolutional networks are stable under graph-coarse-graining procedures precisely if the GSO is the graph Laplacian and filters are regular at infinity.
- Score: 8.1585306387285
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This work establishes rigorous, novel and widely applicable stability
guarantees and transferability bounds for graph convolutional networks --
without reference to any underlying limit object or statistical distribution.
Crucially, utilized graph-shift operators (GSOs) are not necessarily assumed to
be normal, allowing for the treatment of networks on both undirected- and for
the first time also directed graphs. Stability to node-level perturbations is
related to an 'adequate (spectral) covering' property of the filters in each
layer. Stability to edge-level perturbations is related to Lipschitz constants
and newly introduced semi-norms of filters. Results on stability to topological
perturbations are obtained through recently developed mathematical-physics
based tools. As an important and novel example, it is showcased that graph
convolutional networks are stable under graph-coarse-graining procedures
(replacing strongly-connected sub-graphs by single nodes) precisely if the GSO
is the graph Laplacian and filters are regular at infinity. These new
theoretical results are supported by corresponding numerical investigations.
Related papers
- Geometric Graph Filters and Neural Networks: Limit Properties and
Discriminability Trade-offs [122.06927400759021]
We study the relationship between a graph neural network (GNN) and a manifold neural network (MNN) when the graph is constructed from a set of points sampled from the manifold.
We prove non-asymptotic error bounds showing that convolutional filters and neural networks on these graphs converge to convolutional filters and neural networks on the continuous manifold.
arXiv Detail & Related papers (2023-05-29T08:27:17Z) - Stability of Aggregation Graph Neural Networks [153.70485149740608]
We study the stability properties of aggregation graph neural networks (Agg-GNNs) considering perturbations of the underlying graph.
We prove that the stability bounds are defined by the properties of the filters in the first layer of the CNN that acts on each node.
We also conclude that in Agg-GNNs the selectivity of the mapping operators is tied to the properties of the filters only in the first layer of the CNN stage.
arXiv Detail & Related papers (2022-07-08T03:54:52Z) - ResNorm: Tackling Long-tailed Degree Distribution Issue in Graph Neural
Networks via Normalization [80.90206641975375]
This paper focuses on improving the performance of GNNs via normalization.
By studying the long-tailed distribution of node degrees in the graph, we propose a novel normalization method for GNNs.
The $scale$ operation of ResNorm reshapes the node-wise standard deviation (NStd) distribution so as to improve the accuracy of tail nodes.
arXiv Detail & Related papers (2022-06-16T13:49:09Z) - On the Stability of Low Pass Graph Filter With a Large Number of Edge
Rewires [21.794739464247687]
We show that the stability of a graph filter depends on perturbation to the community structure.
For block graphs, the graph filter distance converges to zero when the number of nodes approaches infinity.
arXiv Detail & Related papers (2021-10-14T09:00:35Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Graph Neural Networks Inspired by Classical Iterative Algorithms [28.528150667063876]
We consider a new family of GNN layers designed to mimic and integrate the update rules of two classical iterative algorithms.
A novel attention mechanism is explicitly anchored to an underlying end-toend energy function, contributing stability with respect to edge uncertainty.
arXiv Detail & Related papers (2021-03-10T14:08:12Z) - Convergence and Stability of Graph Convolutional Networks on Large
Random Graphs [22.387735135790706]
We study properties of Graph Convolutional Networks (GCNs) by analyzing their behavior on standard models of random graphs.
We first study the convergence of GCNs to their continuous counterpart as the number of nodes grows.
We then analyze the stability of GCNs to small deformations of the random graph model.
arXiv Detail & Related papers (2020-06-02T18:36:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.