On the Stability of Graph Convolutional Neural Networks under Edge
Rewiring
- URL: http://arxiv.org/abs/2010.13747v2
- Date: Thu, 18 Feb 2021 15:18:57 GMT
- Title: On the Stability of Graph Convolutional Neural Networks under Edge
Rewiring
- Authors: Henry Kenlay, Dorina Thanou, Xiaowen Dong
- Abstract summary: Graph neural networks are experiencing a surge of popularity within the machine learning community.
Despite this, their stability, i.e., their robustness to small perturbations in the input, is not yet well understood.
We develop an interpretable upper bound elucidating that graph neural networks are stable to rewiring between high degree nodes.
- Score: 22.58110328955473
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks are experiencing a surge of popularity within the
machine learning community due to their ability to adapt to non-Euclidean
domains and instil inductive biases. Despite this, their stability, i.e., their
robustness to small perturbations in the input, is not yet well understood.
Although there exists some results showing the stability of graph neural
networks, most take the form of an upper bound on the magnitude of change due
to a perturbation in the graph topology. However, the change in the graph
topology captured in existing bounds tend not to be expressed in terms of
structural properties, limiting our understanding of the model robustness
properties. In this work, we develop an interpretable upper bound elucidating
that graph neural networks are stable to rewiring between high degree nodes.
This bound and further research in bounds of similar type provide further
understanding of the stability properties of graph neural networks.
Related papers
- Formal Verification of Graph Convolutional Networks with Uncertain Node Features and Uncertain Graph Structure [7.133681867718039]
Graph neural networks are becoming increasingly popular in the field of machine learning.
They have been applied in safety-critical environments where perturbations inherently occur.
This research addresses the non-passing gap by preserving the dependencies of all elements in the underlying computations.
arXiv Detail & Related papers (2024-04-23T14:12:48Z) - On the Trade-Off between Stability and Representational Capacity in
Graph Neural Networks [22.751509906413943]
We study the stability of EdgeNet: a general GNN framework that unifies more than twenty solutions.
By studying the effect of different EdgeNet categories on the stability, we show that GNNs with fewer degrees of freedom in their parameter space, linked to a lower representational capacity, are more stable.
arXiv Detail & Related papers (2023-12-04T22:07:17Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - On the Robustness of Graph Neural Diffusion to Topology Perturbations [30.284359808863588]
We show that graph neural PDEs are intrinsically more robust against topology perturbation as compared to other GNNs.
We propose a general graph neural PDE framework based on which a new class of robust GNNs can be defined.
arXiv Detail & Related papers (2022-09-16T07:19:35Z) - Stability of Neural Networks on Manifolds to Relative Perturbations [118.84154142918214]
Graph Neural Networks (GNNs) show impressive performance in many practical scenarios.
GNNs can scale well on large size graphs, but this is contradicted by the fact that existing stability bounds grow with the number of nodes.
arXiv Detail & Related papers (2021-10-10T04:37:19Z) - Training Stable Graph Neural Networks Through Constrained Learning [116.03137405192356]
Graph Neural Networks (GNNs) rely on graph convolutions to learn features from network data.
GNNs are stable to different types of perturbations of the underlying graph, a property that they inherit from graph filters.
We propose a novel constrained learning approach by imposing a constraint on the stability condition of the GNN within a perturbation of choice.
arXiv Detail & Related papers (2021-10-07T15:54:42Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - Graph Structure of Neural Networks [104.33754950606298]
We show how the graph structure of neural networks affect their predictive performance.
A "sweet spot" of relational graphs leads to neural networks with significantly improved predictive performance.
Top-performing neural networks have graph structure surprisingly similar to those of real biological neural networks.
arXiv Detail & Related papers (2020-07-13T17:59:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.