Graph and graphon neural network stability
- URL: http://arxiv.org/abs/2010.12529v4
- Date: Mon, 26 Apr 2021 15:47:35 GMT
- Title: Graph and graphon neural network stability
- Authors: Luana Ruiz, Zhiyang Wang, Alejandro Ribeiro
- Abstract summary: Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
- Score: 122.06927400759021
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) are learning architectures that rely on
knowledge of the graph structure to generate meaningful representations of
large-scale network data. GNN stability is thus important as in real-world
scenarios there are typically uncertainties associated with the graph. We
analyze GNN stability using kernel objects called graphons. Graphons are both
limits of convergent graph sequences and generating models for deterministic
and stochastic graphs. Building upon the theory of graphon signal processing,
we define graphon neural networks and analyze their stability to graphon
perturbations. We then extend this analysis by interpreting the graphon neural
network as a generating model for GNNs on deterministic and stochastic graphs
instantiated from the original and perturbed graphons. We observe that GNNs are
stable to graphon perturbations with a stability bound that decreases
asymptotically with the size of the graph. This asymptotic behavior is further
demonstrated in an experiment of movie recommendation.
Related papers
- Scalable Implicit Graphon Learning [25.015678499211404]
We propose a scalable method that combines implicit neural representations (INRs) and graph neural networks (GNNs) to estimate a graphon from observed graphs.
We evaluate SIGL in synthetic and real-world graphs, showing that it outperforms existing methods and scales effectively to larger graphs.
arXiv Detail & Related papers (2024-10-22T22:44:24Z) - Graph Neural Tangent Kernel: Convergence on Large Graphs [7.624781434274796]
Graph neural networks (GNNs) achieve remarkable performance in graph machine learning tasks.
We investigate the training dynamics of large-graph GNNs using graph neural kernels (GNTKs) and graphons.
arXiv Detail & Related papers (2023-01-25T19:52:58Z) - Stable and Transferable Hyper-Graph Neural Networks [95.07035704188984]
We introduce an architecture for processing signals supported on hypergraphs via graph neural networks (GNNs)
We provide a framework for bounding the stability and transferability error of GNNs across arbitrary graphs via spectral similarity.
arXiv Detail & Related papers (2022-11-11T23:44:20Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Transferability of Graph Neural Networks: an Extended Graphon Approach [3.042325619220694]
We study spectral graph convolutional neural networks (GCNNs)
In this paper, we consider a model of transferability based on graphon analysis.
arXiv Detail & Related papers (2021-09-21T10:59:01Z) - Stability of Graph Convolutional Neural Networks to Stochastic
Perturbations [122.12962842842349]
Graph convolutional neural networks (GCNNs) are nonlinear processing tools to learn representations from network data.
Current analysis considers deterministic perturbations but fails to provide relevant insights when topological changes are random.
This paper investigates the stability of GCNNs to perturbed graph perturbations induced by link losses.
arXiv Detail & Related papers (2021-06-19T16:25:28Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.