Graphon Neural Networks and the Transferability of Graph Neural Networks
- URL: http://arxiv.org/abs/2006.03548v2
- Date: Tue, 20 Oct 2020 23:32:23 GMT
- Title: Graphon Neural Networks and the Transferability of Graph Neural Networks
- Authors: Luana Ruiz, Luiz F. O. Chamon, Alejandro Ribeiro
- Abstract summary: Graph neural networks (GNNs) rely on graph convolutions to extract local features from network data.
We introduce graphon NNs as limit objects of GNNs and prove a bound on the difference between the output of a GNN and its limit graphon-NN.
This result establishes a tradeoff between discriminability and transferability of GNNs.
- Score: 125.71771240180654
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) rely on graph convolutions to extract local
features from network data. These graph convolutions combine information from
adjacent nodes using coefficients that are shared across all nodes. Since these
coefficients are shared and do not depend on the graph, one can envision using
the same coefficients to define a GNN on another graph. This motivates
analyzing the transferability of GNNs across graphs. In this paper we introduce
graphon NNs as limit objects of GNNs and prove a bound on the difference
between the output of a GNN and its limit graphon-NN. This bound vanishes with
growing number of nodes if the graph convolutional filters are bandlimited in
the graph spectral domain. This result establishes a tradeoff between
discriminability and transferability of GNNs.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Transferability of Graph Neural Networks using Graphon and Sampling Theories [0.0]
Graph neural networks (GNNs) have become powerful tools for processing graph-based information in various domains.
A desirable property of GNNs is transferability, where a trained network can swap in information from a different graph without retraining and retain its accuracy.
We contribute to the application of graphons to GNNs by presenting an explicit two-layer graphon neural network (WNN) architecture.
arXiv Detail & Related papers (2023-07-25T02:11:41Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - KerGNNs: Interpretable Graph Neural Networks with Graph Kernels [14.421535610157093]
Graph neural networks (GNNs) have become the state-of-the-art method in downstream graph-related tasks.
We propose a novel GNN framework, termed textit Kernel Graph Neural Networks (KerGNNs)
KerGNNs integrate graph kernels into the message passing process of GNNs.
We show that our method achieves competitive performance compared with existing state-of-the-art methods.
arXiv Detail & Related papers (2022-01-03T06:16:30Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.