A PAC-Bayesian Approach to Generalization Bounds for Graph Neural
Networks
- URL: http://arxiv.org/abs/2012.07690v1
- Date: Mon, 14 Dec 2020 16:41:23 GMT
- Title: A PAC-Bayesian Approach to Generalization Bounds for Graph Neural
Networks
- Authors: Renjie Liao, Raquel Urtasun, Richard Zemel
- Abstract summary: We derive generalization bounds for the two primary classes of graph neural networks (GNNs)
Our result reveals that the maximum node degree and spectral norm of the weights govern the generalization bounds of both models.
- Score: 99.46182575751271
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we derive generalization bounds for the two primary classes of
graph neural networks (GNNs), namely graph convolutional networks (GCNs) and
message passing GNNs (MPGNNs), via a PAC-Bayesian approach. Our result reveals
that the maximum node degree and spectral norm of the weights govern the
generalization bounds of both models. We also show that our bound for GCNs is a
natural generalization of the results developed in arXiv:1707.09564v2 [cs.LG]
for fully-connected and convolutional neural networks. For message passing
GNNs, our PAC-Bayes bound improves over the Rademacher complexity based bound
in arXiv:2002.06157v1 [cs.LG], showing a tighter dependency on the maximum node
degree and the maximum hidden dimension. The key ingredients of our proofs are
a perturbation analysis of GNNs and the generalization of PAC-Bayes analysis to
non-homogeneous GNNs. We perform an empirical study on several real-world graph
datasets and verify that our PAC-Bayes bound is tighter than others.
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
Graph Neural Networks (GNNs) combine information from adjacent nodes by successive applications of graph convolutions.
We study the generalization gaps of GNNs on both node-level and graph-level tasks.
We show that the generalization gaps decrease with the number of nodes in the training graphs.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - PAC-Bayesian Adversarially Robust Generalization Bounds for Graph Neural Network [5.340644246815989]
Graph neural networks (GNNs) are vulnerable to adversarial attacks.
In this paper, we provide adversarially robust generalization bounds for two kinds of popular GNNs.
arXiv Detail & Related papers (2024-02-06T14:34:17Z) - EvenNet: Ignoring Odd-Hop Neighbors Improves Robustness of Graph Neural
Networks [51.42338058718487]
Graph Neural Networks (GNNs) have received extensive research attention for their promising performance in graph machine learning.
Existing approaches, such as GCN and GPRGNN, are not robust in the face of homophily changes on test graphs.
We propose EvenNet, a spectral GNN corresponding to an even-polynomial graph filter.
arXiv Detail & Related papers (2022-05-27T10:48:14Z) - Hierarchical Message-Passing Graph Neural Networks [12.207978823927386]
We propose a novel Hierarchical Message-passing Graph Neural Networks framework.
Key idea is generating a hierarchical structure that re-organises all nodes in a flat graph into multi-level super graphs.
We present the first model to implement this framework, termed Hierarchical Community-aware Graph Neural Network (HC-GNN)
arXiv Detail & Related papers (2020-09-08T13:11:07Z) - Infinitely Wide Graph Convolutional Networks: Semi-supervised Learning
via Gaussian Processes [144.6048446370369]
Graph convolutional neural networks(GCNs) have recently demonstrated promising results on graph-based semi-supervised classification.
We propose a GP regression model via GCNs(GPGC) for graph-based semi-supervised learning.
We conduct extensive experiments to evaluate GPGC and demonstrate that it outperforms other state-of-the-art methods.
arXiv Detail & Related papers (2020-02-26T10:02:32Z) - Generalization and Representational Limits of Graph Neural Networks [46.20253808402385]
We prove that several important graph properties cannot be computed by graph neural networks (GNNs) that rely entirely on local information.
We provide the first data dependent generalization bounds for message passing GNNs.
Our bounds are much tighter than existing VC-dimension based guarantees for GNNs, and are comparable to Rademacher bounds for recurrent neural networks.
arXiv Detail & Related papers (2020-02-14T18:10:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.