Stability and Generalization Capabilities of Message Passing Graph
Neural Networks
- URL: http://arxiv.org/abs/2202.00645v1
- Date: Tue, 1 Feb 2022 18:37:53 GMT
- Title: Stability and Generalization Capabilities of Message Passing Graph
Neural Networks
- Authors: Sohir Maskey, Yunseok Lee, Ron Levie, Gitta Kutyniok
- Abstract summary: We study the generalization capabilities of MPNNs in graph classification.
We derive a non-asymptotic bound on the generalization gap between the empirical and statistical loss.
This is proven by showing that a MPNN, applied on a graph, approximates the MPNN applied on the geometric model that the graph discretizes.
- Score: 4.691259009382681
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Message passing neural networks (MPNN) have seen a steep rise in popularity
since their introduction as generalizations of convolutional neural networks to
graph structured data, and are now considered state-of-the-art tools for
solving a large variety of graph-focused problems. We study the generalization
capabilities of MPNNs in graph classification. We assume that graphs of
different classes are sampled from different random graph models. Based on this
data distribution, we derive a non-asymptotic bound on the generalization gap
between the empirical and statistical loss, that decreases to zero as the
graphs become larger. This is proven by showing that a MPNN, applied on a
graph, approximates the MPNN applied on the geometric model that the graph
discretizes.
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
Graph Neural Networks (GNNs) combine information from adjacent nodes by successive applications of graph convolutions.
We study the generalization gaps of GNNs on both node-level and graph-level tasks.
We show that the generalization gaps decrease with the number of nodes in the training graphs.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Generalization Bounds for Message Passing Networks on Mixture of Graphons [21.457225542391267]
We study the generalization capabilities of Message Passing Neural Networks (MPNNs)
We derive generalization bounds specifically for MPNNs with normalized sum aggregation and mean aggregation.
Our results imply that MPNNs with higher complexity than the size of the training set can still generalize effectively.
arXiv Detail & Related papers (2024-04-04T14:26:47Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - Graph Anomaly Detection with Graph Neural Networks: Current Status and
Challenges [9.076649460696402]
Graph neural networks (GNNs) have been studied extensively and have successfully performed difficult machine learning tasks.
This survey is the first comprehensive review of graph anomaly detection methods based on GNNs.
arXiv Detail & Related papers (2022-09-29T16:47:57Z) - MentorGNN: Deriving Curriculum for Pre-Training GNNs [61.97574489259085]
We propose an end-to-end model named MentorGNN that aims to supervise the pre-training process of GNNs across graphs.
We shed new light on the problem of domain adaption on relational data (i.e., graphs) by deriving a natural and interpretable upper bound on the generalization error of the pre-trained GNNs.
arXiv Detail & Related papers (2022-08-21T15:12:08Z) - KGNN: Harnessing Kernel-based Networks for Semi-supervised Graph
Classification [13.419578861488226]
We propose a Kernel-based Graph Neural Network (KGNN) for semi-supervised graph classification.
We show that KGNN achieves impressive performance over competitive baselines.
arXiv Detail & Related papers (2022-05-21T10:03:46Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - OOD-GNN: Out-of-Distribution Generalized Graph Neural Network [73.67049248445277]
Graph neural networks (GNNs) have achieved impressive performance when testing and training graph data come from identical distribution.
Existing GNNs lack out-of-distribution generalization abilities so that their performance substantially degrades when there exist distribution shifts between testing and training graph data.
We propose an out-of-distribution generalized graph neural network (OOD-GNN) for achieving satisfactory performance on unseen testing graphs that have different distributions with training graphs.
arXiv Detail & Related papers (2021-12-07T16:29:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.