OOD Link Prediction Generalization Capabilities of Message-Passing GNNs
in Larger Test Graphs
- URL: http://arxiv.org/abs/2205.15117v2
- Date: Wed, 1 Jun 2022 11:23:17 GMT
- Title: OOD Link Prediction Generalization Capabilities of Message-Passing GNNs
in Larger Test Graphs
- Authors: Yangze Zhou, Gitta Kutyniok, Bruno Ribeiro
- Abstract summary: This work provides the first theoretical study on the ability of graph Message Passing Neural Networks (gMPNNs) to perform inductive out-of-distribution (OOD) link prediction tasks.
We first prove non-asymptotic bounds showing that link predictors based on permutation-equivariant (structural) node embeddings can converge to a random guess as test graphs get larger.
We then propose a theoretically-sound gMPNN that outputs structural pairwise (2-node) embeddings and prove non-asymptotic bounds showing that, as test graphs grow, these embeddings converge to embeddings of
- Score: 13.118954965368333
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work provides the first theoretical study on the ability of graph
Message Passing Neural Networks (gMPNNs) -- such as Graph Neural Networks
(GNNs) -- to perform inductive out-of-distribution (OOD) link prediction tasks,
where deployment (test) graph sizes are larger than training graphs. We first
prove non-asymptotic bounds showing that link predictors based on
permutation-equivariant (structural) node embeddings obtained by gMPNNs can
converge to a random guess as test graphs get larger. We then propose a
theoretically-sound gMPNN that outputs structural pairwise (2-node) embeddings
and prove non-asymptotic bounds showing that, as test graphs grow, these
embeddings converge to embeddings of a continuous function that retains its
ability to predict links OOD. Empirical results on random graphs show agreement
with our theoretical results.
Related papers
- Graph neural networks and non-commuting operators [4.912318087940015]
We develop a limit theory of graphon-tuple neural networks and use it to prove a universal transferability theorem.
Our theoretical results extend well-known transferability theorems for GNNs to the case of several simultaneous graphs.
We derive a training procedure that provably enforces the stability of the resulting model.
arXiv Detail & Related papers (2024-11-06T21:17:14Z) - Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
We take a manifold perspective to establish the statistical generalization theory of GNNs on graphs sampled from a manifold in the spectral domain.
We prove that the generalization bounds of GNNs decrease linearly with the size of the graphs in the logarithmic scale, and increase linearly with the spectral continuity constants of the filter functions.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - Disentangling Node Attributes from Graph Topology for Improved
Generalizability in Link Prediction [5.651457382936249]
Our proposed method, UPNA, solves the inductive link prediction problem by learning a function that takes a pair of node attributes and predicts the probability of an edge.
UPNA can be applied to various pairwise learning tasks and integrated with existing link prediction models to enhance their generalizability and bolster graph generative models.
arXiv Detail & Related papers (2023-07-17T22:19:12Z) - What functions can Graph Neural Networks compute on random graphs? The
role of Positional Encoding [0.0]
We aim to deepen the theoretical understanding of Graph Neural Networks (GNNs) on large graphs, with a focus on their expressive power.
Recently, several works showed that, on very general random graphs models, GNNs converge to certains functions as the number of nodes grows.
arXiv Detail & Related papers (2023-05-24T07:09:53Z) - Generalization in Graph Neural Networks: Improved PAC-Bayesian Bounds on
Graph Diffusion [17.70434437597516]
We present generalization bounds that scale with the largest singular value of the graph neural network's feature diffusion matrix.
These bounds are numerically much smaller than prior bounds for real-world graphs.
We measure the stability of graph neural networks against noise perturbations using Hessians.
arXiv Detail & Related papers (2023-02-09T05:54:17Z) - When Do We Need Graph Neural Networks for Node Classification? [38.68793097833027]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs)
In some cases, GNNs have little performance gain or even underperform graph-agnostic NNs.
arXiv Detail & Related papers (2022-10-30T23:10:23Z) - Generalizing Graph Neural Networks on Out-Of-Distribution Graphs [51.33152272781324]
Graph Neural Networks (GNNs) are proposed without considering the distribution shifts between training and testing graphs.
In such a setting, GNNs tend to exploit subtle statistical correlations existing in the training set for predictions, even though it is a spurious correlation.
We propose a general causal representation framework, called StableGNN, to eliminate the impact of spurious correlations.
arXiv Detail & Related papers (2021-11-20T18:57:18Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - Can Graph Neural Networks Count Substructures? [53.256112515435355]
We study the power of graph neural networks (GNNs) via their ability to count attributed graph substructures.
We distinguish between two types of substructure counting: inducedsubgraph-count and subgraphcount-count, and both positive and negative answers for popular GNN architectures.
arXiv Detail & Related papers (2020-02-10T18:53:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.