Generalization and Representational Limits of Graph Neural Networks
- URL: http://arxiv.org/abs/2002.06157v1
- Date: Fri, 14 Feb 2020 18:10:14 GMT
- Title: Generalization and Representational Limits of Graph Neural Networks
- Authors: Vikas K. Garg, Stefanie Jegelka, and Tommi Jaakkola
- Abstract summary: We prove that several important graph properties cannot be computed by graph neural networks (GNNs) that rely entirely on local information.
We provide the first data dependent generalization bounds for message passing GNNs.
Our bounds are much tighter than existing VC-dimension based guarantees for GNNs, and are comparable to Rademacher bounds for recurrent neural networks.
- Score: 46.20253808402385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We address two fundamental questions about graph neural networks (GNNs).
First, we prove that several important graph properties cannot be computed by
GNNs that rely entirely on local information. Such GNNs include the standard
message passing models, and more powerful spatial variants that exploit local
graph structure (e.g., via relative orientation of messages, or local port
ordering) to distinguish neighbors of each node. Our treatment includes a novel
graph-theoretic formalism. Second, we provide the first data dependent
generalization bounds for message passing GNNs. This analysis explicitly
accounts for the local permutation invariance of GNNs. Our bounds are much
tighter than existing VC-dimension based guarantees for GNNs, and are
comparable to Rademacher bounds for recurrent neural networks.
Related papers
- Geodesic Graph Neural Network for Efficient Graph Representation
Learning [34.047527874184134]
We propose an efficient GNN framework called Geodesic GNN (GDGNN)
It injects conditional relationships between nodes into the model without labeling.
Conditioned on the geodesic representations, GDGNN is able to generate node, link, and graph representations that carry much richer structural information than plain GNNs.
arXiv Detail & Related papers (2022-10-06T02:02:35Z) - Generalizing Aggregation Functions in GNNs:High-Capacity GNNs via
Nonlinear Neighborhood Aggregators [14.573383849211773]
Graph neural networks (GNNs) have achieved great success in many graph learning tasks.
Existing GNNs mainly adopt either linear neighborhood aggregation (mean,sum) or max aggregator in their message propagation.
We re-think the message propagation mechanism in GNNs and aim to develop the general nonlinear aggregators for neighborhood information aggregation in GNNs.
arXiv Detail & Related papers (2022-02-18T11:49:59Z) - Feature Correlation Aggregation: on the Path to Better Graph Neural
Networks [37.79964911718766]
Prior to the introduction of Graph Neural Networks (GNNs), modeling and analyzing irregular data, particularly graphs, was thought to be the Achilles' heel of deep learning.
This paper introduces a central node permutation variant function through a frustratingly simple and innocent-looking modification to the core operation of a GNN.
A tangible boost in performance of the model is observed where the model surpasses previous state-of-the-art results by a significant margin while employing fewer parameters.
arXiv Detail & Related papers (2021-09-20T05:04:26Z) - Identity-aware Graph Neural Networks [63.6952975763946]
We develop a class of message passing Graph Neural Networks (ID-GNNs) with greater expressive power than the 1-WL test.
ID-GNN extends existing GNN architectures by inductively considering nodes' identities during message passing.
We show that transforming existing GNNs to ID-GNNs yields on average 40% accuracy improvement on challenging node, edge, and graph property prediction tasks.
arXiv Detail & Related papers (2021-01-25T18:59:01Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs [95.63153473559865]
Graph Neural Networks (GNNs) are emerging machine learning models on graphs.
Most existing GNN models in practice are shallow and essentially feature-centric.
We show empirically and analytically that the existing shallow GNNs cannot preserve graph structures well.
We propose Eigen-GNN, a plug-in module to boost GNNs ability in preserving graph structures.
arXiv Detail & Related papers (2020-06-08T02:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.