Almost Surely Asymptotically Constant Graph Neural Networks
- URL: http://arxiv.org/abs/2403.03880v2
- Date: Thu, 23 May 2024 15:03:45 GMT
- Title: Almost Surely Asymptotically Constant Graph Neural Networks
- Authors: Sam Adam-Day, Michael Benedikt, İsmail İlkan Ceylan, Ben Finkelshtein,
- Abstract summary: We show that the output converges to a constant function, which upper-bounds what these classifiers can uniformly express.
Results apply to a broad class of random graph models.
We empirically validate these findings, observing that the convergence phenomenon appears not only on random graphs but also on some real-world graphs.
- Score: 7.339728196535312
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a new angle on the expressive power of graph neural networks (GNNs) by studying how the predictions of a GNN probabilistic classifier evolve as we apply it on larger graphs drawn from some random graph model. We show that the output converges to a constant function, which upper-bounds what these classifiers can uniformly express. This strong convergence phenomenon applies to a very wide class of GNNs, including state of the art models, with aggregates including mean and the attention-based mechanism of graph transformers. Our results apply to a broad class of random graph models, including sparse and dense variants of the Erd\H{o}s-R\'enyi model, the stochastic block model, and the Barab\'asi-Albert model. We empirically validate these findings, observing that the convergence phenomenon appears not only on random graphs but also on some real-world graphs.
Related papers
- Generalization of Graph Neural Networks is Robust to Model Mismatch [84.01980526069075]
Graph neural networks (GNNs) have demonstrated their effectiveness in various tasks supported by their generalization capabilities.
In this paper, we examine GNNs that operate on geometric graphs generated from manifold models.
Our analysis reveals the robustness of the GNN generalization in the presence of such model mismatch.
arXiv Detail & Related papers (2024-08-25T16:00:44Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Stability and Generalization Capabilities of Message Passing Graph
Neural Networks [4.691259009382681]
We study the generalization capabilities of MPNNs in graph classification.
We derive a non-asymptotic bound on the generalization gap between the empirical and statistical loss.
This is proven by showing that a MPNN, applied on a graph, approximates the MPNN applied on the geometric model that the graph discretizes.
arXiv Detail & Related papers (2022-02-01T18:37:53Z) - On the Power of Edge Independent Graph Models [22.085932117823738]
We study the limitations of edge independent random graph models, in which each edge is added to the graph independently with some probability.
We prove that subject to a bounded overlap condition, edge independent models are inherently limited in their ability to generate graphs with high triangle and other subgraph densities.
arXiv Detail & Related papers (2021-10-29T19:12:14Z) - Generalization of graph network inferences in higher-order graphical
models [5.33024001730262]
Probabilistic graphical models provide a powerful tool to describe complex statistical structure.
inferences such as marginalization are intractable for general graphs.
We define the Recurrent Factor Graph Neural Network (RF-GNN) to achieve fast approximate inference on graphical models that involve many-variable interactions.
arXiv Detail & Related papers (2021-07-12T20:51:27Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Graph and graphon neural network stability [122.06927400759021]
Graph networks (GNNs) are learning architectures that rely on knowledge of the graph structure to generate meaningful representations of network data.
We analyze GNN stability using kernel objects called graphons.
arXiv Detail & Related papers (2020-10-23T16:55:56Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - The Power of Graph Convolutional Networks to Distinguish Random Graph
Models: Short Version [27.544219236164764]
Graph convolutional networks (GCNs) are a widely used method for graph representation learning.
We investigate the power of GCNs to distinguish between different random graph models on the basis of the embeddings of their sample graphs.
arXiv Detail & Related papers (2020-02-13T17:58:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.