Geometric instability of graph neural networks on large graphs
- URL: http://arxiv.org/abs/2308.10099v2
- Date: Tue, 28 Nov 2023 10:35:06 GMT
- Title: Geometric instability of graph neural networks on large graphs
- Authors: Emily Morris, Haotian Shen, Weiling Du, Muhammad Hamza Sajjad, Borun
Shi
- Abstract summary: We analyse the geometric instability of embeddings produced by graph neural networks (GNNs)
We propose a simple, efficient and graph-native Graph Gram Index (GGI) to measure such instability.
This allows us to study the varying instability behaviour of GNN embeddings on large graphs for both node classification and link prediction.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We analyse the geometric instability of embeddings produced by graph neural
networks (GNNs). Existing methods are only applicable for small graphs and lack
context in the graph domain. We propose a simple, efficient and graph-native
Graph Gram Index (GGI) to measure such instability which is invariant to
permutation, orthogonal transformation, translation and order of evaluation.
This allows us to study the varying instability behaviour of GNN embeddings on
large graphs for both node classification and link prediction.
Related papers
- NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A Spectral Analysis of Graph Neural Networks on Dense and Sparse Graphs [13.954735096637298]
We analyze how sparsity affects the graph spectra, and thus the performance of graph neural networks (GNNs) in node classification on dense and sparse graphs.
We show that GNNs can outperform spectral methods on sparse graphs, and illustrate these results with numerical examples on both synthetic and real graphs.
arXiv Detail & Related papers (2022-11-06T22:38:13Z) - Training Graph Neural Networks on Growing Stochastic Graphs [114.75710379125412]
Graph Neural Networks (GNNs) rely on graph convolutions to exploit meaningful patterns in networked data.
We propose to learn GNNs on very large graphs by leveraging the limit object of a sequence of growing graphs, the graphon.
arXiv Detail & Related papers (2022-10-27T16:00:45Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - SpeqNets: Sparsity-aware Permutation-equivariant Graph Networks [16.14454388348814]
We present a class of universal, permutation-equivariant graph networks.
They offer a fine-grained control between expressivity and scalability and adapt to the sparsity of the graph.
These architectures lead to vastly reduced computation times compared to standard higher-order graph networks.
arXiv Detail & Related papers (2022-03-25T21:17:09Z) - Transferability Properties of Graph Neural Networks [125.71771240180654]
Graph neural networks (GNNs) are provably successful at learning representations from data supported on moderate-scale graphs.
We study the problem of training GNNs on graphs of moderate size and transferring them to large-scale graphs.
Our results show that (i) the transference error decreases with the graph size, and (ii) that graph filters have a transferability-discriminability tradeoff that in GNNs is alleviated by the scattering behavior of the nonlinearity.
arXiv Detail & Related papers (2021-12-09T00:08:09Z) - Increase and Conquer: Training Graph Neural Networks on Growing Graphs [116.03137405192356]
We consider the problem of learning a graphon neural network (WNN) by training GNNs on graphs sampled Bernoulli from the graphon.
Inspired by these results, we propose an algorithm to learn GNNs on large-scale graphs that, starting from a moderate number of nodes, successively increases the size of the graph during training.
arXiv Detail & Related papers (2021-06-07T15:05:59Z) - Towards Scale-Invariant Graph-related Problem Solving by Iterative
Homogeneous Graph Neural Networks [39.370875358317946]
Current graph neural networks (GNNs) lack generalizability with respect to scales (graph sizes, graph diameters, edge weights, etc.) when solving many graph analysis problems.
We propose several extensions to address the issue. First, inspired by the dependency of the number of iteration of common graph theory algorithms on graph size, we learn to terminate the message passing process in GNNs adaptively according to the progress.
Second, inspired by the fact that many graph theory algorithms are homogeneous with respect to graph weights, we introduce homogeneous transformation layers that are universal homogeneous function approximators, to convert ordinary G
arXiv Detail & Related papers (2020-10-26T12:57:28Z) - Meta-path Free Semi-supervised Learning for Heterogeneous Networks [16.641434334366227]
Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.
In this paper, we propose simple and effective graph neural networks for heterogeneous graph, excluding the use of meta-paths.
arXiv Detail & Related papers (2020-10-18T06:01:58Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graphon Pooling in Graph Neural Networks [169.09536309161314]
Graph neural networks (GNNs) have been used effectively in different applications involving the processing of signals on irregular structures modeled by graphs.
We propose a new strategy for pooling and sampling on GNNs using graphons which preserves the spectral properties of the graph.
arXiv Detail & Related papers (2020-03-03T21:04:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.