General Graph Random Features
- URL: http://arxiv.org/abs/2310.04859v3
- Date: Fri, 24 May 2024 10:57:51 GMT
- Title: General Graph Random Features
- Authors: Isaac Reid, Krzysztof Choromanski, Eli Berger, Adrian Weller,
- Abstract summary: We propose a novel random walk-based algorithm for unbiased estimation of arbitrary functions of a weighted adjacency matrix.
Our algorithm enjoys subquadratic time complexity with respect to the number of nodes, overcoming the notoriously prohibitive cubic scaling of exact graph kernel evaluation.
- Score: 42.75616308187867
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a novel random walk-based algorithm for unbiased estimation of arbitrary functions of a weighted adjacency matrix, coined universal graph random features (u-GRFs). This includes many of the most popular examples of kernels defined on the nodes of a graph. Our algorithm enjoys subquadratic time complexity with respect to the number of nodes, overcoming the notoriously prohibitive cubic scaling of exact graph kernel evaluation. It can also be trivially distributed across machines, permitting learning on much larger networks. At the heart of the algorithm is a modulation function which upweights or downweights the contribution from different random walks depending on their lengths. We show that by parameterising it with a neural network we can obtain u-GRFs that give higher-quality kernel estimates or perform efficient, scalable kernel learning. We provide robust theoretical analysis and support our findings with experiments including pointwise estimation of fixed graph kernels, solving non-homogeneous graph ordinary differential equations, node clustering and kernel regression on triangular meshes.
Related papers
- Optimal Time Complexity Algorithms for Computing General Random Walk Graph Kernels on Sparse Graphs [14.049529046098607]
We present the first linear time complexity randomized algorithms for unbiased approximation of general random walk kernels (RWKs) for sparse graphs.
Our method is up to $mathbf27times$ faster than its counterparts for efficient computation on large graphs.
arXiv Detail & Related papers (2024-10-14T10:48:46Z) - Heating Up Quasi-Monte Carlo Graph Random Features: A Diffusion Kernel Perspective [0.0]
We build upon a recently introduced class of quasi-graph random features (q-GRFs)
We find that the Diffusion kernel performs most similarly to the 2-regularized Laplacian.
We explore graph types that benefit from the previously established antithetic termination procedure.
arXiv Detail & Related papers (2024-10-10T21:51:31Z) - Analysis and Approximate Inference of Large Random Kronecker Graphs [4.417282202068703]
We show that the adjacency of a large random Kronecker graph can be decomposed.
We propose a denoise-and-solve'' approach to infer the key graph parameters.
arXiv Detail & Related papers (2023-06-14T13:09:38Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Taming graph kernels with random features [17.482280753348288]
We introduce the mechanism of graph random features (GRFs)
GRFs can be used to construct unbiased randomized estimators of several important kernels defined on graphs' nodes.
arXiv Detail & Related papers (2023-04-29T03:04:49Z) - Graph Neural Network Bandits [89.31889875864599]
We consider the bandit optimization problem with the reward function defined over graph-structured data.
Key challenges in this setting are scaling to large domains, and to graphs with many nodes.
We show that graph neural networks (GNNs) can be used to estimate the reward function.
arXiv Detail & Related papers (2022-07-13T18:12:36Z) - Random Features for the Neural Tangent Kernel [57.132634274795066]
We propose an efficient feature map construction of the Neural Tangent Kernel (NTK) of fully-connected ReLU network.
We show that dimension of the resulting features is much smaller than other baseline feature map constructions to achieve comparable error bounds both in theory and practice.
arXiv Detail & Related papers (2021-04-03T09:08:12Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.