Heating Up Quasi-Monte Carlo Graph Random Features: A Diffusion Kernel Perspective
- URL: http://arxiv.org/abs/2410.08389v1
- Date: Thu, 10 Oct 2024 21:51:31 GMT
- Title: Heating Up Quasi-Monte Carlo Graph Random Features: A Diffusion Kernel Perspective
- Authors: Brooke Feinberg, Aiwen Li,
- Abstract summary: We build upon a recently introduced class of quasi-graph random features (q-GRFs)
We find that the Diffusion kernel performs most similarly to the 2-regularized Laplacian.
We explore graph types that benefit from the previously established antithetic termination procedure.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We build upon a recently introduced class of quasi-graph random features (q-GRFs), which have demonstrated the ability to yield lower variance estimators of the 2-regularized Laplacian kernel (Choromanski 2023). Our research investigates whether similar results can be achieved with alternative kernel functions, specifically the Diffusion (or Heat), Mat\'ern, and Inverse Cosine kernels. We find that the Diffusion kernel performs most similarly to the 2-regularized Laplacian, and we further explore graph types that benefit from the previously established antithetic termination procedure. Specifically, we explore Erd\H{o}s-R\'enyi and Barab\'asi-Albert random graph models, Binary Trees, and Ladder graphs, with the goal of identifying combinations of specific kernel and graph type that benefit from antithetic termination. We assert that q-GRFs achieve lower variance estimators of the Diffusion (or Heat) kernel on Ladder graphs. However, the number of rungs on the Ladder graphs impacts the algorithm's performance; further theoretical results supporting our experimentation are forthcoming. This work builds upon some of the earliest Quasi-Monte Carlo methods for kernels defined on combinatorial objects, paving the way for kernel-based learning algorithms and future real-world applications in various domains.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Learning Regularized Graphon Mean-Field Games with Unknown Graphons [155.38727464526923]
We design reinforcement learning algorithms for Graphon Mean-Field Games (GMFGs)
We aim to learn the Nash Equilibrium (NE) of the regularized GMFGs when the graphons are unknown.
These algorithms are the first specifically designed for learning graphons from sampled agents.
arXiv Detail & Related papers (2023-10-26T16:19:24Z) - General Graph Random Features [42.75616308187867]
We propose a novel random walk-based algorithm for unbiased estimation of arbitrary functions of a weighted adjacency matrix.
Our algorithm enjoys subquadratic time complexity with respect to the number of nodes, overcoming the notoriously prohibitive cubic scaling of exact graph kernel evaluation.
arXiv Detail & Related papers (2023-10-07T15:47:31Z) - Graph Fourier MMD for Signals on Graphs [67.68356461123219]
We propose a novel distance between distributions and signals on graphs.
GFMMD is defined via an optimal witness function that is both smooth on the graph and maximizes difference in expectation.
We showcase it on graph benchmark datasets as well as on single cell RNA-sequencing data analysis.
arXiv Detail & Related papers (2023-06-05T00:01:17Z) - Quasi-Monte Carlo Graph Random Features [38.762964164013226]
We present a novel mechanism to improve the accuracy of graph random features (GRFs)
Our method induces negative correlations between the lengths of the algorithm's random walks by imposing antithetic termination.
We derive strong theoretical guarantees on the properties of these quasi-Monte Carlo GRFs.
arXiv Detail & Related papers (2023-05-21T14:12:02Z) - Taming graph kernels with random features [17.482280753348288]
We introduce the mechanism of graph random features (GRFs)
GRFs can be used to construct unbiased randomized estimators of several important kernels defined on graphs' nodes.
arXiv Detail & Related papers (2023-04-29T03:04:49Z) - Graph Signal Sampling for Inductive One-Bit Matrix Completion: a
Closed-form Solution [112.3443939502313]
We propose a unified graph signal sampling framework which enjoys the benefits of graph signal analysis and processing.
The key idea is to transform each user's ratings on the items to a function (signal) on the vertices of an item-item graph.
For the online setting, we develop a Bayesian extension, i.e., BGS-IMC which considers continuous random Gaussian noise in the graph Fourier domain.
arXiv Detail & Related papers (2023-02-08T08:17:43Z) - Transductive Kernels for Gaussian Processes on Graphs [7.542220697870243]
We present a novel kernel for graphs with node feature data for semi-supervised learning.
The kernel is derived from a regularization framework by treating the graph and feature data as two spaces.
We show how numerous kernel-based models on graphs are instances of our design.
arXiv Detail & Related papers (2022-11-28T14:00:50Z) - Non-separable Spatio-temporal Graph Kernels via SPDEs [90.62347738138594]
We provide graph kernels for principled-temporal modelling on graphs.
By providing novel tools for modelling on graphs, we outperform pre-existing graph kernels in real-world applications.
arXiv Detail & Related papers (2021-11-16T14:53:19Z) - An End-to-End Graph Convolutional Kernel Support Vector Machine [0.0]
A kernel-based support vector machine (SVM) for graph classification is proposed.
The proposed model is trained in a supervised end-to-end manner.
Experimental results demonstrate that the proposed model outperforms existing deep learning baseline models on a number of datasets.
arXiv Detail & Related papers (2020-02-29T09:57:42Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.