Fast Graph Kernel with Optical Random Features
- URL: http://arxiv.org/abs/2010.08270v1
- Date: Fri, 16 Oct 2020 09:43:47 GMT
- Title: Fast Graph Kernel with Optical Random Features
- Authors: Hashem Ghanem and Nicolas Keriven and Nicolas Tremblay
- Abstract summary: The graphlet kernel suffers from a high computation cost due to the isomorphism test it includes.
We propose to leverage kernel random features within the graphlet framework, and establish a theoretical link with a mean kernel metric.
- Score: 17.403133838762447
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The graphlet kernel is a classical method in graph classification. It however
suffers from a high computation cost due to the isomorphism test it includes.
As a generic proxy, and in general at the cost of losing some information, this
test can be efficiently replaced by a user-defined mapping that computes
various graph characteristics. In this paper, we propose to leverage kernel
random features within the graphlet framework, and establish a theoretical link
with a mean kernel metric. If this method can still be prohibitively costly for
usual random features, we then incorporate optical random features that can be
computed in constant time. Experiments show that the resulting algorithm is
orders of magnitude faster that the graphlet kernel for the same, or better,
accuracy.
Related papers
- Optimal Time Complexity Algorithms for Computing General Random Walk Graph Kernels on Sparse Graphs [14.049529046098607]
We present the first linear time complexity randomized algorithms for unbiased approximation of general random walk kernels (RWKs) for sparse graphs.
Our method is up to $mathbf27times$ faster than its counterparts for efficient computation on large graphs.
arXiv Detail & Related papers (2024-10-14T10:48:46Z) - Heating Up Quasi-Monte Carlo Graph Random Features: A Diffusion Kernel Perspective [0.0]
We build upon a recently introduced class of quasi-graph random features (q-GRFs)
We find that the Diffusion kernel performs most similarly to the 2-regularized Laplacian.
We explore graph types that benefit from the previously established antithetic termination procedure.
arXiv Detail & Related papers (2024-10-10T21:51:31Z) - Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - General Graph Random Features [42.75616308187867]
We propose a novel random walk-based algorithm for unbiased estimation of arbitrary functions of a weighted adjacency matrix.
Our algorithm enjoys subquadratic time complexity with respect to the number of nodes, overcoming the notoriously prohibitive cubic scaling of exact graph kernel evaluation.
arXiv Detail & Related papers (2023-10-07T15:47:31Z) - Tanimoto Random Features for Scalable Molecular Machine Learning [2.5112706394511766]
We propose two kinds of novel random features to allow the Tanimoto kernel to scale to large datasets.
We show that these random features are effective at approximating the Tanimoto coefficient of real-world datasets.
arXiv Detail & Related papers (2023-06-26T16:11:11Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Graph Filtration Kernels [3.325932244741268]
We propose a family of graph kernels that incorporate existence intervals of features.
We show that using Weisfeiler-Lehman labels over certain filtrations strictly increases the expressive power over the ordinary Weisfeiler-Lehman procedure.
arXiv Detail & Related papers (2021-10-22T15:51:10Z) - FGOT: Graph Distances based on Filters and Optimal Transport [62.779521543654134]
Graph comparison deals with identifying similarities and dissimilarities between graphs.
A major obstacle is the unknown alignment of graphs, as well as the lack of accurate and inexpensive comparison metrics.
In this work we introduce the filter graph distance approximation.
arXiv Detail & Related papers (2021-09-09T17:43:07Z) - Online Dense Subgraph Discovery via Blurred-Graph Feedback [87.9850024070244]
We introduce a novel learning problem for dense subgraph discovery.
We first propose a edge-time algorithm that obtains a nearly-optimal solution with high probability.
We then design a more scalable algorithm with a theoretical guarantee.
arXiv Detail & Related papers (2020-06-24T11:37:33Z) - Fast Graph Attention Networks Using Effective Resistance Based Graph
Sparsification [70.50751397870972]
FastGAT is a method to make attention based GNNs lightweight by using spectral sparsification to generate an optimal pruning of the input graph.
We experimentally evaluate FastGAT on several large real world graph datasets for node classification tasks.
arXiv Detail & Related papers (2020-06-15T22:07:54Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.