Graph Filtration Kernels
- URL: http://arxiv.org/abs/2110.11862v1
- Date: Fri, 22 Oct 2021 15:51:10 GMT
- Title: Graph Filtration Kernels
- Authors: Till Hendrik Schulz, Pascal Welke, Stefan Wrobel
- Abstract summary: We propose a family of graph kernels that incorporate existence intervals of features.
We show that using Weisfeiler-Lehman labels over certain filtrations strictly increases the expressive power over the ordinary Weisfeiler-Lehman procedure.
- Score: 3.325932244741268
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The majority of popular graph kernels is based on the concept of Haussler's
$\mathcal{R}$-convolution kernel and defines graph similarities in terms of
mutual substructures. In this work, we enrich these similarity measures by
considering graph filtrations: Using meaningful orders on the set of edges,
which allow to construct a sequence of nested graphs, we can consider a graph
at multiple granularities. For one thing, this provides access to features on
different levels of resolution. Furthermore, rather than to simply compare
frequencies of features in graphs, it allows for their comparison in terms of
when and for how long they exist in the sequences. In this work, we propose a
family of graph kernels that incorporate these existence intervals of features.
While our approach can be applied to arbitrary graph features, we particularly
highlight Weisfeiler-Lehman vertex labels, leading to efficient kernels. We
show that using Weisfeiler-Lehman labels over certain filtrations strictly
increases the expressive power over the ordinary Weisfeiler-Lehman procedure in
terms of deciding graph isomorphism. In fact, this result directly yields more
powerful graph kernels based on such features and has implications to graph
neural networks due to their close relationship to the Weisfeiler-Lehman
method. We empirically validate the expressive power of our graph kernels and
show significant improvements over state-of-the-art graph kernels in terms of
predictive performance on various real-world benchmark datasets.
Related papers
- Graph Mixup with Soft Alignments [49.61520432554505]
We study graph data augmentation by mixup, which has been used successfully on images.
We propose S-Mixup, a simple yet effective mixup method for graph classification by soft alignments.
arXiv Detail & Related papers (2023-06-11T22:04:28Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Transductive Kernels for Gaussian Processes on Graphs [7.542220697870243]
We present a novel kernel for graphs with node feature data for semi-supervised learning.
The kernel is derived from a regularization framework by treating the graph and feature data as two spaces.
We show how numerous kernel-based models on graphs are instances of our design.
arXiv Detail & Related papers (2022-11-28T14:00:50Z) - The PWLR Graph Representation: A Persistent Weisfeiler-Lehman scheme
with Random Walks for Graph Classification [0.6999740786886536]
Persistent Weisfeiler-Lehman Random walk scheme (abbreviated as PWLR) for graph representations.
We generalize many variants of Weisfeiler-Lehman procedures, which are primarily used to embed graphs with discrete node labels.
arXiv Detail & Related papers (2022-08-29T08:50:37Z) - Demystifying Graph Convolution with a Simple Concatenation [6.542119695695405]
We quantify the information overlap between graph topology, node features, and labels.
We show that graph concatenation is a simple but more flexible alternative to graph convolution.
arXiv Detail & Related papers (2022-07-18T16:39:33Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Joint Graph Learning and Matching for Semantic Feature Correspondence [69.71998282148762]
We propose a joint emphgraph learning and matching network, named GLAM, to explore reliable graph structures for boosting graph matching.
The proposed method is evaluated on three popular visual matching benchmarks (Pascal VOC, Willow Object and SPair-71k)
It outperforms previous state-of-the-art graph matching methods by significant margins on all benchmarks.
arXiv Detail & Related papers (2021-09-01T08:24:02Z) - A Generalized Weisfeiler-Lehman Graph Kernel [2.959278299317192]
Weisfeiler-Lehman graph kernels are among the most prevalent graph kernels due to their remarkable time complexity and predictive performance.
We propose a generalization of Weisfeiler-Lehman graph kernels which takes into account the similarity between trees rather than equality.
arXiv Detail & Related papers (2021-01-20T13:03:51Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.