Transductive Kernels for Gaussian Processes on Graphs
- URL: http://arxiv.org/abs/2211.15322v1
- Date: Mon, 28 Nov 2022 14:00:50 GMT
- Title: Transductive Kernels for Gaussian Processes on Graphs
- Authors: Yin-Cong Zhi, Felix L. Opolka, Yin Cheng Ng, Pietro Li\`o, Xiaowen
Dong
- Abstract summary: We present a novel kernel for graphs with node feature data for semi-supervised learning.
The kernel is derived from a regularization framework by treating the graph and feature data as two spaces.
We show how numerous kernel-based models on graphs are instances of our design.
- Score: 7.542220697870243
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kernels on graphs have had limited options for node-level problems. To
address this, we present a novel, generalized kernel for graphs with node
feature data for semi-supervised learning. The kernel is derived from a
regularization framework by treating the graph and feature data as two Hilbert
spaces. We also show how numerous kernel-based models on graphs are instances
of our design. A kernel defined this way has transductive properties, and this
leads to improved ability to learn on fewer training points, as well as better
handling of highly non-Euclidean data. We demonstrate these advantages using
synthetic data where the distribution of the whole graph can inform the pattern
of the labels. Finally, by utilizing a flexible polynomial of the graph
Laplacian within the kernel, the model also performed effectively in
semi-supervised classification on graphs of various levels of homophily.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - Learning to Approximate Adaptive Kernel Convolution on Graphs [4.434835769977399]
We propose a diffusion learning framework, where the range of feature aggregation is controlled by the scale of a diffusion kernel.
Our model is tested on various standard for node-wise classification for the state-of-the-art datasets performance.
It is also validated on a real-world brain network data for graph classifications to demonstrate its practicality for Alzheimer classification.
arXiv Detail & Related papers (2024-01-22T10:57:11Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Graph Neural Network Bandits [89.31889875864599]
We consider the bandit optimization problem with the reward function defined over graph-structured data.
Key challenges in this setting are scaling to large domains, and to graphs with many nodes.
We show that graph neural networks (GNNs) can be used to estimate the reward function.
arXiv Detail & Related papers (2022-07-13T18:12:36Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Graph Filtration Kernels [3.325932244741268]
We propose a family of graph kernels that incorporate existence intervals of features.
We show that using Weisfeiler-Lehman labels over certain filtrations strictly increases the expressive power over the ordinary Weisfeiler-Lehman procedure.
arXiv Detail & Related papers (2021-10-22T15:51:10Z) - Neighborhood Preserving Kernels for Attributed Graphs [0.9176056742068812]
We describe the design of a reproducing kernel suitable for attributed graphs.
The similarity between the two graphs is defined based on the neighborhood information of the graph nodes.
By incorporating the proposed kernel on support vector machines we analyzed the real-world data sets.
arXiv Detail & Related papers (2020-10-13T09:58:50Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Convolutional Kernel Networks for Graph-Structured Data [37.13712126432493]
We introduce a family of multilayer graph kernels and establish new links between graph convolutional neural networks and kernel methods.
Our approach generalizes convolutional kernel networks to graph-structured data, by representing graphs as a sequence of kernel feature maps.
Our model can also be trained end-to-end on large-scale data, leading to new types of graph convolutional neural networks.
arXiv Detail & Related papers (2020-03-11T09:44:03Z) - A Hierarchical Transitive-Aligned Graph Kernel for Un-attributed Graphs [11.51839867040302]
We develop a new graph kernel, namely the Hierarchical Transitive-Aligned kernel, by transitively aligning the vertices between graphs.
The proposed kernel can outperform state-of-the-art graph kernels on standard graph-based datasets in terms of the classification accuracy.
arXiv Detail & Related papers (2020-02-08T11:46:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.