Gaussian Processes on Graphs via Spectral Kernel Learning
- URL: http://arxiv.org/abs/2006.07361v3
- Date: Wed, 28 Oct 2020 10:57:51 GMT
- Title: Gaussian Processes on Graphs via Spectral Kernel Learning
- Authors: Yin-Cong Zhi, Yin Cheng Ng, Xiaowen Dong
- Abstract summary: We propose a graph spectrum-based Gaussian process for prediction of signals defined on nodes of the graph.
We demonstrate the interpretability of the model in synthetic experiments from which we show the various ground truth spectral filters can be accurately recovered.
- Score: 9.260186030255081
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a graph spectrum-based Gaussian process for prediction of signals
defined on nodes of the graph. The model is designed to capture various graph
signal structures through a highly adaptive kernel that incorporates a flexible
polynomial function in the graph spectral domain. Unlike most existing
approaches, we propose to learn such a spectral kernel, where the polynomial
setup enables learning without the need for eigen-decomposition of the graph
Laplacian. In addition, this kernel has the interpretability of graph filtering
achieved by a bespoke maximum likelihood learning algorithm that enforces the
positivity of the spectrum. We demonstrate the interpretability of the model in
synthetic experiments from which we show the various ground truth spectral
filters can be accurately recovered, and the adaptability translates to
superior performances in the prediction of real-world graph data of various
characteristics.
Related papers
- Graph Generation via Spectral Diffusion [51.60814773299899]
We present GRASP, a novel graph generative model based on 1) the spectral decomposition of the graph Laplacian matrix and 2) a diffusion process.
Specifically, we propose to use a denoising model to sample eigenvectors and eigenvalues from which we can reconstruct the graph Laplacian and adjacency matrix.
Our permutation invariant model can also handle node features by concatenating them to the eigenvectors of each node.
arXiv Detail & Related papers (2024-02-29T09:26:46Z) - HoloNets: Spectral Convolutions do extend to Directed Graphs [59.851175771106625]
Conventional wisdom dictates that spectral convolutional networks may only be deployed on undirected graphs.
Here we show this traditional reliance on the graph Fourier transform to be superfluous.
We provide a frequency-response interpretation of newly developed filters, investigate the influence of the basis used to express filters and discuss the interplay with characteristic operators on which networks are based.
arXiv Detail & Related papers (2023-10-03T17:42:09Z) - Graph Classification Gaussian Processes via Spectral Features [7.474662887810221]
Graph classification aims to categorise graphs based on their structure and node attributes.
In this work, we propose to tackle this task using tools from graph signal processing by deriving spectral features.
We show that even such a simple approach, having no learned parameters, can yield competitive performance compared to strong neural network and graph kernel baselines.
arXiv Detail & Related papers (2023-06-06T15:31:05Z) - Specformer: Spectral Graph Neural Networks Meet Transformers [51.644312964537356]
Spectral graph neural networks (GNNs) learn graph representations via spectral-domain graph convolutions.
We introduce Specformer, which effectively encodes the set of all eigenvalues and performs self-attention in the spectral domain.
By stacking multiple Specformer layers, one can build a powerful spectral GNN.
arXiv Detail & Related papers (2023-03-02T07:36:23Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Transductive Kernels for Gaussian Processes on Graphs [7.542220697870243]
We present a novel kernel for graphs with node feature data for semi-supervised learning.
The kernel is derived from a regularization framework by treating the graph and feature data as two spaces.
We show how numerous kernel-based models on graphs are instances of our design.
arXiv Detail & Related papers (2022-11-28T14:00:50Z) - Adaptive Gaussian Processes on Graphs via Spectral Graph Wavelets [3.2498534294827044]
We propose a process model using spectral graph wavelets, which can aggregate information at different scales.
We achieve scalability to larger graphs by using a spectrum-adaptive approximation of the filter function, which is designed to yield a low approximation error in dense areas of the graph spectrum.
arXiv Detail & Related papers (2021-10-25T09:25:04Z) - Spectral-Spatial Global Graph Reasoning for Hyperspectral Image
Classification [50.899576891296235]
Convolutional neural networks have been widely applied to hyperspectral image classification.
Recent methods attempt to address this issue by performing graph convolutions on spatial topologies.
arXiv Detail & Related papers (2021-06-26T06:24:51Z) - Stacked Graph Filter [19.343260981528186]
We study Graph Convolutional Networks (GCN) from the graph signal processing viewpoint.
We find that by stacking graph filters with learnable solution parameters, we can build a highly adaptive and robust graph classification model.
arXiv Detail & Related papers (2020-11-22T11:20:14Z) - Fourier-based and Rational Graph Filters for Spectral Processing [0.0]
Our overall goal is the definition of novel Fourier-based filters for graph processing.
For efficient evaluation of discrete spectral-based and wavelet operators, we introduce a spectrum-free approach.
Approximating arbitrary graph filters with rationals provides a more accurate and numerically stable alternative with respect to sparses.
arXiv Detail & Related papers (2020-11-08T19:02:52Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.