Non-separable Spatio-temporal Graph Kernels via SPDEs
- URL: http://arxiv.org/abs/2111.08524v1
- Date: Tue, 16 Nov 2021 14:53:19 GMT
- Title: Non-separable Spatio-temporal Graph Kernels via SPDEs
- Authors: Alexander Nikitin, ST John, Arno Solin, Samuel Kaski
- Abstract summary: We provide graph kernels for principled-temporal modelling on graphs.
By providing novel tools for modelling on graphs, we outperform pre-existing graph kernels in real-world applications.
- Score: 90.62347738138594
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian processes (GPs) provide a principled and direct approach for
inference and learning on graphs. However, the lack of justified graph kernels
for spatio-temporal modelling has held back their use in graph problems. We
leverage an explicit link between stochastic partial differential equations
(SPDEs) and GPs on graphs, and derive non-separable spatio-temporal graph
kernels that capture interaction across space and time. We formulate the graph
kernels for the stochastic heat equation and wave equation. We show that by
providing novel tools for spatio-temporal GP modelling on graphs, we outperform
pre-existing graph kernels in real-world applications that feature diffusion,
oscillation, and other complicated interactions.
Related papers
- Advancing Graph Generation through Beta Diffusion [49.49740940068255]
Graph Beta Diffusion (GBD) is a generative model specifically designed to handle the diverse nature of graph data.
We propose a modulation technique that enhances the realism of generated graphs by stabilizing critical graph topology.
arXiv Detail & Related papers (2024-06-13T17:42:57Z) - Advective Diffusion Transformers for Topological Generalization in Graph
Learning [69.2894350228753]
We show how graph diffusion equations extrapolate and generalize in the presence of varying graph topologies.
We propose a novel graph encoder backbone, Advective Diffusion Transformer (ADiT), inspired by advective graph diffusion equations.
arXiv Detail & Related papers (2023-10-10T08:40:47Z) - Graph Neural Stochastic Differential Equations [3.568455515949288]
We present a novel model Graph Neural Differential Equations (Graph Neural SDEs)
This technique enhances the Graph Neural Ordinary Differential Equations (Graph Neural ODEs) by embedding randomness into data representation using Brownian motion.
We find that Latent Graph Neural SDEs surpass conventional models like Graph Convolutional Networks and Graph Neural ODEs, especially in confidence prediction.
arXiv Detail & Related papers (2023-08-23T09:20:38Z) - Conditional Diffusion Based on Discrete Graph Structures for Molecular
Graph Generation [32.66694406638287]
We propose a Conditional Diffusion model based on discrete Graph Structures (CDGS) for molecular graph generation.
Specifically, we construct a forward graph diffusion process on both graph structures and inherent features through differential equations (SDE)
We present a specialized hybrid graph noise prediction model that extracts the global context and the local node-edge dependency from intermediate graph states.
arXiv Detail & Related papers (2023-01-01T15:24:15Z) - Transductive Kernels for Gaussian Processes on Graphs [7.542220697870243]
We present a novel kernel for graphs with node feature data for semi-supervised learning.
The kernel is derived from a regularization framework by treating the graph and feature data as two spaces.
We show how numerous kernel-based models on graphs are instances of our design.
arXiv Detail & Related papers (2022-11-28T14:00:50Z) - Score-based Generative Modeling of Graphs via the System of Stochastic
Differential Equations [57.15855198512551]
We propose a novel score-based generative model for graphs with a continuous-time framework.
We show that our method is able to generate molecules that lie close to the training distribution yet do not violate the chemical valency rule.
arXiv Detail & Related papers (2022-02-05T08:21:04Z) - Evolving-Graph Gaussian Processes [20.065168755580558]
Existing approaches have focused on static structures, whereas many real graph data represent a dynamic structure, limiting the applications of GGPs.
We propose evolving-Graph Gaussian Processes (e-GGPs) to overcome this.
We demonstrate the benefits of e-GGPs over static graph Gaussian Process approaches.
arXiv Detail & Related papers (2021-06-29T07:16:04Z) - Hyperbolic Variational Graph Neural Network for Modeling Dynamic Graphs [77.33781731432163]
We learn dynamic graph representation in hyperbolic space, for the first time, which aims to infer node representations.
We present a novel Hyperbolic Variational Graph Network, referred to as HVGNN.
In particular, to model the dynamics, we introduce a Temporal GNN (TGNN) based on a theoretically grounded time encoding approach.
arXiv Detail & Related papers (2021-04-06T01:44:15Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.