Isotropic Gaussian Processes on Finite Spaces of Graphs
- URL: http://arxiv.org/abs/2211.01689v1
- Date: Thu, 3 Nov 2022 10:18:17 GMT
- Title: Isotropic Gaussian Processes on Finite Spaces of Graphs
- Authors: Viacheslav Borovitskiy, Mohammad Reza Karimi, Vignesh Ram Somnath,
Andreas Krause
- Abstract summary: We propose a principled way to define Gaussian process priors on various sets of unweighted graphs.
We go further to consider sets of equivalence classes of unweighted graphs and define the appropriate versions of priors thereon.
Inspired by applications in chemistry, we illustrate the proposed techniques on a real molecular property prediction task in the small data regime.
- Score: 71.26737403006778
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a principled way to define Gaussian process priors on various sets
of unweighted graphs: directed or undirected, with or without loops. We endow
each of these sets with a geometric structure, inducing the notions of
closeness and symmetries, by turning them into a vertex set of an appropriate
metagraph. Building on this, we describe the class of priors that respect this
structure and are analogous to the Euclidean isotropic processes, like squared
exponential or Mat\'ern. We propose an efficient computational technique for
the ostensibly intractable problem of evaluating these priors' kernels, making
such Gaussian processes usable within the usual toolboxes and downstream
applications. We go further to consider sets of equivalence classes of
unweighted graphs and define the appropriate versions of priors thereon. We
prove a hardness result, showing that in this case, exact kernel computation
cannot be performed efficiently. However, we propose a simple Monte Carlo
approximation for handling moderately sized cases. Inspired by applications in
chemistry, we illustrate the proposed techniques on a real molecular property
prediction task in the small data regime.
Related papers
- Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Mat\'ern Gaussian Processes on Graphs [67.13902825728718]
We leverage the partial differential equation characterization of Mat'ern Gaussian processes to study their analog for undirected graphs.
We show that the resulting Gaussian processes inherit various attractive properties of their Euclidean and Euclidian analogs.
This enables graph Mat'ern Gaussian processes to be employed in mini-batch and non-conjugate settings.
arXiv Detail & Related papers (2020-10-29T13:08:07Z) - Semi-discrete optimization through semi-discrete optimal transport: a
framework for neural architecture search [0.0]
We introduce a theoretical framework for semi-discrete optimization using ideas from optimal transport.
Our primary motivation is in the field of deep learning, and specifically in the task of neural architecture search.
In a companion paper we show that algorithms inspired by our framework are competitive with contemporaneous methods.
arXiv Detail & Related papers (2020-06-26T21:44:35Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.