Implicit Manifold Gaussian Process Regression
- URL: http://arxiv.org/abs/2310.19390v2
- Date: Thu, 1 Feb 2024 09:35:33 GMT
- Title: Implicit Manifold Gaussian Process Regression
- Authors: Bernardo Fichera, Viacheslav Borovitskiy, Andreas Krause, Aude Billard
- Abstract summary: Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
- Score: 49.0787777751317
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Gaussian process regression is widely used because of its ability to provide
well-calibrated uncertainty estimates and handle small or sparse datasets.
However, it struggles with high-dimensional data. One possible way to scale
this technique to higher dimensions is to leverage the implicit low-dimensional
manifold upon which the data actually lies, as postulated by the manifold
hypothesis. Prior work ordinarily requires the manifold structure to be
explicitly provided though, i.e. given by a mesh or be known to be one of the
well-known manifolds like the sphere. In contrast, in this paper we propose a
Gaussian process regression technique capable of inferring implicit structure
directly from data (labeled and unlabeled) in a fully differentiable way. For
the resulting model, we discuss its convergence to the Mat\'ern Gaussian
process on the assumed manifold. Our technique scales up to hundreds of
thousands of data points, and may improve the predictive performance and
calibration of the standard Gaussian process regression in high-dimensional
settings.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Sketching the Heat Kernel: Using Gaussian Processes to Embed Data [4.220336689294244]
We introduce a novel, non-deterministic method for embedding data in low-dimensional Euclidean space based on realizations of a Gaussian process depending on the geometry of the data.
Our method demonstrates further advantage in its robustness to outliers.
arXiv Detail & Related papers (2024-03-01T22:56:19Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Canonical normalizing flows for manifold learning [14.377143992248222]
We propose a canonical manifold learning flow method, where a novel objective enforces the transformation matrix to have few prominent and non-degenerate basis functions.
Canonical manifold flow yields a more efficient use of the latent space, automatically generating fewer prominent and distinct dimensions to represent data.
arXiv Detail & Related papers (2023-10-19T13:48:05Z) - Scale invariant process regression [0.0]
We propose a novel regression method that does not require specification of a kernel, length scale, variance, nor prior mean.
Experiments show that it is possible to derive a working machine learning method by assuming nothing but regularity and scale- and translation invariance.
arXiv Detail & Related papers (2022-08-22T17:32:33Z) - Probabilistic Registration for Gaussian Process 3D shape modelling in
the presence of extensive missing data [63.8376359764052]
We propose a shape fitting/registration method based on a Gaussian Processes formulation, suitable for shapes with extensive regions of missing data.
Experiments are conducted both for a 2D small dataset with diverse transformations and a 3D dataset of ears.
arXiv Detail & Related papers (2022-03-26T16:48:27Z) - Adaptive Cholesky Gaussian Processes [7.684183064816171]
We present a method to fit exact Gaussian process models to large datasets by considering only a subset of the data.
Our approach is novel in that the size of the subset is selected on the fly during exact inference with little computational overhead.
arXiv Detail & Related papers (2022-02-22T09:43:46Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Inferring Manifolds From Noisy Data Using Gaussian Processes [17.166283428199634]
Most existing manifold learning algorithms replace the original data with lower dimensional coordinates.
This article proposes a new methodology for addressing these problems, allowing the estimated manifold between fitted data points.
arXiv Detail & Related papers (2021-10-14T15:50:38Z) - Manifold Learning via Manifold Deflation [105.7418091051558]
dimensionality reduction methods provide a valuable means to visualize and interpret high-dimensional data.
Many popular methods can fail dramatically, even on simple two-dimensional Manifolds.
This paper presents an embedding method for a novel, incremental tangent space estimator that incorporates global structure as coordinates.
Empirically, we show our algorithm recovers novel and interesting embeddings on real-world and synthetic datasets.
arXiv Detail & Related papers (2020-07-07T10:04:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.