Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry
- URL: http://arxiv.org/abs/2006.14266v2
- Date: Wed, 31 Jan 2024 16:28:49 GMT
- Title: Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry
- Authors: Ke Ye, Mu Niu, Pokman Cheung, Zhenwen Dai, Yuan Liu
- Abstract summary: Existing methods primarily focus on low dimensional constrained domains for heat kernel estimation.
Our research proposes an intrinsic approach for constructing GP on general equations.
Our methodology estimates the heat kernel by simulating Brownian motion sample paths using the exponential map.
- Score: 9.773237080061815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Amidst the growing interest in nonparametric regression, we address a
significant challenge in Gaussian processes(GP) applied to manifold-based
predictors. Existing methods primarily focus on low dimensional constrained
domains for heat kernel estimation, limiting their effectiveness in
higher-dimensional manifolds. Our research proposes an intrinsic approach for
constructing GP on general manifolds such as orthogonal groups, unitary groups,
Stiefel manifolds and Grassmannian manifolds. Our methodology estimates the
heat kernel by simulating Brownian motion sample paths using the exponential
map, ensuring independence from the manifold's embedding. The introduction of
our strip algorithm, tailored for manifolds with extra symmetries, and the ball
algorithm, designed for arbitrary manifolds, constitutes our significant
contribution. Both algorithms are rigorously substantiated through theoretical
proofs and numerical testing, with the strip algorithm showcasing remarkable
efficiency gains over traditional methods. This intrinsic approach delivers
several key advantages, including applicability to high dimensional manifolds,
eliminating the requirement for global parametrization or embedding. We
demonstrate its practicality through regression case studies (torus knots and
eight dimensional projective spaces) and by developing binary classifiers for
real world datasets (gorilla skulls planar images and diffusion tensor images).
These classifiers outperform traditional methods, particularly in limited data
scenarios.
Related papers
- Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Extrinsic Bayesian Optimizations on Manifolds [1.3477333339913569]
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on Euclid manifold.
Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensionalean space.
This leads to efficient and scalable algorithms for optimization over complex manifold.
arXiv Detail & Related papers (2022-12-21T06:10:12Z) - The Dynamics of Riemannian Robbins-Monro Algorithms [101.29301565229265]
We propose a family of Riemannian algorithms generalizing and extending the seminal approximation framework of Robbins and Monro.
Compared to their Euclidean counterparts, Riemannian algorithms are much less understood due to lack of a global linear structure on the manifold.
We provide a general template of almost sure convergence results that mirrors and extends the existing theory for Euclidean Robbins-Monro schemes.
arXiv Detail & Related papers (2022-06-14T12:30:11Z) - Vector-valued Gaussian Processes on Riemannian Manifolds via Gauge
Equivariant Projected Kernels [108.60991563944351]
We present a recipe for constructing gauge equivariant kernels, which induce vector-valued Gaussian processes coherent with geometry.
We extend standard Gaussian process training methods, such as variational inference, to this setting.
arXiv Detail & Related papers (2021-10-27T13:31:10Z) - Inferring Manifolds From Noisy Data Using Gaussian Processes [17.166283428199634]
Most existing manifold learning algorithms replace the original data with lower dimensional coordinates.
This article proposes a new methodology for addressing these problems, allowing the estimated manifold between fitted data points.
arXiv Detail & Related papers (2021-10-14T15:50:38Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Mat\'ern Gaussian processes on Riemannian manifolds [81.15349473870816]
We show how to generalize the widely-used Mat'ern class of Gaussian processes.
We also extend the generalization from the Mat'ern to the widely-used squared exponential process.
arXiv Detail & Related papers (2020-06-17T21:05:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.