Optimization on Manifolds via Graph Gaussian Processes
- URL: http://arxiv.org/abs/2210.10962v3
- Date: Thu, 9 Nov 2023 02:13:21 GMT
- Title: Optimization on Manifolds via Graph Gaussian Processes
- Authors: Hwanwoo Kim, Daniel Sanz-Alonso, and Ruiyi Yang
- Abstract summary: This paper integrates manifold learning techniques within a emphGaussian process upper confidence bound algorithm to optimize an objective function on a manifold.
- Score: 4.471962177124311
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper integrates manifold learning techniques within a \emph{Gaussian
process upper confidence bound} algorithm to optimize an objective function on
a manifold. Our approach is motivated by applications where a full
representation of the manifold is not available and querying the objective is
expensive. We rely on a point cloud of manifold samples to define a graph
Gaussian process surrogate model for the objective. Query points are
sequentially chosen using the posterior distribution of the surrogate model
given all previous queries. We establish regret bounds in terms of the number
of queries and the size of the point cloud. Several numerical examples
complement the theory and illustrate the performance of our method.
Related papers
- Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - Posterior Contraction Rates for Mat\'ern Gaussian Processes on
Riemannian Manifolds [51.68005047958965]
We show that intrinsic Gaussian processes can achieve better performance in practice.
Our work shows that finer-grained analyses are needed to distinguish between different levels of data-efficiency.
arXiv Detail & Related papers (2023-09-19T20:30:58Z) - Improving Diffusion Models for Inverse Problems using Manifold Constraints [55.91148172752894]
We show that current solvers throw the sample path off the data manifold, and hence the error accumulates.
To address this, we propose an additional correction term inspired by the manifold constraint.
We show that our method is superior to the previous methods both theoretically and empirically.
arXiv Detail & Related papers (2022-06-02T09:06:10Z) - Gaussian Processes and Statistical Decision-making in Non-Euclidean
Spaces [96.53463532832939]
We develop techniques for broadening the applicability of Gaussian processes.
We introduce a wide class of efficient approximations built from this viewpoint.
We develop a collection of Gaussian process models over non-Euclidean spaces.
arXiv Detail & Related papers (2022-02-22T01:42:57Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Intrinsic Gaussian Processes on Manifolds and Their Accelerations by
Symmetry [9.773237080061815]
Existing methods primarily focus on low dimensional constrained domains for heat kernel estimation.
Our research proposes an intrinsic approach for constructing GP on general equations.
Our methodology estimates the heat kernel by simulating Brownian motion sample paths using the exponential map.
arXiv Detail & Related papers (2020-06-25T09:17:40Z) - Adaptive quadrature schemes for Bayesian inference via active learning [0.0]
We propose novel adaptive quadrature schemes based on an active learning procedure.
We consider an interpolative approach for building a surrogate density, combining it with Monte Carlo sampling methods and other quadrature rules.
Numerical results show the advantage of the proposed approach, including a challenging inference problem in an astronomic model.
arXiv Detail & Related papers (2020-05-31T15:02:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.