Extrinsic Bayesian Optimizations on Manifolds
- URL: http://arxiv.org/abs/2212.13886v1
- Date: Wed, 21 Dec 2022 06:10:12 GMT
- Title: Extrinsic Bayesian Optimizations on Manifolds
- Authors: Yihao Fang, Mu Niu, Pokman Cheung, Lizhen Lin
- Abstract summary: We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on Euclid manifold.
Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensionalean space.
This leads to efficient and scalable algorithms for optimization over complex manifold.
- Score: 1.3477333339913569
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose an extrinsic Bayesian optimization (eBO) framework for general
optimization problems on manifolds. Bayesian optimization algorithms build a
surrogate of the objective function by employing Gaussian processes and
quantify the uncertainty in that surrogate by deriving an acquisition function.
This acquisition function represents the probability of improvement based on
the kernel of the Gaussian process, which guides the search in the optimization
process. The critical challenge for designing Bayesian optimization algorithms
on manifolds lies in the difficulty of constructing valid covariance kernels
for Gaussian processes on general manifolds. Our approach is to employ
extrinsic Gaussian processes by first embedding the manifold onto some higher
dimensional Euclidean space via equivariant embeddings and then constructing a
valid covariance kernel on the image manifold after the embedding. This leads
to efficient and scalable algorithms for optimization over complex manifolds.
Simulation study and real data analysis are carried out to demonstrate the
utilities of our eBO framework by applying the eBO to various optimization
problems over manifolds such as the sphere, the Grassmannian, and the manifold
of positive definite matrices.
Related papers
- Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - Global Optimization of Gaussian Process Acquisition Functions Using a Piecewise-Linear Kernel Approximation [2.3342885570554652]
We introduce a piecewise approximation for process kernels and a corresponding MIQP representation for acquisition functions.
We empirically demonstrate the framework on synthetic functions, constrained benchmarks, and hyper tuning tasks.
arXiv Detail & Related papers (2024-10-22T10:56:52Z) - Enhancing Gaussian Process Surrogates for Optimization and Posterior Approximation via Random Exploration [2.984929040246293]
novel noise-free Bayesian optimization strategies that rely on a random exploration step to enhance the accuracy of Gaussian process surrogate models.
New algorithms retain the ease of implementation of the classical GP-UCB, but an additional exploration step facilitates their convergence.
arXiv Detail & Related papers (2024-01-30T14:16:06Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - On a class of geodesically convex optimization problems solved via
Euclidean MM methods [50.428784381385164]
We show how a difference of Euclidean convexization functions can be written as a difference of different types of problems in statistics and machine learning.
Ultimately, we helps the broader broader the broader the broader the broader the work.
arXiv Detail & Related papers (2022-06-22T23:57:40Z) - Geometry-aware Bayesian Optimization in Robotics using Riemannian
Mat\'ern Kernels [64.62221198500467]
We show how to implement geometry-aware kernels for Bayesian optimization.
This technique can be used for control parameter tuning, parametric policy adaptation, and structure design in robotics.
arXiv Detail & Related papers (2021-11-02T09:47:22Z) - Bayesian Variational Optimization for Combinatorial Spaces [0.0]
Broad applications include the study of molecules, proteins, DNA, device structures and quantum circuit designs.
A on optimization over categorical spaces is needed to find optimal or pareto-optimal solutions.
We introduce a variational Bayesian optimization method that combines variational optimization and continuous relaxations.
arXiv Detail & Related papers (2020-11-03T20:56:13Z) - Accelerated Algorithms for Convex and Non-Convex Optimization on
Manifolds [9.632674803757475]
We propose a scheme for solving convex and non- optimization problems on distance.
Our proposed algorithm adapts to the level of complexity in the objective function.
arXiv Detail & Related papers (2020-10-18T02:48:22Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.