Dimensionality Reduction Techniques for Global Bayesian Optimisation
- URL: http://arxiv.org/abs/2412.09183v1
- Date: Thu, 12 Dec 2024 11:27:27 GMT
- Title: Dimensionality Reduction Techniques for Global Bayesian Optimisation
- Authors: Luo Long, Coralia Cartis, Paz Fink Shustin,
- Abstract summary: We explore Latent Space Bayesian optimisation, that applies dimensionality reduction to perform BO in a reduced-dimensional subspace.
We employ Variational Autoencoders (VAEs) to manage more complex data structures and general DR tasks.
We suggest a few key corrections in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes.
- Score: 1.433758865948252
- License:
- Abstract: Bayesian Optimisation (BO) is a state-of-the-art global optimisation technique for black-box problems where derivative information is unavailable, and sample efficiency is crucial. However, improving the general scalability of BO has proved challenging. Here, we explore Latent Space Bayesian Optimisation (LSBO), that applies dimensionality reduction to perform BO in a reduced-dimensional subspace. While early LSBO methods used (linear) random projections (Wang et al., 2013), we employ Variational Autoencoders (VAEs) to manage more complex data structures and general DR tasks. Building on Grosnit et. al. (2021), we analyse the VAE-based LSBO framework, focusing on VAE retraining and deep metric loss. We suggest a few key corrections in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes. Our numerical results show that structured latent manifolds improve BO performance. Additionally, we examine the use of the Mat\'{e}rn-$\frac{5}{2}$ kernel for Gaussian Processes in this LSBO context. We also integrate Sequential Domain Reduction (SDR), a standard global optimization efficiency strategy, into BO. SDR is included in a GPU-based environment using \textit{BoTorch}, both in the original and VAE-generated latent spaces, marking the first application of SDR within LSBO.
Related papers
- BOIDS: High-dimensional Bayesian Optimization via Incumbent-guided Direction Lines and Subspace Embeddings [14.558601519561721]
We introduce BOIDS, a novel high-dimensional BO algorithm that guides optimization by a sequence of one-dimensional direction lines.
We also propose an adaptive selection technique to identify most optimal lines for each round of line-based optimization.
Our experimental results show that BOIDS outperforms state-of-the-art baselines on various synthetic and real-world benchmark problems.
arXiv Detail & Related papers (2024-12-17T13:51:24Z) - Faster WIND: Accelerating Iterative Best-of-$N$ Distillation for LLM Alignment [81.84950252537618]
This paper reveals a unified game-theoretic connection between iterative BOND and self-play alignment.
We establish a novel framework, WIN rate Dominance (WIND), with a series of efficient algorithms for regularized win rate dominance optimization.
arXiv Detail & Related papers (2024-10-28T04:47:39Z) - Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - Relaxing the Additivity Constraints in Decentralized No-Regret
High-Dimensional Bayesian Optimization [5.316089560623732]
We relax the restrictive assumptions on the additive structure $f$ without weakening the guarantees of the acquisition function.
We propose an optimal decentralized BO algorithm that achieves very competitive performance against state-of-the-art BO algorithms.
arXiv Detail & Related papers (2023-05-31T13:26:49Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - High Dimensional Bayesian Optimization with Kernel Principal Component
Analysis [4.33419118449588]
kernel PCA-assisted BO (KPCA-BO) algorithm embeds a non-linear sub-manifold in the search space and performs BO on this sub-manifold.
We compare the performance of KPCA-BO to the vanilla BO and PCA-BO on the multi-modal problems of the COCO/BBOB benchmark suite.
arXiv Detail & Related papers (2022-04-28T20:09:02Z) - ES-Based Jacobian Enables Faster Bilevel Optimization [53.675623215542515]
Bilevel optimization (BO) has arisen as a powerful tool for solving many modern machine learning problems.
Existing gradient-based methods require second-order derivative approximations via Jacobian- or/and Hessian-vector computations.
We propose a novel BO algorithm, which adopts Evolution Strategies (ES) based method to approximate the response Jacobian matrix in the hypergradient of BO.
arXiv Detail & Related papers (2021-10-13T19:36:50Z) - BOSS: Bayesian Optimization over String Spaces [15.630421177117634]
This article develops a Bayesian optimization (BO) method which acts directly over raw strings.
It proposes the first uses of string kernels and genetic algorithms within BO loops.
arXiv Detail & Related papers (2020-10-02T13:18:27Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z) - High Dimensional Bayesian Optimization Assisted by Principal Component
Analysis [4.030481609048958]
We introduce a novel PCA-assisted BO (PCA-BO) algorithm for high-dimensional numerical optimization problems.
We show that PCA-BO can effectively reduce the CPU time incurred on high-dimensional problems, and maintains the convergence rate on problems with an adequate global structure.
arXiv Detail & Related papers (2020-07-02T07:03:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.