PMBO: Enhancing Black-Box Optimization through Multivariate Polynomial
Surrogates
- URL: http://arxiv.org/abs/2403.07485v1
- Date: Tue, 12 Mar 2024 10:21:21 GMT
- Title: PMBO: Enhancing Black-Box Optimization through Multivariate Polynomial
Surrogates
- Authors: Janina Schreiber, Pau Batlle, Damar Wicaksono, Michael Hecht
- Abstract summary: We introduce a surrogate-based black-box optimization method, termed Polynomial-model-based optimization (PMBO)
We compare the performance of PMBO with several optimization methods for a set of analytic test functions.
Remarkably, PMBO performs comparably with state-of-the-art evolutionary algorithms.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: We introduce a surrogate-based black-box optimization method, termed
Polynomial-model-based optimization (PMBO). The algorithm alternates polynomial
approximation with Bayesian optimization steps, using Gaussian processes to
model the error between the objective and its polynomial fit. We describe the
algorithmic design of PMBO and compare the results of the performance of PMBO
with several optimization methods for a set of analytic test functions.
The results show that PMBO outperforms the classic Bayesian optimization and
is robust with respect to the choice of its correlation function family and its
hyper-parameter setting, which, on the contrary, need to be carefully tuned in
classic Bayesian optimization. Remarkably, PMBO performs comparably with
state-of-the-art evolutionary algorithms such as the Covariance Matrix
Adaptation -- Evolution Strategy (CMA-ES). This finding suggests that PMBO
emerges as the pivotal choice among surrogate-based optimization methods when
addressing low-dimensional optimization problems. Hereby, the simple nature of
polynomials opens the opportunity for interpretation and analysis of the
inferred surrogate model, providing a macroscopic perspective on the landscape
of the objective function.
Related papers
- Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Simulation Based Bayesian Optimization [0.6526824510982799]
This paper introduces Simulation Based Bayesian Optimization (SBBO) as a novel approach to optimizing acquisition functions.
SBBO allows the use of surrogate models tailored for spaces with discrete variables.
We demonstrate empirically the effectiveness of SBBO method using various choices of surrogate models.
arXiv Detail & Related papers (2024-01-19T16:56:11Z) - Polynomial-Model-Based Optimization for Blackbox Objectives [0.0]
Black-box optimization seeks to find optimal parameters for systems such that a pre-defined objective function is minimized.
PMBO is a novel blackbox that finds the minimum by fitting a surrogate to the objective function.
PMBO is benchmarked against other state-of-the-art algorithms for a given set of artificial, analytical functions.
arXiv Detail & Related papers (2023-09-01T14:11:03Z) - Extrinsic Bayesian Optimizations on Manifolds [1.3477333339913569]
We propose an extrinsic Bayesian optimization (eBO) framework for general optimization problems on Euclid manifold.
Our approach is to employ extrinsic Gaussian processes by first embedding the manifold onto some higher dimensionalean space.
This leads to efficient and scalable algorithms for optimization over complex manifold.
arXiv Detail & Related papers (2022-12-21T06:10:12Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - B\'ezier Flow: a Surface-wise Gradient Descent Method for
Multi-objective Optimization [12.487037582320804]
We extend the stability of optimization algorithms in the sense of Probability Approximately Correct (PAC) learning.
We show that multi-objective optimization algorithms derived from a gradient descent-based single-objective optimization algorithm are PAC stable.
arXiv Detail & Related papers (2022-05-23T07:47:58Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - BOSH: Bayesian Optimization by Sampling Hierarchically [10.10241176664951]
We propose a novel BO routine pairing a hierarchical Gaussian process with an information-theoretic framework to generate a growing pool of realizations.
We demonstrate that BOSH provides more efficient and higher-precision optimization than standard BO across synthetic benchmarks, simulation optimization, reinforcement learning and hyper- parameter tuning tasks.
arXiv Detail & Related papers (2020-07-02T07:35:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.