Bayesian Optimization of Expensive Nested Grey-Box Functions
- URL: http://arxiv.org/abs/2306.05150v2
- Date: Wed, 2 Aug 2023 12:12:56 GMT
- Title: Bayesian Optimization of Expensive Nested Grey-Box Functions
- Authors: Wenjie Xu, Yuning Jiang, Bratislav Svetozarevic, Colin N. Jones
- Abstract summary: We consider the problem of optimizing a grey-box objective function composed of both black-box and white-box functions.
A general formulation for such grey-box problems is given, which covers the existing grey-box optimization formulations as special cases.
We then design an optimism-driven algorithm to solve it.
- Score: 11.523746174066702
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We consider the problem of optimizing a grey-box objective function, i.e.,
nested function composed of both black-box and white-box functions. A general
formulation for such grey-box problems is given, which covers the existing
grey-box optimization formulations as special cases. We then design an
optimism-driven algorithm to solve it. Under certain regularity assumptions,
our algorithm achieves similar regret bound as that for the standard black-box
Bayesian optimization algorithm, up to a constant multiplicative term depending
on the Lipschitz constants of the functions considered. We further extend our
method to the constrained case and discuss special cases. For the commonly used
kernel functions, the regret bounds allow us to derive a convergence rate to
the optimal solution. Experimental results show that our grey-box optimization
method empirically improves the speed of finding the global optimal solution
significantly, as compared to the standard black-box optimization algorithm.
Related papers
- Sharpness-Aware Black-Box Optimization [47.95184866255126]
We propose a Sharpness-Aware Black-box Optimization (SABO) algorithm, which applies a sharpness-aware minimization strategy to improve the model generalization.
Empirically, extensive experiments on the black-box prompt fine-tuning tasks demonstrate the effectiveness of the proposed SABO method in improving model generalization performance.
arXiv Detail & Related papers (2024-10-16T11:08:06Z) - Principled Preferential Bayesian Optimization [22.269732173306192]
We study the problem of preferential Bayesian optimization (BO)
We aim to optimize a black-box function with only preference feedback over a pair of candidate solutions.
An optimistic algorithm with an efficient computational method is then developed to solve the problem.
arXiv Detail & Related papers (2024-02-08T02:57:47Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - A unified surrogate-based scheme for black-box and preference-based
optimization [2.561649173827544]
We show that black-box and preference-based optimization problems are closely related and can be solved using the same family of approaches.
We propose the generalized Metric Response Surface (gMRS) algorithm, an optimization scheme that is a generalization of the popular MSRS framework.
arXiv Detail & Related papers (2022-02-03T08:47:54Z) - Zeroth-Order Hybrid Gradient Descent: Towards A Principled Black-Box
Optimization Framework [100.36569795440889]
This work is on the iteration of zero-th-order (ZO) optimization which does not require first-order information.
We show that with a graceful design in coordinate importance sampling, the proposed ZO optimization method is efficient both in terms of complexity as well as as function query cost.
arXiv Detail & Related papers (2020-12-21T17:29:58Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - DEFT-FUNNEL: an open-source global optimization solver for constrained
grey-box and black-box problems [0.0]
DEFT-FUNNEL is an open-source global optimization algorithm for general constrained grey-box and black-box problems.
The code as well as the test sets used for experiments are available at the Github repository.
arXiv Detail & Related papers (2019-12-29T11:43:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.