Multi-Step Budgeted Bayesian Optimization with Unknown Evaluation Costs
- URL: http://arxiv.org/abs/2111.06537v1
- Date: Fri, 12 Nov 2021 02:18:26 GMT
- Title: Multi-Step Budgeted Bayesian Optimization with Unknown Evaluation Costs
- Authors: Raul Astudillo, Daniel R. Jiang, Maximilian Balandat, Eytan Bakshy,
Peter I. Frazier
- Abstract summary: We propose a non-myopic acquisition function that generalizes classical expected improvement to the setting of heterogeneous evaluation costs.
Our acquisition function outperforms existing methods in a variety of synthetic and real problems.
- Score: 28.254408148839644
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimization (BO) is a sample-efficient approach to optimizing
costly-to-evaluate black-box functions. Most BO methods ignore how evaluation
costs may vary over the optimization domain. However, these costs can be highly
heterogeneous and are often unknown in advance. This occurs in many practical
settings, such as hyperparameter tuning of machine learning algorithms or
physics-based simulation optimization. Moreover, those few existing methods
that acknowledge cost heterogeneity do not naturally accommodate a budget
constraint on the total evaluation cost. This combination of unknown costs and
a budget constraint introduces a new dimension to the exploration-exploitation
trade-off, where learning about the cost incurs the cost itself. Existing
methods do not reason about the various trade-offs of this problem in a
principled way, leading often to poor performance. We formalize this claim by
proving that the expected improvement and the expected improvement per unit of
cost, arguably the two most widely used acquisition functions in practice, can
be arbitrarily inferior with respect to the optimal non-myopic policy. To
overcome the shortcomings of existing approaches, we propose the budgeted
multi-step expected improvement, a non-myopic acquisition function that
generalizes classical expected improvement to the setting of heterogeneous and
unknown evaluation costs. Finally, we show that our acquisition function
outperforms existing methods in a variety of synthetic and real problems.
Related papers
- Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - An adaptive approach to Bayesian Optimization with switching costs [0.8246494848934447]
We investigate modifications to Bayesian Optimization for a resource-constrained setting of sequential experimental design.
We adapt two process-constrained batch algorithms to this sequential problem formulation, and propose two new methods: one cost-aware and one cost-ignorant.
arXiv Detail & Related papers (2024-05-14T21:55:02Z) - Landscape-Sketch-Step: An AI/ML-Based Metaheuristic for Surrogate
Optimization Problems [0.0]
We introduce a newimats for global optimization in scenarios where extensive evaluations of the cost function are expensive, inaccessible, or even prohibitive.
The method, which we call Landscape-Sketch-and-Step (LSS), combines Machine Learning, Replica Optimization, and Reinforcement Learning techniques.
arXiv Detail & Related papers (2023-09-14T01:53:45Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Low-Cost Algorithmic Recourse for Users With Uncertain Cost Functions [74.00030431081751]
We formalize the notion of user-specific cost functions and introduce a new method for identifying actionable recourses for users.
Our method satisfies up to 25.89 percentage points more users compared to strong baseline methods.
arXiv Detail & Related papers (2021-11-01T19:49:35Z) - A Nonmyopic Approach to Cost-Constrained Bayesian Optimization [10.078368988372247]
We formulate cost-constrained BO as a constrained Markov decision process (CMDP)
We develop an efficient rollout approximation to the optimal CMDP policy that takes both the cost and future iterations into account.
arXiv Detail & Related papers (2021-06-10T22:44:37Z) - Cost-Efficient Online Hyperparameter Optimization [94.60924644778558]
We propose an online HPO algorithm that reaches human expert-level performance within a single run of the experiment.
Our proposed online HPO algorithm reaches human expert-level performance within a single run of the experiment, while incurring only modest computational overhead compared to regular training.
arXiv Detail & Related papers (2021-01-17T04:55:30Z) - Pareto-efficient Acquisition Functions for Cost-Aware Bayesian
Optimization [5.459427541271035]
We show how to make cost-aware Bayesian optimization for black-box functions.
On 144 real-world black-box function optimization problems, our solution brings up to 50% speed-ups.
We also revisit the common choice of Gaussian process cost models, showing that simple, low-variance cost models predict training times effectively.
arXiv Detail & Related papers (2020-11-23T15:06:07Z) - Optimal Bayesian experimental design for subsurface flow problems [77.34726150561087]
We propose a novel approach for development of chaos expansion (PCE) surrogate model for the design utility function.
This novel technique enables the derivation of a reasonable quality response surface for the targeted objective function with a computational budget comparable to several single-point evaluations.
arXiv Detail & Related papers (2020-08-10T09:42:59Z) - Cost-aware Bayesian Optimization [6.75013674088437]
Cost-aware BO measures convergence with alternative cost metrics such as time, energy, or money.
We introduce Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible.
arXiv Detail & Related papers (2020-03-22T14:51:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.