Constrained multi-fidelity Bayesian optimization with automatic stop condition
- URL: http://arxiv.org/abs/2503.01126v2
- Date: Fri, 21 Mar 2025 22:41:37 GMT
- Title: Constrained multi-fidelity Bayesian optimization with automatic stop condition
- Authors: Zahra Zanjani Foumani, Ramin Bostanabad,
- Abstract summary: We develop a constrained cost-aware multi-fidelity BO (CMFBO) framework.<n>Our goal is to minimize overall sampling costs by utilizing inexpensive low-fidelity sources while ensuring feasibility.<n>We also introduce a systematic stopping criterion that addresses the long-lasting issue associated with BO's convergence assessment.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimization (BO) is increasingly employed in critical applications to find the optimal design with minimal cost. While BO is known for its sample efficiency, relying solely on costly high-fidelity data can still result in high costs. This is especially the case in constrained search spaces where BO must not only optimize but also ensure feasibility. A related issue in the BO literature is the lack of a systematic stopping criterion. To solve these challenges, we develop a constrained cost-aware multi-fidelity BO (CMFBO) framework whose goal is to minimize overall sampling costs by utilizing inexpensive low-fidelity sources while ensuring feasibility. In our case, the constraints can change across the data sources and may be even black-box functions. We also introduce a systematic stopping criterion that addresses the long-lasting issue associated with BO's convergence assessment. Our framework is publicly available on GitHub through the GP+ Python package and herein we validate it's efficacy on multiple benchmark problems.
Related papers
- Offline Stochastic Optimization of Black-Box Objective Functions [47.74033738624514]
It is essential to leverage existing data to avoid costly active queries of complex black-box functions.<n>We introduce Offline BBO (SOBBO), which tackles both black-box objectives and uncontrolled uncertainties.<n> Numerical experiments demonstrate the effectiveness of our approach on both synthetic and real-world tasks.
arXiv Detail & Related papers (2024-12-03T02:20:30Z) - Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.<n>Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - OTClean: Data Cleaning for Conditional Independence Violations using
Optimal Transport [51.6416022358349]
sys is a framework that harnesses optimal transport theory for data repair under Conditional Independence (CI) constraints.
We develop an iterative algorithm inspired by Sinkhorn's matrix scaling algorithm, which efficiently addresses high-dimensional and large-scale data.
arXiv Detail & Related papers (2024-03-04T18:23:55Z) - Learning Regions of Interest for Bayesian Optimization with Adaptive
Level-Set Estimation [84.0621253654014]
We propose a framework, called BALLET, which adaptively filters for a high-confidence region of interest.
We show theoretically that BALLET can efficiently shrink the search space, and can exhibit a tighter regret bound than standard BO.
arXiv Detail & Related papers (2023-07-25T09:45:47Z) - Multi-Fidelity Cost-Aware Bayesian Optimization [0.0]
An increasingly popular strategy in Bayesian optimization (BO) is to forgo the sole reliance on high-fidelity data and instead use an ensemble of information sources which provide inexpensive low-fidelity data.
Here, we propose a multi-fidelity cost-aware BO framework that dramatically outperforms the state-of-the-art technologies in terms of efficiency, consistency, and robustness.
arXiv Detail & Related papers (2022-11-04T20:07:24Z) - Multi-Fidelity Bayesian Optimization with Unreliable Information Sources [12.509709549771385]
We propose rMFBO (robust MFBO) to make GP-based MFBO schemes robust to the addition of unreliable information sources.
We demonstrate the effectiveness of the proposed methodology on a number of numerical benchmarks.
We expect rMFBO to be particularly useful to reliably include human experts with varying knowledge within BO processes.
arXiv Detail & Related papers (2022-10-25T11:47:33Z) - Movement Penalized Bayesian Optimization with Application to Wind Energy
Systems [84.7485307269572]
Contextual Bayesian optimization (CBO) is a powerful framework for sequential decision-making given side information.
In this setting, the learner receives context (e.g., weather conditions) at each round, and has to choose an action (e.g., turbine parameters)
Standard algorithms assume no cost for switching their decisions at every round, but in many practical applications, there is a cost associated with such changes, which should be minimized.
arXiv Detail & Related papers (2022-10-14T20:19:32Z) - Distributionally Robust Bayesian Optimization with $\varphi$-divergences [45.48814080654241]
We consider robustness against data-shift in $varphi$-divergences, which subsumes many popular choices, such as the Total Variation, and the extant Kullback-Leibler divergence.
We show that the DRO-BO problem in this setting is equivalent to a finite-dimensional optimization problem which, even in the continuous context setting, can be easily implemented with provable sublinear regret bounds.
arXiv Detail & Related papers (2022-03-04T04:34:52Z) - A Nonmyopic Approach to Cost-Constrained Bayesian Optimization [10.078368988372247]
We formulate cost-constrained BO as a constrained Markov decision process (CMDP)
We develop an efficient rollout approximation to the optimal CMDP policy that takes both the cost and future iterations into account.
arXiv Detail & Related papers (2021-06-10T22:44:37Z) - Cost-aware Bayesian Optimization [6.75013674088437]
Cost-aware BO measures convergence with alternative cost metrics such as time, energy, or money.
We introduce Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible.
arXiv Detail & Related papers (2020-03-22T14:51:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.