Bayesian optimization for modular black-box systems with switching costs
- URL: http://arxiv.org/abs/2006.02624v2
- Date: Mon, 11 Oct 2021 20:05:22 GMT
- Title: Bayesian optimization for modular black-box systems with switching costs
- Authors: Chi-Heng Lin, Joseph D. Miano, Eva L. Dyer
- Abstract summary: We propose a new algorithm for switch cost-aware optimization called Lazy Modular Bayesian Optimization (LaMBO)
LaMBO efficiently identifies the global optimum while minimizing cost through a passive change of variables in early modules.
We apply LaMBO to multiple synthetic functions and a three-stage image segmentation pipeline used in a neuroscience application.
- Score: 9.595357496779394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most existing black-box optimization methods assume that all variables in the
system being optimized have equal cost and can change freely at each iteration.
However, in many real world systems, inputs are passed through a sequence of
different operations or modules, making variables in earlier stages of
processing more costly to update. Such structure imposes a cost on switching
variables in early parts of a data processing pipeline. In this work, we
propose a new algorithm for switch cost-aware optimization called Lazy Modular
Bayesian Optimization (LaMBO). This method efficiently identifies the global
optimum while minimizing cost through a passive change of variables in early
modules. The method is theoretical grounded and achieves vanishing regret when
augmented with switching cost. We apply LaMBO to multiple synthetic functions
and a three-stage image segmentation pipeline used in a neuroscience
application, where we obtain promising improvements over prevailing cost-aware
Bayesian optimization algorithms. Our results demonstrate that LaMBO is an
effective strategy for black-box optimization that is capable of minimizing
switching costs in modular systems.
Related papers
- Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Two Optimizers Are Better Than One: LLM Catalyst Empowers Gradient-Based Optimization for Prompt Tuning [69.95292905263393]
We show that gradient-based optimization and large language models (MsLL) are complementary to each other, suggesting a collaborative optimization approach.
Our code is released at https://www.guozix.com/guozix/LLM-catalyst.
arXiv Detail & Related papers (2024-05-30T06:24:14Z) - Landscape Surrogate: Learning Decision Losses for Mathematical
Optimization Under Partial Information [48.784330281177446]
Recent works in learning-integrated optimization have shown promise in settings where the optimization is only partially observed or where general-purposes perform poorly without expert tuning.
We propose using a smooth and learnable Landscape Surrogate as a replacement for $fcirc mathbfg$.
This surrogate, learnable by neural networks, can be computed faster than the $mathbfg$ solver, provides dense and smooth gradients during training, can generalize to unseen optimization problems, and is efficiently learned via alternating optimization.
arXiv Detail & Related papers (2023-07-18T04:29:16Z) - Movement Penalized Bayesian Optimization with Application to Wind Energy
Systems [84.7485307269572]
Contextual Bayesian optimization (CBO) is a powerful framework for sequential decision-making given side information.
In this setting, the learner receives context (e.g., weather conditions) at each round, and has to choose an action (e.g., turbine parameters)
Standard algorithms assume no cost for switching their decisions at every round, but in many practical applications, there is a cost associated with such changes, which should be minimized.
arXiv Detail & Related papers (2022-10-14T20:19:32Z) - SnAKe: Bayesian Optimization with Pathwise Exploration [9.807656882149319]
We consider a novel setting where the expense of evaluating the function can increase significantly when making large input changes between iterations.
This paper investigates the problem and introduces 'Sequential Bayesian Optimization via Adaptive Connecting Samples' (SnAKe)
It provides a solution by considering future queries and preemptively building optimization paths that minimize input costs.
arXiv Detail & Related papers (2022-01-31T19:42:56Z) - LinEasyBO: Scalable Bayesian Optimization Approach for Analog Circuit
Synthesis via One-Dimensional Subspaces [11.64233949999656]
We propose a fast and robust Bayesian optimization approach via one-dimensional subspaces for analog circuit synthesis.
Our proposed algorithm can accelerate the optimization procedure by up to 9x and 38x compared to LP-EI and REMBOpBO respectively when the batch size is 15.
arXiv Detail & Related papers (2021-09-01T21:25:25Z) - A Nonmyopic Approach to Cost-Constrained Bayesian Optimization [10.078368988372247]
We formulate cost-constrained BO as a constrained Markov decision process (CMDP)
We develop an efficient rollout approximation to the optimal CMDP policy that takes both the cost and future iterations into account.
arXiv Detail & Related papers (2021-06-10T22:44:37Z) - Divide and Learn: A Divide and Conquer Approach for Predict+Optimize [50.03608569227359]
The predict+optimize problem combines machine learning ofproblem coefficients with a optimization prob-lem that uses the predicted coefficients.
We show how to directlyexpress the loss of the optimization problem in terms of thepredicted coefficients as a piece-wise linear function.
We propose a novel divide and algorithm to tackle optimization problems without this restriction and predict itscoefficients using the optimization loss.
arXiv Detail & Related papers (2020-12-04T00:26:56Z) - Pareto-efficient Acquisition Functions for Cost-Aware Bayesian
Optimization [5.459427541271035]
We show how to make cost-aware Bayesian optimization for black-box functions.
On 144 real-world black-box function optimization problems, our solution brings up to 50% speed-ups.
We also revisit the common choice of Gaussian process cost models, showing that simple, low-variance cost models predict training times effectively.
arXiv Detail & Related papers (2020-11-23T15:06:07Z) - A Primer on Zeroth-Order Optimization in Signal Processing and Machine
Learning [95.85269649177336]
ZO optimization iteratively performs three major steps: gradient estimation, descent direction, and solution update.
We demonstrate promising applications of ZO optimization, such as evaluating and generating explanations from black-box deep learning models, and efficient online sensor management.
arXiv Detail & Related papers (2020-06-11T06:50:35Z) - Cost-aware Bayesian Optimization [6.75013674088437]
Cost-aware BO measures convergence with alternative cost metrics such as time, energy, or money.
We introduce Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible.
arXiv Detail & Related papers (2020-03-22T14:51:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.