Learning Relevant Contextual Variables Within Bayesian Optimization
- URL: http://arxiv.org/abs/2305.14120v4
- Date: Fri, 24 May 2024 12:59:54 GMT
- Title: Learning Relevant Contextual Variables Within Bayesian Optimization
- Authors: Julien Martinelli, Ayush Bharti, Armi Tiihonen, S. T. John, Louis Filstroff, Sabina J. Sloman, Patrick Rinke, Samuel Kaski,
- Abstract summary: We propose a Sensitivity-Analysis-Driven Contextual BO (SADCBO) method against alternatives on both synthetic and real-world experiments.
We learn the relevance of contextual variables by sensitivity analysis of the posterior surrogate model.
We empirically evaluate our proposed Sensitivity-Analysis-Driven Contextual BO (SADCBO) method against alternatives on both synthetic and real-world experiments.
- Score: 17.18588266529801
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contextual Bayesian Optimization (CBO) efficiently optimizes black-box functions with respect to design variables, while simultaneously integrating contextual information regarding the environment, such as experimental conditions. However, the relevance of contextual variables is not necessarily known beforehand. Moreover, contextual variables can sometimes be optimized themselves at an additional cost, a setting overlooked by current CBO algorithms. Cost-sensitive CBO would simply include optimizable contextual variables as part of the design variables based on their cost. Instead, we adaptively select a subset of contextual variables to include in the optimization, based on the trade-off between their relevance and the additional cost incurred by optimizing them compared to leaving them to be determined by the environment. We learn the relevance of contextual variables by sensitivity analysis of the posterior surrogate model while minimizing the cost of optimization by leveraging recent developments on early stopping for BO. We empirically evaluate our proposed Sensitivity-Analysis-Driven Contextual BO (SADCBO) method against alternatives on both synthetic and real-world experiments, together with extensive ablation studies, and demonstrate a consistent improvement across examples.
Related papers
- Bayesian Optimization for Unknown Cost-Varying Variable Subsets with No-Regret Costs [3.1269598124014264]
We propose a novel algorithm for the extension of the BOCVS problem with random and unknown costs.
Our algorithm achieves a sub-linear rate in both quality regret and cost regret, addressing the objective of the BOCVS problem more effectively than previous analyses.
arXiv Detail & Related papers (2024-12-20T13:00:39Z) - $f$-PO: Generalizing Preference Optimization with $f$-divergence Minimization [91.43730624072226]
$f$-PO is a novel framework that generalizes and extends existing approaches.
We conduct experiments on state-of-the-art language models using benchmark datasets.
arXiv Detail & Related papers (2024-10-29T02:11:45Z) - Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Enhanced Bayesian Optimization via Preferential Modeling of Abstract
Properties [49.351577714596544]
We propose a human-AI collaborative Bayesian framework to incorporate expert preferences about unmeasured abstract properties into surrogate modeling.
We provide an efficient strategy that can also handle any incorrect/misleading expert bias in preferential judgments.
arXiv Detail & Related papers (2024-02-27T09:23:13Z) - Contextual Causal Bayesian Optimisation [3.649440235324259]
We show that, in general, utilising a subset of observational variables as a context to choose the values of interventional variables leads to lower cumulative regrets.
We propose a general framework of contextual causal Bayesian optimisation that efficiently searches through combinations of controlled and contextual variables.
We analytically show that well-established methods, such as contextual BO (CoBO) or CaBO, are not able to achieve the optimum in some cases.
arXiv Detail & Related papers (2023-01-29T10:36:10Z) - Variable Functioning and Its Application to Large Scale Steel Frame
Design Optimization [15.86197261674868]
A concept-based approach called variable functioning Fx is introduced to reduce the optimization variables and narrow down the search space.
By using problem structure analysis technique and engineering expert knowledge, the $Fx$ method is used to enhance the steel frame design optimization process.
arXiv Detail & Related papers (2022-05-15T12:43:25Z) - Sparse Bayesian Optimization [16.867375370457438]
We present several regularization-based approaches that allow us to discover sparse and more interpretable configurations.
We propose a novel differentiable relaxation based on homotopy continuation that makes it possible to target sparsity.
We show that we are able to efficiently optimize for sparsity.
arXiv Detail & Related papers (2022-03-03T18:25:33Z) - SnAKe: Bayesian Optimization with Pathwise Exploration [9.807656882149319]
We consider a novel setting where the expense of evaluating the function can increase significantly when making large input changes between iterations.
This paper investigates the problem and introduces 'Sequential Bayesian Optimization via Adaptive Connecting Samples' (SnAKe)
It provides a solution by considering future queries and preemptively building optimization paths that minimize input costs.
arXiv Detail & Related papers (2022-01-31T19:42:56Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Optimal Bayesian experimental design for subsurface flow problems [77.34726150561087]
We propose a novel approach for development of chaos expansion (PCE) surrogate model for the design utility function.
This novel technique enables the derivation of a reasonable quality response surface for the targeted objective function with a computational budget comparable to several single-point evaluations.
arXiv Detail & Related papers (2020-08-10T09:42:59Z) - Cost-aware Bayesian Optimization [6.75013674088437]
Cost-aware BO measures convergence with alternative cost metrics such as time, energy, or money.
We introduce Cost Apportioned BO (CArBO), which attempts to minimize an objective function in as little cost as possible.
arXiv Detail & Related papers (2020-03-22T14:51:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.