Efficient Robust Bayesian Optimization for Arbitrary Uncertain Inputs
- URL: http://arxiv.org/abs/2310.20145v2
- Date: Fri, 3 Nov 2023 23:52:57 GMT
- Title: Efficient Robust Bayesian Optimization for Arbitrary Uncertain Inputs
- Authors: Lin Yang, Junlong Lyu, Wenlong Lyu, and Zhitang Chen
- Abstract summary: We introduce a novel robust Bayesian Optimization algorithm, AIRBO, which can effectively identify a robust optimum that performs consistently well under arbitrary input uncertainty.
Our method directly models the uncertain inputs of arbitrary distributions by empowering the Gaussian Process with the Maximum Mean Discrepancy (MMD) and further accelerates the posterior inference via Nystrom approximation.
Rigorous theoretical regret bound is established under MMD estimation error and extensive experiments on synthetic functions and real problems demonstrate that our approach can handle various input uncertainties and achieve state-of-the-art performance.
- Score: 13.578262325229161
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian Optimization (BO) is a sample-efficient optimization algorithm
widely employed across various applications. In some challenging BO tasks,
input uncertainty arises due to the inevitable randomness in the optimization
process, such as machining errors, execution noise, or contextual variability.
This uncertainty deviates the input from the intended value before evaluation,
resulting in significant performance fluctuations in the final result. In this
paper, we introduce a novel robust Bayesian Optimization algorithm, AIRBO,
which can effectively identify a robust optimum that performs consistently well
under arbitrary input uncertainty. Our method directly models the uncertain
inputs of arbitrary distributions by empowering the Gaussian Process with the
Maximum Mean Discrepancy (MMD) and further accelerates the posterior inference
via Nystrom approximation. Rigorous theoretical regret bound is established
under MMD estimation error and extensive experiments on synthetic functions and
real problems demonstrate that our approach can handle various input
uncertainties and achieve state-of-the-art performance.
Related papers
- BO4IO: A Bayesian optimization approach to inverse optimization with uncertainty quantification [5.031974232392534]
This work addresses data-driven inverse optimization (IO)
The goal is to estimate unknown parameters in an optimization model from observed decisions that can be assumed to be optimal or near-optimal.
arXiv Detail & Related papers (2024-05-28T06:52:17Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Robust Multi-Objective Bayesian Optimization Under Input Noise [27.603887040015888]
In many manufacturing processes, the design parameters are subject to random input noise, resulting in a product that is often less performant than expected.
In this work, we propose the first multi-objective BO method that is robust to input noise.
arXiv Detail & Related papers (2022-02-15T16:33:48Z) - Outlier-Robust Sparse Estimation via Non-Convex Optimization [73.18654719887205]
We explore the connection between high-dimensional statistics and non-robust optimization in the presence of sparsity constraints.
We develop novel and simple optimization formulations for these problems.
As a corollary, we obtain that any first-order method that efficiently converges to station yields an efficient algorithm for these tasks.
arXiv Detail & Related papers (2021-09-23T17:38:24Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Bayesian Optimization with Output-Weighted Optimal Sampling [0.0]
We advocate the use of the likelihood ratio to guide the search algorithm towards regions of the input space where the objective function to be minimized assumes abnormally small values.
The "likelihood-weighted" acquisition functions introduced in this work are found to outperform their unweighted counterparts in a number of applications.
arXiv Detail & Related papers (2020-04-22T14:38:39Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z) - Noisy-Input Entropy Search for Efficient Robust Bayesian Optimization [5.836533862551427]
Noisy-Input Entropy Search (NES) is designed to find robust optima for problems with both input and measurement noise.
NES reliably finds robust optima, outperforming existing methods from the literature on all benchmarks.
arXiv Detail & Related papers (2020-02-07T14:48:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.