Noisy-Input Entropy Search for Efficient Robust Bayesian Optimization
- URL: http://arxiv.org/abs/2002.02820v1
- Date: Fri, 7 Feb 2020 14:48:16 GMT
- Title: Noisy-Input Entropy Search for Efficient Robust Bayesian Optimization
- Authors: Lukas P. Fr\"ohlich, Edgar D. Klenske, Julia Vinogradska, Christian
Daniel, Melanie N. Zeilinger
- Abstract summary: Noisy-Input Entropy Search (NES) is designed to find robust optima for problems with both input and measurement noise.
NES reliably finds robust optima, outperforming existing methods from the literature on all benchmarks.
- Score: 5.836533862551427
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the problem of robust optimization within the well-established
Bayesian optimization (BO) framework. While BO is intrinsically robust to noisy
evaluations of the objective function, standard approaches do not consider the
case of uncertainty about the input parameters. In this paper, we propose
Noisy-Input Entropy Search (NES), a novel information-theoretic acquisition
function that is designed to find robust optima for problems with both input
and measurement noise. NES is based on the key insight that the robust
objective in many cases can be modeled as a Gaussian process, however, it
cannot be observed directly. We evaluate NES on several benchmark problems from
the optimization literature and from engineering. The results show that NES
reliably finds robust optima, outperforming existing methods from the
literature on all benchmarks.
Related papers
- BO4IO: A Bayesian optimization approach to inverse optimization with uncertainty quantification [5.031974232392534]
This work addresses data-driven inverse optimization (IO)
The goal is to estimate unknown parameters in an optimization model from observed decisions that can be assumed to be optimal or near-optimal.
arXiv Detail & Related papers (2024-05-28T06:52:17Z) - Efficient Robust Bayesian Optimization for Arbitrary Uncertain Inputs [13.578262325229161]
We introduce a novel robust Bayesian Optimization algorithm, AIRBO, which can effectively identify a robust optimum that performs consistently well under arbitrary input uncertainty.
Our method directly models the uncertain inputs of arbitrary distributions by empowering the Gaussian Process with the Maximum Mean Discrepancy (MMD) and further accelerates the posterior inference via Nystrom approximation.
Rigorous theoretical regret bound is established under MMD estimation error and extensive experiments on synthetic functions and real problems demonstrate that our approach can handle various input uncertainties and achieve state-of-the-art performance.
arXiv Detail & Related papers (2023-10-31T03:29:31Z) - Equation Discovery with Bayesian Spike-and-Slab Priors and Efficient Kernels [57.46832672991433]
We propose a novel equation discovery method based on Kernel learning and BAyesian Spike-and-Slab priors (KBASS)
We use kernel regression to estimate the target function, which is flexible, expressive, and more robust to data sparsity and noises.
We develop an expectation-propagation expectation-maximization algorithm for efficient posterior inference and function estimation.
arXiv Detail & Related papers (2023-10-09T03:55:09Z) - A Corrected Expected Improvement Acquisition Function Under Noisy
Observations [22.63212972670109]
Sequential of expected improvement (EI) is one of the most widely used policies in Bayesian optimization.
The uncertainty associated with the incumbent solution is often neglected in many analytic EI-type methods.
We propose a modification of EI that corrects its closed-form expression by incorporating the covariance information provided by the Gaussian Process (GP) model.
arXiv Detail & Related papers (2023-10-08T13:50:39Z) - Distributionally Robust Variational Quantum Algorithms with Shifted Noise [9.705847892362165]
We show how to optimize variational quantum algorithms (VQAs) to be robust against unknown shifted noise.
This work is the first step towards improving the reliability of VQAs influenced by shifted noise.
arXiv Detail & Related papers (2023-08-28T23:19:57Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Robust Multi-Objective Bayesian Optimization Under Input Noise [27.603887040015888]
In many manufacturing processes, the design parameters are subject to random input noise, resulting in a product that is often less performant than expected.
In this work, we propose the first multi-objective BO method that is robust to input noise.
arXiv Detail & Related papers (2022-02-15T16:33:48Z) - RoMA: Robust Model Adaptation for Offline Model-based Optimization [115.02677045518692]
We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries.
A popular approach to solving this problem is maintaining a proxy model that approximates the true objective function.
Here, the main challenge is how to avoid adversarially optimized inputs during the search.
arXiv Detail & Related papers (2021-10-27T05:37:12Z) - Outlier-Robust Sparse Estimation via Non-Convex Optimization [73.18654719887205]
We explore the connection between high-dimensional statistics and non-robust optimization in the presence of sparsity constraints.
We develop novel and simple optimization formulations for these problems.
As a corollary, we obtain that any first-order method that efficiently converges to station yields an efficient algorithm for these tasks.
arXiv Detail & Related papers (2021-09-23T17:38:24Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.