MBORE: Multi-objective Bayesian Optimisation by Density-Ratio Estimation
- URL: http://arxiv.org/abs/2203.16912v1
- Date: Thu, 31 Mar 2022 09:27:59 GMT
- Title: MBORE: Multi-objective Bayesian Optimisation by Density-Ratio Estimation
- Authors: George De Ath, Tinkle Chugh, Alma A. M. Rahat
- Abstract summary: optimisation problems often have multiple conflicting objectives that can be computationally and/or financially expensive.
Mono-surrogate Bayesian optimisation (BO) is a popular model-based approach for optimising such black-box functions.
We extend previous work on BO by density-ratio estimation (BORE) to the multi-objective setting.
- Score: 0.01652719262940403
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Optimisation problems often have multiple conflicting objectives that can be
computationally and/or financially expensive. Mono-surrogate Bayesian
optimisation (BO) is a popular model-based approach for optimising such
black-box functions. It combines objective values via scalarisation and builds
a Gaussian process (GP) surrogate of the scalarised values. The location which
maximises a cheap-to-query acquisition function is chosen as the next location
to expensively evaluate. While BO is an effective strategy, the use of GPs is
limiting. Their performance decreases as the problem input dimensionality
increases, and their computational complexity scales cubically with the amount
of data. To address these limitations, we extend previous work on BO by
density-ratio estimation (BORE) to the multi-objective setting. BORE links the
computation of the probability of improvement acquisition function to that of
probabilistic classification. This enables the use of state-of-the-art
classifiers in a BO-like framework. In this work we present MBORE:
multi-objective Bayesian optimisation by density-ratio estimation, and compare
it to BO across a range of synthetic and real-world benchmarks. We find that
MBORE performs as well as or better than BO on a wide variety of problems, and
that it outperforms BO on high-dimensional and real-world problems.
Related papers
- Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Large-Batch, Iteration-Efficient Neural Bayesian Design Optimization [37.339567743948955]
We present a novel Bayesian optimization framework specifically tailored to address the limitations of BO.
Our key contribution is a highly scalable, sample-based acquisition function that performs a non-dominated sorting of objectives.
We show that our acquisition function in combination with different Bayesian neural network surrogates is effective in data-intensive environments with a minimal number of iterations.
arXiv Detail & Related papers (2023-06-01T19:10:57Z) - Comparison of High-Dimensional Bayesian Optimization Algorithms on BBOB [0.40498500266986387]
We compare five state-of-the-art high-dimensional BO algorithms, with vanilla and CMA-ES, at increasing dimensionality, ranging from 10 to 60 variables.
Our results confirm the superiority of BO over CMA-ES for limited evaluation budgets.
arXiv Detail & Related papers (2023-03-02T01:14:15Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Bayesian Optimization for Macro Placement [48.55456716632735]
We develop a novel approach to macro placement using Bayesian optimization (BO) over sequence pairs.
BO is a machine learning technique that uses a probabilistic surrogate model and an acquisition function.
We demonstrate our algorithm on the fixed-outline macro placement problem with the half-perimeter wire length objective.
arXiv Detail & Related papers (2022-07-18T06:17:06Z) - Many Objective Bayesian Optimization [0.0]
Multi-objective Bayesian optimization (MOBO) is a set of methods that has been successfully applied for the simultaneous optimization of black-boxes.
In particular, MOBO methods have problems when the number of objectives in a multi-objective optimization problem are 3 or more, which is the many objective setting.
We show empirical evidence in a set of toy, synthetic, benchmark and real experiments that GPs predictive distributions of the effectiveness of the metric and the algorithm.
arXiv Detail & Related papers (2021-07-08T21:57:07Z) - Bayesian Optimistic Optimisation with Exponentially Decaying Regret [58.02542541410322]
The current practical BO algorithms have regret bounds ranging from $mathcalO(fraclogNsqrtN)$ to $mathcal O(e-sqrtN)$, where $N$ is the number of evaluations.
This paper explores the possibility of improving the regret bound in the noiseless setting by intertwining concepts from BO and tree-based optimistic optimisation.
We propose the BOO algorithm, a first practical approach which can achieve an exponential regret bound with order $mathcal O(N-sqrt
arXiv Detail & Related papers (2021-05-10T13:07:44Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Cauchy-Schwarz Regularized Autoencoder [68.80569889599434]
Variational autoencoders (VAE) are a powerful and widely-used class of generative models.
We introduce a new constrained objective based on the Cauchy-Schwarz divergence, which can be computed analytically for GMMs.
Our objective improves upon variational auto-encoding models in density estimation, unsupervised clustering, semi-supervised learning, and face analysis.
arXiv Detail & Related papers (2021-01-06T17:36:26Z) - Multi-Fidelity Bayesian Optimization via Deep Neural Networks [19.699020509495437]
In many applications, the objective function can be evaluated at multiple fidelities to enable a trade-off between the cost and accuracy.
We propose Deep Neural Network Multi-Fidelity Bayesian Optimization (DNN-MFBO) that can flexibly capture all kinds of complicated relationships between the fidelities.
We show the advantages of our method in both synthetic benchmark datasets and real-world applications in engineering design.
arXiv Detail & Related papers (2020-07-06T23:28:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.