On the development of a Bayesian optimisation framework for complex
unknown systems
- URL: http://arxiv.org/abs/2207.09154v1
- Date: Tue, 19 Jul 2022 09:50:34 GMT
- Title: On the development of a Bayesian optimisation framework for complex
unknown systems
- Authors: Mike Diessner, Yu Guan, Kevin J. Wilson, Richard D. Whalley
- Abstract summary: This paper studies and compares common Bayesian optimisation algorithms empirically on a range of synthetic test functions.
It investigates the choice of acquisition function and number of training samples, exact calculation of acquisition functions and Monte Carlo based approaches.
- Score: 11.066706766632578
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimisation provides an effective method to optimise expensive
black box functions. It has recently been applied to problems in fluid
dynamics. This paper studies and compares common Bayesian optimisation
algorithms empirically on a range of synthetic test functions. It investigates
the choice of acquisition function and number of training samples, exact
calculation of acquisition functions and Monte Carlo based approaches and both
single-point and multi-point optimisation. The test functions considered cover
a wide selection of challenges and therefore serve as an ideal test bed to
understand the performance of Bayesian optimisation and to identify general
situations where Bayesian optimisation performs well and poorly. This knowledge
can be utilised in applications, including those in fluid dynamics, where
objective functions are unknown. The results of this investigation show that
the choices to be made are less relevant for relatively simple functions, while
optimistic acquisition functions such as Upper Confidence Bound should be
preferred for more complex objective functions. Furthermore, results from the
Monte Carlo approach are comparable to results from analytical acquisition
functions. In instances where the objective function allows parallel
evaluations, the multi-point approach offers a quicker alternative, yet it may
potentially require more objective function evaluations.
Related papers
- Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Mastering the exploration-exploitation trade-off in Bayesian
Optimization [0.2538209532048867]
The acquisition function drives the choice of the next solution to evaluate, balancing between exploration and exploitation.
This paper proposes a novel acquisition function, mastering the trade-off between explorative and exploitative choices, adaptively.
arXiv Detail & Related papers (2023-05-15T13:19:03Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Batch Bayesian Optimization via Particle Gradient Flows [0.5735035463793008]
We show how to find global optima of objective functions which are only available as a black-box or are expensive to evaluate.
We construct a new function based on multipoint expected probability which is over the space of probability measures.
arXiv Detail & Related papers (2022-09-10T18:10:15Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - Efficient Neural Network Analysis with Sum-of-Infeasibilities [64.31536828511021]
Inspired by sum-of-infeasibilities methods in convex optimization, we propose a novel procedure for analyzing verification queries on networks with extensive branching functions.
An extension to a canonical case-analysis-based complete search procedure can be achieved by replacing the convex procedure executed at each search state with DeepSoI.
arXiv Detail & Related papers (2022-03-19T15:05:09Z) - Optimizing Bayesian acquisition functions in Gaussian Processes [0.0]
This paper analyzes different acquistion functions like Probability of Maximum Improvement and Expected Improvement.
Along with the analysis of time taken, the paper also shows the importance of position of initial samples chosen.
arXiv Detail & Related papers (2021-11-09T03:25:15Z) - Are we Forgetting about Compositional Optimisers in Bayesian
Optimisation? [66.39551991177542]
This paper presents a sample methodology for global optimisation.
Within this, a crucial performance-determiningtrivial is maximising the acquisition function.
We highlight the empirical advantages of the approach to optimise functionation across 3958 individual experiments.
arXiv Detail & Related papers (2020-12-15T12:18:38Z) - Incorporating Expert Prior in Bayesian Optimisation via Space Warping [54.412024556499254]
In big search spaces the algorithm goes through several low function value regions before reaching the optimum of the function.
One approach to subside this cold start phase is to use prior knowledge that can accelerate the optimisation.
In this paper, we represent the prior knowledge about the function optimum through a prior distribution.
The prior distribution is then used to warp the search space in such a way that space gets expanded around the high probability region of function optimum and shrinks around low probability region of optimum.
arXiv Detail & Related papers (2020-03-27T06:18:49Z) - Composition of kernel and acquisition functions for High Dimensional
Bayesian Optimization [0.1749935196721634]
We use the addition-ality of the objective function into mapping both the kernel and the acquisition function of the Bayesian Optimization.
This ap-proach makes more efficient the learning/updating of the probabilistic surrogate model.
Results are presented for real-life application, that is the control of pumps in urban water distribution systems.
arXiv Detail & Related papers (2020-03-09T15:45:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.