BOIS: Bayesian Optimization of Interconnected Systems
- URL: http://arxiv.org/abs/2311.11254v3
- Date: Wed, 29 Nov 2023 02:32:02 GMT
- Title: BOIS: Bayesian Optimization of Interconnected Systems
- Authors: Leonardo D. Gonz\'alez and Victor M. Zavala
- Abstract summary: We introduce a new paradigm which allows for the efficient use of composite functions in BO.
We show that this simple approach (which we call BOIS) enables the exploitation of structural knowledge.
Our results indicate that BOIS achieves performance gains and accurately captures the statistics of composite functions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimization (BO) has proven to be an effective paradigm for the
global optimization of expensive-to-sample systems. One of the main advantages
of BO is its use of Gaussian processes (GPs) to characterize model uncertainty
which can be leveraged to guide the learning and search process. However, BO
typically treats systems as black-boxes and this limits the ability to exploit
structural knowledge (e.g., physics and sparse interconnections). Composite
functions of the form $f(x, y(x))$, wherein GP modeling is shifted from the
performance function $f$ to an intermediate function $y$, offer an avenue for
exploiting structural knowledge. However, the use of composite functions in a
BO framework is complicated by the need to generate a probability density for
$f$ from the Gaussian density of $y$ calculated by the GP (e.g., when $f$ is
nonlinear it is not possible to obtain a closed-form expression). Previous work
has handled this issue using sampling techniques; these are easy to implement
and flexible but are computationally intensive. In this work, we introduce a
new paradigm which allows for the efficient use of composite functions in BO;
this uses adaptive linearizations of $f$ to obtain closed-form expressions for
the statistical moments of the composite function. We show that this simple
approach (which we call BOIS) enables the exploitation of structural knowledge,
such as that arising in interconnected systems as well as systems that embed
multiple GP models and combinations of physics and GP models. Using a chemical
process optimization case study, we benchmark the effectiveness of BOIS against
standard BO and sampling approaches. Our results indicate that BOIS achieves
performance gains and accurately captures the statistics of composite
functions.
Related papers
- Sample-efficient Bayesian Optimisation Using Known Invariances [56.34916328814857]
We show that vanilla and constrained BO algorithms are inefficient when optimising invariant objectives.
We derive a bound on the maximum information gain of these invariant kernels.
We use our method to design a current drive system for a nuclear fusion reactor, finding a high-performance solution.
arXiv Detail & Related papers (2024-10-22T12:51:46Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Joint Composite Latent Space Bayesian Optimization [15.262166538890243]
We introduce Joint Composite Latent Space Bayesian Optimization (JoCo)
JoCo is a novel framework that jointly trains neural network encoders and probabilistic models to adaptively compress high-dimensional input and output spaces into manageable latent representations.
This enables viable BO on these compressed representations, allowing JoCo to outperform other state-of-the-art methods in high-dimensional BO on a wide variety of simulated and real-world problems.
arXiv Detail & Related papers (2023-11-03T19:53:37Z) - Bayesian Optimization for Function Compositions with Applications to
Dynamic Pricing [0.0]
We propose a practical BO method of function compositions where the form of the composition is known but the constituent functions are expensive to evaluate.
We demonstrate a novel application to dynamic pricing in revenue management when the underlying demand function is expensive to evaluate.
arXiv Detail & Related papers (2023-03-21T15:45:06Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Pre-training helps Bayesian optimization too [49.28382118032923]
We seek an alternative practice for setting functional priors.
In particular, we consider the scenario where we have data from similar functions that allow us to pre-train a tighter distribution a priori.
Our results show that our method is able to locate good hyper parameters at least 3 times more efficiently than the best competing methods.
arXiv Detail & Related papers (2022-07-07T04:42:54Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - $\pi$BO: Augmenting Acquisition Functions with User Beliefs for Bayesian
Optimization [40.30019289383378]
We propose $pi$BO, an acquisition function generalization which incorporates prior beliefs about the location of the optimum.
In contrast to previous approaches, $pi$BO is conceptually simple and can easily be integrated with existing libraries and many acquisition functions.
We also demonstrate that $pi$BO improves on the state-of-the-art performance for a popular deep learning task, with a 12.5 $times$ time-to-accuracy speedup over prominent BO approaches.
arXiv Detail & Related papers (2022-04-23T11:07:13Z) - MBORE: Multi-objective Bayesian Optimisation by Density-Ratio Estimation [0.01652719262940403]
optimisation problems often have multiple conflicting objectives that can be computationally and/or financially expensive.
Mono-surrogate Bayesian optimisation (BO) is a popular model-based approach for optimising such black-box functions.
We extend previous work on BO by density-ratio estimation (BORE) to the multi-objective setting.
arXiv Detail & Related papers (2022-03-31T09:27:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.