Mean-Field Bayesian Optimisation
- URL: http://arxiv.org/abs/2502.12315v1
- Date: Mon, 17 Feb 2025 20:34:29 GMT
- Title: Mean-Field Bayesian Optimisation
- Authors: Petar Steinberg, Juliusz Ziomek, Matej Jusup, Ilija Bogunovic,
- Abstract summary: We address the problem of optimising the average payoff for a large number of cooperating agents, where the payoff function is unknown and treated as a black box.
We introduce MF-GP-UCB, a novel efficient algorithm designed to optimise agent payoffs in this setting.
Empirical results demonstrate that MF-GP-UCB significantly outperforms existing benchmarks.
- Score: 11.624033826631448
- License:
- Abstract: We address the problem of optimising the average payoff for a large number of cooperating agents, where the payoff function is unknown and treated as a black box. While standard Bayesian Optimisation (BO) methods struggle with the scalability required for high-dimensional input spaces, we demonstrate how leveraging the mean-field assumption on the black-box function can transform BO into an efficient and scalable solution. Specifically, we introduce MF-GP-UCB, a novel efficient algorithm designed to optimise agent payoffs in this setting. Our theoretical analysis establishes a regret bound for MF-GP-UCB that is independent of the number of agents, contrasting sharply with the exponential dependence observed when naive BO methods are applied. We evaluate our algorithm on a diverse set of tasks, including real-world problems, such as optimising the location of public bikes for a bike-sharing programme, distributing taxi fleets, and selecting refuelling ports for maritime vessels. Empirical results demonstrate that MF-GP-UCB significantly outperforms existing benchmarks, offering substantial improvements in performance and scalability, constituting a promising solution for mean-field, black-box optimisation. The code is available at https://github.com/petarsteinberg/MF-BO.
Related papers
- Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - LABCAT: Locally adaptive Bayesian optimization using principal-component-aligned trust regions [0.0]
We propose the LABCAT algorithm, which extends trust-region-based BO.
We show that the algorithm outperforms several state-of-the-art BO and other black-box optimization algorithms.
arXiv Detail & Related papers (2023-11-19T13:56:24Z) - Self-Adjusting Weighted Expected Improvement for Bayesian Optimization [11.955557264002204]
This work focuses on the definition of the AF, whose main purpose is to balance the trade-off between exploring regions with high uncertainty and those with high promise for good solutions.
We propose Self-Adjusting Weighted Expected Improvement (SAWEI), where we let the exploration-exploitation trade-off self-adjust in a data-driven manner.
Our method exhibits a favorable any-time performance compared to handcrafted baselines and serves as a robust default choice for any problem structure.
arXiv Detail & Related papers (2023-06-07T09:00:19Z) - Bayesian Optimization for Function Compositions with Applications to
Dynamic Pricing [0.0]
We propose a practical BO method of function compositions where the form of the composition is known but the constituent functions are expensive to evaluate.
We demonstrate a novel application to dynamic pricing in revenue management when the underlying demand function is expensive to evaluate.
arXiv Detail & Related papers (2023-03-21T15:45:06Z) - Multi-Fidelity Bayesian Optimization with Unreliable Information Sources [12.509709549771385]
We propose rMFBO (robust MFBO) to make GP-based MFBO schemes robust to the addition of unreliable information sources.
We demonstrate the effectiveness of the proposed methodology on a number of numerical benchmarks.
We expect rMFBO to be particularly useful to reliably include human experts with varying knowledge within BO processes.
arXiv Detail & Related papers (2022-10-25T11:47:33Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - MBORE: Multi-objective Bayesian Optimisation by Density-Ratio Estimation [0.01652719262940403]
optimisation problems often have multiple conflicting objectives that can be computationally and/or financially expensive.
Mono-surrogate Bayesian optimisation (BO) is a popular model-based approach for optimising such black-box functions.
We extend previous work on BO by density-ratio estimation (BORE) to the multi-objective setting.
arXiv Detail & Related papers (2022-03-31T09:27:59Z) - Likelihood-Free Inference with Deep Gaussian Processes [70.74203794847344]
Surrogate models have been successfully used in likelihood-free inference to decrease the number of simulator evaluations.
We propose a Deep Gaussian Process (DGP) surrogate model that can handle more irregularly behaved target distributions.
Our experiments show how DGPs can outperform GPs on objective functions with multimodal distributions and maintain a comparable performance in unimodal cases.
arXiv Detail & Related papers (2020-06-18T14:24:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.