Bayesian Optimization for Function Compositions with Applications to
Dynamic Pricing
- URL: http://arxiv.org/abs/2303.11954v2
- Date: Mon, 1 May 2023 15:28:17 GMT
- Title: Bayesian Optimization for Function Compositions with Applications to
Dynamic Pricing
- Authors: Kunal Jain, Prabuchandran K. J., Tejas Bodas
- Abstract summary: We propose a practical BO method of function compositions where the form of the composition is known but the constituent functions are expensive to evaluate.
We demonstrate a novel application to dynamic pricing in revenue management when the underlying demand function is expensive to evaluate.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian Optimization (BO) is used to find the global optima of black box
functions. In this work, we propose a practical BO method of function
compositions where the form of the composition is known but the constituent
functions are expensive to evaluate. By assuming an independent Gaussian
process (GP) model for each of the constituent black-box function, we propose
Expected Improvement (EI) and Upper Confidence Bound (UCB) based BO algorithms
and demonstrate their ability to outperform not just vanilla BO but also the
current state-of-art algorithms. We demonstrate a novel application of the
proposed methods to dynamic pricing in revenue management when the underlying
demand function is expensive to evaluate.
Related papers
- Cost-aware Bayesian Optimization via the Pandora's Box Gittins Index [57.045952766988925]
We develop a previously-unexplored connection between cost-aware Bayesian optimization and the Pandora's Box problem, a decision problem from economics.
Our work constitutes a first step towards integrating techniques from Gittins index theory into Bayesian optimization.
arXiv Detail & Related papers (2024-06-28T17:20:13Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - A General Framework for User-Guided Bayesian Optimization [51.96352579696041]
We propose ColaBO, the first Bayesian-principled framework for prior beliefs beyond the typical kernel structure.
We empirically demonstrate ColaBO's ability to substantially accelerate optimization when the prior information is accurate, and to retain approximately default performance when it is misleading.
arXiv Detail & Related papers (2023-11-24T18:27:26Z) - BOIS: Bayesian Optimization of Interconnected Systems [0.0]
We introduce a new paradigm which allows for the efficient use of composite functions in BO.
We show that this simple approach (which we call BOIS) enables the exploitation of structural knowledge.
Our results indicate that BOIS achieves performance gains and accurately captures the statistics of composite functions.
arXiv Detail & Related papers (2023-11-19T06:44:13Z) - Polynomial-Model-Based Optimization for Blackbox Objectives [0.0]
Black-box optimization seeks to find optimal parameters for systems such that a pre-defined objective function is minimized.
PMBO is a novel blackbox that finds the minimum by fitting a surrogate to the objective function.
PMBO is benchmarked against other state-of-the-art algorithms for a given set of artificial, analytical functions.
arXiv Detail & Related papers (2023-09-01T14:11:03Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - Surrogate modeling for Bayesian optimization beyond a single Gaussian
process [62.294228304646516]
We propose a novel Bayesian surrogate model to balance exploration with exploitation of the search space.
To endow function sampling with scalability, random feature-based kernel approximation is leveraged per GP model.
To further establish convergence of the proposed EGP-TS to the global optimum, analysis is conducted based on the notion of Bayesian regret.
arXiv Detail & Related papers (2022-05-27T16:43:10Z) - $\pi$BO: Augmenting Acquisition Functions with User Beliefs for Bayesian
Optimization [40.30019289383378]
We propose $pi$BO, an acquisition function generalization which incorporates prior beliefs about the location of the optimum.
In contrast to previous approaches, $pi$BO is conceptually simple and can easily be integrated with existing libraries and many acquisition functions.
We also demonstrate that $pi$BO improves on the state-of-the-art performance for a popular deep learning task, with a 12.5 $times$ time-to-accuracy speedup over prominent BO approaches.
arXiv Detail & Related papers (2022-04-23T11:07:13Z) - One-parameter family of acquisition functions for efficient global
optimization [0.0]
We propose a new one- parameter family of acquisition functions for BO that unifies EI and PI.
The proposed method is numerically inexpensive, is easy to implement, can be easily parallelized, and on benchmark tasks shows a performance superior to EI and GP-UCB.
arXiv Detail & Related papers (2021-04-26T06:41:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.