Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature
and Bayesian Optimization
- URL: http://arxiv.org/abs/2401.05716v1
- Date: Thu, 11 Jan 2024 07:45:09 GMT
- Title: Kernelized Normalizing Constant Estimation: Bridging Bayesian Quadrature
and Bayesian Optimization
- Authors: Xu Cai and Jonathan Scarlett
- Abstract summary: We show that to estimate the normalizing constant within a small relative error, the level of difficulty depends on the value of $lambda$.
We find that this pattern holds true even when the function evaluations are noisy.
- Score: 51.533164528799084
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we study the problem of estimating the normalizing constant
$\int e^{-\lambda f(x)}dx$ through queries to the black-box function $f$, where
$f$ belongs to a reproducing kernel Hilbert space (RKHS), and $\lambda$ is a
problem parameter. We show that to estimate the normalizing constant within a
small relative error, the level of difficulty depends on the value of
$\lambda$: When $\lambda$ approaches zero, the problem is similar to Bayesian
quadrature (BQ), while when $\lambda$ approaches infinity, the problem is
similar to Bayesian optimization (BO). More generally, the problem varies
between BQ and BO. We find that this pattern holds true even when the function
evaluations are noisy, bringing new aspects to this topic. Our findings are
supported by both algorithm-independent lower bounds and algorithmic upper
bounds, as well as simulation studies conducted on a variety of benchmark
functions.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.