Variational Entropy Search for Adjusting Expected Improvement
- URL: http://arxiv.org/abs/2402.11345v1
- Date: Sat, 17 Feb 2024 17:37:53 GMT
- Title: Variational Entropy Search for Adjusting Expected Improvement
- Authors: Nuojin Cheng and Stephen Becker
- Abstract summary: Expected Improvement (EI) is the most commonly utilized acquisition function in black-box functions.
We have developed the Variational Entropy Search (VES) methodology and the VES-Gamma algorithm, which adapts EI by incorporating principles from information-theoretic concepts.
- Score: 3.04585143845864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian optimization is a widely used technique for optimizing black-box
functions, with Expected Improvement (EI) being the most commonly utilized
acquisition function in this domain. While EI is often viewed as distinct from
other information-theoretic acquisition functions, such as entropy search (ES)
and max-value entropy search (MES), our work reveals that EI can be considered
a special case of MES when approached through variational inference (VI). In
this context, we have developed the Variational Entropy Search (VES)
methodology and the VES-Gamma algorithm, which adapts EI by incorporating
principles from information-theoretic concepts. The efficacy of VES-Gamma is
demonstrated across a variety of test functions and read datasets, highlighting
its theoretical and practical utilities in Bayesian optimization scenarios.
Related papers
- Benchmarking Optimizers for Qumode State Preparation with Variational Quantum Algorithms [10.941053143198092]
There has been a growing interest in qumodes due to advancements in the field and their potential applications.
This paper aims to bridge this gap by providing performance benchmarks of various parameters used in state preparation with Variational Quantum Algorithms.
arXiv Detail & Related papers (2024-05-07T17:15:58Z) - Unexpected Improvements to Expected Improvement for Bayesian
Optimization [23.207497480389208]
We propose LogEI, a new family of acquisition functions whose members either have identical or approximately equal optima as their canonical counterparts, but are substantially easier to optimize numerically.
Our empirical results show that members of the LogEI family of acquisition functions substantially improve on the optimization performance of their canonical counterparts and surprisingly, are on par with or exceed the performance of recent state-of-the-art acquisition functions.
arXiv Detail & Related papers (2023-10-31T17:59:56Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Towards Learning Universal Hyperparameter Optimizers with Transformers [57.35920571605559]
We introduce the OptFormer, the first text-based Transformer HPO framework that provides a universal end-to-end interface for jointly learning policy and function prediction.
Our experiments demonstrate that the OptFormer can imitate at least 7 different HPO algorithms, which can be further improved via its function uncertainty estimates.
arXiv Detail & Related papers (2022-05-26T12:51:32Z) - Rectified Max-Value Entropy Search for Bayesian Optimization [54.26984662139516]
We develop a rectified MES acquisition function based on the notion of mutual information.
As a result, RMES shows a consistent improvement over MES in several synthetic function benchmarks and real-world optimization problems.
arXiv Detail & Related papers (2022-02-28T08:11:02Z) - Fourier Representations for Black-Box Optimization over Categorical
Variables [34.0277529502051]
We propose to use existing methods in conjunction with a surrogate model for the black-box evaluations over purely categorical variables.
To learn such representations, we consider two different settings to update our surrogate model.
Numerical experiments over synthetic benchmarks as well as real-world RNA sequence optimization and design problems demonstrate the representational power of the proposed methods.
arXiv Detail & Related papers (2022-02-08T08:14:58Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.