BOFormer: Learning to Solve Multi-Objective Bayesian Optimization via Non-Markovian RL
- URL: http://arxiv.org/abs/2505.21974v2
- Date: Thu, 29 May 2025 09:07:59 GMT
- Title: BOFormer: Learning to Solve Multi-Objective Bayesian Optimization via Non-Markovian RL
- Authors: Yu-Heng Hung, Kai-Jie Lin, Yu-Heng Lin, Chien-Yi Wang, Cheng Sun, Ping-Chun Hsieh,
- Abstract summary: We present a generalized deep Q-learning framework and propose textitBOFormer, which substantiates this framework for MOBO via sequence modeling.<n>Through extensive evaluation, we demonstrate that BOFormer constantly outperforms the benchmark rule-based and learning-based algorithms.
- Score: 15.127370150885348
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian optimization (BO) offers an efficient pipeline for optimizing black-box functions with the help of a Gaussian process prior and an acquisition function (AF). Recently, in the context of single-objective BO, learning-based AFs witnessed promising empirical results given its favorable non-myopic nature. Despite this, the direct extension of these approaches to multi-objective Bayesian optimization (MOBO) suffer from the \textit{hypervolume identifiability issue}, which results from the non-Markovian nature of MOBO problems. To tackle this, inspired by the non-Markovian RL literature and the success of Transformers in language modeling, we present a generalized deep Q-learning framework and propose \textit{BOFormer}, which substantiates this framework for MOBO via sequence modeling. Through extensive evaluation, we demonstrate that BOFormer constantly outperforms the benchmark rule-based and learning-based algorithms in various synthetic MOBO and real-world multi-objective hyperparameter optimization problems. We have made the source code publicly available to encourage further research in this direction.
Related papers
- Latent Bayesian Optimization via Autoregressive Normalizing Flows [17.063294409131238]
We propose a Normalizing Flow-based Bayesian Optimization (NF-BO) to solve the value discrepancy problem.<n>Our method demonstrates superior performance in molecule generation tasks, significantly outperforming both traditional and recent LBO approaches.
arXiv Detail & Related papers (2025-04-21T06:36:09Z) - Non-Myopic Multi-Objective Bayesian Optimization [64.31753000439514]
We consider the problem of finite-horizon sequential experimental design to solve multi-objective optimization problems.<n>This problem arises in many real-world applications, including materials design.<n>We propose the first set of non-myopic methods for MOO problems.
arXiv Detail & Related papers (2024-12-11T04:05:29Z) - LLaMA-Berry: Pairwise Optimization for O1-like Olympiad-Level Mathematical Reasoning [56.273799410256075]
The framework combines Monte Carlo Tree Search (MCTS) with iterative Self-Refine to optimize the reasoning path.
The framework has been tested on general and advanced benchmarks, showing superior performance in terms of search efficiency and problem-solving capability.
arXiv Detail & Related papers (2024-10-03T18:12:29Z) - Large Language Models to Enhance Bayesian Optimization [57.474613739645605]
We present LLAMBO, a novel approach that integrates the capabilities of Large Language Models (LLM) within Bayesian optimization.
At a high level, we frame the BO problem in natural language, enabling LLMs to iteratively propose and evaluate promising solutions conditioned on historical evaluations.
Our findings illustrate that LLAMBO is effective at zero-shot warmstarting, and enhances surrogate modeling and candidate sampling, especially in the early stages of search when observations are sparse.
arXiv Detail & Related papers (2024-02-06T11:44:06Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Towards Automated Design of Bayesian Optimization via Exploratory
Landscape Analysis [11.143778114800272]
We show that a dynamic selection of the AF can benefit the BO design.
We pave a way towards AutoML-assisted, on-the-fly BO designs that adjust their behavior on a run-by-run basis.
arXiv Detail & Related papers (2022-11-17T17:15:04Z) - A General Recipe for Likelihood-free Bayesian Optimization [115.82591413062546]
We propose likelihood-free BO (LFBO) to extend BO to a broader class of models and utilities.
LFBO directly models the acquisition function without having to separately perform inference with a probabilistic surrogate model.
We show that computing the acquisition function in LFBO can be reduced to optimizing a weighted classification problem.
arXiv Detail & Related papers (2022-06-27T03:55:27Z) - High-Dimensional Bayesian Optimization with Sparse Axis-Aligned
Subspaces [14.03847432040056]
We argue that a surrogate model defined on sparse axis-aligned subspaces offer an attractive compromise between flexibility and parsimony.
We demonstrate that our approach, which relies on Hamiltonian Monte Carlo for inference, can rapidly identify sparse subspaces relevant to modeling the unknown objective function.
arXiv Detail & Related papers (2021-02-27T23:06:24Z) - Preferential Batch Bayesian Optimization [16.141259199997005]
preferential batch Bayesian optimization (PBBO) is a new framework that allows finding the optimum of a latent function of interest.
We show how the acquisitions developed under this framework generalize and augment previous approaches in Bayesian optimization.
arXiv Detail & Related papers (2020-03-25T14:59:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.