OPBO: Order-Preserving Bayesian Optimization
- URL: http://arxiv.org/abs/2512.18980v1
- Date: Mon, 22 Dec 2025 02:45:41 GMT
- Title: OPBO: Order-Preserving Bayesian Optimization
- Authors: Wei Peng, Jianchen Hu, Kang Liu, Qiaozhu Zhai,
- Abstract summary: We propose a simple order-preserving Bayesian optimization (OPBO) method, where the surrogate model preserves the order, instead of the value, of the black-box objective function.<n>The experimental results show that for high-dimensional (over 500) black-box optimization problems, the proposed OPBO significantly outperforms traditional BO methods.
- Score: 16.234096837420235
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian optimization is an effective method for solving expensive black-box optimization problems. Most existing methods use Gaussian processes (GP) as the surrogate model for approximating the black-box objective function, it is well-known that it can fail in high-dimensional space (e.g., dimension over 500). We argue that the reliance of GP on precise numerical fitting is fundamentally ill-suited in high-dimensional space, where it leads to prohibitive computational complexity. In order to address this, we propose a simple order-preserving Bayesian optimization (OPBO) method, where the surrogate model preserves the order, instead of the value, of the black-box objective function. Then we can use a simple but effective OP neural network (NN) to replace GP as the surrogate model. Moreover, instead of searching for the best solution from the acquisition model, we select good-enough solutions in the ordinal set to reduce computational cost. The experimental results show that for high-dimensional (over 500) black-box optimization problems, the proposed OPBO significantly outperforms traditional BO methods based on regression NN and GP. The source code is available at https://github.com/pengwei222/OPBO.
Related papers
- Scalable Neural Network-based Blackbox Optimization [0.0]
We propose scalable neural network-based blackbox optimization (SNBO)<n> SNBO does not rely on model uncertainty estimation.<n>It attains function values better than the best-performing baseline algorithm.
arXiv Detail & Related papers (2025-08-05T18:15:27Z) - Efficient optimization of expensive black-box simulators via marginal means, with application to neutrino detector design [1.5749416770494706]
We propose a new Black-box Optimization via Marginal Means (BOMM) approach.<n>BOMM uses a new estimator of a global $mathbfx*$ that can be efficiently inferred with limited runs in high dimensions.<n>We show that BOMM is consistent for optimization, but also has an optimization rate that tempers the ''curse-of-dimensionality'' faced by existing methods.
arXiv Detail & Related papers (2025-08-03T16:44:05Z) - Training Deep Learning Models with Norm-Constrained LMOs [56.00317694850397]
We propose a new family of algorithms that uses the linear minimization oracle (LMO) to adapt to the geometry of the problem.<n>We demonstrate significant speedups on nanoGPT training using our algorithm, Scion, without any reliance on Adam.
arXiv Detail & Related papers (2025-02-11T13:10:34Z) - Poisson Process for Bayesian Optimization [126.51200593377739]
We propose a ranking-based surrogate model based on the Poisson process and introduce an efficient BO framework, namely Poisson Process Bayesian Optimization (PoPBO)
Compared to the classic GP-BO method, our PoPBO has lower costs and better robustness to noise, which is verified by abundant experiments.
arXiv Detail & Related papers (2024-02-05T02:54:50Z) - Polynomial-Model-Based Optimization for Blackbox Objectives [0.0]
Black-box optimization seeks to find optimal parameters for systems such that a pre-defined objective function is minimized.
PMBO is a novel blackbox that finds the minimum by fitting a surrogate to the objective function.
PMBO is benchmarked against other state-of-the-art algorithms for a given set of artificial, analytical functions.
arXiv Detail & Related papers (2023-09-01T14:11:03Z) - Predictive Modeling through Hyper-Bayesian Optimization [60.586813904500595]
We propose a novel way of integrating model selection and BO for the single goal of reaching the function optima faster.
The algorithm moves back and forth between BO in the model space and BO in the function space, where the goodness of the recommended model is captured.
In addition to improved sample efficiency, the framework outputs information about the black-box function.
arXiv Detail & Related papers (2023-08-01T04:46:58Z) - Sample-Then-Optimize Batch Neural Thompson Sampling [50.800944138278474]
We introduce two algorithms for black-box optimization based on the Thompson sampling (TS) policy.
To choose an input query, we only need to train an NN and then choose the query by maximizing the trained NN.
Our algorithms sidestep the need to invert the large parameter matrix yet still preserve the validity of the TS policy.
arXiv Detail & Related papers (2022-10-13T09:01:58Z) - Generalizing Bayesian Optimization with Decision-theoretic Entropies [102.82152945324381]
We consider a generalization of Shannon entropy from work in statistical decision theory.
We first show that special cases of this entropy lead to popular acquisition functions used in BO procedures.
We then show how alternative choices for the loss yield a flexible family of acquisition functions.
arXiv Detail & Related papers (2022-10-04T04:43:58Z) - Black Box Optimization Using QUBO and the Cross Entropy Method [11.091089276821716]
Black box optimization can be used to optimize functions whose analytic form is unknown.
A common approach to realize BBO is to learn a surrogate model which approximates the target black box function.
We present our approach BOX-QUBO, where the surrogate model is a QUBO matrix.
arXiv Detail & Related papers (2022-06-24T22:57:24Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Neural Process for Black-Box Model Optimization Under Bayesian Framework [7.455546102930911]
Black-box models are named in general because they can only be viewed in terms of inputs and outputs, without knowledge of the internal workings.
One powerful algorithm to solve such problem is Bayesian optimization, which can effectively estimates the model parameters that lead to the best performance.
It has been challenging for GP to optimize black-box models that need to query many observations and/or have many parameters.
We propose a general Bayesian optimization algorithm that employs a Neural Process as the surrogate model to perform black-box model optimization.
arXiv Detail & Related papers (2021-04-03T23:35:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.