High-Dimensional Bayesian Optimization Using Both Random and Supervised Embeddings
- URL: http://arxiv.org/abs/2502.00854v1
- Date: Sun, 02 Feb 2025 16:57:05 GMT
- Title: High-Dimensional Bayesian Optimization Using Both Random and Supervised Embeddings
- Authors: Rémy Priem, Youssef Diouane, Nathalie Bartoli, Sylvain Dubreuil, Paul Saves,
- Abstract summary: This paper proposes a high-dimensionnal optimization method incorporating linear embedding subspaces of small dimension.
The resulting BO method combines in an adaptive way both random and supervised linear embeddings.
The obtained results show the high potential of EGORSE to solve high-dimensional blackbox optimization problems.
- Score: 0.6291443816903801
- License:
- Abstract: Bayesian optimization (BO) is one of the most powerful strategies to solve computationally expensive-to-evaluate blackbox optimization problems. However, BO methods are conventionally used for optimization problems of small dimension because of the curse of dimensionality. In this paper, a high-dimensionnal optimization method incorporating linear embedding subspaces of small dimension is proposed to efficiently perform the optimization. An adaptive learning strategy for these linear embeddings is carried out in conjunction with the optimization. The resulting BO method, named efficient global optimization coupled with random and supervised embedding (EGORSE), combines in an adaptive way both random and supervised linear embeddings. EGORSE has been compared to state-of-the-art algorithms and tested on academic examples with a number of design variables ranging from 10 to 600. The obtained results show the high potential of EGORSE to solve high-dimensional blackbox optimization problems, in terms of both CPU time and the limited number of calls to the expensive blackbox simulation.
Related papers
- BOIDS: High-dimensional Bayesian Optimization via Incumbent-guided Direction Lines and Subspace Embeddings [14.558601519561721]
We introduce BOIDS, a novel high-dimensional BO algorithm that guides optimization by a sequence of one-dimensional direction lines.
We also propose an adaptive selection technique to identify most optimal lines for each round of line-based optimization.
Our experimental results show that BOIDS outperforms state-of-the-art baselines on various synthetic and real-world benchmark problems.
arXiv Detail & Related papers (2024-12-17T13:51:24Z) - Analyzing and Enhancing the Backward-Pass Convergence of Unrolled
Optimization [50.38518771642365]
The integration of constrained optimization models as components in deep networks has led to promising advances on many specialized learning tasks.
A central challenge in this setting is backpropagation through the solution of an optimization problem, which often lacks a closed form.
This paper provides theoretical insights into the backward pass of unrolled optimization, showing that it is equivalent to the solution of a linear system by a particular iterative method.
A system called Folded Optimization is proposed to construct more efficient backpropagation rules from unrolled solver implementations.
arXiv Detail & Related papers (2023-12-28T23:15:18Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Scalable Bayesian optimization with high-dimensional outputs using
randomized prior networks [3.0468934705223774]
We propose a deep learning framework for BO and sequential decision making based on bootstrapped ensembles of neural architectures with randomized priors.
We show that the proposed framework can approximate functional relationships between design variables and quantities of interest, even in cases where the latter take values in high-dimensional vector spaces or even infinite-dimensional function spaces.
We test the proposed framework against state-of-the-art methods for BO and demonstrate superior performance across several challenging tasks with high-dimensional outputs.
arXiv Detail & Related papers (2023-02-14T18:55:21Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - Computationally Efficient High-Dimensional Bayesian Optimization via
Variable Selection [0.5439020425818999]
We develop a new computationally efficient high-dimensional BO method that exploits variable selection.
Our method is able to automatically learn axis-aligned sub-spaces, i.e. spaces containing selected variables.
We empirically show the efficacy of our method on several synthetic and real problems.
arXiv Detail & Related papers (2021-09-20T01:55:43Z) - High dimensional Bayesian Optimization Algorithm for Complex System in
Time Series [1.9371782627708491]
This paper presents a novel high dimensional Bayesian optimization algorithm.
Based on the time-dependent or dimension-dependent characteristics of the model, the proposed algorithm can reduce the dimension evenly.
To increase the final accuracy of the optimal solution, the proposed algorithm adds a local search based on a series of Adam-based steps at the final stage.
arXiv Detail & Related papers (2021-08-04T21:21:17Z) - Divide and Learn: A Divide and Conquer Approach for Predict+Optimize [50.03608569227359]
The predict+optimize problem combines machine learning ofproblem coefficients with a optimization prob-lem that uses the predicted coefficients.
We show how to directlyexpress the loss of the optimization problem in terms of thepredicted coefficients as a piece-wise linear function.
We propose a novel divide and algorithm to tackle optimization problems without this restriction and predict itscoefficients using the optimization loss.
arXiv Detail & Related papers (2020-12-04T00:26:56Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Self-Directed Online Machine Learning for Topology Optimization [58.920693413667216]
Self-directed Online Learning Optimization integrates Deep Neural Network (DNN) with Finite Element Method (FEM) calculations.
Our algorithm was tested by four types of problems including compliance minimization, fluid-structure optimization, heat transfer enhancement and truss optimization.
It reduced the computational time by 2 5 orders of magnitude compared with directly using methods, and outperformed all state-of-the-art algorithms tested in our experiments.
arXiv Detail & Related papers (2020-02-04T20:00:28Z) - Bayesian Optimization for Policy Search in High-Dimensional Systems via
Automatic Domain Selection [1.1240669509034296]
We propose to leverage results from optimal control to scale BO to higher dimensional control tasks.
We show how we can make use of a learned dynamics model in combination with a model-based controller to simplify the BO problem.
We present an experimental evaluation on real hardware, as well as simulated tasks including a 48-dimensional policy for a quadcopter.
arXiv Detail & Related papers (2020-01-21T09:04:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.