A comparison of mixed-variables Bayesian optimization approaches
- URL: http://arxiv.org/abs/2111.01533v1
- Date: Sat, 30 Oct 2021 09:26:34 GMT
- Title: A comparison of mixed-variables Bayesian optimization approaches
- Authors: Jhouben Cuesta-Ramirez and Rodolphe Le Riche and Olivier Roustant and
Guillaume Perrin and Cedric Durantin and Alain Gliere
- Abstract summary: Most real optimization problems are defined over a mixed search space where the variables are both discrete and continuous.
In this article, costly mixed problems are approached through Gaussian processes where the discrete variables are relaxed into continuous latent variables.
In particular, the reformulation of the problem with continuous latent variables is put in competition with searches working directly in the mixed space.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Most real optimization problems are defined over a mixed search space where
the variables are both discrete and continuous. In engineering applications,
the objective function is typically calculated with a numerically costly
black-box simulation.General mixed and costly optimization problems are
therefore of a great practical interest, yet their resolution remains in a
large part an open scientific question. In this article, costly mixed problems
are approached through Gaussian processes where the discrete variables are
relaxed into continuous latent variables. The continuous space is more easily
harvested by classical Bayesian optimization techniques than a mixed space
would. Discrete variables are recovered either subsequently to the continuous
optimization, or simultaneously with an additional continuous-discrete
compatibility constraint that is handled with augmented Lagrangians. Several
possible implementations of such Bayesian mixed optimizers are compared. In
particular, the reformulation of the problem with continuous latent variables
is put in competition with searches working directly in the mixed space. Among
the algorithms involving latent variables and an augmented Lagrangian, a
particular attention is devoted to the Lagrange multipliers for which a local
and a global estimation techniques are studied. The comparisons are based on
the repeated optimization of three analytical functions and a beam design
problem.
Related papers
- Convergence of Expectation-Maximization Algorithm with Mixed-Integer Optimization [5.319361976450982]
This paper introduces a set of conditions that ensure the convergence of a specific class of EM algorithms.
Our results offer a new analysis technique for iterative algorithms that solve mixed-integer non-linear optimization problems.
arXiv Detail & Related papers (2024-01-31T11:42:46Z) - Numerical Methods for Convex Multistage Stochastic Optimization [86.45244607927732]
We focus on optimisation programming (SP), Optimal Control (SOC) and Decision Processes (MDP)
Recent progress in solving convex multistage Markov problems is based on cutting planes approximations of the cost-to-go functions of dynamic programming equations.
Cutting plane type methods can handle multistage problems with a large number of stages, but a relatively smaller number of state (decision) variables.
arXiv Detail & Related papers (2023-03-28T01:30:40Z) - Global and Preference-based Optimization with Mixed Variables using
Piecewise Affine Surrogates [1.30536490219656]
This paper proposes a novel surrogate-based global optimization algorithm to solve linearly constrained mixed-variable problems.
We introduce two types of exploration functions to efficiently search the feasible domain via mixed-integer linear programming solvers.
The proposed algorithms can often achieve better or comparable results than other existing methods.
arXiv Detail & Related papers (2023-02-09T15:04:35Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - Nonequilibrium Monte Carlo for unfreezing variables in hard
combinatorial optimization [1.1783108699768]
We introduce a quantum-inspired family of nonlocal Nonequilibrium Monte Carlo (NMC) algorithms by developing an adaptive gradient-free strategy.
We observe significant speedup and robustness over both specialized solvers and generic solvers.
arXiv Detail & Related papers (2021-11-26T17:45:32Z) - Optimization on manifolds: A symplectic approach [127.54402681305629]
We propose a dissipative extension of Dirac's theory of constrained Hamiltonian systems as a general framework for solving optimization problems.
Our class of (accelerated) algorithms are not only simple and efficient but also applicable to a broad range of contexts.
arXiv Detail & Related papers (2021-07-23T13:43:34Z) - Implicit differentiation for fast hyperparameter selection in non-smooth
convex learning [87.60600646105696]
We study first-order methods when the inner optimization problem is convex but non-smooth.
We show that the forward-mode differentiation of proximal gradient descent and proximal coordinate descent yield sequences of Jacobians converging toward the exact Jacobian.
arXiv Detail & Related papers (2021-05-04T17:31:28Z) - The Connection between Discrete- and Continuous-Time Descriptions of
Gaussian Continuous Processes [60.35125735474386]
We show that discretizations yielding consistent estimators have the property of invariance under coarse-graining'
This result explains why combining differencing schemes for derivatives reconstruction and local-in-time inference approaches does not work for time series analysis of second or higher order differential equations.
arXiv Detail & Related papers (2021-01-16T17:11:02Z) - Combinatorial Bayesian Optimization with Random Mapping Functions to
Convex Polytopes [43.19936635161588]
We present a method for Bayesian optimization in a space which can operate well in a large space.
Our algorithm shows satisfactory performance compared to existing methods.
arXiv Detail & Related papers (2020-11-26T02:22:41Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Bayesian optimization of variable-size design space problems [0.0]
Two alternative Bayesian Optimization-based approaches are proposed in order to solve this type of optimization problems.
The first approach consists in a budget allocation strategy allowing to focus the computational budget on the most promising design sub-spaces.
The second approach, instead, is based on the definition of a kernel function allowing to compute the covariance between samples characterized by partially different sets of variables.
arXiv Detail & Related papers (2020-03-06T16:30:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.