Versatile Black-Box Optimization
- URL: http://arxiv.org/abs/2004.14014v1
- Date: Wed, 29 Apr 2020 08:20:36 GMT
- Title: Versatile Black-Box Optimization
- Authors: Jialin Liu, Antoine Moreau, Mike Preuss, Baptiste Roziere, Jeremy
Rapin, Fabien Teytaud, Olivier Teytaud
- Abstract summary: We present Shiwa, an algorithm good at both discrete and continuous, noisy and noise-free, sequential and parallel, black-box optimization.
Our algorithm is experimentally compared to competitors on YABBOB, a BBOB comparable testbed, and on some variants of it, and then validated on several real world testbeds.
- Score: 6.92253842648213
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Choosing automatically the right algorithm using problem descriptors is a
classical component of combinatorial optimization. It is also a good tool for
making evolutionary algorithms fast, robust and versatile. We present Shiwa, an
algorithm good at both discrete and continuous, noisy and noise-free,
sequential and parallel, black-box optimization. Our algorithm is
experimentally compared to competitors on YABBOB, a BBOB comparable testbed,
and on some variants of it, and then validated on several real world testbeds.
Related papers
- MA-BBOB: A Problem Generator for Black-Box Optimization Using Affine
Combinations and Shifts [1.2617078020344619]
We present the MA-BBOB function generator, which uses the BBOB suite as component functions in an affine combination.
We show a potential use-case of MA-BBOB in generating a wide set of training and testing data for algorithm selectors.
arXiv Detail & Related papers (2023-12-18T10:23:09Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - HARRIS: Hybrid Ranking and Regression Forests for Algorithm Selection [75.84584400866254]
We propose a new algorithm selector leveraging special forests, combining the strengths of both approaches while alleviating their weaknesses.
HARRIS' decisions are based on a forest model, whose trees are created based on optimized on a hybrid ranking and regression loss function.
arXiv Detail & Related papers (2022-10-31T14:06:11Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - Benchmarking Feature-based Algorithm Selection Systems for Black-box
Numerical Optimization [0.0]
Feature-based algorithm selection aims to automatically find the best one from a portfolio of optimization algorithms on an unseen problem.
This paper analyzes algorithm selection systems on the 24 noiseless black-box optimization benchmarking functions.
We show that the performance of algorithm selection systems can be significantly improved by using sequential least squares programming as a pre-solver.
arXiv Detail & Related papers (2021-09-17T07:17:40Z) - Machine Learning for Online Algorithm Selection under Censored Feedback [71.6879432974126]
In online algorithm selection (OAS), instances of an algorithmic problem class are presented to an agent one after another, and the agent has to quickly select a presumably best algorithm from a fixed set of candidate algorithms.
For decision problems such as satisfiability (SAT), quality typically refers to the algorithm's runtime.
In this work, we revisit multi-armed bandit algorithms for OAS and discuss their capability of dealing with the problem.
We adapt them towards runtime-oriented losses, allowing for partially censored data while keeping a space- and time-complexity independent of the time horizon.
arXiv Detail & Related papers (2021-09-13T18:10:52Z) - Black-Box Optimization Revisited: Improving Algorithm Selection Wizards
through Massive Benchmarking [8.874754363200614]
Existing studies in black-box optimization for machine learning suffer from low generalizability.
We propose a benchmark suite, OptimSuite, which covers a broad range of black-box optimization problems.
ABBO achieves competitive performance on all benchmark suites.
arXiv Detail & Related papers (2020-10-08T14:17:30Z) - Sub-linear Regret Bounds for Bayesian Optimisation in Unknown Search
Spaces [63.22864716473051]
We propose a novel BO algorithm which expands (and shifts) the search space over iterations.
We show theoretically that for both our algorithms, the cumulative regret grows at sub-linear rates.
arXiv Detail & Related papers (2020-09-05T14:24:40Z) - Towards Dynamic Algorithm Selection for Numerical Black-Box
Optimization: Investigating BBOB as a Use Case [4.33419118449588]
We show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in significant performance gains.
We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these.
arXiv Detail & Related papers (2020-06-11T16:36:11Z) - Combinatorial Black-Box Optimization with Expert Advice [28.45580493454314]
We consider the problem of black-box function optimization over the hypercube.
We propose a computationally efficient model learning algorithm based on multilinears and exponential weight updates.
arXiv Detail & Related papers (2020-06-06T20:26:19Z) - Stepwise Model Selection for Sequence Prediction via Deep Kernel
Learning [100.83444258562263]
We propose a novel Bayesian optimization (BO) algorithm to tackle the challenge of model selection in this setting.
In order to solve the resulting multiple black-box function optimization problem jointly and efficiently, we exploit potential correlations among black-box functions.
We are the first to formulate the problem of stepwise model selection (SMS) for sequence prediction, and to design and demonstrate an efficient joint-learning algorithm for this purpose.
arXiv Detail & Related papers (2020-01-12T09:42:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.