Switching between Numerical Black-box Optimization Algorithms with
Warm-starting Policies
- URL: http://arxiv.org/abs/2204.06539v2
- Date: Thu, 12 Jan 2023 09:34:46 GMT
- Title: Switching between Numerical Black-box Optimization Algorithms with
Warm-starting Policies
- Authors: Dominik Schr\"oder, Diederick Vermetten, Hao Wang, Carola Doerr,
Thomas B\"ack
- Abstract summary: We build on Vermetten et al. [GECCO 2020] to investigate promising switches between pairs of algorithms for numerical black-box optimization.
We show that with a single switch between two algorithms, we outperform the best static choice among the five algorithms.
We also show that for switching between BFGS and CMA-ES, a proper warm-starting of the parameters is crucial to realize high-performance gains.
- Score: 3.967483941966979
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: When solving optimization problems with black-box approaches, the algorithms
gather valuable information about the problem instance during the optimization
process. This information is used to adjust the distributions from which new
solution candidates are sampled. In fact, a key objective in evolutionary
computation is to identify the most effective ways to collect and exploit
instance knowledge. However, while considerable work is devoted to adjusting
hyper-parameters of black-box optimization algorithms on the fly or exchanging
some of its modular components, we barely know how to effectively switch
between different black-box optimization algorithms.
In this work, we build on the recent study of Vermetten et al. [GECCO 2020],
who presented a data-driven approach to investigate promising switches between
pairs of algorithms for numerical black-box optimization. We replicate their
approach with a portfolio of five algorithms and investigate whether the
predicted performance gains are realized when executing the most promising
switches. Our results suggest that with a single switch between two algorithms,
we outperform the best static choice among the five algorithms on 48 out of the
120 considered problem instances, the 24 BBOB functions in five different
dimensions. We also show that for switching between BFGS and CMA-ES, a proper
warm-starting of the parameters is crucial to realize high-performance gains.
Lastly, with a sensitivity analysis, we find the actual performance gain per
run is largely affected by the switching point, and in some cases, the
switching point yielding the best actual performance differs from the one
computed from the theoretical gain.
Related papers
- Discovering Preference Optimization Algorithms with and for Large Language Models [50.843710797024805]
offline preference optimization is a key method for enhancing and controlling the quality of Large Language Model (LLM) outputs.
We perform objective discovery to automatically discover new state-of-the-art preference optimization algorithms without (expert) human intervention.
Experiments demonstrate the state-of-the-art performance of DiscoPOP, a novel algorithm that adaptively blends logistic and exponential losses.
arXiv Detail & Related papers (2024-06-12T16:58:41Z) - Efficient computation of the Knowledge Gradient for Bayesian
Optimization [1.0497128347190048]
One-shot Hybrid KG is a new approach that combines several of the previously proposed ideas and is cheap to compute as well as powerful and efficient.
All experiments are implemented in BOTorch and show empirically drastically reduced computational overhead with equal or improved performance.
arXiv Detail & Related papers (2022-09-30T10:39:38Z) - Per-run Algorithm Selection with Warm-starting using Trajectory-based
Features [5.073358743426584]
Per-instance algorithm selection seeks to recommend, for a given problem instance, one or several suitable algorithms.
We propose an online algorithm selection scheme which we coin per-run algorithm selection.
We show that our approach outperforms static per-instance algorithm selection.
arXiv Detail & Related papers (2022-04-20T14:30:42Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - Dynamic Cat Swarm Optimization Algorithm for Backboard Wiring Problem [0.9990687944474739]
This paper presents a powerful swarm intelligence meta-heuristic optimization algorithm called Dynamic Cat Swarm Optimization.
The proposed algorithm suggests a new method to provide a proper balance between these phases by modifying the selection scheme and the seeking mode of the algorithm.
optimization results show the effectiveness of the proposed algorithm, which ranks first compared to several well-known algorithms available in the literature.
arXiv Detail & Related papers (2021-04-27T19:41:27Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Particle Swarm Optimization: Fundamental Study and its Application to
Optimization and to Jetty Scheduling Problems [0.0]
The advantages of evolutionary algorithms with respect to traditional methods have been greatly discussed in the literature.
While particle swarms share such advantages, they outperform evolutionary algorithms in that they require lower computational cost and easier implementation.
This paper does not intend to study their tuning, general-purpose settings are taken from previous studies, and virtually the same algorithm is used to optimize a variety of notably different problems.
arXiv Detail & Related papers (2021-01-25T02:06:30Z) - Divide and Learn: A Divide and Conquer Approach for Predict+Optimize [50.03608569227359]
The predict+optimize problem combines machine learning ofproblem coefficients with a optimization prob-lem that uses the predicted coefficients.
We show how to directlyexpress the loss of the optimization problem in terms of thepredicted coefficients as a piece-wise linear function.
We propose a novel divide and algorithm to tackle optimization problems without this restriction and predict itscoefficients using the optimization loss.
arXiv Detail & Related papers (2020-12-04T00:26:56Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Adaptive pruning-based optimization of parameterized quantum circuits [62.997667081978825]
Variisy hybrid quantum-classical algorithms are powerful tools to maximize the use of Noisy Intermediate Scale Quantum devices.
We propose a strategy for such ansatze used in variational quantum algorithms, which we call "Efficient Circuit Training" (PECT)
Instead of optimizing all of the ansatz parameters at once, PECT launches a sequence of variational algorithms.
arXiv Detail & Related papers (2020-10-01T18:14:11Z) - Towards Dynamic Algorithm Selection for Numerical Black-Box
Optimization: Investigating BBOB as a Use Case [4.33419118449588]
We show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in significant performance gains.
We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these.
arXiv Detail & Related papers (2020-06-11T16:36:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.