Speeding-up Evolutionary Algorithms to solve Black-Box Optimization
Problems
- URL: http://arxiv.org/abs/2309.13349v2
- Date: Mon, 29 Jan 2024 08:02:18 GMT
- Title: Speeding-up Evolutionary Algorithms to solve Black-Box Optimization
Problems
- Authors: Judith Echevarrieta, Etor Arza and Aritz P\'erez
- Abstract summary: Population-based evolutionary algorithms are often considered when approaching computationally expensive black-box optimization problems.
We propose a technique capable of choosing an appropriate approximate function cost during the execution of the optimization algorithm.
The proposal finds the minimum evaluation cost at which the solutions are still properly ranked, and consequently, more evaluations can be computed in the same amount of time with minimal accuracy loss.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Population-based evolutionary algorithms are often considered when
approaching computationally expensive black-box optimization problems. They
employ a selection mechanism to choose the best solutions from a given
population after comparing their objective values, which are then used to
generate the next population. This iterative process explores the solution
space efficiently, leading to improved solutions over time. However, these
algorithms require a large number of evaluations to provide a quality solution,
which might be computationally expensive when the evaluation cost is high. In
some cases, it is possible to replace the original objective function with a
less accurate approximation of lower cost. This introduces a trade-off between
the evaluation cost and its accuracy.
In this paper, we propose a technique capable of choosing an appropriate
approximate function cost during the execution of the optimization algorithm.
The proposal finds the minimum evaluation cost at which the solutions are still
properly ranked, and consequently, more evaluations can be computed in the same
amount of time with minimal accuracy loss. An experimental section on four very
different problems reveals that the proposed approach can reach the same
objective value in less than half of the time in certain cases.
Related papers
- On Constructing Algorithm Portfolios in Algorithm Selection for Computationally Expensive Black-box Optimization in the Fixed-budget Setting [0.0]
This paper argues the importance of considering the number of function evaluations used in the sampling phase when constructing algorithm portfolios.
The results show that algorithm portfolios constructed by our approach perform significantly better than those by the previous approach.
arXiv Detail & Related papers (2024-05-13T03:31:13Z) - Equitable and Fair Performance Evaluation of Whale Optimization
Algorithm [4.0814527055582746]
It is essential that all algorithms are exhaustively, somewhat, and intelligently evaluated.
evaluating the effectiveness of optimization algorithms equitably and fairly is not an easy process for various reasons.
arXiv Detail & Related papers (2023-09-04T06:32:02Z) - BiAdam: Fast Adaptive Bilevel Optimization Methods [104.96004056928474]
Bilevel optimization has attracted increased interest in machine learning due to its many applications.
We provide a useful analysis framework for both the constrained and unconstrained optimization.
arXiv Detail & Related papers (2021-06-21T20:16:40Z) - EXPObench: Benchmarking Surrogate-based Optimisation Algorithms on
Expensive Black-box Functions [4.8980686156238535]
We provide an extensive comparison of six different surrogate algorithms on four expensive optimisation problems from different real-life applications.
This has led to new insights regarding the relative importance of exploration, the evaluation time of the objective, and the used model.
We make the algorithms and benchmark problem instances publicly available, contributing to more uniform analysis of surrogate algorithms.
arXiv Detail & Related papers (2021-06-08T18:17:42Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - Bayesian Algorithm Execution: Estimating Computable Properties of
Black-box Functions Using Mutual Information [78.78486761923855]
In many real world problems, we want to infer some property of an expensive black-box function f, given a budget of T function evaluations.
We present a procedure, InfoBAX, that sequentially chooses queries that maximize mutual information with respect to the algorithm's output.
On these problems, InfoBAX uses up to 500 times fewer queries to f than required by the original algorithm.
arXiv Detail & Related papers (2021-04-19T17:22:11Z) - Towards Feature-Based Performance Regression Using Trajectory Data [0.9281671380673306]
Black-box optimization is a very active area of research, with many new algorithms being developed every year.
The variety of algorithms poses a meta-problem: which algorithm to choose for a given problem at hand?
Past research has shown that per-instance algorithm selection based on exploratory landscape analysis can be an efficient mean to tackle this meta-problem.
arXiv Detail & Related papers (2021-02-10T10:19:13Z) - Divide and Learn: A Divide and Conquer Approach for Predict+Optimize [50.03608569227359]
The predict+optimize problem combines machine learning ofproblem coefficients with a optimization prob-lem that uses the predicted coefficients.
We show how to directlyexpress the loss of the optimization problem in terms of thepredicted coefficients as a piece-wise linear function.
We propose a novel divide and algorithm to tackle optimization problems without this restriction and predict itscoefficients using the optimization loss.
arXiv Detail & Related papers (2020-12-04T00:26:56Z) - Contrastive Losses and Solution Caching for Predict-and-Optimize [19.31153168397003]
We use a Noise Contrastive approach to motivate a family of surrogate loss functions.
We address a major bottleneck of all predict-and-optimize approaches.
We show that even a very slow growth rate is enough to match the quality of state-of-the-art methods.
arXiv Detail & Related papers (2020-11-10T19:09:12Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z) - Time-varying Gaussian Process Bandit Optimization with Non-constant
Evaluation Time [93.6788993843846]
We propose a novel time-varying Bayesian optimization algorithm that can effectively handle the non-constant evaluation time.
Our bound elucidates that a pattern of the evaluation time sequence can hugely affect the difficulty of the problem.
arXiv Detail & Related papers (2020-03-10T13:28:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.