Prasatul Matrix: A Direct Comparison Approach for Analyzing Evolutionary
Optimization Algorithms
- URL: http://arxiv.org/abs/2212.00671v1
- Date: Thu, 1 Dec 2022 17:21:44 GMT
- Title: Prasatul Matrix: A Direct Comparison Approach for Analyzing Evolutionary
Optimization Algorithms
- Authors: Anupam Biswas
- Abstract summary: Direct comparison approach is proposed to analyze the performance of evolutionary optimization algorithms.
Five different performance measures are designed based on the prasatul matrix to evaluate the performance of algorithms.
- Score: 2.1320960069210475
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The performance of individual evolutionary optimization algorithms is mostly
measured in terms of statistics such as mean, median and standard deviation
etc., computed over the best solutions obtained with few trails of the
algorithm. To compare the performance of two algorithms, the values of these
statistics are compared instead of comparing the solutions directly. This kind
of comparison lacks direct comparison of solutions obtained with different
algorithms. For instance, the comparison of best solutions (or worst solution)
of two algorithms simply not possible. Moreover, ranking of algorithms is
mostly done in terms of solution quality only, despite the fact that the
convergence of algorithm is also an important factor. In this paper, a direct
comparison approach is proposed to analyze the performance of evolutionary
optimization algorithms. A direct comparison matrix called \emph{Prasatul
Matrix} is prepared, which accounts direct comparison outcome of best solutions
obtained with two algorithms for a specific number of trials. Five different
performance measures are designed based on the prasatul matrix to evaluate the
performance of algorithms in terms of Optimality and Comparability of
solutions. These scores are utilized to develop a score-driven approach for
comparing performance of multiple algorithms as well as for ranking both in the
grounds of solution quality and convergence analysis. Proposed approach is
analyzed with six evolutionary optimization algorithms on 25 benchmark
functions. A non-parametric statistical analysis, namely Wilcoxon paired
sum-rank test is also performed to verify the outcomes of proposed direct
comparison approach.
Related papers
- Provably Faster Algorithms for Bilevel Optimization via Without-Replacement Sampling [96.47086913559289]
gradient-based algorithms are widely used in bilevel optimization.
We introduce a without-replacement sampling based algorithm which achieves a faster convergence rate.
We validate our algorithms over both synthetic and real-world applications.
arXiv Detail & Related papers (2024-11-07T17:05:31Z) - BMR and BWR: Two simple metaphor-free optimization algorithms for solving real-life non-convex constrained and unconstrained problems [0.5755004576310334]
Two simple yet powerful optimization algorithms, named the Best-MeanRandom (BMR) and Best-Worst-Random (BWR) algorithms, are developed and presented in this paper.
arXiv Detail & Related papers (2024-07-15T18:11:47Z) - A Novel Ranking Scheme for the Performance Analysis of Stochastic Optimization Algorithms using the Principles of Severity [9.310464457958844]
We provide a novel ranking scheme to rank the algorithms over multiple single-objective optimization problems.
The results of the algorithms are compared using a robust bootstrapping-based hypothesis testing procedure.
arXiv Detail & Related papers (2024-05-31T19:35:34Z) - Performance Evaluation of Evolutionary Algorithms for Analog Integrated
Circuit Design Optimisation [0.0]
An automated sizing approach for analog circuits is presented in this paper.
A targeted search of the search space has been implemented using a particle generation function and a repair-bounds function.
The algorithms are tuned and modified to converge to a better optimal solution.
arXiv Detail & Related papers (2023-10-19T03:26:36Z) - Accelerating Cutting-Plane Algorithms via Reinforcement Learning
Surrogates [49.84541884653309]
A current standard approach to solving convex discrete optimization problems is the use of cutting-plane algorithms.
Despite the existence of a number of general-purpose cut-generating algorithms, large-scale discrete optimization problems continue to suffer from intractability.
We propose a method for accelerating cutting-plane algorithms via reinforcement learning.
arXiv Detail & Related papers (2023-07-17T20:11:56Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Provably Faster Algorithms for Bilevel Optimization [54.83583213812667]
Bilevel optimization has been widely applied in many important machine learning applications.
We propose two new algorithms for bilevel optimization.
We show that both algorithms achieve the complexity of $mathcalO(epsilon-1.5)$, which outperforms all existing algorithms by the order of magnitude.
arXiv Detail & Related papers (2021-06-08T21:05:30Z) - On the Assessment of Benchmark Suites for Algorithm Comparison [7.501426386641256]
We show that most benchmark functions of BBOB suite have high difficulty levels (compared to the optimization algorithms) and low discrimination.
We discuss potential uses of IRT in benchmarking, including its use to improve the design of benchmark suites.
arXiv Detail & Related papers (2021-04-15T11:20:11Z) - Benchmarking Simulation-Based Inference [5.3898004059026325]
Recent advances in probabilistic modelling have led to a large number of simulation-based inference algorithms which do not require numerical evaluation of likelihoods.
We provide a benchmark with inference tasks and suitable performance metrics, with an initial selection of algorithms.
We found that the choice of performance metric is critical, that even state-of-the-art algorithms have substantial room for improvement, and that sequential estimation improves sample efficiency.
arXiv Detail & Related papers (2021-01-12T18:31:22Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.