Assessing Ranking and Effectiveness of Evolutionary Algorithm
Hyperparameters Using Global Sensitivity Analysis Methodologies
- URL: http://arxiv.org/abs/2207.04820v1
- Date: Mon, 11 Jul 2022 12:39:39 GMT
- Title: Assessing Ranking and Effectiveness of Evolutionary Algorithm
Hyperparameters Using Global Sensitivity Analysis Methodologies
- Authors: Varun Ojha and Jon Timmis and Giuseppe Nicosia
- Abstract summary: We present a comprehensive global sensitivity analysis of two single-objective and two multi-objective state-of-the-art global optimization evolutionary algorithms as an algorithm configuration problem.
- Score: 1.8397598629548875
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a comprehensive global sensitivity analysis of two
single-objective and two multi-objective state-of-the-art global optimization
evolutionary algorithms as an algorithm configuration problem. That is, we
investigate the quality of influence hyperparameters have on the performance of
algorithms in terms of their direct effect and interaction effect with other
hyperparameters. Using three sensitivity analysis methods, Morris LHS, Morris,
and Sobol, to systematically analyze tunable hyperparameters of covariance
matrix adaptation evolutionary strategy, differential evolution, non-dominated
sorting genetic algorithm III, and multi-objective evolutionary algorithm based
on decomposition, the framework reveals the behaviors of hyperparameters to
sampling methods and performance metrics. That is, it answers questions like
what hyperparameters influence patterns, how they interact, how much they
interact, and how much their direct influence is. Consequently, the ranking of
hyperparameters suggests their order of tuning, and the pattern of influence
reveals the stability of the algorithms.
Related papers
- A Comparative Study of Hyperparameter Tuning Methods [0.0]
Tree-structured Parzen Estimator (TPE), Genetic Search, and Random Search are evaluated across regression and classification tasks.
Random Search excelled in regression tasks, while TPE was more effective for classification tasks.
arXiv Detail & Related papers (2024-08-29T10:35:07Z) - Robustness of Algorithms for Causal Structure Learning to Hyperparameter
Choice [2.3020018305241337]
Hyper parameter tuning can make the difference between state-of-the-art and poor prediction performance for any algorithm.
We investigate the influence of hyper parameter selection on causal structure learning tasks.
arXiv Detail & Related papers (2023-10-27T15:34:08Z) - Hyperparameter Adaptive Search for Surrogate Optimization: A
Self-Adjusting Approach [1.6317061277457001]
Surrogate optimization (SO) algorithms have shown promise for optimizing expensive black-box functions.
Our approach identifies and modifies the most influential hyper parameters specific to each problem and SO approach.
Experimental results demonstrate the effectiveness of HASSO in enhancing the performance of various SO algorithms.
arXiv Detail & Related papers (2023-10-12T01:26:05Z) - Multi-objective hyperparameter optimization with performance uncertainty [62.997667081978825]
This paper presents results on multi-objective hyperparameter optimization with uncertainty on the evaluation of Machine Learning algorithms.
We combine the sampling strategy of Tree-structured Parzen Estimators (TPE) with the metamodel obtained after training a Gaussian Process Regression (GPR) with heterogeneous noise.
Experimental results on three analytical test functions and three ML problems show the improvement over multi-objective TPE and GPR.
arXiv Detail & Related papers (2022-09-09T14:58:43Z) - A survey on multi-objective hyperparameter optimization algorithms for
Machine Learning [62.997667081978825]
This article presents a systematic survey of the literature published between 2014 and 2020 on multi-objective HPO algorithms.
We distinguish between metaheuristic-based algorithms, metamodel-based algorithms, and approaches using a mixture of both.
We also discuss the quality metrics used to compare multi-objective HPO procedures and present future research directions.
arXiv Detail & Related papers (2021-11-23T10:22:30Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Optimizing Large-Scale Hyperparameters via Automated Learning Algorithm [97.66038345864095]
We propose a new hyperparameter optimization method with zeroth-order hyper-gradients (HOZOG)
Specifically, we first formulate hyperparameter optimization as an A-based constrained optimization problem.
Then, we use the average zeroth-order hyper-gradients to update hyper parameters.
arXiv Detail & Related papers (2021-02-17T21:03:05Z) - VisEvol: Visual Analytics to Support Hyperparameter Search through Evolutionary Optimization [4.237343083490243]
During the training phase of machine learning (ML) models, it is usually necessary to configure several hyper parameters.
We present VisEvol, a visual analytics tool that supports interactive exploration of hyper parameters and intervention in this evolutionary procedure.
The utility and applicability of VisEvol are demonstrated with two use cases and interviews with ML experts who evaluated the effectiveness of the tool.
arXiv Detail & Related papers (2020-12-02T13:43:37Z) - An Asymptotically Optimal Multi-Armed Bandit Algorithm and
Hyperparameter Optimization [48.5614138038673]
We propose an efficient and robust bandit-based algorithm called Sub-Sampling (SS) in the scenario of hyper parameter search evaluation.
We also develop a novel hyper parameter optimization algorithm called BOSS.
Empirical studies validate our theoretical arguments of SS and demonstrate the superior performance of BOSS on a number of applications.
arXiv Detail & Related papers (2020-07-11T03:15:21Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.