Exploratory Landscape Analysis for Mixed-Variable Problems
- URL: http://arxiv.org/abs/2402.16467v1
- Date: Mon, 26 Feb 2024 10:19:23 GMT
- Title: Exploratory Landscape Analysis for Mixed-Variable Problems
- Authors: Raphael Patrick Prager and Heike Trautmann
- Abstract summary: We provide the means to compute exploratory landscape features for mixed-variable problems where the decision space is a mixture of continuous, binary, integer, and categorical variables.
To further highlight their merit for practical applications, we design and conduct an automated algorithm selection study.
Our trained algorithm selector is able to close the gap between the single best and the virtual best solver by 57.5% over all benchmark problems.
- Score: 0.7252027234425334
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Exploratory landscape analysis and fitness landscape analysis in general have
been pivotal in facilitating problem understanding, algorithm design and
endeavors such as automated algorithm selection and configuration. These
techniques have largely been limited to search spaces of a single domain. In
this work, we provide the means to compute exploratory landscape features for
mixed-variable problems where the decision space is a mixture of continuous,
binary, integer, and categorical variables. This is achieved by utilizing
existing encoding techniques originating from machine learning. We provide a
comprehensive juxtaposition of the results based on these different techniques.
To further highlight their merit for practical applications, we design and
conduct an automated algorithm selection study based on a hyperparameter
optimization benchmark suite. We derive a meaningful compartmentalization of
these benchmark problems by clustering based on the used landscape features.
The identified clusters mimic the behavior the used algorithms exhibit.
Meaning, the different clusters have different best performing algorithms.
Finally, our trained algorithm selector is able to close the gap between the
single best and the virtual best solver by 57.5% over all benchmark problems.
Related papers
- Hybridizing Target- and SHAP-encoded Features for Algorithm Selection in Mixed-variable Black-box Optimization [0.9049664874474738]
ELA is used for problem comprehension, algorithm design, and applications such as automated algorithm selection and configuration.
Until recently, ELA was limited to search spaces with either continuous or discrete variables, neglecting problems with mixed variable types.
This gap was addressed in a recent study that uses an approach based on target-encoding to compute exploratory landscape features for mixed problems.
arXiv Detail & Related papers (2024-07-10T07:47:31Z) - A Survey of Meta-features Used for Automated Selection of Algorithms for Black-box Single-objective Continuous Optimization [4.173197621837912]
We conduct an overview of the key contributions to algorithm selection in the field of single-objective continuous black-box optimization.
We study machine learning models for automated algorithm selection, configuration, and performance prediction.
arXiv Detail & Related papers (2024-06-08T11:11:14Z) - A machine learning framework for neighbor generation in metaheuristic
search [4.521119623956821]
We propose a general machine learning framework for neighbor generation in metaheuristic search.
We validate our methodology on two metaheuristic applications.
arXiv Detail & Related papers (2022-12-22T01:58:04Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Applying Semi-Automated Hyperparameter Tuning for Clustering Algorithms [0.0]
This study proposes a framework for semi-automated hyperparameter tuning of clustering problems.
It uses a grid search to develop a series of graphs and easy to interpret metrics that can then be used for more efficient domain-specific evaluation.
Preliminary results show that internal metrics are unable to capture the semantic quality of the clusters developed.
arXiv Detail & Related papers (2021-08-25T05:48:06Z) - Algorithm Selection on a Meta Level [58.720142291102135]
We introduce the problem of meta algorithm selection, which essentially asks for the best way to combine a given set of algorithm selectors.
We present a general methodological framework for meta algorithm selection as well as several concrete learning methods as instantiations of this framework.
arXiv Detail & Related papers (2021-07-20T11:23:21Z) - A black-box adversarial attack for poisoning clustering [78.19784577498031]
We propose a black-box adversarial attack for crafting adversarial samples to test the robustness of clustering algorithms.
We show that our attacks are transferable even against supervised algorithms such as SVMs, random forests, and neural networks.
arXiv Detail & Related papers (2020-09-09T18:19:31Z) - Run2Survive: A Decision-theoretic Approach to Algorithm Selection based
on Survival Analysis [75.64261155172856]
survival analysis (SA) naturally supports censored data and offers appropriate ways to use such data for learning distributional models of algorithm runtime.
We leverage such models as a basis of a sophisticated decision-theoretic approach to algorithm selection, which we dub Run2Survive.
In an extensive experimental study with the standard benchmark ASlib, our approach is shown to be highly competitive and in many cases even superior to state-of-the-art AS approaches.
arXiv Detail & Related papers (2020-07-06T15:20:17Z) - Towards Dynamic Algorithm Selection for Numerical Black-Box
Optimization: Investigating BBOB as a Use Case [4.33419118449588]
We show that even single-switch dynamic Algorithm selection (dynAS) can potentially result in significant performance gains.
We also discuss key challenges in dynAS, and argue that the BBOB-framework can become a useful tool in overcoming these.
arXiv Detail & Related papers (2020-06-11T16:36:11Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z) - Optimal Clustering from Noisy Binary Feedback [75.17453757892152]
We study the problem of clustering a set of items from binary user feedback.
We devise an algorithm with a minimal cluster recovery error rate.
For adaptive selection, we develop an algorithm inspired by the derivation of the information-theoretical error lower bounds.
arXiv Detail & Related papers (2019-10-14T09:18:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.