Two novel features selection algorithms based on crowding distance
- URL: http://arxiv.org/abs/2105.05212v2
- Date: Thu, 13 May 2021 15:19:50 GMT
- Title: Two novel features selection algorithms based on crowding distance
- Authors: Abdesslem Layeb
- Abstract summary: The proposed algorithms use the crowding distance used in the multiobjective optimization as a metric to sort the features.
The experimental results have shown the effectiveness and the robustness of the proposed algorithms.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, two novel algorithms for features selection are proposed. The
first one is a filter method while the second is wrapper method. Both the
proposed algorithms use the crowding distance used in the multiobjective
optimization as a metric in order to sort the features. The less crowded
features have great effects on the target attribute (class). The experimental
results have shown the effectiveness and the robustness of the proposed
algorithms.
Related papers
- On Constructing Algorithm Portfolios in Algorithm Selection for Computationally Expensive Black-box Optimization in the Fixed-budget Setting [0.0]
This paper argues the importance of considering the number of function evaluations used in the sampling phase when constructing algorithm portfolios.
The results show that algorithm portfolios constructed by our approach perform significantly better than those by the previous approach.
arXiv Detail & Related papers (2024-05-13T03:31:13Z) - Preference Inference from Demonstration in Multi-objective Multi-agent
Decision Making [0.0]
We propose an algorithm to infer linear preference weights from either optimal or near-optimal demonstrations.
Empirical results demonstrate significant improvements compared to the baseline algorithms.
In future work, we plan to evaluate the algorithm's effectiveness in a multi-agent system.
arXiv Detail & Related papers (2023-04-27T12:19:28Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Per-run Algorithm Selection with Warm-starting using Trajectory-based
Features [5.073358743426584]
Per-instance algorithm selection seeks to recommend, for a given problem instance, one or several suitable algorithms.
We propose an online algorithm selection scheme which we coin per-run algorithm selection.
We show that our approach outperforms static per-instance algorithm selection.
arXiv Detail & Related papers (2022-04-20T14:30:42Z) - Compactness Score: A Fast Filter Method for Unsupervised Feature
Selection [66.84571085643928]
We propose a fast unsupervised feature selection method, named as, Compactness Score (CSUFS) to select desired features.
Our proposed algorithm seems to be more accurate and efficient compared with existing algorithms.
arXiv Detail & Related papers (2022-01-31T13:01:37Z) - Benchmarking Feature-based Algorithm Selection Systems for Black-box
Numerical Optimization [0.0]
Feature-based algorithm selection aims to automatically find the best one from a portfolio of optimization algorithms on an unseen problem.
This paper analyzes algorithm selection systems on the 24 noiseless black-box optimization benchmarking functions.
We show that the performance of algorithm selection systems can be significantly improved by using sequential least squares programming as a pre-solver.
arXiv Detail & Related papers (2021-09-17T07:17:40Z) - Machine Learning for Online Algorithm Selection under Censored Feedback [71.6879432974126]
In online algorithm selection (OAS), instances of an algorithmic problem class are presented to an agent one after another, and the agent has to quickly select a presumably best algorithm from a fixed set of candidate algorithms.
For decision problems such as satisfiability (SAT), quality typically refers to the algorithm's runtime.
In this work, we revisit multi-armed bandit algorithms for OAS and discuss their capability of dealing with the problem.
We adapt them towards runtime-oriented losses, allowing for partially censored data while keeping a space- and time-complexity independent of the time horizon.
arXiv Detail & Related papers (2021-09-13T18:10:52Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - E2E-FS: An End-to-End Feature Selection Method for Neural Networks [0.3222802562733786]
We present a novel selection algorithm, called EndtoEnd Feature Selection (E2FS)
Our algorithm, similar to the lasso approach, is solved with gradient descent techniques.
Although hard restrictions, experimental results show that this algorithm can be used with any learning model.
arXiv Detail & Related papers (2020-12-14T16:19:25Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.