A metaheuristic multi-objective interaction-aware feature selection
method
- URL: http://arxiv.org/abs/2211.05423v1
- Date: Thu, 10 Nov 2022 08:56:48 GMT
- Title: A metaheuristic multi-objective interaction-aware feature selection
method
- Authors: Motahare Namakin, Modjtaba Rouhani, Mostafa Sabzekar
- Abstract summary: We present a novel multi-objective feature selection method that has several advantages.
It considers the interaction between features using an advanced probability scheme.
The proposed method utilizes the introduced probability scheme to produce more promising offsprings.
- Score: 5.28539620288341
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-objective feature selection is one of the most significant issues in
the field of pattern recognition. It is challenging because it maximizes the
classification performance and, at the same time, minimizes the number of
selected features, and the mentioned two objectives are usually conflicting. To
achieve a better Pareto optimal solution, metaheuristic optimization methods
are widely used in many studies. However, the main drawback is the exploration
of a large search space. Another problem with multi-objective feature selection
approaches is the interaction between features. Selecting correlated features
has negative effect on classification performance. To tackle these problems, we
present a novel multi-objective feature selection method that has several
advantages. Firstly, it considers the interaction between features using an
advanced probability scheme. Secondly, it is based on the Pareto Archived
Evolution Strategy (PAES) method that has several advantages such as simplicity
and its speed in exploring the solution space. However, we improve the
structure of PAES in such a way that generates the offsprings, intelligently.
Thus, the proposed method utilizes the introduced probability scheme to produce
more promising offsprings. Finally, it is equipped with a novel strategy that
guides it to find the optimum number of features through the process of
evolution. The experimental results show a significant improvement in finding
the optimal Pareto front compared to state-of-the-art methods on different
real-world datasets.
Related papers
- Large-scale Multi-objective Feature Selection: A Multi-phase Search Space Shrinking Approach [0.27624021966289597]
Feature selection is a crucial step in machine learning, especially for high-dimensional datasets.
This paper proposes a novel large-scale multi-objective evolutionary algorithm based on the search space shrinking, termed LMSSS.
The effectiveness of the proposed algorithm is demonstrated through comprehensive experiments on 15 large-scale datasets.
arXiv Detail & Related papers (2024-10-13T23:06:10Z) - UCB-driven Utility Function Search for Multi-objective Reinforcement Learning [75.11267478778295]
In Multi-objective Reinforcement Learning (MORL) agents are tasked with optimising decision-making behaviours.
We focus on the case of linear utility functions parameterised by weight vectors w.
We introduce a method based on Upper Confidence Bound to efficiently search for the most promising weight vectors during different stages of the learning process.
arXiv Detail & Related papers (2024-05-01T09:34:42Z) - Feature Selection Based on Orthogonal Constraints and Polygon Area [10.587608254638667]
The goal of feature selection is to choose the optimal subset of features for a recognition task by evaluating the importance of each feature.
This paper introduces a non-monotone linear search between dependencies enhancing features labels.
Experimental results demonstrate that our approach not only effectively captures discriminative dependency but also surpasses traditional methods in reducing dimensions classification performance.
arXiv Detail & Related papers (2024-02-25T08:20:05Z) - Multi-objective Binary Coordinate Search for Feature Selection [0.24578723416255746]
We propose the binary multi-objective coordinate search (MOCS) algorithm to solve large-scale feature selection problems.
Results indicate the significant superiority of our method over NSGA-II, on five real-world large-scale datasets.
arXiv Detail & Related papers (2024-02-20T00:50:26Z) - Causal Feature Selection via Transfer Entropy [59.999594949050596]
Causal discovery aims to identify causal relationships between features with observational data.
We introduce a new causal feature selection approach that relies on the forward and backward feature selection procedures.
We provide theoretical guarantees on the regression and classification errors for both the exact and the finite-sample cases.
arXiv Detail & Related papers (2023-10-17T08:04:45Z) - Multi-Objective GFlowNets [59.16787189214784]
We study the problem of generating diverse candidates in the context of Multi-Objective Optimization.
In many applications of machine learning such as drug discovery and material design, the goal is to generate candidates which simultaneously optimize a set of potentially conflicting objectives.
We propose Multi-Objective GFlowNets (MOGFNs), a novel method for generating diverse optimal solutions, based on GFlowNets.
arXiv Detail & Related papers (2022-10-23T16:15:36Z) - Compactness Score: A Fast Filter Method for Unsupervised Feature
Selection [66.84571085643928]
We propose a fast unsupervised feature selection method, named as, Compactness Score (CSUFS) to select desired features.
Our proposed algorithm seems to be more accurate and efficient compared with existing algorithms.
arXiv Detail & Related papers (2022-01-31T13:01:37Z) - Multivariate feature ranking of gene expression data [62.997667081978825]
We propose two new multivariate feature ranking methods based on pairwise correlation and pairwise consistency.
We statistically prove that the proposed methods outperform the state of the art feature ranking methods Clustering Variation, Chi Squared, Correlation, Information Gain, ReliefF and Significance.
arXiv Detail & Related papers (2021-11-03T17:19:53Z) - An Evolutionary Correlation-aware Feature Selection Method for
Classification Problems [3.2550305883611244]
In this paper, an estimation of distribution algorithm is proposed to meet three goals.
Firstly, as an extension of EDA, the proposed method generates only two individuals in each iteration that compete based on a fitness function.
Secondly, we provide a guiding technique for determining the number of features for individuals in each iteration.
As the main contribution of the paper, in addition to considering the importance of each feature alone, the proposed method can consider the interaction between features.
arXiv Detail & Related papers (2021-10-16T20:20:43Z) - Auto-weighted Multi-view Feature Selection with Graph Optimization [90.26124046530319]
We propose a novel unsupervised multi-view feature selection model based on graph learning.
The contributions are threefold: (1) during the feature selection procedure, the consensus similarity graph shared by different views is learned.
Experiments on various datasets demonstrate the superiority of the proposed method compared with the state-of-the-art methods.
arXiv Detail & Related papers (2021-04-11T03:25:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.