A Tent L\'evy Flying Sparrow Search Algorithm for Feature Selection: A
COVID-19 Case Study
- URL: http://arxiv.org/abs/2209.10542v1
- Date: Tue, 20 Sep 2022 15:12:10 GMT
- Title: A Tent L\'evy Flying Sparrow Search Algorithm for Feature Selection: A
COVID-19 Case Study
- Authors: Qinwen Yang, Yuelin Gao, Yanjie Song
- Abstract summary: The "Curse of Dimensionality" induced by the rapid development of information science might have a negative impact when dealing with big datasets.
We propose a variant of the sparrow search algorithm (SSA), called Tent L'evy flying sparrow search algorithm (TFSSA)
TFSSA is used to select the best subset of features in the packing pattern for classification purposes.
- Score: 1.6436293069942312
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The "Curse of Dimensionality" induced by the rapid development of information
science, might have a negative impact when dealing with big datasets. In this
paper, we propose a variant of the sparrow search algorithm (SSA), called Tent
L\'evy flying sparrow search algorithm (TFSSA), and use it to select the best
subset of features in the packing pattern for classification purposes. SSA is a
recently proposed algorithm that has not been systematically applied to feature
selection problems. After verification by the CEC2020 benchmark function, TFSSA
is used to select the best feature combination to maximize classification
accuracy and minimize the number of selected features. The proposed TFSSA is
compared with nine algorithms in the literature. Nine evaluation metrics are
used to properly evaluate and compare the performance of these algorithms on
twenty-one datasets from the UCI repository. Furthermore, the approach is
applied to the coronavirus disease (COVID-19) dataset, yielding the best
average classification accuracy and the average number of feature selections,
respectively, of 93.47% and 2.1. Experimental results confirm the advantages of
the proposed algorithm in improving classification accuracy and reducing the
number of selected features compared to other wrapper-based algorithms.
Related papers
- SFE: A Simple, Fast and Efficient Feature Selection Algorithm for
High-Dimensional Data [8.190527783858096]
The SFE algorithm performs its search process using a search agent and two operators: non-selection and selection.
The efficiency and effectiveness of the SFE and the SFE-PSO for feature selection are compared on 40 high-dimensional datasets.
arXiv Detail & Related papers (2023-03-17T12:28:17Z) - Feature selection algorithm based on incremental mutual information and
cockroach swarm optimization [12.297966427336124]
We propose an incremental mutual information based improved swarm intelligent optimization method (IMIICSO)
This method extracts decision table reduction knowledge to guide group algorithm global search.
The accuracy of feature subsets selected by the improved cockroach swarm algorithm based on incremental mutual information is better or almost the same as that of the original swarm intelligent optimization algorithm.
arXiv Detail & Related papers (2023-02-21T08:51:05Z) - An efficient hybrid classification approach for COVID-19 based on Harris
Hawks Optimization and Salp Swarm Optimization [0.0]
This study presents a hybrid binary version of the Harris Hawks Optimization algorithm (HHO) and Salp Swarm Optimization (SSA) for Covid-19 classification.
The proposed algorithm (HHOSSA) achieved 96% accuracy with the SVM, 98% and 98% accuracy with two classifiers.
arXiv Detail & Related papers (2022-12-25T19:52:18Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Compactness Score: A Fast Filter Method for Unsupervised Feature
Selection [66.84571085643928]
We propose a fast unsupervised feature selection method, named as, Compactness Score (CSUFS) to select desired features.
Our proposed algorithm seems to be more accurate and efficient compared with existing algorithms.
arXiv Detail & Related papers (2022-01-31T13:01:37Z) - Machine Learning for Online Algorithm Selection under Censored Feedback [71.6879432974126]
In online algorithm selection (OAS), instances of an algorithmic problem class are presented to an agent one after another, and the agent has to quickly select a presumably best algorithm from a fixed set of candidate algorithms.
For decision problems such as satisfiability (SAT), quality typically refers to the algorithm's runtime.
In this work, we revisit multi-armed bandit algorithms for OAS and discuss their capability of dealing with the problem.
We adapt them towards runtime-oriented losses, allowing for partially censored data while keeping a space- and time-complexity independent of the time horizon.
arXiv Detail & Related papers (2021-09-13T18:10:52Z) - RSO: A Novel Reinforced Swarm Optimization Algorithm for Feature
Selection [0.0]
In this paper, we propose a novel feature selection algorithm named Reinforced Swarm Optimization (RSO)
This algorithm embeds the widely used Bee Swarm Optimization (BSO) algorithm along with Reinforcement Learning (RL) to maximize the reward of a superior search agent and punish the inferior ones.
The proposed method is evaluated on 25 widely known UCI datasets containing a perfect blend of balanced and imbalanced data.
arXiv Detail & Related papers (2021-07-29T17:38:04Z) - Local policy search with Bayesian optimization [73.0364959221845]
Reinforcement learning aims to find an optimal policy by interaction with an environment.
Policy gradients for local search are often obtained from random perturbations.
We develop an algorithm utilizing a probabilistic model of the objective function and its gradient.
arXiv Detail & Related papers (2021-06-22T16:07:02Z) - A Scalable Feature Selection and Opinion Miner Using Whale Optimization
Algorithm [6.248184589339059]
Using feature selection techniques not only support to understand data better but also lead to higher speed and also accuracy.
In this article, the Whale Optimization algorithm is considered and applied to the search for the optimum subset of features.
arXiv Detail & Related papers (2020-04-21T01:08:45Z) - Extreme Algorithm Selection With Dyadic Feature Representation [78.13985819417974]
We propose the setting of extreme algorithm selection (XAS) where we consider fixed sets of thousands of candidate algorithms.
We assess the applicability of state-of-the-art AS techniques to the XAS setting and propose approaches leveraging a dyadic feature representation.
arXiv Detail & Related papers (2020-01-29T09:40:58Z) - Optimal Clustering from Noisy Binary Feedback [75.17453757892152]
We study the problem of clustering a set of items from binary user feedback.
We devise an algorithm with a minimal cluster recovery error rate.
For adaptive selection, we develop an algorithm inspired by the derivation of the information-theoretical error lower bounds.
arXiv Detail & Related papers (2019-10-14T09:18:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.