Handcrafted Feature Selection Techniques for Pattern Recognition: A
Survey
- URL: http://arxiv.org/abs/2209.02746v1
- Date: Tue, 6 Sep 2022 18:05:35 GMT
- Title: Handcrafted Feature Selection Techniques for Pattern Recognition: A
Survey
- Authors: Alysson Ribeiro da Silva, Camila Guedes Silveira
- Abstract summary: Feature selection is a process that allows for representing information properly.
This paper presents a survey on some Filters and Wrapper methods for handcrafted feature selection.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The accuracy of a classifier, when performing Pattern recognition, is mostly
tied to the quality and representativeness of the input feature vector. Feature
Selection is a process that allows for representing information properly and
may increase the accuracy of a classifier. This process is responsible for
finding the best possible features, thus allowing us to identify to which class
a pattern belongs. Feature selection methods can be categorized as Filters,
Wrappers, and Embed. This paper presents a survey on some Filters and Wrapper
methods for handcrafted feature selection. Some discussions, with regard to the
data structure, processing time, and ability to well represent a feature
vector, are also provided in order to explicitly show how appropriate some
methods are in order to perform feature selection. Therefore, the presented
feature selection methods can be accurate and efficient if applied considering
their positives and negatives, finding which one fits best the problem's domain
may be the hardest task.
Related papers
- Greedy feature selection: Classifier-dependent feature selection via
greedy methods [2.4374097382908477]
The purpose of this study is to introduce a new approach to feature ranking for classification tasks, called in what follows greedy feature selection.
The benefits of such scheme are investigated theoretically in terms of model capacity indicators, such as the Vapnik-Chervonenkis (VC) dimension or the kernel alignment.
arXiv Detail & Related papers (2024-03-08T08:12:05Z) - Feature Selection as Deep Sequential Generative Learning [50.00973409680637]
We develop a deep variational transformer model over a joint of sequential reconstruction, variational, and performance evaluator losses.
Our model can distill feature selection knowledge and learn a continuous embedding space to map feature selection decision sequences into embedding vectors associated with utility scores.
arXiv Detail & Related papers (2024-03-06T16:31:56Z) - A Contrast Based Feature Selection Algorithm for High-dimensional Data
set in Machine Learning [9.596923373834093]
We propose a novel filter feature selection method, ContrastFS, which selects discriminative features based on the discrepancies features shown between different classes.
We validate effectiveness and efficiency of our approach on several widely studied benchmark datasets, results show that the new method performs favorably with negligible computation.
arXiv Detail & Related papers (2024-01-15T05:32:35Z) - Deep Feature Selection Using a Novel Complementary Feature Mask [5.904240881373805]
We deal with feature selection by exploiting the features with less importance scores.
We propose a feature selection framework based on a novel complementary feature mask.
Our method is generic and can be easily integrated into existing deep-learning-based feature selection approaches.
arXiv Detail & Related papers (2022-09-25T18:03:30Z) - Parallel feature selection based on the trace ratio criterion [4.30274561163157]
This work presents a novel parallel feature selection approach for classification, namely Parallel Feature Selection using Trace criterion (PFST)
Our method uses trace criterion, a measure of class separability used in Fisher's Discriminant Analysis, to evaluate feature usefulness.
The experiments show that our method can produce a small set of features in a fraction of the amount of time by the other methods under comparison.
arXiv Detail & Related papers (2022-03-03T10:50:33Z) - Compactness Score: A Fast Filter Method for Unsupervised Feature
Selection [66.84571085643928]
We propose a fast unsupervised feature selection method, named as, Compactness Score (CSUFS) to select desired features.
Our proposed algorithm seems to be more accurate and efficient compared with existing algorithms.
arXiv Detail & Related papers (2022-01-31T13:01:37Z) - An Efficient and Accurate Rough Set for Feature Selection,
Classification and Knowledge Representation [89.5951484413208]
This paper present a strong data mining method based on rough set, which can realize feature selection, classification and knowledge representation at the same time.
We first find the ineffectiveness of rough set because of overfitting, especially in processing noise attribute, and propose a robust measurement for an attribute, called relative importance.
Experimental results on public benchmark data sets show that the proposed framework achieves higher accurcy than seven popular or the state-of-the-art feature selection methods.
arXiv Detail & Related papers (2021-12-29T12:45:49Z) - Few-shot Learning for Unsupervised Feature Selection [59.75321498170363]
We propose a few-shot learning method for unsupervised feature selection.
The proposed method can select a subset of relevant features in a target task given a few unlabeled target instances.
We experimentally demonstrate that the proposed method outperforms existing feature selection methods.
arXiv Detail & Related papers (2021-07-02T03:52:51Z) - Saliency-driven Class Impressions for Feature Visualization of Deep
Neural Networks [55.11806035788036]
It is advantageous to visualize the features considered to be essential for classification.
Existing visualization methods develop high confidence images consisting of both background and foreground features.
In this work, we propose a saliency-driven approach to visualize discriminative features that are considered most important for a given task.
arXiv Detail & Related papers (2020-07-31T06:11:06Z) - Selecting Relevant Features from a Multi-domain Representation for
Few-shot Classification [91.67977602992657]
We propose a new strategy based on feature selection, which is both simpler and more effective than previous feature adaptation approaches.
We show that a simple non-parametric classifier built on top of such features produces high accuracy and generalizes to domains never seen during training.
arXiv Detail & Related papers (2020-03-20T15:44:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.