Differentiable Unsupervised Feature Selection based on a Gated Laplacian
- URL: http://arxiv.org/abs/2007.04728v3
- Date: Mon, 9 Nov 2020 11:23:01 GMT
- Title: Differentiable Unsupervised Feature Selection based on a Gated Laplacian
- Authors: Ofir Lindenbaum, Uri Shaham, Jonathan Svirsky, Erez Peterfreund, Yuval
Kluger
- Abstract summary: We propose a differentiable loss function that combines the Laplacian score, which favors low-frequency features, with a gating mechanism for feature selection.
We mathematically motivate the proposed approach and demonstrate that in the high noise regime, it is crucial to compute the Laplacian on the gated inputs, rather than on the full feature set.
- Score: 7.970954821067042
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scientific observations may consist of a large number of variables
(features). Identifying a subset of meaningful features is often ignored in
unsupervised learning, despite its potential for unraveling clear patterns
hidden in the ambient space. In this paper, we present a method for
unsupervised feature selection, and we demonstrate its use for the task of
clustering. We propose a differentiable loss function that combines the
Laplacian score, which favors low-frequency features, with a gating mechanism
for feature selection. We improve the Laplacian score, by replacing it with a
gated variant computed on a subset of features. This subset is obtained using a
continuous approximation of Bernoulli variables whose parameters are trained to
gate the full feature space. We mathematically motivate the proposed approach
and demonstrate that in the high noise regime, it is crucial to compute the
Laplacian on the gated inputs, rather than on the full feature set.
Experimental demonstration of the efficacy of the proposed approach and its
advantage over current baselines is provided using several real-world examples.
Related papers
- Quiver Laplacians and Feature Selection [1.237454174824584]
We describe a method for identifying selected features which are compatible with the decomposition into subsets.
We demonstrate that eigenvectors of the associated quiver Laplacian yield locally and globally compatible features.
arXiv Detail & Related papers (2024-04-10T13:12:07Z) - Feature Selection as Deep Sequential Generative Learning [50.00973409680637]
We develop a deep variational transformer model over a joint of sequential reconstruction, variational, and performance evaluator losses.
Our model can distill feature selection knowledge and learn a continuous embedding space to map feature selection decision sequences into embedding vectors associated with utility scores.
arXiv Detail & Related papers (2024-03-06T16:31:56Z) - Transcending Forgery Specificity with Latent Space Augmentation for Generalizable Deepfake Detection [57.646582245834324]
We propose a simple yet effective deepfake detector called LSDA.
It is based on a idea: representations with a wider variety of forgeries should be able to learn a more generalizable decision boundary.
We show that our proposed method is surprisingly effective and transcends state-of-the-art detectors across several widely used benchmarks.
arXiv Detail & Related papers (2023-11-19T09:41:10Z) - Bayesian Hierarchical Models for Counterfactual Estimation [12.159830463756341]
We propose a probabilistic paradigm to estimate a diverse set of counterfactuals.
We treat the perturbations as random variables endowed with prior distribution functions.
A gradient based sampler with superior convergence characteristics efficiently computes the posterior samples.
arXiv Detail & Related papers (2023-01-21T00:21:11Z) - Deep Unsupervised Feature Selection by Discarding Nuisance and
Correlated Features [7.288137686773523]
Modern datasets contain large subsets of correlated features and nuisance features.
In the presence of large numbers of nuisance features, the Laplacian must be computed on the subset of selected features.
We employ an autoencoder architecture to cope with correlated features, trained to reconstruct the data from the subset of selected features.
arXiv Detail & Related papers (2021-10-11T14:26:13Z) - Fine-Grained Dynamic Head for Object Detection [68.70628757217939]
We propose a fine-grained dynamic head to conditionally select a pixel-level combination of FPN features from different scales for each instance.
Experiments demonstrate the effectiveness and efficiency of the proposed method on several state-of-the-art detection benchmarks.
arXiv Detail & Related papers (2020-12-07T08:16:32Z) - Multi-scale Interactive Network for Salient Object Detection [91.43066633305662]
We propose the aggregate interaction modules to integrate the features from adjacent levels.
To obtain more efficient multi-scale features, the self-interaction modules are embedded in each decoder unit.
Experimental results on five benchmark datasets demonstrate that the proposed method without any post-processing performs favorably against 23 state-of-the-art approaches.
arXiv Detail & Related papers (2020-07-17T15:41:37Z) - Infinite Feature Selection: A Graph-based Feature Filtering Approach [78.63188057505012]
We propose a filtering feature selection framework that considers subsets of features as paths in a graph.
Going to infinite allows to constrain the computational complexity of the selection process.
We show that Inf-FS behaves better in almost any situation, that is, when the number of features to keep are fixed a priori.
arXiv Detail & Related papers (2020-06-15T07:20:40Z) - Spatially Adaptive Inference with Stochastic Feature Sampling and
Interpolation [72.40827239394565]
We propose to compute features only at sparsely sampled locations.
We then densely reconstruct the feature map with an efficient procedure.
The presented network is experimentally shown to save substantial computation while maintaining accuracy over a variety of computer vision tasks.
arXiv Detail & Related papers (2020-03-19T15:36:31Z) - Outlier Detection Ensemble with Embedded Feature Selection [42.8338013000469]
We propose an outlier detection ensemble framework with embedded feature selection (ODEFS)
For each random sub-sampling based learning component, ODEFS unifies feature selection and outlier detection into a pairwise ranking formulation.
We adopt the thresholded self-paced learning to simultaneously optimize feature selection and example selection.
arXiv Detail & Related papers (2020-01-15T13:14:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.