AutoField: Automating Feature Selection in Deep Recommender Systems
- URL: http://arxiv.org/abs/2204.09078v1
- Date: Tue, 19 Apr 2022 18:06:02 GMT
- Title: AutoField: Automating Feature Selection in Deep Recommender Systems
- Authors: Yejing Wang, Xiangyu Zhao, Tong Xu, Xian Wu
- Abstract summary: Feature selection is a critical process in developing deep learning-based recommender systems.
We propose an AutoML framework that can adaptively select the essential feature fields in an automatic manner.
- Score: 36.70138179483737
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Feature quality has an impactful effect on recommendation performance.
Thereby, feature selection is a critical process in developing deep
learning-based recommender systems. Most existing deep recommender systems,
however, focus on designing sophisticated neural networks, while neglecting the
feature selection process. Typically, they just feed all possible features into
their proposed deep architectures, or select important features manually by
human experts. The former leads to non-trivial embedding parameters and extra
inference time, while the latter requires plenty of expert knowledge and human
labor effort. In this work, we propose an AutoML framework that can adaptively
select the essential feature fields in an automatic manner. Specifically, we
first design a differentiable controller network, which is capable of
automatically adjusting the probability of selecting a particular feature
field; then, only selected feature fields are utilized to retrain the deep
recommendation model. Extensive experiments on three benchmark datasets
demonstrate the effectiveness of our framework. We conduct further experiments
to investigate its properties, including the transferability, key components,
and parameter sensitivity.
Related papers
- Dataset-Agnostic Recommender Systems [0.47109219881156855]
DAReS aims to enable a single system to autonomously adapt to various datasets without the need for fine-tuning.
DareS offers a more efficient and scalable solution for building recommender systems across diverse application domains.
arXiv Detail & Related papers (2025-01-13T13:01:00Z) - Feature Selection as Deep Sequential Generative Learning [50.00973409680637]
We develop a deep variational transformer model over a joint of sequential reconstruction, variational, and performance evaluator losses.
Our model can distill feature selection knowledge and learn a continuous embedding space to map feature selection decision sequences into embedding vectors associated with utility scores.
arXiv Detail & Related papers (2024-03-06T16:31:56Z) - A Performance-Driven Benchmark for Feature Selection in Tabular Deep
Learning [131.2910403490434]
Data scientists typically collect as many features as possible into their datasets, and even engineer new features from existing ones.
Existing benchmarks for tabular feature selection consider classical downstream models, toy synthetic datasets, or do not evaluate feature selectors on the basis of downstream performance.
We construct a challenging feature selection benchmark evaluated on downstream neural networks including transformers.
We also propose an input-gradient-based analogue of Lasso for neural networks that outperforms classical feature selection methods on challenging problems.
arXiv Detail & Related papers (2023-11-10T05:26:10Z) - Feature Selection with Distance Correlation [0.0]
We develop a new feature selection method based on Distance Correlation (DisCo)
Using our method to select features from a set of over 7,000 energy flows, we show that we can match the performance of much deeper architectures.
arXiv Detail & Related papers (2022-11-30T19:00:04Z) - Deep Feature Selection Using a Novel Complementary Feature Mask [5.904240881373805]
We deal with feature selection by exploiting the features with less importance scores.
We propose a feature selection framework based on a novel complementary feature mask.
Our method is generic and can be easily integrated into existing deep-learning-based feature selection approaches.
arXiv Detail & Related papers (2022-09-25T18:03:30Z) - Meta-Wrapper: Differentiable Wrapping Operator for User Interest
Selection in CTR Prediction [97.99938802797377]
Click-through rate (CTR) prediction, whose goal is to predict the probability of the user to click on an item, has become increasingly significant in recommender systems.
Recent deep learning models with the ability to automatically extract the user interest from his/her behaviors have achieved great success.
We propose a novel approach under the framework of the wrapper method, which is named Meta-Wrapper.
arXiv Detail & Related papers (2022-06-28T03:28:15Z) - i-Razor: A Differentiable Neural Input Razor for Feature Selection and
Dimension Search in DNN-Based Recommender Systems [8.992480061695138]
Noisy features and inappropriate embedding dimension assignments can deteriorate the performance of recommender systems.
We propose a differentiable neural input razor (i-Razor) that enables joint optimization of feature selection and dimension search.
arXiv Detail & Related papers (2022-04-01T08:30:06Z) - Compactness Score: A Fast Filter Method for Unsupervised Feature
Selection [66.84571085643928]
We propose a fast unsupervised feature selection method, named as, Compactness Score (CSUFS) to select desired features.
Our proposed algorithm seems to be more accurate and efficient compared with existing algorithms.
arXiv Detail & Related papers (2022-01-31T13:01:37Z) - Feature Selection Using Batch-Wise Attenuation and Feature Mask
Normalization [6.6357750579293935]
This paper proposes a feature mask module (FM- module) for feature selection based on a novel batch-wise attenuation and feature mask normalization.
Experiments on popular image, text and speech datasets have shown that our approach is easy to use and has superior performance in comparison with other state-of-the-art deep-learning-based feature selection methods.
arXiv Detail & Related papers (2020-10-26T14:46:38Z) - AutoFIS: Automatic Feature Interaction Selection in Factorization Models
for Click-Through Rate Prediction [75.16836697734995]
We propose a two-stage algorithm called Automatic Feature Interaction Selection (AutoFIS)
AutoFIS can automatically identify important feature interactions for factorization models with computational cost just equivalent to training the target model to convergence.
AutoFIS has been deployed onto the training platform of Huawei App Store recommendation service.
arXiv Detail & Related papers (2020-03-25T06:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.