Iterative Boosting Deep Neural Networks for Predicting Click-Through
Rate
- URL: http://arxiv.org/abs/2007.13087v1
- Date: Sun, 26 Jul 2020 09:41:16 GMT
- Title: Iterative Boosting Deep Neural Networks for Predicting Click-Through
Rate
- Authors: Amit Livne, Roy Dor, Eyal Mazuz, Tamar Didi, Bracha Shapira, and Lior
Rokach
- Abstract summary: The click-through rate (CTR) reflects the ratio of clicks on a specific item to its total number of views.
XdBoost is an iterative three-stage neural network model influenced by the traditional machine learning boosting mechanism.
- Score: 15.90144113403866
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The click-through rate (CTR) reflects the ratio of clicks on a specific item
to its total number of views. It has significant impact on websites'
advertising revenue. Learning sophisticated models to understand and predict
user behavior is essential for maximizing the CTR in recommendation systems.
Recent works have suggested new methods that replace the expensive and
time-consuming feature engineering process with a variety of deep learning (DL)
classifiers capable of capturing complicated patterns from raw data; these
methods have shown significant improvement on the CTR prediction task. While DL
techniques can learn intricate user behavior patterns, it relies on a vast
amount of data and does not perform as well when there is a limited amount of
data. We propose XDBoost, a new DL method for capturing complex patterns that
requires just a limited amount of raw data. XDBoost is an iterative three-stage
neural network model influenced by the traditional machine learning boosting
mechanism. XDBoost's components operate sequentially similar to boosting;
However, unlike conventional boosting, XDBoost does not sum the predictions
generated by its components. Instead, it utilizes these predictions as new
artificial features and enhances CTR prediction by retraining the model using
these features. Comprehensive experiments conducted to illustrate the
effectiveness of XDBoost on two datasets demonstrated its ability to outperform
existing state-of-the-art (SOTA) models for CTR prediction.
Related papers
- Investigating the Robustness of Counterfactual Learning to Rank Models: A Reproducibility Study [61.64685376882383]
Counterfactual learning to rank (CLTR) has attracted extensive attention in the IR community for its ability to leverage massive logged user interaction data to train ranking models.
This paper investigates the robustness of existing CLTR models in complex and diverse situations.
We find that the DLA models and IPS-DCM show better robustness under various simulation settings than IPS-PBM and PRS with offline propensity estimation.
arXiv Detail & Related papers (2024-04-04T10:54:38Z) - MAP: A Model-agnostic Pretraining Framework for Click-through Rate
Prediction [39.48740397029264]
We propose a Model-agnostic pretraining (MAP) framework that applies feature corruption and recovery on multi-field categorical data.
We derive two practical algorithms: masked feature prediction (RFD) and replaced feature detection (RFD)
arXiv Detail & Related papers (2023-08-03T12:55:55Z) - DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for
CTR Prediction [61.68415731896613]
Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation.
We propose a model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction.
arXiv Detail & Related papers (2023-05-03T12:34:45Z) - Directed Acyclic Graph Factorization Machines for CTR Prediction via
Knowledge Distillation [65.62538699160085]
We propose a Directed Acyclic Graph Factorization Machine (KD-DAGFM) to learn the high-order feature interactions from existing complex interaction models for CTR prediction via Knowledge Distillation.
KD-DAGFM achieves the best performance with less than 21.5% FLOPs of the state-of-the-art method on both online and offline experiments.
arXiv Detail & Related papers (2022-11-21T03:09:42Z) - Towards Open-World Feature Extrapolation: An Inductive Graph Learning
Approach [80.8446673089281]
We propose a new learning paradigm with graph representation and learning.
Our framework contains two modules: 1) a backbone network (e.g., feedforward neural nets) as a lower model takes features as input and outputs predicted labels; 2) a graph neural network as an upper model learns to extrapolate embeddings for new features via message passing over a feature-data graph built from observed data.
arXiv Detail & Related papers (2021-10-09T09:02:45Z) - Efficient Click-Through Rate Prediction for Developing Countries via
Tabular Learning [2.916402752324148]
Click-Through Rate (CTR) prediction models are difficult to be deployed due to the limited computing resources.
In this paper, we show that tabular learning models are more efficient and effective in CTR prediction.
arXiv Detail & Related papers (2021-04-15T16:07:25Z) - Adversarial Feature Augmentation and Normalization for Visual
Recognition [109.6834687220478]
Recent advances in computer vision take advantage of adversarial data augmentation to ameliorate the generalization ability of classification models.
Here, we present an effective and efficient alternative that advocates adversarial augmentation on intermediate feature embeddings.
We validate the proposed approach across diverse visual recognition tasks with representative backbone networks.
arXiv Detail & Related papers (2021-03-22T20:36:34Z) - Ensemble Knowledge Distillation for CTR Prediction [46.92149090885551]
We propose a new model training strategy based on knowledge distillation (KD)
KD is a teacher-student learning framework to transfer knowledge learned from a teacher model to a student model.
We propose some novel techniques to facilitate ensembled CTR prediction, including teacher gating and early stopping by distillation loss.
arXiv Detail & Related papers (2020-11-08T23:37:58Z) - DeepLight: Deep Lightweight Feature Interactions for Accelerating CTR
Predictions in Ad Serving [15.637357991632241]
Click-through rate (CTR) prediction is a crucial task in online display advertising.
embedding-based neural networks have been proposed to learn both explicit feature interactions.
These sophisticated models, however, slow down the prediction inference by at least hundreds of times.
arXiv Detail & Related papers (2020-02-17T14:51:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.