Learning Graph Meta Embeddings for Cold-Start Ads in Click-Through Rate
Prediction
- URL: http://arxiv.org/abs/2105.08909v1
- Date: Wed, 19 May 2021 03:46:56 GMT
- Title: Learning Graph Meta Embeddings for Cold-Start Ads in Click-Through Rate
Prediction
- Authors: Wentao Ouyang, Xiuwu Zhang, Shukui Ren, Li Li, Kun Zhang, Jinmei Luo,
Zhaojie Liu, Yanlong Du
- Abstract summary: We propose Graph Meta Embedding (GME) models that can rapidly learn how to generate desirable initial embeddings for new ad IDs.
Experimental results on three real-world datasets show that GMEs can significantly improve the prediction performance in both cold-start and warm-up.
- Score: 14.709092114902159
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Click-through rate (CTR) prediction is one of the most central tasks in
online advertising systems. Recent deep learning-based models that exploit
feature embedding and high-order data nonlinearity have shown dramatic
successes in CTR prediction. However, these models work poorly on cold-start
ads with new IDs, whose embeddings are not well learned yet. In this paper, we
propose Graph Meta Embedding (GME) models that can rapidly learn how to
generate desirable initial embeddings for new ad IDs based on graph neural
networks and meta learning. Previous works address this problem from the new ad
itself, but ignore possibly useful information contained in existing old ads.
In contrast, GMEs simultaneously consider two information sources: the new ad
and existing old ads. For the new ad, GMEs exploit its associated attributes.
For existing old ads, GMEs first build a graph to connect them with new ads,
and then adaptively distill useful information. We propose three specific GMEs
from different perspectives to explore what kind of information to use and how
to distill information. In particular, GME-P uses Pre-trained neighbor ID
embeddings, GME-G uses Generated neighbor ID embeddings and GME-A uses neighbor
Attributes. Experimental results on three real-world datasets show that GMEs
can significantly improve the prediction performance in both cold-start (i.e.,
no training data is available) and warm-up (i.e., a small number of training
samples are collected) scenarios over five major deep learning-based CTR
prediction models. GMEs can be applied to conversion rate (CVR) prediction as
well.
Related papers
- What's New in My Data? Novelty Exploration via Contrastive Generation [31.33791825286853]
We introduce the task of novelty discovery through generation (CGE)
CGE aims to identify novel properties of a fine-tuning dataset by generating examples that illustrate these properties.
Our experiments demonstrate the effectiveness of CGE in detecting novel content, such as toxic language, as well as new natural and programming languages.
arXiv Detail & Related papers (2024-10-18T15:24:05Z) - GACE: Learning Graph-Based Cross-Page Ads Embedding For Click-Through
Rate Prediction [3.3840833400287593]
This paper proposes a graph-based cross-page ads embedding generation method.
It generates representations embedding of cold-start and existing ads across various pages.
The results show that our method is significantly superior to the SOTA method.
arXiv Detail & Related papers (2024-01-15T03:12:21Z) - A Survey of Knowledge Graph Reasoning on Graph Types: Static, Dynamic,
and Multimodal [57.8455911689554]
Knowledge graph reasoning (KGR) aims to deduce new facts from existing facts based on mined logic rules underlying knowledge graphs (KGs)
It has been proven to significantly benefit the usage of KGs in many AI applications, such as question answering, recommendation systems, and etc.
arXiv Detail & Related papers (2022-12-12T08:40:04Z) - Boost CTR Prediction for New Advertisements via Modeling Visual Content [55.11267821243347]
We exploit the visual content in ads to boost the performance of CTR prediction models.
We learn the embedding for each visual ID based on the historical user-ad interactions accumulated in the past.
After incorporating the visual ID embedding in the CTR prediction model of Baidu online advertising, the average CTR of ads improves by 1.46%, and the total charge increases by 1.10%.
arXiv Detail & Related papers (2022-09-23T17:08:54Z) - On the Factory Floor: ML Engineering for Industrial-Scale Ads
Recommendation Models [9.102290972714652]
For industrial-scale advertising systems, prediction of ad click-through rate (CTR) is a central problem.
We present a case study of practical techniques deployed in Google's search ads CTR model.
arXiv Detail & Related papers (2022-09-12T15:15:23Z) - A Graph-Enhanced Click Model for Web Search [67.27218481132185]
We propose a novel graph-enhanced click model (GraphCM) for web search.
We exploit both intra-session and inter-session information for the sparsity and cold-start problems.
arXiv Detail & Related papers (2022-06-17T08:32:43Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Learning Intents behind Interactions with Knowledge Graph for
Recommendation [93.08709357435991]
Knowledge graph (KG) plays an increasingly important role in recommender systems.
Existing GNN-based models fail to identify user-item relation at a fine-grained level of intents.
We propose a new model, Knowledge Graph-based Intent Network (KGIN)
arXiv Detail & Related papers (2021-02-14T03:21:36Z) - Iterative Boosting Deep Neural Networks for Predicting Click-Through
Rate [15.90144113403866]
The click-through rate (CTR) reflects the ratio of clicks on a specific item to its total number of views.
XdBoost is an iterative three-stage neural network model influenced by the traditional machine learning boosting mechanism.
arXiv Detail & Related papers (2020-07-26T09:41:16Z) - Generative Adversarial Zero-Shot Relational Learning for Knowledge
Graphs [96.73259297063619]
We consider a novel formulation, zero-shot learning, to free this cumbersome curation.
For newly-added relations, we attempt to learn their semantic features from their text descriptions.
We leverage Generative Adrial Networks (GANs) to establish the connection between text and knowledge graph domain.
arXiv Detail & Related papers (2020-01-08T01:19:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.