Item Cold Start Recommendation via Adversarial Variational Auto-encoder
Warm-up
- URL: http://arxiv.org/abs/2302.14395v1
- Date: Tue, 28 Feb 2023 08:23:15 GMT
- Title: Item Cold Start Recommendation via Adversarial Variational Auto-encoder
Warm-up
- Authors: Shenzheng Zhang, Qi Tan, Xinzhi Zheng, Yi Ren, Xu Zhao
- Abstract summary: We propose an Adversarial Variational Auto-encoder Warm-up model (AVAEW) to generate warm-up item ID embedding for cold items.
We demonstrate the effectiveness and compatibility of the proposed method by extensive offline experiments on public datasets and online A/B tests on a real-world news recommendation platform.
- Score: 18.923299235862974
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The gap between the randomly initialized item ID embedding and the
well-trained warm item ID embedding makes the cold items hard to suit the
recommendation system, which is trained on the data of historical warm items.
To alleviate the performance decline of new items recommendation, the
distribution of the new item ID embedding should be close to that of the
historical warm items. To achieve this goal, we propose an Adversarial
Variational Auto-encoder Warm-up model (AVAEW) to generate warm-up item ID
embedding for cold items. Specifically, we develop a conditional variational
auto-encoder model to leverage the side information of items for generating the
warm-up item ID embedding. Particularly, we introduce an adversarial module to
enforce the alignment between warm-up item ID embedding distribution and
historical item ID embedding distribution. We demonstrate the effectiveness and
compatibility of the proposed method by extensive offline experiments on public
datasets and online A/B tests on a real-world large-scale news recommendation
platform.
Related papers
- MISSRec: Pre-training and Transferring Multi-modal Interest-aware
Sequence Representation for Recommendation [61.45986275328629]
We propose MISSRec, a multi-modal pre-training and transfer learning framework for sequential recommendation.
On the user side, we design a Transformer-based encoder-decoder model, where the contextual encoder learns to capture the sequence-level multi-modal user interests.
On the candidate item side, we adopt a dynamic fusion module to produce user-adaptive item representation.
arXiv Detail & Related papers (2023-08-22T04:06:56Z) - Multi-task Item-attribute Graph Pre-training for Strict Cold-start Item
Recommendation [71.5871100348448]
ColdGPT models item-attribute correlations into an item-attribute graph by extracting fine-grained attributes from item contents.
ColdGPT transfers knowledge into the item-attribute graph from various available data sources, i.e., item contents, historical purchase sequences, and review texts of the existing items.
Extensive experiments show that ColdGPT consistently outperforms the existing SCS recommenders by large margins.
arXiv Detail & Related papers (2023-06-26T07:04:47Z) - Recommender Systems with Generative Retrieval [58.454606442670034]
We propose a novel generative retrieval approach, where the retrieval model autoregressively decodes the identifiers of the target candidates.
To that end, we create semantically meaningful of codewords to serve as a Semantic ID for each item.
We show that recommender systems trained with the proposed paradigm significantly outperform the current SOTA models on various datasets.
arXiv Detail & Related papers (2023-05-08T21:48:17Z) - Forget Embedding Layers: Representation Learning for Cold-start in
Recommender Systems [0.0]
We present FELRec, a large embedding network that refines the existing representations of users and items.
In contrast to similar approaches, our model represents new users and items without side information or time-consuming fine-tuning.
Our proposed model generalizes well to previously unseen datasets.
arXiv Detail & Related papers (2022-10-30T19:08:38Z) - Sequential Recommendation via Stochastic Self-Attention [68.52192964559829]
Transformer-based approaches embed items as vectors and use dot-product self-attention to measure the relationship between items.
We propose a novel textbfSTOchastic textbfSelf-textbfAttention(STOSA) to overcome these issues.
We devise a novel Wasserstein Self-Attention module to characterize item-item position-wise relationships in sequences.
arXiv Detail & Related papers (2022-01-16T12:38:45Z) - Cold Item Integration in Deep Hybrid Recommenders via Tunable Stochastic
Gates [19.69804455785047]
A major challenge in collaborative filtering methods is how to produce recommendations for cold items.
We propose a novel hybrid recommendation algorithm that bridges these two conflicting objectives.
We demonstrate the effectiveness of the proposed algorithm on movies, apps, and articles recommendations.
arXiv Detail & Related papers (2021-12-12T11:37:24Z) - Privileged Graph Distillation for Cold Start Recommendation [57.918041397089254]
The cold start problem in recommender systems requires recommending to new users (items) based on attributes without any historical interaction records.
We propose a privileged graph distillation model(PGD)
Our proposed model is generally applicable to different cold start scenarios with new user, new item, or new user-new item.
arXiv Detail & Related papers (2021-05-31T14:05:27Z) - Represent Items by Items: An Enhanced Representation of the Target Item
for Recommendation [37.28220632871373]
Item-based collaborative filtering (ICF) has been widely used in industrial applications such as recommender system and online advertising.
Recent models use methods such as attention mechanism and deep neural network to learn the user representation and scoring function more accurately.
We propose an enhanced representation of the target item which distills relevant information from the co-occurrence items.
arXiv Detail & Related papers (2021-04-26T11:28:28Z) - Variation Control and Evaluation for Generative SlateRecommendations [22.533997063750597]
We show that item perturbation can enforce slate variation and mitigate the over-concentration of generated slates.
We also propose to separate a pivot selection phase from the generation process so that the model can apply perturbation before generation.
arXiv Detail & Related papers (2021-02-26T05:04:40Z) - Seamlessly Unifying Attributes and Items: Conversational Recommendation
for Cold-Start Users [111.28351584726092]
We consider the conversational recommendation for cold-start users, where a system can both ask the attributes from and recommend items to a user interactively.
Our Conversational Thompson Sampling (ConTS) model holistically solves all questions in conversational recommendation by choosing the arm with the maximal reward to play.
arXiv Detail & Related papers (2020-05-23T08:56:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.