A Multi-Strategy based Pre-Training Method for Cold-Start Recommendation
- URL: http://arxiv.org/abs/2112.02275v1
- Date: Sat, 4 Dec 2021 08:11:55 GMT
- Title: A Multi-Strategy based Pre-Training Method for Cold-Start Recommendation
- Authors: Bowen Hao, Hongzhi Yin, Jing Zhang, Cuiping Li, and Hong Chen
- Abstract summary: Cold-start problem is a fundamental challenge for recommendation tasks.
Recent self-supervised learning (SSL) on Graph Neural Networks (GNNs) model, PT-GNN, pre-trains the GNN model to reconstruct the cold-start embeddings.
We propose a multi-strategy based pre-training method for cold-start recommendation (MPT), which extends PT-GNN from the perspective of model architecture and pretext tasks.
- Score: 28.337475919795008
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Cold-start problem is a fundamental challenge for recommendation tasks. The
recent self-supervised learning (SSL) on Graph Neural Networks (GNNs) model,
PT-GNN, pre-trains the GNN model to reconstruct the cold-start embeddings and
has shown great potential for cold-start recommendation. However, due to the
over-smoothing problem, PT-GNN can only capture up to 3-order relation, which
can not provide much useful auxiliary information to depict the target
cold-start user or item. Besides, the embedding reconstruction task only
considers the intra-correlations within the subgraph of users and items, while
ignoring the inter-correlations across different subgraphs. To solve the above
challenges, we propose a multi-strategy based pre-training method for
cold-start recommendation (MPT), which extends PT-GNN from the perspective of
model architecture and pretext tasks to improve the cold-start recommendation
performance. Specifically, in terms of the model architecture, in addition to
the short-range dependencies of users and items captured by the GNN encoder, we
introduce a Transformer encoder to capture long-range dependencies. In terms of
the pretext task, in addition to considering the intra-correlations of users
and items by the embedding reconstruction task, we add embedding contrastive
learning task to capture inter-correlations of users and items. We train the
GNN and Transformer encoders on these pretext tasks under the meta-learning
setting to simulate the real cold-start scenario, making the model easily and
rapidly being adapted to new cold-start users and items. Experiments on three
public recommendation datasets show the superiority of the proposed MPT model
against the vanilla GNN models, the pre-training GNN model on user/item
embedding inference and the recommendation task.
Related papers
- Graph Neural Patching for Cold-Start Recommendations [16.08395433358279]
We introduce Graph Neural Patching for Cold-Start Recommendations (GNP)
GNP is a customized GNN framework with dual functionalities: GWarmer for modeling collaborative signal on existing warm users/items and Patching Networks for simulating and enhancing GWarmer's performance on cold-start recommendations.
Extensive experiments on three benchmark datasets confirm GNP's superiority in recommending both warm and cold users/items.
arXiv Detail & Related papers (2024-10-18T07:44:12Z) - Linear-Time Graph Neural Networks for Scalable Recommendations [50.45612795600707]
The key of recommender systems is to forecast users' future behaviors based on previous user-item interactions.
Recent years have witnessed a rising interest in leveraging Graph Neural Networks (GNNs) to boost the prediction performance of recommender systems.
We propose a Linear-Time Graph Neural Network (LTGNN) to scale up GNN-based recommender systems to achieve comparable scalability as classic MF approaches.
arXiv Detail & Related papers (2024-02-21T17:58:10Z) - A Social-aware Gaussian Pre-trained Model for Effective Cold-start
Recommendation [25.850274659792305]
We propose a novel recommendation model, the Social-aware Gaussian Pre-trained model (SGP), which encodes the user social relations and interaction data at the pre-training stage in a Graph Neural Network (GNN)
Our experiments on three public datasets show that, in comparison to 16 competitive baselines, our SGP model significantly outperforms the best baseline by upto 7.7% in terms of NDCG@10.
In addition, we show that SGP permits to effectively alleviate the cold-start problem, especially when users newly register to the system through their friends' suggestions.
arXiv Detail & Related papers (2023-11-27T13:04:33Z) - Meta-Learning with Adaptive Weighted Loss for Imbalanced Cold-Start
Recommendation [4.379304291229695]
We propose a novel sequential recommendation framework based on gradient-based meta-learning.
Our work is the first to tackle the impact of imbalanced ratings in cold-start sequential recommendation scenarios.
arXiv Detail & Related papers (2023-02-28T15:18:42Z) - GPatch: Patching Graph Neural Networks for Cold-Start Recommendations [20.326139541161194]
Cold start is an essential and persistent problem in recommender systems.
State-of-the-art solutions rely on training hybrid models for both cold-start and existing users/items.
We propose a tailored GNN-based framework (GPatch) that contains two separate but correlated components.
arXiv Detail & Related papers (2022-09-25T13:16:39Z) - An Adaptive Graph Pre-training Framework for Localized Collaborative
Filtering [79.17319280791237]
We propose an adaptive graph pre-training framework for localized collaborative filtering (ADAPT)
ADAPT captures both the common knowledge across different graphs and the uniqueness for each graph.
It does not require transferring user/item embeddings, and is able to capture both the common knowledge across different graphs and the uniqueness for each graph.
arXiv Detail & Related papers (2021-12-14T06:53:13Z) - Learning to Learn a Cold-start Sequential Recommender [70.5692886883067]
The cold-start recommendation is an urgent problem in contemporary online applications.
We propose a meta-learning based cold-start sequential recommendation framework called metaCSR.
metaCSR holds the ability to learn the common patterns from regular users' behaviors.
arXiv Detail & Related papers (2021-10-18T08:11:24Z) - Causal Incremental Graph Convolution for Recommender System Retraining [89.25922726558875]
Real-world recommender system needs to be regularly retrained to keep with the new data.
In this work, we consider how to efficiently retrain graph convolution network (GCN) based recommender models.
arXiv Detail & Related papers (2021-08-16T04:20:09Z) - Privileged Graph Distillation for Cold Start Recommendation [57.918041397089254]
The cold start problem in recommender systems requires recommending to new users (items) based on attributes without any historical interaction records.
We propose a privileged graph distillation model(PGD)
Our proposed model is generally applicable to different cold start scenarios with new user, new item, or new user-new item.
arXiv Detail & Related papers (2021-05-31T14:05:27Z) - LightGCN: Simplifying and Powering Graph Convolution Network for
Recommendation [100.76229017056181]
Graph Convolution Network (GCN) has become new state-of-the-art for collaborative filtering.
In this work, we aim to simplify the design of GCN to make it more concise and appropriate for recommendation.
We propose a new model named LightGCN, including only the most essential component in GCN -- neighborhood aggregation.
arXiv Detail & Related papers (2020-02-06T06:53:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.