Pre-training Graph Neural Network for Cross Domain Recommendation
- URL: http://arxiv.org/abs/2111.08268v1
- Date: Tue, 16 Nov 2021 07:34:42 GMT
- Title: Pre-training Graph Neural Network for Cross Domain Recommendation
- Authors: Chen Wang, Yueqing Liang, Zhiwei Liu, Tao Zhang, Philip S. Yu
- Abstract summary: A recommender system predicts users' potential interests in items, where the core is to learn user/item embeddings.
Inspired by the contemporary arts in pre-training from graph representation learning, we propose a pre-training and fine-tuning diagram for cross-domain recommendation.
We devise a novel Pre-training Graph Neural Network for Cross-Domain Recommendation (PCRec), which adopts the contrastive self-supervised pre-training of a graph encoder.
- Score: 58.057687253631826
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A recommender system predicts users' potential interests in items, where the
core is to learn user/item embeddings. Nevertheless, it suffers from the
data-sparsity issue, which the cross-domain recommendation can alleviate.
However, most prior works either jointly learn the source domain and target
domain models, or require side-features. However, jointly training and side
features would affect the prediction on the target domain as the learned
embedding is dominated by the source domain containing bias information.
Inspired by the contemporary arts in pre-training from graph representation
learning, we propose a pre-training and fine-tuning diagram for cross-domain
recommendation. We devise a novel Pre-training Graph Neural Network for
Cross-Domain Recommendation (PCRec), which adopts the contrastive
self-supervised pre-training of a graph encoder. Then, we transfer the
pre-trained graph encoder to initialize the node embeddings on the target
domain, which benefits the fine-tuning of the single domain recommender system
on the target domain. The experimental results demonstrate the superiority of
PCRec. Detailed analyses verify the superiority of PCRec in transferring
information while avoiding biases from source domains.
Related papers
- Adapting to Distribution Shift by Visual Domain Prompt Generation [34.19066857066073]
We adapt a model at test-time using a few unlabeled data to address distribution shifts.
We build a knowledge bank to learn the transferable knowledge from source domains.
The proposed method outperforms previous work on 5 large-scale benchmarks including WILDS and DomainNet.
arXiv Detail & Related papers (2024-05-05T02:44:04Z) - Towards Generalised Pre-Training of Graph Models [0.0]
We present Topology Only Pre-Training, a graph pre-training method based on node and edge feature exclusion.
Models show positive transfer on evaluation datasets from multiple domains, including domains not present in pre-training data.
arXiv Detail & Related papers (2023-11-07T13:24:01Z) - Connect, Not Collapse: Explaining Contrastive Learning for Unsupervised
Domain Adaptation [88.5448806952394]
We consider unsupervised domain adaptation (UDA), where labeled data from a source domain and unlabeled data from a target domain are used to learn a classifier for the target domain.
We show that contrastive pre-training, which learns features on unlabeled source and target data and then fine-tunes on labeled source data, is competitive with strong UDA methods.
arXiv Detail & Related papers (2022-04-01T16:56:26Z) - A Broad Study of Pre-training for Domain Generalization and Adaptation [69.38359595534807]
We provide a broad study and in-depth analysis of pre-training for domain adaptation and generalization.
We observe that simply using a state-of-the-art backbone outperforms existing state-of-the-art domain adaptation baselines.
arXiv Detail & Related papers (2022-03-22T15:38:36Z) - Ranking Distance Calibration for Cross-Domain Few-Shot Learning [91.22458739205766]
Recent progress in few-shot learning promotes a more realistic cross-domain setting.
Due to the domain gap and disjoint label spaces between source and target datasets, their shared knowledge is extremely limited.
We employ a re-ranking process for calibrating a target distance matrix by discovering the reciprocal k-nearest neighbours within the task.
arXiv Detail & Related papers (2021-12-01T03:36:58Z) - Unified Instance and Knowledge Alignment Pretraining for Aspect-based
Sentiment Analysis [96.53859361560505]
Aspect-based Sentiment Analysis (ABSA) aims to determine the sentiment polarity towards an aspect.
There always exists severe domain shift between the pretraining and downstream ABSA datasets.
We introduce a unified alignment pretraining framework into the vanilla pretrain-finetune pipeline.
arXiv Detail & Related papers (2021-10-26T04:03:45Z) - Efficient Variational Graph Autoencoders for Unsupervised Cross-domain
Prerequisite Chains [3.358838755118655]
We introduce Domain-versaational Variational Graph Autoencoders (DAVGAE) to solve this cross-domain prerequisite chain learning task efficiently.
Our novel model consists of a variational graph autoencoder (VGAE) and a domain discriminator.
Results show that our model outperforms recent graph-based computation using only 1/10 graph scale and 1/3 time.
arXiv Detail & Related papers (2021-09-17T19:07:27Z) - Unsupervised Cross-Domain Prerequisite Chain Learning using Variational
Graph Autoencoders [2.735701323590668]
We propose unsupervised cross-domain concept prerequisite chain learning using an optimized variational graph autoencoder.
Our model learns to transfer concept prerequisite relations from an information-rich domain to an information-poor domain.
Also, we expand an existing dataset by introducing two new domains: CV and Bioinformatics.
arXiv Detail & Related papers (2021-05-07T21:02:41Z) - Supervised Domain Adaptation using Graph Embedding [86.3361797111839]
Domain adaptation methods assume that distributions between the two domains are shifted and attempt to realign them.
We propose a generic framework based on graph embedding.
We show that the proposed approach leads to a powerful Domain Adaptation framework.
arXiv Detail & Related papers (2020-03-09T12:25:13Z) - Unsupervised Domain Adaptive Object Detection using Forward-Backward
Cyclic Adaptation [13.163271874039191]
We present a novel approach to perform the unsupervised domain adaptation for object detection through forward-backward cyclic (FBC) training.
Recent adversarial training based domain adaptation methods have shown their effectiveness on minimizing domain discrepancy via marginal feature distributions alignment.
We propose Forward-Backward Cyclic Adaptation, which iteratively computes adaptation from source to target via backward hopping and from target to source via forward passing.
arXiv Detail & Related papers (2020-02-03T06:24:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.