LightGCL: Simple Yet Effective Graph Contrastive Learning for
Recommendation
- URL: http://arxiv.org/abs/2302.08191v3
- Date: Wed, 14 Jun 2023 14:25:15 GMT
- Title: LightGCL: Simple Yet Effective Graph Contrastive Learning for
Recommendation
- Authors: Xuheng Cai, Chao Huang, Lianghao Xia, Xubin Ren
- Abstract summary: Graph neural clustering network (GNN) is a powerful learning approach for graph-based recommender systems.
In this paper, we propose a simple yet effective graph contrastive learning paradigm LightGCL.
- Score: 9.181689366185038
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural network (GNN) is a powerful learning approach for graph-based
recommender systems. Recently, GNNs integrated with contrastive learning have
shown superior performance in recommendation with their data augmentation
schemes, aiming at dealing with highly sparse data. Despite their success, most
existing graph contrastive learning methods either perform stochastic
augmentation (e.g., node/edge perturbation) on the user-item interaction graph,
or rely on the heuristic-based augmentation techniques (e.g., user clustering)
for generating contrastive views. We argue that these methods cannot well
preserve the intrinsic semantic structures and are easily biased by the noise
perturbation. In this paper, we propose a simple yet effective graph
contrastive learning paradigm LightGCL that mitigates these issues impairing
the generality and robustness of CL-based recommenders. Our model exclusively
utilizes singular value decomposition for contrastive augmentation, which
enables the unconstrained structural refinement with global collaborative
relation modeling. Experiments conducted on several benchmark datasets
demonstrate the significant improvement in performance of our model over the
state-of-the-arts. Further analyses demonstrate the superiority of LightGCL's
robustness against data sparsity and popularity bias. The source code of our
model is available at https://github.com/HKUDS/LightGCL.
Related papers
- GRE^2-MDCL: Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning [0.0]
Graph representation learning has emerged as a powerful tool for preserving graph topology when mapping nodes to vector representations.
Current graph neural network models face the challenge of requiring extensive labeled data.
We propose Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning.
arXiv Detail & Related papers (2024-09-12T03:09:05Z) - Dual-Channel Latent Factor Analysis Enhanced Graph Contrastive Learning for Recommendation [2.9449497738046078]
Graph Neural Networks (GNNs) are powerful learning methods for recommender systems.
Recently, the integration of contrastive learning with GNNs has demonstrated remarkable performance in recommender systems.
This study proposes a latent factor analysis (LFA) enhanced GCL approach, named LFA-GCL.
arXiv Detail & Related papers (2024-08-09T03:24:48Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - Graph Masked Autoencoder for Sequential Recommendation [10.319298705782058]
We propose a Graph Masked AutoEncoder-enhanced sequential Recommender system (MAERec) that adaptively and dynamically distills global item transitional information for self-supervised augmentation.
Our method significantly outperforms state-of-the-art baseline models and can learn more accurate representations against data noise and sparsity.
arXiv Detail & Related papers (2023-05-08T10:57:56Z) - Adversarial Learning Data Augmentation for Graph Contrastive Learning in
Recommendation [56.10351068286499]
We propose Learnable Data Augmentation for Graph Contrastive Learning (LDA-GCL)
Our methods include data augmentation learning and graph contrastive learning, which follow the InfoMin and InfoMax principles, respectively.
In implementation, our methods optimize the adversarial loss function to learn data augmentation and effective representations of users and items.
arXiv Detail & Related papers (2023-02-05T06:55:51Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - FastGCL: Fast Self-Supervised Learning on Graphs via Contrastive
Neighborhood Aggregation [26.07819501316758]
We argue that a better contrastive scheme should be tailored to the characteristics of graph neural networks.
By constructing weighted-aggregated and non-aggregated neighborhood information as positive and negative samples respectively, FastGCL identifies the potential semantic information of data.
Experiments have been conducted on node classification and graph classification tasks, showing that FastGCL has competitive classification performance and significant training speedup.
arXiv Detail & Related papers (2022-05-02T13:33:43Z) - Towards Unsupervised Deep Graph Structure Learning [67.58720734177325]
We propose an unsupervised graph structure learning paradigm, where the learned graph topology is optimized by data itself without any external guidance.
Specifically, we generate a learning target from the original data as an "anchor graph", and use a contrastive loss to maximize the agreement between the anchor graph and the learned graph.
arXiv Detail & Related papers (2022-01-17T11:57:29Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z) - Iterative Deep Graph Learning for Graph Neural Networks: Better and
Robust Node Embeddings [53.58077686470096]
We propose an end-to-end graph learning framework, namely Iterative Deep Graph Learning (IDGL) for jointly and iteratively learning graph structure and graph embedding.
Our experiments show that our proposed IDGL models can consistently outperform or match the state-of-the-art baselines.
arXiv Detail & Related papers (2020-06-21T19:49:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.