An Empirical Study of Graph Contrastive Learning
- URL: http://arxiv.org/abs/2109.01116v1
- Date: Thu, 2 Sep 2021 17:43:45 GMT
- Title: An Empirical Study of Graph Contrastive Learning
- Authors: Yanqiao Zhu, Yichen Xu, Qiang Liu, Shu Wu
- Abstract summary: Graph Contrastive Learning establishes a new paradigm for learning graph representations without human annotations.
We identify several critical design considerations within a general GCL paradigm, including augmentation functions, contrasting modes, contrastive objectives, and negative mining techniques.
To foster future research and ease the implementation of GCL algorithms, we develop an easy-to-use library PyGCL, featuring modularized CL components, standardized evaluation, and experiment management.
- Score: 17.246488437677616
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Contrastive Learning (GCL) establishes a new paradigm for learning
graph representations without human annotations. Although remarkable progress
has been witnessed recently, the success behind GCL is still left somewhat
mysterious. In this work, we first identify several critical design
considerations within a general GCL paradigm, including augmentation functions,
contrasting modes, contrastive objectives, and negative mining techniques.
Then, to understand the interplay of different GCL components, we conduct
extensive, controlled experiments over a set of benchmark tasks on datasets
across various domains. Our empirical studies suggest a set of general receipts
for effective GCL, e.g., simple topology augmentations that produce sparse
graph views bring promising performance improvements; contrasting modes should
be aligned with the granularities of end tasks. In addition, to foster future
research and ease the implementation of GCL algorithms, we develop an
easy-to-use library PyGCL, featuring modularized CL components, standardized
evaluation, and experiment management. We envision this work to provide useful
empirical evidence of effective GCL algorithms and offer several insights for
future research.
Related papers
- Overcoming Pitfalls in Graph Contrastive Learning Evaluation: Toward
Comprehensive Benchmarks [60.82579717007963]
We introduce an enhanced evaluation framework designed to more accurately gauge the effectiveness, consistency, and overall capability of Graph Contrastive Learning (GCL) methods.
arXiv Detail & Related papers (2024-02-24T01:47:56Z) - Rethinking and Simplifying Bootstrapped Graph Latents [48.76934123429186]
Graph contrastive learning (GCL) has emerged as a representative paradigm in graph self-supervised learning.
We present SGCL, a simple yet effective GCL framework that utilizes the outputs from two consecutive iterations as positive pairs.
We show that SGCL can achieve competitive performance with fewer parameters, lower time and space costs, and significant convergence speedup.
arXiv Detail & Related papers (2023-12-05T09:49:50Z) - Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training [27.339318501446115]
We propose a Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to jointly learn community partition and node representations in an end-to-end manner.
We show that our CEGCL exhibits state-of-the-art performance on three benchmark datasets with different scales.
arXiv Detail & Related papers (2023-11-18T13:45:21Z) - HomoGCL: Rethinking Homophily in Graph Contrastive Learning [64.85392028383164]
HomoGCL is a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances.
We show that HomoGCL yields multiple state-of-the-art results across six public datasets.
arXiv Detail & Related papers (2023-06-16T04:06:52Z) - Adversarial Learning Data Augmentation for Graph Contrastive Learning in
Recommendation [56.10351068286499]
We propose Learnable Data Augmentation for Graph Contrastive Learning (LDA-GCL)
Our methods include data augmentation learning and graph contrastive learning, which follow the InfoMin and InfoMax principles, respectively.
In implementation, our methods optimize the adversarial loss function to learn data augmentation and effective representations of users and items.
arXiv Detail & Related papers (2023-02-05T06:55:51Z) - MA-GCL: Model Augmentation Tricks for Graph Contrastive Learning [41.963242524220654]
We present three easy-to-implement model augmentation tricks for graph contrastive learning (GCL)
Specifically, we present three easy-to-implement model augmentation tricks for GCL, namely asymmetric, random and shuffling.
Experimental results show that MA-GCL can achieve state-of-the-art performance on node classification benchmarks.
arXiv Detail & Related papers (2022-12-14T05:04:10Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Heterogeneous Graph Contrastive Multi-view Learning [11.489983916543805]
Graph contrastive learning (GCL) has been developed to learn discriminative node representations on graph datasets.
We propose a novel Heterogeneous Graph Contrastive Multi-view Learning (HGCML) model.
HGCML consistently outperforms state-of-the-art baselines on five real-world benchmark datasets.
arXiv Detail & Related papers (2022-10-01T10:53:48Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Structural and Semantic Contrastive Learning for Self-supervised Node
Representation Learning [32.126228702554144]
Graph Contrastive Learning (GCL) has drawn much research interest for learning generalizable, transferable, and robust node representations in a self-supervised fashion.
In this work, we go beyond the existing unsupervised GCL counterparts and address their limitations by proposing a simple yet effective framework S$3$-CL.
Our experiments demonstrate that the node representations learned by S$3$-CL achieve superior performance on different downstream tasks compared to the state-of-the-art GCL methods.
arXiv Detail & Related papers (2022-02-17T07:20:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.