CSGCL: Community-Strength-Enhanced Graph Contrastive Learning
- URL: http://arxiv.org/abs/2305.04658v1
- Date: Mon, 8 May 2023 12:21:24 GMT
- Title: CSGCL: Community-Strength-Enhanced Graph Contrastive Learning
- Authors: Han Chen, Ziwen Zhao, Yuhua Li, Yixiong Zou, Ruixuan Li, Rui Zhang
- Abstract summary: We propose a Community-Strength-enhanced Graph Contrastive Learning (CSGCL) framework to preserve community strength throughout the learning process.
We present two novel graph augmentation methods, Communal Attribute Voting (CAV) and Communal Edge Dropping (CED), where the perturbations of node attributes and edges are guided by community strength.
We report extensive experiment results on three downstream tasks: node classification, node clustering, and link prediction.
- Score: 13.770188320382285
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Contrastive Learning (GCL) is an effective way to learn generalized
graph representations in a self-supervised manner, and has grown rapidly in
recent years. However, the underlying community semantics has not been well
explored by most previous GCL methods. Research that attempts to leverage
communities in GCL regards them as having the same influence on the graph,
leading to extra representation errors. To tackle this issue, we define
''community strength'' to measure the difference of influence among
communities. Under this premise, we propose a Community-Strength-enhanced Graph
Contrastive Learning (CSGCL) framework to preserve community strength
throughout the learning process. Firstly, we present two novel graph
augmentation methods, Communal Attribute Voting (CAV) and Communal Edge
Dropping (CED), where the perturbations of node attributes and edges are guided
by community strength. Secondly, we propose a dynamic ''Team-up'' contrastive
learning scheme, where community strength is used to progressively fine-tune
the contrastive objective. We report extensive experiment results on three
downstream tasks: node classification, node clustering, and link prediction.
CSGCL achieves state-of-the-art performance compared with other GCL methods,
validating that community strength brings effectiveness and generality to graph
representations. Our code is available at
https://github.com/HanChen-HUST/CSGCL.
Related papers
- Community-Invariant Graph Contrastive Learning [21.72222875193335]
This research investigates the role of the graph community in graph augmentation.
We propose a community-invariant GCL framework to maintain graph community structure during learnable graph augmentation.
arXiv Detail & Related papers (2024-05-02T14:59:58Z) - Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training [27.339318501446115]
We propose a Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to jointly learn community partition and node representations in an end-to-end manner.
We show that our CEGCL exhibits state-of-the-art performance on three benchmark datasets with different scales.
arXiv Detail & Related papers (2023-11-18T13:45:21Z) - Provable Training for Graph Contrastive Learning [58.8128675529977]
Graph Contrastive Learning (GCL) has emerged as a popular training approach for learning node embeddings from augmented graphs without labels.
We show that the training of GCL is indeed imbalanced across all nodes.
We propose the metric "node compactness", which is the lower bound of how a node follows the GCL principle.
arXiv Detail & Related papers (2023-09-25T08:23:53Z) - From Cluster Assumption to Graph Convolution: Graph-based Semi-Supervised Learning Revisited [51.24526202984846]
Graph-based semi-supervised learning (GSSL) has long been a hot research topic.
graph convolutional networks (GCNs) have become the predominant techniques for their promising performance.
arXiv Detail & Related papers (2023-09-24T10:10:21Z) - Similarity Preserving Adversarial Graph Contrastive Learning [5.671825576834061]
We propose a similarity-preserving adversarial graph contrastive learning framework.
In this paper, we show that SP-AGCL achieves a competitive performance on several downstream tasks.
arXiv Detail & Related papers (2023-06-24T04:02:50Z) - HomoGCL: Rethinking Homophily in Graph Contrastive Learning [64.85392028383164]
HomoGCL is a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances.
We show that HomoGCL yields multiple state-of-the-art results across six public datasets.
arXiv Detail & Related papers (2023-06-16T04:06:52Z) - Graph Contrastive Learning for Skeleton-based Action Recognition [85.86820157810213]
We propose a graph contrastive learning framework for skeleton-based action recognition.
SkeletonGCL associates graph learning across sequences by enforcing graphs to be class-discriminative.
SkeletonGCL establishes a new training paradigm, and it can be seamlessly incorporated into current graph convolutional networks.
arXiv Detail & Related papers (2023-01-26T02:09:16Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Uncovering the Structural Fairness in Graph Contrastive Learning [87.65091052291544]
Graph contrastive learning (GCL) has emerged as a promising self-supervised approach for learning node representations.
We show that representations obtained by GCL methods are already fairer to degree bias than those learned by GCN.
We devise a novel graph augmentation method, called GRAph contrastive learning for DEgree bias (GRADE), which applies different strategies to low- and high-degree nodes.
arXiv Detail & Related papers (2022-10-06T15:58:25Z) - Graph Communal Contrastive Learning [34.85906025283825]
A fundamental problem for graph representation learning is how to effectively learn representations without human labeling.
We propose a novel Graph Contrastive Learning (gCooL) framework to jointly learn the community partition and learn node representations.
arXiv Detail & Related papers (2021-10-28T02:57:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.