Graph Contrastive Learning with Cohesive Subgraph Awareness
- URL: http://arxiv.org/abs/2401.17580v2
- Date: Wed, 21 Feb 2024 16:33:59 GMT
- Title: Graph Contrastive Learning with Cohesive Subgraph Awareness
- Authors: Yucheng Wu, Leye Wang, Xiao Han, and Han-Jia Ye
- Abstract summary: Graph contrastive learning (GCL) has emerged as a state-of-the-art strategy for learning representations of diverse graphs.
We argue that an awareness of subgraphs during the graph augmentation and learning processes has the potential to enhance GCL performance.
We propose a novel unified framework called CTAug, to seamlessly integrate cohesion awareness into various existing GCL mechanisms.
- Score: 34.76555185419192
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph contrastive learning (GCL) has emerged as a state-of-the-art strategy
for learning representations of diverse graphs including social and biomedical
networks. GCL widely uses stochastic graph topology augmentation, such as
uniform node dropping, to generate augmented graphs. However, such stochastic
augmentations may severely damage the intrinsic properties of a graph and
deteriorate the following representation learning process. We argue that
incorporating an awareness of cohesive subgraphs during the graph augmentation
and learning processes has the potential to enhance GCL performance. To this
end, we propose a novel unified framework called CTAug, to seamlessly integrate
cohesion awareness into various existing GCL mechanisms. In particular, CTAug
comprises two specialized modules: topology augmentation enhancement and graph
learning enhancement. The former module generates augmented graphs that
carefully preserve cohesion properties, while the latter module bolsters the
graph encoder's ability to discern subgraph patterns. Theoretical analysis
shows that CTAug can strictly improve existing GCL mechanisms. Empirical
experiments verify that CTAug can achieve state-of-the-art performance for
graph representation learning, especially for graphs with high degrees. The
code is available at https://doi.org/10.5281/zenodo.10594093, or
https://github.com/wuyucheng2002/CTAug.
Related papers
- Tensor-Fused Multi-View Graph Contrastive Learning [12.412040359604163]
Graph contrastive learning (GCL) has emerged as a promising approach to enhance graph neural networks' (GNNs) ability to learn rich representations from unlabeled graph-structured data.
Current GCL models face challenges with computational demands and limited feature utilization.
We propose TensorMV-GCL, a novel framework that integrates extended persistent homology with GCL representations and facilitates multi-scale feature extraction.
arXiv Detail & Related papers (2024-10-20T01:40:12Z) - Community-Invariant Graph Contrastive Learning [21.72222875193335]
This research investigates the role of the graph community in graph augmentation.
We propose a community-invariant GCL framework to maintain graph community structure during learnable graph augmentation.
arXiv Detail & Related papers (2024-05-02T14:59:58Z) - MA-GCL: Model Augmentation Tricks for Graph Contrastive Learning [41.963242524220654]
We present three easy-to-implement model augmentation tricks for graph contrastive learning (GCL)
Specifically, we present three easy-to-implement model augmentation tricks for GCL, namely asymmetric, random and shuffling.
Experimental results show that MA-GCL can achieve state-of-the-art performance on node classification benchmarks.
arXiv Detail & Related papers (2022-12-14T05:04:10Z) - Revisiting Graph Contrastive Learning from the Perspective of Graph
Spectrum [91.06367395889514]
Graph Contrastive Learning (GCL) learning the node representations by augmenting graphs has attracted considerable attentions.
We answer these questions by establishing the connection between GCL and graph spectrum.
We propose a spectral graph contrastive learning module (SpCo), which is a general and GCL-friendly plug-in.
arXiv Detail & Related papers (2022-10-05T15:32:00Z) - Graph Contrastive Learning with Personalized Augmentation [17.714437631216516]
Graph contrastive learning (GCL) has emerged as an effective tool for learning unsupervised representations of graphs.
We propose a principled framework, termed as textitGraph contrastive learning with textitPersonalized textitAugmentation (GPA)
GPA infers tailored augmentation strategies for each graph based on its topology and node attributes via a learnable augmentation selector.
arXiv Detail & Related papers (2022-09-14T11:37:48Z) - Let Invariant Rationale Discovery Inspire Graph Contrastive Learning [98.10268114789775]
We argue that a high-performing augmentation should preserve the salient semantics of anchor graphs regarding instance-discrimination.
We propose a new framework, Rationale-aware Graph Contrastive Learning (RGCL)
RGCL uses a rationale generator to reveal salient features about graph instance-discrimination as the rationale, and then creates rationale-aware views for contrastive learning.
arXiv Detail & Related papers (2022-06-16T01:28:40Z) - COSTA: Covariance-Preserving Feature Augmentation for Graph Contrastive
Learning [64.78221638149276]
We show that the node embedding obtained via the graph augmentations is highly biased.
Instead of investigating graph augmentation in the input space, we propose augmentations on the hidden features.
We show that the feature augmentation with COSTA achieves comparable/better results than graph augmentation based models.
arXiv Detail & Related papers (2022-06-09T18:46:38Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - CGCL: Collaborative Graph Contrastive Learning without Handcrafted Graph Data Augmentations [12.820228374977441]
We propose a novel Collaborative Graph Contrastive Learning framework (CGCL)
This framework harnesses multiple graph encoders to observe the graph.
To ensure the collaboration among diverse graph encoders, we propose the concepts of asymmetric architecture and complementary encoders.
arXiv Detail & Related papers (2021-11-05T05:08:27Z) - Graph Contrastive Learning with Augmentations [109.23158429991298]
We propose a graph contrastive learning (GraphCL) framework for learning unsupervised representations of graph data.
We show that our framework can produce graph representations of similar or better generalizability, transferrability, and robustness compared to state-of-the-art methods.
arXiv Detail & Related papers (2020-10-22T20:13:43Z) - Unsupervised Graph Embedding via Adaptive Graph Learning [85.28555417981063]
Graph autoencoders (GAEs) are powerful tools in representation learning for graph embedding.
In this paper, two novel unsupervised graph embedding methods, unsupervised graph embedding via adaptive graph learning (BAGE) and unsupervised graph embedding via variational adaptive graph learning (VBAGE) are proposed.
Experimental studies on several datasets validate our design and demonstrate that our methods outperform baselines by a wide margin in node clustering, node classification, and graph visualization tasks.
arXiv Detail & Related papers (2020-03-10T02:33:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.