LocalGCL: Local-aware Contrastive Learning for Graphs
- URL: http://arxiv.org/abs/2402.17345v1
- Date: Tue, 27 Feb 2024 09:23:54 GMT
- Title: LocalGCL: Local-aware Contrastive Learning for Graphs
- Authors: Haojun Jiang, Jiawei Sun, Jie Li, Chentao Wu
- Abstract summary: We propose underlineLocal-aware underlineGraph underlineContrastive underlineLearning (textbfmethnametrim) as a graph representation learner.
Experiments validate the superiority of methname against state-of-the-art methods, demonstrating its promise as a comprehensive graph representation learner.
- Score: 17.04219759259025
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph representation learning (GRL) makes considerable progress recently,
which encodes graphs with topological structures into low-dimensional
embeddings. Meanwhile, the time-consuming and costly process of annotating
graph labels manually prompts the growth of self-supervised learning (SSL)
techniques. As a dominant approach of SSL, Contrastive learning (CL) learns
discriminative representations by differentiating between positive and negative
samples. However, when applied to graph data, it overemphasizes global patterns
while neglecting local structures. To tackle the above issue, we propose
\underline{Local}-aware \underline{G}raph \underline{C}ontrastive
\underline{L}earning (\textbf{\methnametrim}), a self-supervised learning
framework that supplementarily captures local graph information with
masking-based modeling compared with vanilla contrastive learning. Extensive
experiments validate the superiority of \methname against state-of-the-art
methods, demonstrating its promise as a comprehensive graph representation
learner.
Related papers
- Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Adversarial Graph Contrastive Learning with Information Regularization [51.14695794459399]
Contrastive learning is an effective method in graph representation learning.
Data augmentation on graphs is far less intuitive and much harder to provide high-quality contrastive samples.
We propose a simple but effective method, Adversarial Graph Contrastive Learning (ARIEL)
It consistently outperforms the current graph contrastive learning methods in the node classification task over various real-world datasets.
arXiv Detail & Related papers (2022-02-14T05:54:48Z) - Dual Space Graph Contrastive Learning [82.81372024482202]
We propose a novel graph contrastive learning method, namely textbfDual textbfSpace textbfGraph textbfContrastive (DSGC) Learning.
Since both spaces have their own advantages to represent graph data in the embedding spaces, we hope to utilize graph contrastive learning to bridge the spaces and leverage advantages from both sides.
arXiv Detail & Related papers (2022-01-19T04:10:29Z) - Multi-Level Graph Contrastive Learning [38.022118893733804]
We propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs.
The original graph is first-order approximation structure and contains uncertainty or error, while the $k$NN graph generated by encoding features preserves high-order proximity.
Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven datasets.
arXiv Detail & Related papers (2021-07-06T14:24:43Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Graph Barlow Twins: A self-supervised representation learning framework
for graphs [25.546290138565393]
We propose a framework for self-supervised graph representation learning - Graph Barlow Twins.
It utilizes a cross-correlation-based loss function instead of negative samples.
We show that our method achieves as competitive results as the best self-supervised methods and fully supervised ones.
arXiv Detail & Related papers (2021-06-04T13:10:51Z) - Sub-graph Contrast for Scalable Self-Supervised Graph Representation
Learning [21.0019144298605]
Existing graph neural networks fed with the complete graph data are not scalable due to limited computation and memory costs.
textscSubg-Con is proposed by utilizing the strong correlation between central nodes and their sampled subgraphs to capture regional structure information.
Compared with existing graph representation learning approaches, textscSubg-Con has prominent performance advantages in weaker supervision requirements, model learning scalability, and parallelization.
arXiv Detail & Related papers (2020-09-22T01:58:19Z) - Contrastive and Generative Graph Convolutional Networks for Graph-based
Semi-Supervised Learning [64.98816284854067]
Graph-based Semi-Supervised Learning (SSL) aims to transfer the labels of a handful of labeled data to the remaining massive unlabeled data via a graph.
A novel GCN-based SSL algorithm is presented in this paper to enrich the supervision signals by utilizing both data similarities and graph structure.
arXiv Detail & Related papers (2020-09-15T13:59:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.