Structural and Semantic Contrastive Learning for Self-supervised Node
Representation Learning
- URL: http://arxiv.org/abs/2202.08480v1
- Date: Thu, 17 Feb 2022 07:20:09 GMT
- Title: Structural and Semantic Contrastive Learning for Self-supervised Node
Representation Learning
- Authors: Kaize Ding, Yancheng Wang, Yingzhen Yang and Huan Liu
- Abstract summary: Graph Contrastive Learning (GCL) has drawn much research interest for learning generalizable, transferable, and robust node representations in a self-supervised fashion.
In this work, we go beyond the existing unsupervised GCL counterparts and address their limitations by proposing a simple yet effective framework S$3$-CL.
Our experiments demonstrate that the node representations learned by S$3$-CL achieve superior performance on different downstream tasks compared to the state-of-the-art GCL methods.
- Score: 32.126228702554144
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Contrastive Learning (GCL) recently has drawn much research interest
for learning generalizable, transferable, and robust node representations in a
self-supervised fashion. In general, the contrastive learning process in GCL is
performed on top of the representations learned by a graph neural network (GNN)
backbone, which transforms and propagates the node contextual information based
on its local neighborhoods. However, existing GCL efforts have severe
limitations in terms of both encoding architecture, augmentation, and
contrastive objective, making them commonly inefficient and ineffective to use
in different datasets. In this work, we go beyond the existing unsupervised GCL
counterparts and address their limitations by proposing a simple yet effective
framework S$^3$-CL. Specifically, by virtue of the proposed structural and
semantic contrastive learning, even a simple neural network is able to learn
expressive node representations that preserve valuable structural and semantic
patterns. Our experiments demonstrate that the node representations learned by
S$^3$-CL achieve superior performance on different downstream tasks compared to
the state-of-the-art GCL methods.
Related papers
- L^2CL: Embarrassingly Simple Layer-to-Layer Contrastive Learning for Graph Collaborative Filtering [33.165094795515785]
Graph neural networks (GNNs) have recently emerged as an effective approach to model neighborhood signals in collaborative filtering.
We propose L2CL, a principled Layer-to-Layer Contrastive Learning framework that contrasts representations from different layers.
We find that L2CL, using only one-hop contrastive learning paradigm, is able to capture intrinsic semantic structures and improve the quality of node representation.
arXiv Detail & Related papers (2024-07-19T12:45:21Z) - Graph-level Protein Representation Learning by Structure Knowledge
Refinement [50.775264276189695]
This paper focuses on learning representation on the whole graph level in an unsupervised manner.
We propose a novel framework called Structure Knowledge Refinement (SKR) which uses data structure to determine the probability of whether a pair is positive or negative.
arXiv Detail & Related papers (2024-01-05T09:05:33Z) - Structure-CLIP: Towards Scene Graph Knowledge to Enhance Multi-modal
Structured Representations [70.41385310930846]
We present an end-to-end framework Structure-CLIP to enhance multi-modal structured representations.
We use scene graphs to guide the construction of semantic negative examples, which results in an increased emphasis on learning structured representations.
A Knowledge-Enhance (KEE) is proposed to leverage SGK as input to further enhance structured representations.
arXiv Detail & Related papers (2023-05-06T03:57:05Z) - Self-Supervised Node Representation Learning via Node-to-Neighbourhood
Alignment [10.879056662671802]
Self-supervised node representation learning aims to learn node representations from unlabelled graphs that rival the supervised counterparts.
In this work, we present simple-yet-effective self-supervised node representation learning via aligning the hidden representations of nodes and their neighbourhood.
We learn node representations that achieve promising node classification performance on a set of graph-structured datasets from small- to large-scale.
arXiv Detail & Related papers (2023-02-09T13:21:18Z) - Graph Contrastive Learning for Skeleton-based Action Recognition [85.86820157810213]
We propose a graph contrastive learning framework for skeleton-based action recognition.
SkeletonGCL associates graph learning across sequences by enforcing graphs to be class-discriminative.
SkeletonGCL establishes a new training paradigm, and it can be seamlessly incorporated into current graph convolutional networks.
arXiv Detail & Related papers (2023-01-26T02:09:16Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Uncovering the Structural Fairness in Graph Contrastive Learning [87.65091052291544]
Graph contrastive learning (GCL) has emerged as a promising self-supervised approach for learning node representations.
We show that representations obtained by GCL methods are already fairer to degree bias than those learned by GCN.
We devise a novel graph augmentation method, called GRAph contrastive learning for DEgree bias (GRADE), which applies different strategies to low- and high-degree nodes.
arXiv Detail & Related papers (2022-10-06T15:58:25Z) - GraphCL: Contrastive Self-Supervised Learning of Graph Representations [20.439666392958284]
We propose Graph Contrastive Learning (GraphCL), a general framework for learning node representations in a self supervised manner.
We use graph neural networks to produce two representations of the same node and leverage a contrastive learning loss to maximize agreement between them.
In both transductive and inductive learning setups, we demonstrate that our approach significantly outperforms the state-of-the-art in unsupervised learning on a number of node classification benchmarks.
arXiv Detail & Related papers (2020-07-15T22:36:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.