Graph Soft-Contrastive Learning via Neighborhood Ranking
- URL: http://arxiv.org/abs/2209.13964v3
- Date: Wed, 2 Aug 2023 12:43:19 GMT
- Title: Graph Soft-Contrastive Learning via Neighborhood Ranking
- Authors: Zhiyuan Ning, Pengfei Wang, Pengyang Wang, Ziyue Qiao, Wei Fan,
Denghui Zhang, Yi Du, Yuanchun Zhou
- Abstract summary: Graph Contrastive Learning (GCL) has emerged as a promising approach in the realm of graph self-supervised learning.
We propose a novel paradigm, Graph Soft-Contrastive Learning (GSCL)
GSCL facilitates GCL via neighborhood ranking, avoiding the need to specify absolutely similar pairs.
- Score: 19.241089079154044
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Contrastive Learning (GCL) has emerged as a promising approach in the
realm of graph self-supervised learning. Prevailing GCL methods mainly derive
from the principles of contrastive learning in the field of computer vision:
modeling invariance by specifying absolutely similar pairs. However, when
applied to graph data, this paradigm encounters two significant limitations:
(1) the validity of the generated views cannot be guaranteed: graph
perturbation may produce invalid views against semantics and intrinsic topology
of graph data; (2) specifying absolutely similar pairs in the graph views is
unreliable: for abstract and non-Euclidean graph data, it is difficult for
humans to decide the absolute similarity and dissimilarity intuitively. Despite
the notable performance of current GCL methods, these challenges necessitate a
reevaluation: Could GCL be more effectively tailored to the intrinsic
properties of graphs, rather than merely adopting principles from computer
vision? In response to this query, we propose a novel paradigm, Graph
Soft-Contrastive Learning (GSCL). This approach facilitates GCL via
neighborhood ranking, avoiding the need to specify absolutely similar pairs.
GSCL leverages the underlying graph characteristic of diminishing label
consistency, asserting that nodes that are closer in the graph are overall more
similar than far-distant nodes. Within the GSCL framework, we introduce
pairwise and listwise gated ranking InfoNCE loss functions to effectively
preserve the relative similarity ranking within neighborhoods. Moreover, as the
neighborhood size exponentially expands with more hops considered, we propose
neighborhood sampling strategies to improve learning efficiency. Our extensive
empirical results across 11 commonly used graph datasets-including 8 homophily
graphs and 3 heterophily graphs-demonstrate GSCL's superior performance
compared to 20 SOTA GCL methods.
Related papers
- Simple and Asymmetric Graph Contrastive Learning without Augmentations [39.301072710063636]
Asymmetric Contrastive Learning for Graphs (GraphACL) is easy to implement and does not rely on graph augmentations and homophily assumptions.
Experimental results show that the simple GraphACL significantly outperforms state-of-the-art graph contrastive learning and self-supervised learning methods on homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-10-29T03:14:20Z) - HomoGCL: Rethinking Homophily in Graph Contrastive Learning [64.85392028383164]
HomoGCL is a model-agnostic framework to expand the positive set using neighbor nodes with neighbor-specific significances.
We show that HomoGCL yields multiple state-of-the-art results across six public datasets.
arXiv Detail & Related papers (2023-06-16T04:06:52Z) - Subgraph Networks Based Contrastive Learning [5.736011243152416]
Graph contrastive learning (GCL) can solve the problem of annotated data scarcity.
Most existing GCL methods focus on the design of graph augmentation strategies and mutual information estimation operations.
We propose a novel framework called subgraph network-based contrastive learning (SGNCL)
arXiv Detail & Related papers (2023-06-06T08:52:44Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - Graph Contrastive Learning with Personalized Augmentation [17.714437631216516]
Graph contrastive learning (GCL) has emerged as an effective tool for learning unsupervised representations of graphs.
We propose a principled framework, termed as textitGraph contrastive learning with textitPersonalized textitAugmentation (GPA)
GPA infers tailored augmentation strategies for each graph based on its topology and node attributes via a learnable augmentation selector.
arXiv Detail & Related papers (2022-09-14T11:37:48Z) - Geometry Contrastive Learning on Heterogeneous Graphs [50.58523799455101]
This paper proposes a novel self-supervised learning method, termed as Geometry Contrastive Learning (GCL)
GCL views a heterogeneous graph from Euclidean and hyperbolic perspective simultaneously, aiming to make a strong merger of the ability of modeling rich semantics and complex structures.
Extensive experiments on four benchmarks data sets show that the proposed approach outperforms the strong baselines.
arXiv Detail & Related papers (2022-06-25T03:54:53Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Adversarial Graph Augmentation to Improve Graph Contrastive Learning [21.54343383921459]
We propose a novel principle, termed adversarial-GCL (AD-GCL), which enables GNNs to avoid capturing redundant information during the training.
We experimentally validate AD-GCL by comparing with the state-of-the-art GCL methods and achieve performance gains of up-to $14%$ in unsupervised, $6%$ in transfer, and $3%$ in semi-supervised learning settings.
arXiv Detail & Related papers (2021-06-10T15:34:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.