Smoothed Graph Contrastive Learning via Seamless Proximity Integration
- URL: http://arxiv.org/abs/2402.15270v1
- Date: Fri, 23 Feb 2024 11:32:46 GMT
- Title: Smoothed Graph Contrastive Learning via Seamless Proximity Integration
- Authors: Maysam Behmanesh, Maks Ovsjanikov
- Abstract summary: Graph contrastive learning (GCL) aligns node representations by classifying node pairs into positives and negatives.
We present a Smoothed Graph Contrastive Learning model (SGCL) that injects proximity information associated with positive/negative pairs in the contrastive loss.
The proposed SGCL adjusts the penalties associated with node pairs in the contrastive loss by incorporating three distinct smoothing techniques.
- Score: 35.73306919276754
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph contrastive learning (GCL) aligns node representations by classifying
node pairs into positives and negatives using a selection process that
typically relies on establishing correspondences within two augmented graphs.
The conventional GCL approaches incorporate negative samples uniformly in the
contrastive loss, resulting in the equal treatment negative nodes, regardless
of their proximity to the true positive. In this paper, we present a Smoothed
Graph Contrastive Learning model (SGCL), which leverages the geometric
structure of augmented graphs to inject proximity information associated with
positive/negative pairs in the contrastive loss, thus significantly
regularizing the learning process. The proposed SGCL adjusts the penalties
associated with node pairs in the contrastive loss by incorporating three
distinct smoothing techniques that result in proximity aware positives and
negatives. To enhance scalability for large-scale graphs, the proposed
framework incorporates a graph batch-generating strategy that partitions the
given graphs into multiple subgraphs, facilitating efficient training in
separate batches. Through extensive experimentation in the unsupervised setting
on various benchmarks, particularly those of large scale, we demonstrate the
superiority of our proposed framework against recent baselines.
Related papers
- Negative-Free Self-Supervised Gaussian Embedding of Graphs [29.26519601854811]
Graph Contrastive Learning (GCL) has emerged as a promising graph self-supervised learning framework.
We propose a negative-free objective to achieve uniformity, inspired by the fact that points distributed according to a normalized isotropic Gaussian are uniformly spread across the unit hypersphere.
Our proposal achieves competitive performance with fewer parameters, shorter training times, and lower memory consumption compared to existing GCL methods.
arXiv Detail & Related papers (2024-11-02T07:04:40Z) - Bootstrap Latents of Nodes and Neighbors for Graph Self-Supervised Learning [27.278097015083343]
Contrastive learning requires negative samples to prevent model collapse and learn discriminative representations.
We introduce a cross-attention module to predict the supportiveness score of a neighbor with respect to the anchor node.
Our method mitigates class collision from negative and noisy positive samples, concurrently enhancing intra-class compactness.
arXiv Detail & Related papers (2024-08-09T14:17:52Z) - Bilateral Unsymmetrical Graph Contrastive Learning for Recommendation [12.945782054710113]
We propose a novel framework for recommendation tasks called Bilateral Unsymmetrical Graph Contrastive Learning (BusGCL)
BusGCL considers the bilateral unsymmetry on user-item node relation density for sliced user and item graph reasoning better with bilateral slicing contrastive training.
Comprehensive experiments on two public datasets have proved the superiority of BusGCL in comparison to various recommendation methods.
arXiv Detail & Related papers (2024-03-22T09:58:33Z) - STERLING: Synergistic Representation Learning on Bipartite Graphs [78.86064828220613]
A fundamental challenge of bipartite graph representation learning is how to extract node embeddings.
Most recent bipartite graph SSL methods are based on contrastive learning which learns embeddings by discriminating positive and negative node pairs.
We introduce a novel synergistic representation learning model (STERLING) to learn node embeddings without negative node pairs.
arXiv Detail & Related papers (2023-01-25T03:21:42Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Graph Contrastive Learning with Implicit Augmentations [36.57536688367965]
Implicit Graph Contrastive Learning (iGCL) uses augmentations in latent space learned from a Variational Graph Auto-Encoder by reconstructing graph topological structure.
Experimental results on both graph-level and node-level tasks show that the proposed method achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-11-07T17:34:07Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.