Khan-GCL: Kolmogorov-Arnold Network Based Graph Contrastive Learning with Hard Negatives
- URL: http://arxiv.org/abs/2505.15103v1
- Date: Wed, 21 May 2025 04:54:18 GMT
- Title: Khan-GCL: Kolmogorov-Arnold Network Based Graph Contrastive Learning with Hard Negatives
- Authors: Zihu Wang, Boxun Xu, Hejia Geng, Peng Li,
- Abstract summary: Khan-GCL is a novel framework that integrates the Kolmogorov-Arnold Network (KAN) into the GCL encoder architecture.<n>We exploit the rich information embedded within KAN coefficient parameters to develop two novel critical feature identification techniques.<n>These techniques enable the generation of semantically meaningful hard negative samples for each graph representation.
- Score: 3.440313042843115
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Graph contrastive learning (GCL) has demonstrated great promise for learning generalizable graph representations from unlabeled data. However, conventional GCL approaches face two critical limitations: (1) the restricted expressive capacity of multilayer perceptron (MLP) based encoders, and (2) suboptimal negative samples that either from random augmentations-failing to provide effective 'hard negatives'-or generated hard negatives without addressing the semantic distinctions crucial for discriminating graph data. To this end, we propose Khan-GCL, a novel framework that integrates the Kolmogorov-Arnold Network (KAN) into the GCL encoder architecture, substantially enhancing its representational capacity. Furthermore, we exploit the rich information embedded within KAN coefficient parameters to develop two novel critical feature identification techniques that enable the generation of semantically meaningful hard negative samples for each graph representation. These strategically constructed hard negatives guide the encoder to learn more discriminative features by emphasizing critical semantic differences between graphs. Extensive experiments demonstrate that our approach achieves state-of-the-art performance compared to existing GCL methods across a variety of datasets and tasks.
Related papers
- Squeeze and Excitation: A Weighted Graph Contrastive Learning for Collaborative Filtering [1.3535213052193866]
Graph contrastive learning (GCL) aims to enhance the robustness of representation learning.<n>Weighted Graph Contrastive Learning framework (WeightedGCL) addresses the irrational allocation of feature attention.<n>WeightedGCL achieves significant accuracy improvements compared to competitive baselines.
arXiv Detail & Related papers (2025-04-06T11:30:59Z) - HyperGCL: Multi-Modal Graph Contrastive Learning via Learnable Hypergraph Views [6.17062506211443]
Graph Contrastive Learning (GCL) has demonstrated remarkable effectiveness in improving graph representations.<n>In this paper, we introduce HyperGCL, a novel multimodal GCL framework from a hypergraph perspective.
arXiv Detail & Related papers (2025-02-18T20:57:56Z) - Graph Structure Refinement with Energy-based Contrastive Learning [56.957793274727514]
We introduce an unsupervised method based on a joint of generative training and discriminative training to learn graph structure and representation.<n>We propose an Energy-based Contrastive Learning (ECL) guided Graph Structure Refinement (GSR) framework, denoted as ECL-GSR.<n>ECL-GSR achieves faster training with fewer samples and memories against the leading baseline, highlighting its simplicity and efficiency in downstream tasks.
arXiv Detail & Related papers (2024-12-20T04:05:09Z) - GRE^2-MDCL: Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning [0.0]
Graph representation learning has emerged as a powerful tool for preserving graph topology when mapping nodes to vector representations.<n>Current graph neural network models face the challenge of requiring extensive labeled data.<n>We propose Graph Representation Embedding Enhanced via Multidimensional Contrastive Learning.
arXiv Detail & Related papers (2024-09-12T03:09:05Z) - Generative-Contrastive Heterogeneous Graph Neural Network [17.889906784627904]
Heterogeneous Graphs (HGs) effectively model complex relationships in the real world through multi-type nodes and edges.<n>Contrastive learning (CL)-based Heterogeneous Graphs Neural Networks (HGNNs) have shown great potential in utilizing data augmentation and contrastive discriminators for downstream tasks.<n>We propose a novel Generative-Contrastive Heterogeneous Graph Neural Network (GC-HGNN)<n> Specifically, we propose a heterogeneous graph generative learning method that enhances CL-based paradigm.
arXiv Detail & Related papers (2024-04-03T15:31:18Z) - Graph-level Protein Representation Learning by Structure Knowledge
Refinement [50.775264276189695]
This paper focuses on learning representation on the whole graph level in an unsupervised manner.
We propose a novel framework called Structure Knowledge Refinement (SKR) which uses data structure to determine the probability of whether a pair is positive or negative.
arXiv Detail & Related papers (2024-01-05T09:05:33Z) - Rethinking and Simplifying Bootstrapped Graph Latents [48.76934123429186]
Graph contrastive learning (GCL) has emerged as a representative paradigm in graph self-supervised learning.
We present SGCL, a simple yet effective GCL framework that utilizes the outputs from two consecutive iterations as positive pairs.
We show that SGCL can achieve competitive performance with fewer parameters, lower time and space costs, and significant convergence speedup.
arXiv Detail & Related papers (2023-12-05T09:49:50Z) - LightGCL: Simple Yet Effective Graph Contrastive Learning for
Recommendation [9.181689366185038]
Graph neural clustering network (GNN) is a powerful learning approach for graph-based recommender systems.
In this paper, we propose a simple yet effective graph contrastive learning paradigm LightGCL.
arXiv Detail & Related papers (2023-02-16T10:16:21Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - Single-Pass Contrastive Learning Can Work for Both Homophilic and
Heterophilic Graph [60.28340453547902]
Graph contrastive learning (GCL) techniques typically require two forward passes for a single instance to construct the contrastive loss.
Existing GCL approaches fail to provide strong performance guarantees.
We implement the Single-Pass Graph Contrastive Learning method (SP-GCL)
Empirically, the features learned by the SP-GCL can match or outperform existing strong baselines with significantly less computational overhead.
arXiv Detail & Related papers (2022-11-20T07:18:56Z) - Unifying Graph Contrastive Learning with Flexible Contextual Scopes [57.86762576319638]
We present a self-supervised learning method termed Unifying Graph Contrastive Learning with Flexible Contextual Scopes (UGCL for short)
Our algorithm builds flexible contextual representations with contextual scopes by controlling the power of an adjacency matrix.
Based on representations from both local and contextual scopes, distL optimises a very simple contrastive loss function for graph representation learning.
arXiv Detail & Related papers (2022-10-17T07:16:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.