Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training
- URL: http://arxiv.org/abs/2311.11073v1
- Date: Sat, 18 Nov 2023 13:45:21 GMT
- Title: Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training
- Authors: Yuecheng Li, Yanming Hu, Lele Fu, Chuan Chen, Lei Yang, Zibin Zheng
- Abstract summary: We propose a Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to jointly learn community partition and node representations in an end-to-end manner.
We show that our CEGCL exhibits state-of-the-art performance on three benchmark datasets with different scales.
- Score: 27.339318501446115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, graph contrastive learning (GCL) has emerged as one of the
optimal solutions for various supervised tasks at the node level. However, for
unsupervised and structure-related tasks such as community detection, current
GCL algorithms face difficulties in acquiring the necessary community-level
information, resulting in poor performance. In addition, general contrastive
learning algorithms improve the performance of downstream tasks by increasing
the number of negative samples, which leads to severe class collision and
unfairness of community detection. To address above issues, we propose a novel
Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to
jointly learn community partition and node representations in an end-to-end
manner. Specifically, we first design a personalized self-training (PeST)
strategy for unsupervised scenarios, which enables our model to capture precise
community-level personalized information in a graph. With the benefit of the
PeST, we alleviate class collision and unfairness without sacrificing the
overall model performance. Furthermore, the aligned graph clustering (AlGC) is
employed to obtain the community partition. In this module, we align the
clustering space of our downstream task with that in PeST to achieve more
consistent node embeddings. Finally, we demonstrate the effectiveness of our
model for community detection both theoretically and experimentally. Extensive
experimental results also show that our CEGCL exhibits state-of-the-art
performance on three benchmark datasets with different scales.
Related papers
- A Dual Adaptive Assignment Approach for Robust Graph-Based Clustering [19.548297224070076]
We introduce a new framework called the Dual Adaptive Assignment Approach for Robust Graph-Based Clustering (RDSA)
RDSA consists of three key components: (i) a node embedding module that effectively integrates the graph's topological features and node attributes; (ii) a structure-based soft assignment module that improves graph modularity by utilizing an affinity matrix for node assignments; and (iii) a node-based soft assignment module that identifies community landmarks and refines node assignments to enhance the model's robustness.
Our findings indicate that RDSA provides robust clustering across different graph types, excelling in clustering effectiveness and robustness, including adaptability
arXiv Detail & Related papers (2024-10-29T05:18:34Z) - Multi-Task Curriculum Graph Contrastive Learning with Clustering Entropy Guidance [25.5510013711661]
We propose the Clustering-guided Curriculum Graph contrastive Learning (CCGL) framework.
CCGL uses clustering entropy as the guidance of the following graph augmentation and contrastive learning.
Experimental results demonstrate that CCGL has achieved excellent performance compared to state-of-the-art competitors.
arXiv Detail & Related papers (2024-08-22T02:18:47Z) - Self-Supervised Contrastive Graph Clustering Network via Structural Information Fusion [15.293684479404092]
We propose a novel deep graph clustering method called CGCN.
Our approach introduces contrastive signals and deep structural information into the pre-training process.
Our method has been experimentally validated on multiple real-world graph datasets.
arXiv Detail & Related papers (2024-08-08T09:49:26Z) - Modularity aided consistent attributed graph clustering via coarsening [6.522020196906943]
Graph clustering is an important unsupervised learning technique for partitioning graphs with attributes and detecting communities.
We propose a loss function incorporating log-determinant, smoothness, and modularity components using a block majorization-minimization technique.
Our algorithm seamlessly integrates graph neural networks (GNNs) and variational graph autoencoders (VGAEs) to learn enhanced node features and deliver exceptional clustering performance.
arXiv Detail & Related papers (2024-07-09T10:42:19Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - CARL-G: Clustering-Accelerated Representation Learning on Graphs [18.763104937800215]
We propose a novel clustering-based framework for graph representation learning that uses a loss inspired by Cluster Validation Indices (CVIs)
CARL-G is adaptable to different clustering methods and CVIs, and we show that with the right choice of clustering method and CVI, CARL-G outperforms node classification baselines on 4/5 datasets with up to a 79x training speedup compared to the best-performing baseline.
arXiv Detail & Related papers (2023-06-12T08:14:42Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Learning to Cluster Faces via Confidence and Connectivity Estimation [136.5291151775236]
We propose a fully learnable clustering framework without requiring a large number of overlapped subgraphs.
Our method significantly improves clustering accuracy and thus performance of the recognition models trained on top, yet it is an order of magnitude more efficient than existing supervised methods.
arXiv Detail & Related papers (2020-04-01T13:39:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.