Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training
- URL: http://arxiv.org/abs/2311.11073v1
- Date: Sat, 18 Nov 2023 13:45:21 GMT
- Title: Community-Aware Efficient Graph Contrastive Learning via Personalized
Self-Training
- Authors: Yuecheng Li, Yanming Hu, Lele Fu, Chuan Chen, Lei Yang, Zibin Zheng
- Abstract summary: We propose a Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to jointly learn community partition and node representations in an end-to-end manner.
We show that our CEGCL exhibits state-of-the-art performance on three benchmark datasets with different scales.
- Score: 27.339318501446115
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, graph contrastive learning (GCL) has emerged as one of the
optimal solutions for various supervised tasks at the node level. However, for
unsupervised and structure-related tasks such as community detection, current
GCL algorithms face difficulties in acquiring the necessary community-level
information, resulting in poor performance. In addition, general contrastive
learning algorithms improve the performance of downstream tasks by increasing
the number of negative samples, which leads to severe class collision and
unfairness of community detection. To address above issues, we propose a novel
Community-aware Efficient Graph Contrastive Learning Framework (CEGCL) to
jointly learn community partition and node representations in an end-to-end
manner. Specifically, we first design a personalized self-training (PeST)
strategy for unsupervised scenarios, which enables our model to capture precise
community-level personalized information in a graph. With the benefit of the
PeST, we alleviate class collision and unfairness without sacrificing the
overall model performance. Furthermore, the aligned graph clustering (AlGC) is
employed to obtain the community partition. In this module, we align the
clustering space of our downstream task with that in PeST to achieve more
consistent node embeddings. Finally, we demonstrate the effectiveness of our
model for community detection both theoretically and experimentally. Extensive
experimental results also show that our CEGCL exhibits state-of-the-art
performance on three benchmark datasets with different scales.
Related papers
- Modularity aided consistent attributed graph clustering via coarsening [6.522020196906943]
Graph clustering is an important unsupervised learning technique for partitioning graphs with attributes and detecting communities.
We propose a loss function incorporating log-determinant, smoothness, and modularity components using a block majorization-minimization technique.
Our algorithm seamlessly integrates graph neural networks (GNNs) and variational graph autoencoders (VGAEs) to learn enhanced node features and deliver exceptional clustering performance.
arXiv Detail & Related papers (2024-07-09T10:42:19Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - Mitigating Semantic Confusion from Hostile Neighborhood for Graph Active
Learning [38.5372139056485]
Graph Active Learning (GAL) aims to find the most informative nodes in graphs for annotation to maximize the Graph Neural Networks (GNNs) performance.
Gal strategies may introduce semantic confusion to the selected training set, particularly when graphs are noisy.
We present Semantic-aware Active learning framework for Graphs (SAG) to mitigate the semantic confusion problem.
arXiv Detail & Related papers (2023-08-17T07:06:54Z) - CARL-G: Clustering-Accelerated Representation Learning on Graphs [18.763104937800215]
We propose a novel clustering-based framework for graph representation learning that uses a loss inspired by Cluster Validation Indices (CVIs)
CARL-G is adaptable to different clustering methods and CVIs, and we show that with the right choice of clustering method and CVI, CARL-G outperforms node classification baselines on 4/5 datasets with up to a 79x training speedup compared to the best-performing baseline.
arXiv Detail & Related papers (2023-06-12T08:14:42Z) - Localized Contrastive Learning on Graphs [110.54606263711385]
We introduce a simple yet effective contrastive model named Localized Graph Contrastive Learning (Local-GCL)
In spite of its simplicity, Local-GCL achieves quite competitive performance in self-supervised node representation learning tasks on graphs with various scales and properties.
arXiv Detail & Related papers (2022-12-08T23:36:00Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z) - Graph Representation Learning via Contrasting Cluster Assignments [57.87743170674533]
We propose a novel unsupervised graph representation model by contrasting cluster assignments, called as GRCCA.
It is motivated to make good use of local and global information synthetically through combining clustering algorithms and contrastive learning.
GRCCA has strong competitiveness in most tasks.
arXiv Detail & Related papers (2021-12-15T07:28:58Z) - An Empirical Study of Graph Contrastive Learning [17.246488437677616]
Graph Contrastive Learning establishes a new paradigm for learning graph representations without human annotations.
We identify several critical design considerations within a general GCL paradigm, including augmentation functions, contrasting modes, contrastive objectives, and negative mining techniques.
To foster future research and ease the implementation of GCL algorithms, we develop an easy-to-use library PyGCL, featuring modularized CL components, standardized evaluation, and experiment management.
arXiv Detail & Related papers (2021-09-02T17:43:45Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Learning to Cluster Faces via Confidence and Connectivity Estimation [136.5291151775236]
We propose a fully learnable clustering framework without requiring a large number of overlapped subgraphs.
Our method significantly improves clustering accuracy and thus performance of the recognition models trained on top, yet it is an order of magnitude more efficient than existing supervised methods.
arXiv Detail & Related papers (2020-04-01T13:39:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.