Deep Attention-guided Graph Clustering with Dual Self-supervision
- URL: http://arxiv.org/abs/2111.05548v1
- Date: Wed, 10 Nov 2021 06:53:03 GMT
- Title: Deep Attention-guided Graph Clustering with Dual Self-supervision
- Authors: Zhihao Peng and Hui Liu and Yuheng Jia and Junhui Hou
- Abstract summary: We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
- Score: 49.040136530379094
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Existing deep embedding clustering works only consider the deepest layer to
learn a feature embedding and thus fail to well utilize the available
discriminative information from cluster assignments, resulting performance
limitation. To this end, we propose a novel method, namely deep
attention-guided graph clustering with dual self-supervision (DAGC).
Specifically, DAGC first utilizes a heterogeneity-wise fusion module to
adaptively integrate the features of an auto-encoder and a graph convolutional
network in each layer and then uses a scale-wise fusion module to dynamically
concatenate the multi-scale features in different layers. Such modules are
capable of learning a discriminative feature embedding via an attention-based
mechanism. In addition, we design a distribution-wise fusion module that
leverages cluster assignments to acquire clustering results directly. To better
explore the discriminative information from the cluster assignments, we develop
a dual self-supervision solution consisting of a soft self-supervision strategy
with a triplet Kullback-Leibler divergence loss and a hard self-supervision
strategy with a pseudo supervision loss. Extensive experiments validate that
our method consistently outperforms state-of-the-art methods on six benchmark
datasets. Especially, our method improves the ARI by more than 18.14% over the
best baseline.
Related papers
- Self-Supervised Contrastive Graph Clustering Network via Structural Information Fusion [15.293684479404092]
We propose a novel deep graph clustering method called CGCN.
Our approach introduces contrastive signals and deep structural information into the pre-training process.
Our method has been experimentally validated on multiple real-world graph datasets.
arXiv Detail & Related papers (2024-08-08T09:49:26Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Deep Multi-View Subspace Clustering with Anchor Graph [11.291831842959926]
We propose a novel deep multi-view subspace clustering method with anchor graph (DMCAG)
DMCAG learns the embedded features for each view independently, which are used to obtain the subspace representations.
Our method achieves superior clustering performance over other state-of-the-art methods.
arXiv Detail & Related papers (2023-05-11T16:17:43Z) - Dual Information Enhanced Multi-view Attributed Graph Clustering [11.624319530337038]
A novel Dual Information enhanced multi-view Attributed Graph Clustering (DIAGC) method is proposed in this paper.
The proposed method introduces the Specific Information Reconstruction (SIR) module to disentangle the explorations of the consensus and specific information from multiple views.
The Mutual Information Maximization (MIM) module maximizes the agreement between the latent high-level representation and low-level ones, and enables the high-level representation to satisfy the desired clustering structure.
arXiv Detail & Related papers (2022-11-28T01:18:04Z) - Rethinking Clustering-Based Pseudo-Labeling for Unsupervised
Meta-Learning [146.11600461034746]
Method for unsupervised meta-learning, CACTUs, is a clustering-based approach with pseudo-labeling.
This approach is model-agnostic and can be combined with supervised algorithms to learn from unlabeled data.
We prove that the core reason for this is lack of a clustering-friendly property in the embedding space.
arXiv Detail & Related papers (2022-09-27T19:04:36Z) - Attention-driven Graph Clustering Network [49.040136530379094]
We propose a novel deep clustering method named Attention-driven Graph Clustering Network (AGCN)
AGCN exploits a heterogeneous-wise fusion module to dynamically fuse the node attribute feature and the topological graph feature.
AGCN can jointly perform feature learning and cluster assignment in an unsupervised fashion.
arXiv Detail & Related papers (2021-08-12T02:30:38Z) - CaEGCN: Cross-Attention Fusion based Enhanced Graph Convolutional
Network for Clustering [51.62959830761789]
We propose a cross-attention based deep clustering framework, named Cross-Attention Fusion based Enhanced Graph Convolutional Network (CaEGCN)
CaEGCN contains four main modules: cross-attention fusion, Content Auto-encoder, Graph Convolutional Auto-encoder and self-supervised model.
Experimental results on different types of datasets prove the superiority and robustness of the proposed CaEGCN.
arXiv Detail & Related papers (2021-01-18T05:21:59Z) - Learning to Cluster Faces via Confidence and Connectivity Estimation [136.5291151775236]
We propose a fully learnable clustering framework without requiring a large number of overlapped subgraphs.
Our method significantly improves clustering accuracy and thus performance of the recognition models trained on top, yet it is an order of magnitude more efficient than existing supervised methods.
arXiv Detail & Related papers (2020-04-01T13:39:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.