Graph Contrastive Clustering
- URL: http://arxiv.org/abs/2104.01429v1
- Date: Sat, 3 Apr 2021 15:32:49 GMT
- Title: Graph Contrastive Clustering
- Authors: Huasong Zhong, Jianlong Wu, Chong Chen, Jianqiang Huang, Minghua Deng,
Liqiang Nie, Zhouchen Lin, Xian-Sheng Hua
- Abstract summary: We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
- Score: 131.67881457114316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, some contrastive learning methods have been proposed to
simultaneously learn representations and clustering assignments, achieving
significant improvements. However, these methods do not take the category
information and clustering objective into consideration, thus the learned
representations are not optimal for clustering and the performance might be
limited. Towards this issue, we first propose a novel graph contrastive
learning framework, which is then applied to the clustering task and we come up
with the Graph Constrastive Clustering~(GCC) method. Different from basic
contrastive clustering that only assumes an image and its augmentation should
share similar representation and clustering assignments, we lift the
instance-level consistency to the cluster-level consistency with the assumption
that samples in one cluster and their augmentations should all be similar.
Specifically, on the one hand, the graph Laplacian based contrastive loss is
proposed to learn more discriminative and clustering-friendly features. On the
other hand, a novel graph-based contrastive learning strategy is proposed to
learn more compact clustering assignments. Both of them incorporate the latent
category information to reduce the intra-cluster variance while increasing the
inter-cluster variance. Experiments on six commonly used datasets demonstrate
the superiority of our proposed approach over the state-of-the-art methods.
Related papers
- Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Joint Debiased Representation and Image Clustering Learning with
Self-Supervision [3.1806743741013657]
We develop a novel joint clustering and contrastive learning framework.
We adapt the debiased contrastive loss to avoid under-clustering minority classes of imbalanced datasets.
arXiv Detail & Related papers (2022-09-14T21:23:41Z) - ACTIVE:Augmentation-Free Graph Contrastive Learning for Partial
Multi-View Clustering [52.491074276133325]
We propose an augmentation-free graph contrastive learning framework to solve the problem of partial multi-view clustering.
The proposed approach elevates instance-level contrastive learning and missing data inference to the cluster-level, effectively mitigating the impact of individual missing data on clustering.
arXiv Detail & Related papers (2022-03-01T02:32:25Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Clustering by Maximizing Mutual Information Across Views [62.21716612888669]
We propose a novel framework for image clustering that incorporates joint representation learning and clustering.
Our method significantly outperforms state-of-the-art single-stage clustering methods across a variety of image datasets.
arXiv Detail & Related papers (2021-07-24T15:36:49Z) - Learning the Precise Feature for Cluster Assignment [39.320210567860485]
We propose a framework which integrates representation learning and clustering into a single pipeline for the first time.
The proposed framework exploits the powerful ability of recently developed generative models for learning intrinsic features.
Experimental results show that the performance of the proposed method is superior, or at least comparable to, the state-of-the-art methods.
arXiv Detail & Related papers (2021-06-11T04:08:54Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Representation Learning for Clustering via Building Consensus [3.7434090710577608]
We propose Consensus Clustering using Unsupervised Representation Learning (ConCURL)
ConCURL improves the clustering performance over state-of-the art methods on four out of five image datasets.
We extend the evaluation procedure for clustering to reflect the challenges in real world clustering tasks.
arXiv Detail & Related papers (2021-05-04T05:04:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.