Contrastive Fine-grained Class Clustering via Generative Adversarial
Networks
- URL: http://arxiv.org/abs/2112.14971v1
- Date: Thu, 30 Dec 2021 08:57:11 GMT
- Title: Contrastive Fine-grained Class Clustering via Generative Adversarial
Networks
- Authors: Yunji Kim, Jung-Woo Ha
- Abstract summary: We introduce C3-GAN, a method that leverages the categorical inference power of InfoGAN by applying contrastive learning.
C3-GAN achieved state-of-the-art clustering performance on four fine-grained benchmark datasets.
- Score: 9.667133604169829
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Unsupervised fine-grained class clustering is practical yet challenging task
due to the difficulty of feature representations learning of subtle object
details. We introduce C3-GAN, a method that leverages the categorical inference
power of InfoGAN by applying contrastive learning. We aim to learn feature
representations that encourage the data to form distinct cluster boundaries in
the embedding space, while also maximizing the mutual information between the
latent code and its observation. Our approach is to train the discriminator,
which is used for inferring clusters, to optimize the contrastive loss, where
the image-latent pairs that maximize the mutual information are considered as
positive pairs and the rest as negative pairs. Specifically, we map the input
of the generator, which has sampled from the categorical distribution, to the
embedding space of the discriminator and let them act as a cluster centroid. In
this way, C3-GAN achieved to learn a clustering-friendly embedding space where
each cluster is distinctively separable. Experimental results show that C3-GAN
achieved state-of-the-art clustering performance on four fine-grained benchmark
datasets, while also alleviating the mode collapse phenomenon.
Related papers
- A3S: A General Active Clustering Method with Pairwise Constraints [66.74627463101837]
A3S features strategic active clustering adjustment on the initial cluster result, which is obtained by an adaptive clustering algorithm.
In extensive experiments across diverse real-world datasets, A3S achieves desired results with significantly fewer human queries.
arXiv Detail & Related papers (2024-07-14T13:37:03Z) - Stable Cluster Discrimination for Deep Clustering [7.175082696240088]
Deep clustering can optimize representations of instances (i.e., representation learning) and explore the inherent data distribution.
The coupled objective implies a trivial solution that all instances collapse to the uniform features.
In this work, we first show that the prevalent discrimination task in supervised learning is unstable for one-stage clustering.
A novel stable cluster discrimination (SeCu) task is proposed and a new hardness-aware clustering criterion can be obtained accordingly.
arXiv Detail & Related papers (2023-11-24T06:43:26Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Using Decision Trees for Interpretable Supervised Clustering [0.0]
supervised clustering aims at forming clusters of labelled data with high probability densities.
We are particularly interested in finding clusters of data of a given class and describing the clusters with the set of comprehensive rules.
arXiv Detail & Related papers (2023-07-16T17:12:45Z) - Oracle-guided Contrastive Clustering [28.066047266687058]
Oracle-guided Contrastive Clustering(OCC) is proposed to cluster by interactively making pairwise same-cluster" queries to oracles with distinctive demands.
To the best of our knowledge, it is the first deep framework to perform personalized clustering.
arXiv Detail & Related papers (2022-11-01T12:05:12Z) - Self-Evolutionary Clustering [1.662966122370634]
Most existing deep clustering methods are based on simple distance comparison and highly dependent on the target distribution generated by a handcrafted nonlinear mapping.
A novel modular Self-Evolutionary Clustering (Self-EvoC) framework is constructed, which boosts the clustering performance by classification in a self-supervised manner.
The framework can efficiently discriminate sample outliers and generate better target distribution with the assistance of self-supervised.
arXiv Detail & Related papers (2022-02-21T19:38:18Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Very Compact Clusters with Structural Regularization via Similarity and
Connectivity [3.779514860341336]
We propose an end-to-end deep clustering algorithm, i.e., Very Compact Clusters (VCC) for the general datasets.
Our proposed approach achieves better clustering performance over most of the state-of-the-art clustering methods.
arXiv Detail & Related papers (2021-06-09T23:22:03Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Contrastive Clustering [57.71729650297379]
We propose Contrastive Clustering (CC) which explicitly performs the instance- and cluster-level contrastive learning.
In particular, CC achieves an NMI of 0.705 (0.431) on the CIFAR-10 (CIFAR-100) dataset, which is an up to 19% (39%) performance improvement compared with the best baseline.
arXiv Detail & Related papers (2020-09-21T08:54:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.