GATCluster: Self-Supervised Gaussian-Attention Network for Image
Clustering
- URL: http://arxiv.org/abs/2002.11863v2
- Date: Sat, 6 Jun 2020 20:09:39 GMT
- Title: GATCluster: Self-Supervised Gaussian-Attention Network for Image
Clustering
- Authors: Chuang Niu, Jun Zhang, Ge Wang, Jimin Liang
- Abstract summary: We propose a self-supervised clustering network for image Clustering (GATCluster)
Rather than extracting intermediate features first and then performing the traditional clustering, GATCluster semantic cluster labels without further post-processing.
We develop a two-step learning algorithm that is memory-efficient for clustering large-size images.
- Score: 9.722607434532883
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a self-supervised Gaussian ATtention network for image Clustering
(GATCluster). Rather than extracting intermediate features first and then
performing the traditional clustering algorithm, GATCluster directly outputs
semantic cluster labels without further post-processing. Theoretically, we give
a Label Feature Theorem to guarantee the learned features are one-hot encoded
vectors, and the trivial solutions are avoided. To train the GATCluster in a
completely unsupervised manner, we design four self-learning tasks with the
constraints of transformation invariance, separability maximization, entropy
analysis, and attention mapping. Specifically, the transformation invariance
and separability maximization tasks learn the relationships between sample
pairs. The entropy analysis task aims to avoid trivial solutions. To capture
the object-oriented semantics, we design a self-supervised attention mechanism
that includes a parameterized attention module and a soft-attention loss. All
the guiding signals for clustering are self-generated during the training
process. Moreover, we develop a two-step learning algorithm that is
memory-efficient for clustering large-size images. Extensive experiments
demonstrate the superiority of our proposed method in comparison with the
state-of-the-art image clustering benchmarks. Our code has been made publicly
available at https://github.com/niuchuangnn/GATCluster.
Related papers
- Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Image Clustering via the Principle of Rate Reduction in the Age of Pretrained Models [37.574691902971296]
We propose a novel image clustering pipeline that leverages the powerful feature representation of large pre-trained models.
We show that our pipeline works well on standard datasets such as CIFAR-10, CIFAR-100, and ImageNet-1k.
arXiv Detail & Related papers (2023-06-08T15:20:27Z) - DeepCut: Unsupervised Segmentation using Graph Neural Networks
Clustering [6.447863458841379]
This study introduces a lightweight Graph Neural Network (GNN) to replace classical clustering methods.
Unlike existing methods, our GNN takes both the pair-wise affinities between local image features and the raw features as input.
We demonstrate how classical clustering objectives can be formulated as self-supervised loss functions for training an image segmentation GNN.
arXiv Detail & Related papers (2022-12-12T12:31:46Z) - Dual Contrastive Attributed Graph Clustering Network [6.796682703663566]
We propose a generic framework called Dual Contrastive Attributed Graph Clustering Network (DCAGC)
In DCAGC, by leveraging Neighborhood Contrast Module, the similarity of the neighbor nodes will be maximized and the quality of the node representation will be improved.
All the modules of DCAGC are trained and optimized in a unified framework, so the learned node representation contains clustering-oriented messages.
arXiv Detail & Related papers (2022-06-16T03:17:01Z) - ClusterGNN: Cluster-based Coarse-to-Fine Graph Neural Network for
Efficient Feature Matching [15.620335576962475]
ClusterGNN is an attentional GNN architecture which operates on clusters for learning the feature matching task.
Our approach yields a 59.7% reduction in runtime and 58.4% reduction in memory consumption for dense detection.
arXiv Detail & Related papers (2022-04-25T14:43:15Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - Self-supervised Contrastive Attributed Graph Clustering [110.52694943592974]
We propose a novel attributed graph clustering network, namely Self-supervised Contrastive Attributed Graph Clustering (SCAGC)
In SCAGC, by leveraging inaccurate clustering labels, a self-supervised contrastive loss, are designed for node representation learning.
For the OOS nodes, SCAGC can directly calculate their clustering labels.
arXiv Detail & Related papers (2021-10-15T03:25:28Z) - Clustering by Maximizing Mutual Information Across Views [62.21716612888669]
We propose a novel framework for image clustering that incorporates joint representation learning and clustering.
Our method significantly outperforms state-of-the-art single-stage clustering methods across a variety of image datasets.
arXiv Detail & Related papers (2021-07-24T15:36:49Z) - Learning Hierarchical Graph Neural Networks for Image Clustering [81.5841862489509]
We propose a hierarchical graph neural network (GNN) model that learns how to cluster a set of images into an unknown number of identities.
Our hierarchical GNN uses a novel approach to merge connected components predicted at each level of the hierarchy to form a new graph at the next level.
arXiv Detail & Related papers (2021-07-03T01:28:42Z) - Online Deep Clustering for Unsupervised Representation Learning [108.33534231219464]
Online Deep Clustering (ODC) performs clustering and network update simultaneously rather than alternatingly.
We design and maintain two dynamic memory modules, i.e., samples memory to store samples labels and features, and centroids memory for centroids evolution.
In this way, labels and the network evolve shoulder-to-shoulder rather than alternatingly.
arXiv Detail & Related papers (2020-06-18T16:15:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.