Deep Visual Attention-Based Transfer Clustering
- URL: http://arxiv.org/abs/2107.02415v1
- Date: Tue, 6 Jul 2021 06:26:15 GMT
- Title: Deep Visual Attention-Based Transfer Clustering
- Authors: Akshaykumar Gunari, Shashidhar Veerappa Kudari, Sukanya Nadagadalli,
Keerthi Goudnaik, Ramesh Ashok Tabib, Uma Mudenagudi, and Adarsh Jamadandi
- Abstract summary: Clustering can be considered as the most important unsupervised learning problem.
Image clustering is a crucial but challenging task in the domain machine learning and computer vision.
This paper is the improvement of the existing deep transfer clustering for less variant data distribution.
- Score: 2.248500763940652
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we propose a methodology to improvise the technique of deep
transfer clustering (DTC) when applied to the less variant data distribution.
Clustering can be considered as the most important unsupervised learning
problem. A simple definition of clustering can be stated as "the process of
organizing objects into groups, whose members are similar in some way". Image
clustering is a crucial but challenging task in the domain machine learning and
computer vision. We have discussed the clustering of the data collection where
the data is less variant. We have discussed the improvement by using
attention-based classifiers rather than regular classifiers as the initial
feature extractors in the deep transfer clustering. We have enforced the model
to learn only the required region of interest in the images to get the
differentiable and robust features that do not take into account the
background. This paper is the improvement of the existing deep transfer
clustering for less variant data distribution.
Related papers
- Deep Structure and Attention Aware Subspace Clustering [29.967881186297582]
We propose a novel Deep Structure and Attention aware Subspace Clustering (DSASC)
We use a vision transformer to extract features, and the extracted features are divided into two parts, structure features, and content features.
Our method significantly outperforms state-of-the-art methods.
arXiv Detail & Related papers (2023-12-25T01:19:47Z) - Transferable Deep Clustering Model [14.073783373395196]
We propose a novel transferable deep clustering model that can automatically adapt the cluster centroids according to the distribution of data samples.
Our approach introduces a novel attention-based module that can adapt the centroids by measuring their relationship with samples.
Experimental results on both synthetic and real-world datasets demonstrate the effectiveness and efficiency of our proposed transfer learning framework.
arXiv Detail & Related papers (2023-10-07T23:35:17Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - DMS: Differentiable Mean Shift for Dataset Agnostic Task Specific
Clustering Using Side Information [0.0]
We present a novel approach, in which we learn to cluster data directly from side information.
We do not need to know the number of clusters, their centers or any kind of distance metric for similarity.
Our method is able to divide the same data points in various ways dependant on the needs of a specific task.
arXiv Detail & Related papers (2023-05-29T13:45:49Z) - Hard Regularization to Prevent Deep Online Clustering Collapse without
Data Augmentation [65.268245109828]
Online deep clustering refers to the joint use of a feature extraction network and a clustering model to assign cluster labels to each new data point or batch as it is processed.
While faster and more versatile than offline methods, online clustering can easily reach the collapsed solution where the encoder maps all inputs to the same point and all are put into a single cluster.
We propose a method that does not require data augmentation, and that, differently from existing methods, regularizes the hard assignments.
arXiv Detail & Related papers (2023-03-29T08:23:26Z) - Learning Statistical Representation with Joint Deep Embedded Clustering [2.1267423178232407]
StatDEC is an unsupervised framework for joint statistical representation learning and clustering.
Our experiments show that using these representations, one can considerably improve results on imbalanced image clustering across a variety of image datasets.
arXiv Detail & Related papers (2021-09-11T09:26:52Z) - Very Compact Clusters with Structural Regularization via Similarity and
Connectivity [3.779514860341336]
We propose an end-to-end deep clustering algorithm, i.e., Very Compact Clusters (VCC) for the general datasets.
Our proposed approach achieves better clustering performance over most of the state-of-the-art clustering methods.
arXiv Detail & Related papers (2021-06-09T23:22:03Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Online Deep Clustering for Unsupervised Representation Learning [108.33534231219464]
Online Deep Clustering (ODC) performs clustering and network update simultaneously rather than alternatingly.
We design and maintain two dynamic memory modules, i.e., samples memory to store samples labels and features, and centroids memory for centroids evolution.
In this way, labels and the network evolve shoulder-to-shoulder rather than alternatingly.
arXiv Detail & Related papers (2020-06-18T16:15:46Z) - Unsupervised Person Re-identification via Softened Similarity Learning [122.70472387837542]
Person re-identification (re-ID) is an important topic in computer vision.
This paper studies the unsupervised setting of re-ID, which does not require any labeled information.
Experiments on two image-based and video-based datasets demonstrate state-of-the-art performance.
arXiv Detail & Related papers (2020-04-07T17:16:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.