GuCNet: A Guided Clustering-based Network for Improved Classification
- URL: http://arxiv.org/abs/2010.05212v1
- Date: Sun, 11 Oct 2020 10:22:03 GMT
- Title: GuCNet: A Guided Clustering-based Network for Improved Classification
- Authors: Ushasi Chaudhuri, Syomantak Chaudhuri, Subhasis Chaudhuri
- Abstract summary: We present a novel, and yet a very simple classification technique by leveraging the ease of classifiability of any existing well separable dataset for guidance.
Since the guide dataset which may or may not have any semantic relationship with the experimental dataset, the proposed network tries to embed class-wise features of the challenging dataset to those distinct clusters of the guide set.
- Score: 15.747227188672088
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We deal with the problem of semantic classification of challenging and
highly-cluttered dataset. We present a novel, and yet a very simple
classification technique by leveraging the ease of classifiability of any
existing well separable dataset for guidance. Since the guide dataset which may
or may not have any semantic relationship with the experimental dataset, forms
well separable clusters in the feature set, the proposed network tries to embed
class-wise features of the challenging dataset to those distinct clusters of
the guide set, making them more separable. Depending on the availability, we
propose two types of guide sets: one using texture (image) guides and another
using prototype vectors representing cluster centers. Experimental results
obtained on the challenging benchmark RSSCN, LSUN, and TU-Berlin datasets
establish the efficacy of the proposed method as we outperform the existing
state-of-the-art techniques by a considerable margin.
Related papers
- Contextuality Helps Representation Learning for Generalized Category Discovery [5.885208652383516]
This paper introduces a novel approach to Generalized Category Discovery (GCD) by leveraging the concept of contextuality.
Our model integrates two levels of contextuality: instance-level, where nearest-neighbor contexts are utilized for contrastive learning, and cluster-level, employing contrastive learning.
The integration of the contextual information effectively improves the feature learning and thereby the classification accuracy of all categories.
arXiv Detail & Related papers (2024-07-29T07:30:41Z) - One-Shot Learning as Instruction Data Prospector for Large Language Models [108.81681547472138]
textscNuggets uses one-shot learning to select high-quality instruction data from extensive datasets.
We show that instruction tuning with the top 1% of examples curated by textscNuggets substantially outperforms conventional methods employing the entire dataset.
arXiv Detail & Related papers (2023-12-16T03:33:12Z) - Generalized Category Discovery with Clustering Assignment Consistency [56.92546133591019]
Generalized category discovery (GCD) is a recently proposed open-world task.
We propose a co-training-based framework that encourages clustering consistency.
Our method achieves state-of-the-art performance on three generic benchmarks and three fine-grained visual recognition datasets.
arXiv Detail & Related papers (2023-10-30T00:32:47Z) - A Clustering-based Framework for Classifying Data Streams [0.6524460254566904]
We propose a clustering-based data stream classification framework to handle non-stationary data streams.
The proposed method provides statistically better or comparable performance than the existing methods.
arXiv Detail & Related papers (2021-06-22T14:37:52Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Binary Classification from Multiple Unlabeled Datasets via Surrogate Set
Classification [94.55805516167369]
We propose a new approach for binary classification from m U-sets for $mge2$.
Our key idea is to consider an auxiliary classification task called surrogate set classification (SSC)
arXiv Detail & Related papers (2021-02-01T07:36:38Z) - Towards Uncovering the Intrinsic Data Structures for Unsupervised Domain
Adaptation using Structurally Regularized Deep Clustering [119.88565565454378]
Unsupervised domain adaptation (UDA) is to learn classification models that make predictions for unlabeled data on a target domain.
We propose a hybrid model of Structurally Regularized Deep Clustering, which integrates the regularized discriminative clustering of target data with a generative one.
Our proposed H-SRDC outperforms all the existing methods under both the inductive and transductive settings.
arXiv Detail & Related papers (2020-12-08T08:52:00Z) - Mixing Consistent Deep Clustering [3.5786621294068373]
Good latent representations produce semantically mixed outputs when decoding linears of two latent representations.
We propose the Mixing Consistent Deep Clustering method which encourages representations to appear realistic.
We show that the proposed method can be added to existing autoencoders to further improve clustering performance.
arXiv Detail & Related papers (2020-11-03T19:47:06Z) - A Classification-Based Approach to Semi-Supervised Clustering with
Pairwise Constraints [5.639904484784126]
We introduce a network framework for semi-supervised clustering with pairwise constraints.
In contrast to existing approaches, we decompose SSC into two simpler classification tasks/stages.
The proposed approach, S3C2, is motivated by the observation that binary classification is usually easier than multi-class clustering.
arXiv Detail & Related papers (2020-01-18T20:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.