Enhancing Clustering Representations with Positive Proximity and Cluster
Dispersion Learning
- URL: http://arxiv.org/abs/2311.00731v1
- Date: Wed, 1 Nov 2023 06:12:02 GMT
- Title: Enhancing Clustering Representations with Positive Proximity and Cluster
Dispersion Learning
- Authors: Abhishek Kumar and Dong-Gyu Lee
- Abstract summary: We propose a novel end-to-end deep clustering approach named PIPCDR.
PIPCDR incorporates a positive instance proximity loss and a cluster dispersion regularizer.
We extensively validate the effectiveness of PIPCDR within an end-to-end Majorize-Minimization framework.
- Score: 9.396177578282176
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contemporary deep clustering approaches often rely on either contrastive or
non-contrastive techniques to acquire effective representations for clustering
tasks. Contrastive methods leverage negative pairs to achieve homogenous
representations but can introduce class collision issues, potentially
compromising clustering performance. On the contrary, non-contrastive
techniques prevent class collisions but may produce non-uniform representations
that lead to clustering collapse. In this work, we propose a novel end-to-end
deep clustering approach named PIPCDR, designed to harness the strengths of
both approaches while mitigating their limitations. PIPCDR incorporates a
positive instance proximity loss and a cluster dispersion regularizer. The
positive instance proximity loss ensures alignment between augmented views of
instances and their sampled neighbors, enhancing within-cluster compactness by
selecting genuinely positive pairs within the embedding space. Meanwhile, the
cluster dispersion regularizer maximizes inter-cluster distances while
minimizing within-cluster compactness, promoting uniformity in the learned
representations. PIPCDR excels in producing well-separated clusters, generating
uniform representations, avoiding class collision issues, and enhancing
within-cluster compactness. We extensively validate the effectiveness of PIPCDR
within an end-to-end Majorize-Minimization framework, demonstrating its
competitive performance on moderate-scale clustering benchmark datasets and
establishing new state-of-the-art results on large-scale datasets.
Related papers
- Self-Supervised Graph Embedding Clustering [70.36328717683297]
K-means one-step dimensionality reduction clustering method has made some progress in addressing the curse of dimensionality in clustering tasks.
We propose a unified framework that integrates manifold learning with K-means, resulting in the self-supervised graph embedding framework.
arXiv Detail & Related papers (2024-09-24T08:59:51Z) - Stable Cluster Discrimination for Deep Clustering [7.175082696240088]
Deep clustering can optimize representations of instances (i.e., representation learning) and explore the inherent data distribution.
The coupled objective implies a trivial solution that all instances collapse to the uniform features.
In this work, we first show that the prevalent discrimination task in supervised learning is unstable for one-stage clustering.
A novel stable cluster discrimination (SeCu) task is proposed and a new hardness-aware clustering criterion can be obtained accordingly.
arXiv Detail & Related papers (2023-11-24T06:43:26Z) - Efficient Bilateral Cross-Modality Cluster Matching for Unsupervised Visible-Infrared Person ReID [56.573905143954015]
We propose a novel bilateral cluster matching-based learning framework to reduce the modality gap by matching cross-modality clusters.
Under such a supervisory signal, a Modality-Specific and Modality-Agnostic (MSMA) contrastive learning framework is proposed to align features jointly at a cluster-level.
Experiments on the public SYSU-MM01 and RegDB datasets demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2023-05-22T03:27:46Z) - Dynamic Clustering and Cluster Contrastive Learning for Unsupervised
Person Re-identification [29.167783500369442]
Unsupervised Re-ID methods aim at learning robust and discriminative features from unlabeled data.
We propose a dynamic clustering and cluster contrastive learning (DCCC) method.
Experiments on several widely used public datasets validate the effectiveness of our proposed DCCC.
arXiv Detail & Related papers (2023-03-13T01:56:53Z) - Cluster-guided Contrastive Graph Clustering Network [53.16233290797777]
We propose a Cluster-guided Contrastive deep Graph Clustering network (CCGC)
We construct two views of the graph by designing special Siamese encoders whose weights are not shared between the sibling sub-networks.
To construct semantic meaningful negative sample pairs, we regard the centers of different high-confidence clusters as negative samples.
arXiv Detail & Related papers (2023-01-03T13:42:38Z) - Robust Consensus Clustering and its Applications for Advertising
Forecasting [18.242055675730253]
We propose a novel algorithm -- robust consensus clustering that can find common ground truth among experts' opinions.
We apply the proposed method to the real-world advertising campaign segmentation and forecasting tasks.
arXiv Detail & Related papers (2022-12-27T21:49:04Z) - Exploring Non-Contrastive Representation Learning for Deep Clustering [23.546602131801205]
Non-contrastive representation learning for deep clustering, termed NCC, is based on BYOL, a representative method without negative examples.
NCC forms an embedding space where all clusters are well-separated and within-cluster examples are compact.
Experimental results on several clustering benchmark datasets including ImageNet-1K demonstrate that NCC outperforms the state-of-the-art methods by a significant margin.
arXiv Detail & Related papers (2021-11-23T12:21:53Z) - Hybrid Dynamic Contrast and Probability Distillation for Unsupervised
Person Re-Id [109.1730454118532]
Unsupervised person re-identification (Re-Id) has attracted increasing attention due to its practical application in the read-world video surveillance system.
We present the hybrid dynamic cluster contrast and probability distillation algorithm.
It formulates the unsupervised Re-Id problem into an unified local-to-global dynamic contrastive learning and self-supervised probability distillation framework.
arXiv Detail & Related papers (2021-09-29T02:56:45Z) - Cluster Analysis with Deep Embeddings and Contrastive Learning [0.0]
This work proposes a novel framework for performing image clustering from deep embeddings.
Our approach jointly learns representations and predicts cluster centers in an end-to-end manner.
Our framework performs on par with widely accepted clustering methods and outperforms the state-of-the-art contrastive learning method on the CIFAR-10 dataset.
arXiv Detail & Related papers (2021-09-26T22:18:15Z) - Contrastive Clustering [57.71729650297379]
We propose Contrastive Clustering (CC) which explicitly performs the instance- and cluster-level contrastive learning.
In particular, CC achieves an NMI of 0.705 (0.431) on the CIFAR-10 (CIFAR-100) dataset, which is an up to 19% (39%) performance improvement compared with the best baseline.
arXiv Detail & Related papers (2020-09-21T08:54:40Z) - Progressive Cluster Purification for Unsupervised Feature Learning [48.87365358296371]
In unsupervised feature learning, sample specificity based methods ignore the inter-class information.
We propose a novel clustering based method, which excludes class inconsistent samples during progressive cluster formation.
Our approach, referred to as Progressive Cluster Purification (PCP), implements progressive clustering by gradually reducing the number of clusters during training.
arXiv Detail & Related papers (2020-07-06T08:11:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.