Strongly Augmented Contrastive Clustering
- URL: http://arxiv.org/abs/2206.00380v1
- Date: Wed, 1 Jun 2022 10:30:59 GMT
- Title: Strongly Augmented Contrastive Clustering
- Authors: Xiaozhi Deng, Dong Huang, Ding-Hua Chen, Chang-Dong Wang, Jian-Huang
Lai
- Abstract summary: We present an end-to-end deep clustering approach termed strongly augmented contrastive clustering (SACC)
We utilize a backbone network with triply-shared weights, where a strongly augmented view and two weakly augmented views are incorporated.
Based on the representations produced by the backbone, the weak-weak view pair and the strong-weak view pairs are simultaneously exploited for the instance-level contrastive learning.
- Score: 52.00792661612913
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep clustering has attracted increasing attention in recent years due to its
capability of joint representation learning and clustering via deep neural
networks. In its latest developments, the contrastive learning has emerged as
an effective technique to substantially enhance the deep clustering
performance. However, the existing contrastive learning based deep clustering
algorithms mostly focus on some carefully-designed augmentations (often with
limited transformations to preserve the structure), referred to as weak
augmentations, but cannot go beyond the weak augmentations to explore the more
opportunities in stronger augmentations (with more aggressive transformations
or even severe distortions). In this paper, we present an end-to-end deep
clustering approach termed strongly augmented contrastive clustering (SACC),
which extends the conventional two-augmentation-view paradigm to multiple views
and jointly leverages strong and weak augmentations for strengthened deep
clustering. Particularly, we utilize a backbone network with triply-shared
weights, where a strongly augmented view and two weakly augmented views are
incorporated. Based on the representations produced by the backbone, the
weak-weak view pair and the strong-weak view pairs are simultaneously exploited
for the instance-level contrastive learning (via an instance projector) and the
cluster-level contrastive learning (via a cluster projector), which, together
with the backbone, can be jointly optimized in a purely unsupervised manner.
Experimental results on five challenging image datasets have shown the superior
performance of the proposed SACC approach over the state-of-the-art.
Related papers
- Dual Adversarial Perturbators Generate rich Views for Recommendation [16.284670207195056]
AvoGCL emulates curriculum learning by applying adversarial training to graph structures and embedding perturbations.
Experiments on three real-world datasets demonstrate that AvoGCL significantly outperforms the state-of-the-art competitors.
arXiv Detail & Related papers (2024-08-26T15:19:35Z) - Structure-enhanced Contrastive Learning for Graph Clustering [4.6746630466993055]
Structure-enhanced Contrastive Learning (SECL) is introduced to addresses issues by leveraging inherent network structures.
SECL utilizes a cross-view contrastive learning mechanism to enhance node embeddings without elaborate data augmentations.
Extensive experiments on six datasets confirm SECL's superiority over current state-of-the-art methods.
arXiv Detail & Related papers (2024-08-19T08:39:08Z) - Hierarchical Contrastive Learning Enhanced Heterogeneous Graph Neural
Network [59.860534520941485]
Heterogeneous graph neural networks (HGNNs) as an emerging technique have shown superior capacity of dealing with heterogeneous information network (HIN)
Recently, contrastive learning, a self-supervised method, becomes one of the most exciting learning paradigms and shows great potential when there are no labels.
In this paper, we study the problem of self-supervised HGNNs and propose a novel co-contrastive learning mechanism for HGNNs, named HeCo.
arXiv Detail & Related papers (2023-04-24T16:17:21Z) - Deep Temporal Contrastive Clustering [21.660509622172274]
This paper presents a deep temporal contrastive clustering approach.
It incorporates the contrastive learning paradigm into the deep time series clustering research.
Experiments on a variety of time series datasets demonstrate the superiority of our approach over the state-of-the-art.
arXiv Detail & Related papers (2022-12-29T16:43:34Z) - Hierarchical Consistent Contrastive Learning for Skeleton-Based Action
Recognition with Growing Augmentations [33.68311764817763]
We propose a general hierarchical consistent contrastive learning framework (HiCLR) for skeleton-based action recognition.
Specifically, we first design a gradual growing augmentation policy to generate multiple ordered positive pairs.
Then, an asymmetric loss is proposed to enforce the hierarchical consistency via a directional clustering operation.
arXiv Detail & Related papers (2022-11-24T08:09:50Z) - Vision Transformer for Contrastive Clustering [48.476602271481674]
Vision Transformer (ViT) has shown its advantages over the convolutional neural network (CNN)
This paper presents an end-to-end deep image clustering approach termed Vision Transformer for Contrastive Clustering (VTCC)
arXiv Detail & Related papers (2022-06-26T17:00:35Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Large-Scale Hyperspectral Image Clustering Using Contrastive Learning [18.473767002905433]
We present a scalable deep online clustering model, named Spectral-Spatial Contrastive Clustering (SSCC)
We exploit a symmetric twin neural network comprised of a projection head with a dimensionality of the cluster number to conduct dual contrastive learning from a spectral-spatial augmentation pool.
The resulting approach is trained in an end-to-end fashion by batch-wise optimization, making it robust in large-scale data and resulting in good generalization ability for unseen data.
arXiv Detail & Related papers (2021-11-15T17:50:06Z) - Graph Contrastive Clustering [131.67881457114316]
We propose a novel graph contrastive learning framework, which is then applied to the clustering task and we come up with the Graph Constrastive Clustering(GCC) method.
Specifically, on the one hand, the graph Laplacian based contrastive loss is proposed to learn more discriminative and clustering-friendly features.
On the other hand, a novel graph-based contrastive learning strategy is proposed to learn more compact clustering assignments.
arXiv Detail & Related papers (2021-04-03T15:32:49Z) - Deep Clustering by Semantic Contrastive Learning [67.28140787010447]
We introduce a novel variant called Semantic Contrastive Learning (SCL)
It explores the characteristics of both conventional contrastive learning and deep clustering.
It can amplify the strengths of contrastive learning and deep clustering in a unified approach.
arXiv Detail & Related papers (2021-03-03T20:20:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.