Large-Scale Hyperspectral Image Clustering Using Contrastive Learning
- URL: http://arxiv.org/abs/2111.07945v1
- Date: Mon, 15 Nov 2021 17:50:06 GMT
- Title: Large-Scale Hyperspectral Image Clustering Using Contrastive Learning
- Authors: Yaoming Cai, Zijia Zhang, Yan Liu, Pedram Ghamisi, Kun Li, Xiaobo Liu,
Zhihua Cai
- Abstract summary: We present a scalable deep online clustering model, named Spectral-Spatial Contrastive Clustering (SSCC)
We exploit a symmetric twin neural network comprised of a projection head with a dimensionality of the cluster number to conduct dual contrastive learning from a spectral-spatial augmentation pool.
The resulting approach is trained in an end-to-end fashion by batch-wise optimization, making it robust in large-scale data and resulting in good generalization ability for unseen data.
- Score: 18.473767002905433
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Clustering of hyperspectral images is a fundamental but challenging task. The
recent development of hyperspectral image clustering has evolved from shallow
models to deep and achieved promising results in many benchmark datasets.
However, their poor scalability, robustness, and generalization ability, mainly
resulting from their offline clustering scenarios, greatly limit their
application to large-scale hyperspectral data. To circumvent these problems, we
present a scalable deep online clustering model, named Spectral-Spatial
Contrastive Clustering (SSCC), based on self-supervised learning. Specifically,
we exploit a symmetric twin neural network comprised of a projection head with
a dimensionality of the cluster number to conduct dual contrastive learning
from a spectral-spatial augmentation pool. We define the objective function by
implicitly encouraging within-cluster similarity and reducing between-cluster
redundancy. The resulting approach is trained in an end-to-end fashion by
batch-wise optimization, making it robust in large-scale data and resulting in
good generalization ability for unseen data. Extensive experiments on three
hyperspectral image benchmarks demonstrate the effectiveness of our approach
and show that we advance the state-of-the-art approaches by large margins.
Related papers
- Superpixel Graph Contrastive Clustering with Semantic-Invariant
Augmentations for Hyperspectral Images [64.72242126879503]
Hyperspectral images (HSI) clustering is an important but challenging task.
We first use 3-D and 2-D hybrid convolutional neural networks to extract the high-order spatial and spectral features of HSI.
We then design a superpixel graph contrastive clustering model to learn discriminative superpixel representations.
arXiv Detail & Related papers (2024-03-04T07:40:55Z) - Snapshot Spectral Clustering -- a costless approach to deep clustering
ensembles generation [0.0]
This paper proposes a novel deep clustering ensemble method - Snapshot Spectral Clustering.
It is designed to maximize the gain from combining multiple data views while minimizing the computational costs of creating the ensemble.
arXiv Detail & Related papers (2023-07-17T16:01:22Z) - Cluster-guided Contrastive Graph Clustering Network [53.16233290797777]
We propose a Cluster-guided Contrastive deep Graph Clustering network (CCGC)
We construct two views of the graph by designing special Siamese encoders whose weights are not shared between the sibling sub-networks.
To construct semantic meaningful negative sample pairs, we regard the centers of different high-confidence clusters as negative samples.
arXiv Detail & Related papers (2023-01-03T13:42:38Z) - A Deep Dive into Deep Cluster [0.2578242050187029]
DeepCluster is a simple and scalable unsupervised pretraining of visual representations.
We show that DeepCluster convergence and performance depend on the interplay between the quality of the randomly filters of the convolutional layer and the selected number of clusters.
arXiv Detail & Related papers (2022-07-24T22:55:09Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - Learning Statistical Representation with Joint Deep Embedded Clustering [2.1267423178232407]
StatDEC is an unsupervised framework for joint statistical representation learning and clustering.
Our experiments show that using these representations, one can considerably improve results on imbalanced image clustering across a variety of image datasets.
arXiv Detail & Related papers (2021-09-11T09:26:52Z) - Clustering by Maximizing Mutual Information Across Views [62.21716612888669]
We propose a novel framework for image clustering that incorporates joint representation learning and clustering.
Our method significantly outperforms state-of-the-art single-stage clustering methods across a variety of image datasets.
arXiv Detail & Related papers (2021-07-24T15:36:49Z) - Learning to Cluster Faces via Confidence and Connectivity Estimation [136.5291151775236]
We propose a fully learnable clustering framework without requiring a large number of overlapped subgraphs.
Our method significantly improves clustering accuracy and thus performance of the recognition models trained on top, yet it is an order of magnitude more efficient than existing supervised methods.
arXiv Detail & Related papers (2020-04-01T13:39:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.