Overcomplete Deep Subspace Clustering Networks
- URL: http://arxiv.org/abs/2011.08306v1
- Date: Mon, 16 Nov 2020 22:07:18 GMT
- Title: Overcomplete Deep Subspace Clustering Networks
- Authors: Jeya Maria Jose Valanarasu, Vishal M. Patel
- Abstract summary: Experimental results on four benchmark datasets show the effectiveness of the proposed method over DSC and other clustering methods in terms of clustering error.
Our method is also not as dependent as DSC is on where pre-training should be stopped to get the best performance and is also more robust to noise.
- Score: 80.16644725886968
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep Subspace Clustering Networks (DSC) provide an efficient solution to the
problem of unsupervised subspace clustering by using an undercomplete deep
auto-encoder with a fully-connected layer to exploit the self expressiveness
property. This method uses undercomplete representations of the input data
which makes it not so robust and more dependent on pre-training. To overcome
this, we propose a simple yet efficient alternative method - Overcomplete Deep
Subspace Clustering Networks (ODSC) where we use overcomplete representations
for subspace clustering. In our proposed method, we fuse the features from both
undercomplete and overcomplete auto-encoder networks before passing them
through the self-expressive layer thus enabling us to extract a more meaningful
and robust representation of the input data for clustering. Experimental
results on four benchmark datasets show the effectiveness of the proposed
method over DSC and other clustering methods in terms of clustering error. Our
method is also not as dependent as DSC is on where pre-training should be
stopped to get the best performance and is also more robust to noise. Code -
\href{https://github.com/jeya-maria-jose/Overcomplete-Deep-Subspace-Clustering}{https://github.com/jeya-maria-jose/Overcomplete-Deep-Subspace-Clustering
Related papers
- OMH: Structured Sparsity via Optimally Matched Hierarchy for Unsupervised Semantic Segmentation [69.37484603556307]
Un Semantic segmenting (USS) involves segmenting images without relying on predefined labels.
We introduce a novel approach called Optimally Matched Hierarchy (OMH) to simultaneously address the above issues.
Our OMH yields better unsupervised segmentation performance compared to existing USS methods.
arXiv Detail & Related papers (2024-03-11T09:46:41Z) - Multilayer Graph Approach to Deep Subspace Clustering [0.0]
Deep subspace clustering (DSC) networks based on self-expressive model learn representation matrix, often implemented in terms of fully connected network.
Here, we apply selected linear subspace clustering algorithm to learn representation from representations learned by all layers of encoder network including the input data.
We validate proposed approach on four well-known datasets with two DSC networks as baseline models.
arXiv Detail & Related papers (2024-01-30T14:09:41Z) - Reinforcement Graph Clustering with Unknown Cluster Number [91.4861135742095]
We propose a new deep graph clustering method termed Reinforcement Graph Clustering.
In our proposed method, cluster number determination and unsupervised representation learning are unified into a uniform framework.
In order to conduct feedback actions, the clustering-oriented reward function is proposed to enhance the cohesion of the same clusters and separate the different clusters.
arXiv Detail & Related papers (2023-08-13T18:12:28Z) - Hard Regularization to Prevent Deep Online Clustering Collapse without
Data Augmentation [65.268245109828]
Online deep clustering refers to the joint use of a feature extraction network and a clustering model to assign cluster labels to each new data point or batch as it is processed.
While faster and more versatile than offline methods, online clustering can easily reach the collapsed solution where the encoder maps all inputs to the same point and all are put into a single cluster.
We propose a method that does not require data augmentation, and that, differently from existing methods, regularizes the hard assignments.
arXiv Detail & Related papers (2023-03-29T08:23:26Z) - DeepCluE: Enhanced Image Clustering via Multi-layer Ensembles in Deep
Neural Networks [53.88811980967342]
This paper presents a Deep Clustering via Ensembles (DeepCluE) approach.
It bridges the gap between deep clustering and ensemble clustering by harnessing the power of multiple layers in deep neural networks.
Experimental results on six image datasets confirm the advantages of DeepCluE over the state-of-the-art deep clustering approaches.
arXiv Detail & Related papers (2022-06-01T09:51:38Z) - Very Compact Clusters with Structural Regularization via Similarity and
Connectivity [3.779514860341336]
We propose an end-to-end deep clustering algorithm, i.e., Very Compact Clusters (VCC) for the general datasets.
Our proposed approach achieves better clustering performance over most of the state-of-the-art clustering methods.
arXiv Detail & Related papers (2021-06-09T23:22:03Z) - AutoEmbedder: A semi-supervised DNN embedding system for clustering [0.0]
This paper introduces a novel embedding system named AutoEmbedder, that downsamples higher dimensional data to clusterable embedding points.
The training process is semi-supervised and uses Siamese network architecture to compute pairwise constraint loss in the feature learning phase.
arXiv Detail & Related papers (2020-07-11T19:00:45Z) - Learnable Subspace Clustering [76.2352740039615]
We develop a learnable subspace clustering paradigm to efficiently solve the large-scale subspace clustering problem.
The key idea is to learn a parametric function to partition the high-dimensional subspaces into their underlying low-dimensional subspaces.
To the best of our knowledge, this paper is the first work to efficiently cluster millions of data points among the subspace clustering methods.
arXiv Detail & Related papers (2020-04-09T12:53:28Z) - Robust Self-Supervised Convolutional Neural Network for Subspace
Clustering and Classification [0.10152838128195464]
This paper proposes the robust formulation of the self-supervised convolutional subspace clustering network ($S2$ConvSCN)
In a truly unsupervised training environment, Robust $S2$ConvSCN outperforms its baseline version by a significant amount for both seen and unseen data on four well-known datasets.
arXiv Detail & Related papers (2020-04-03T16:07:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.