Self-Supervised Deep Subspace Clustering with Entropy-norm
- URL: http://arxiv.org/abs/2206.04958v1
- Date: Fri, 10 Jun 2022 09:15:33 GMT
- Title: Self-Supervised Deep Subspace Clustering with Entropy-norm
- Authors: Guangyi Zhao and Simin Kou and Xuesong Yin
- Abstract summary: Self-Supervised deep Subspace Clustering with Entropy-norm (S$3$CE)
S$3$CE exploits a self-supervised contrastive network to gain a more effetive feature vector.
New module with data enhancement is designed to help S$3$CE focus on the key information of data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Auto-Encoder based deep subspace clustering (DSC) is widely used in computer
vision, motion segmentation and image processing. However, it suffers from the
following three issues in the self-expressive matrix learning process: the
first one is less useful information for learning self-expressive weights due
to the simple reconstruction loss; the second one is that the construction of
the self-expression layer associated with the sample size requires
high-computational cost; and the last one is the limited connectivity of the
existing regularization terms. In order to address these issues, in this paper
we propose a novel model named Self-Supervised deep Subspace Clustering with
Entropy-norm (S$^{3}$CE). Specifically, S$^{3}$CE exploits a self-supervised
contrastive network to gain a more effetive feature vector. The local structure
and dense connectivity of the original data benefit from the self-expressive
layer and additional entropy-norm constraint. Moreover, a new module with data
enhancement is designed to help S$^{3}$CE focus on the key information of data,
and improve the clustering performance of positive and negative instances
through spectral clustering. Extensive experimental results demonstrate the
superior performance of S$^{3}$CE in comparison to the state-of-the-art
approaches.
Related papers
- Autoencoded UMAP-Enhanced Clustering for Unsupervised Learning [49.1574468325115]
We propose a novel approach to unsupervised learning by constructing a non-linear embedding of the data into a low-dimensional space followed by any conventional clustering algorithm.
The embedding promotes clusterability of the data and is comprised of two mappings: the encoder of an autoencoder neural network and the output of UMAP algorithm.
When applied to MNIST data, AUEC significantly outperforms the state-of-the-art techniques in terms of clustering accuracy.
arXiv Detail & Related papers (2025-01-13T22:30:38Z) - Unfolding ADMM for Enhanced Subspace Clustering of Hyperspectral Images [43.152314090830174]
We introduce an innovative clustering architecture for hyperspectral images (HSI) by unfolding an iterative solver based on the Alternating Direction Method of Multipliers (ADMM) for sparse subspace clustering.
Our approach captures well the structural characteristics of HSI data by employing the K nearest neighbors algorithm as part of a structure preservation module.
arXiv Detail & Related papers (2024-04-10T15:51:46Z) - UGMAE: A Unified Framework for Graph Masked Autoencoders [67.75493040186859]
We propose UGMAE, a unified framework for graph masked autoencoders.
We first develop an adaptive feature mask generator to account for the unique significance of nodes.
We then design a ranking-based structure reconstruction objective joint with feature reconstruction to capture holistic graph information.
arXiv Detail & Related papers (2024-02-12T19:39:26Z) - Unsupervised Learning on 3D Point Clouds by Clustering and Contrasting [11.64827192421785]
unsupervised representation learning is a promising direction to auto-extract features without human intervention.
This paper proposes a general unsupervised approach, named textbfConClu, to perform the learning of point-wise and global features.
arXiv Detail & Related papers (2022-02-05T12:54:17Z) - Deep clustering with fusion autoencoder [0.0]
Deep clustering (DC) models capitalize on autoencoders to learn intrinsic features which facilitate the clustering process in consequence.
In this paper, a novel DC method is proposed to address this issue. Specifically, the generative adversarial network and VAE are coalesced into a new autoencoder called fusion autoencoder (FAE)
arXiv Detail & Related papers (2022-01-11T07:38:03Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - PC-RGNN: Point Cloud Completion and Graph Neural Network for 3D Object
Detection [57.49788100647103]
LiDAR-based 3D object detection is an important task for autonomous driving.
Current approaches suffer from sparse and partial point clouds of distant and occluded objects.
In this paper, we propose a novel two-stage approach, namely PC-RGNN, dealing with such challenges by two specific solutions.
arXiv Detail & Related papers (2020-12-18T18:06:43Z) - Deep N-ary Error Correcting Output Codes [66.15481033522343]
Data-independent ensemble methods like Error Correcting Output Codes (ECOC) attract increasing attention.
N-ary ECOC decomposes the original multi-class classification problem into a series of independent simpler classification subproblems.
We propose three different variants of parameter sharing architectures for deep N-ary ECOC.
arXiv Detail & Related papers (2020-09-22T11:35:03Z) - Dual-constrained Deep Semi-Supervised Coupled Factorization Network with
Enriched Prior [80.5637175255349]
We propose a new enriched prior based Dual-constrained Deep Semi-Supervised Coupled Factorization Network, called DS2CF-Net.
To ex-tract hidden deep features, DS2CF-Net is modeled as a deep-structure and geometrical structure-constrained neural network.
Our network can obtain state-of-the-art performance for representation learning and clustering.
arXiv Detail & Related papers (2020-09-08T13:10:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.