Maximum Entropy Subspace Clustering Network
- URL: http://arxiv.org/abs/2012.03176v1
- Date: Sun, 6 Dec 2020 03:50:49 GMT
- Title: Maximum Entropy Subspace Clustering Network
- Authors: Zhihao Peng, Yuheng Jia, Hui Liu, Junhui Hou, Qingfu Zhang
- Abstract summary: We propose a novel deep learning-based clustering method named Maximum Entropy Subspace Clustering Network (MESC-Net)
MESC-Net maximizes the learned affinity matrix's entropy to encourage it to exhibit an ideal affinity matrix structure.
We experimentally show that its elements corresponding to the same subspace are uniformly and densely distributed, which gives better clustering performance.
- Score: 46.96462192586944
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep subspace clustering network (DSC-Net) and its numerous variants have
achieved impressive performance for subspace clustering, in which an
auto-encoder non-linearly maps input data into a latent space, and a fully
connected layer named self-expressiveness module is introduced between the
encoder and the decoder to learn an affinity matrix. However, the adopted
regularization on the affinity matrix (e.g., sparse, Tikhonov, or low-rank) is
still insufficient to drive the learning of an ideal affinity matrix, thus
limiting their performance. In addition, in DSC-Net, the self-expressiveness
module and the auto-encoder module are tightly coupled, making the training of
the DSC-Net non-trivial. To this end, in this paper, we propose a novel deep
learning-based clustering method named Maximum Entropy Subspace Clustering
Network (MESC-Net). Specifically, MESC-Net maximizes the learned affinity
matrix's entropy to encourage it to exhibit an ideal affinity matrix structure.
We theoretically prove that the affinity matrix driven by MESC-Net obeys the
block-diagonal property, and experimentally show that its elements
corresponding to the same subspace are uniformly and densely distributed, which
gives better clustering performance. Moreover, we explicitly decouple the
auto-encoder module and the self-expressiveness module. Extensive quantitative
and qualitative results on commonly used benchmark datasets validate MESC-Net
significantly outperforms state-of-the-art methods.
Related papers
- Hierarchical Multiple Kernel K-Means Algorithm Based on Sparse Connectivity [6.524937526547959]
This paper proposes a hierarchical multiple kernel K-Means (SCHMKKM) algorithm based on sparse connectivity.
It is shown that more discnative information fusion is beneficial for learning a better consistent partition matrix.
arXiv Detail & Related papers (2024-10-27T09:35:09Z) - Synergistic eigenanalysis of covariance and Hessian matrices for enhanced binary classification [72.77513633290056]
We present a novel approach that combines the eigenanalysis of a covariance matrix evaluated on a training set with a Hessian matrix evaluated on a deep learning model.
Our method captures intricate patterns and relationships, enhancing classification performance.
arXiv Detail & Related papers (2024-02-14T16:10:42Z) - Deep Double Self-Expressive Subspace Clustering [7.875193047472789]
We propose a double self-expressive subspace clustering algorithm.
The proposed algorithm can achieve better clustering than state-of-the-art methods.
arXiv Detail & Related papers (2023-06-20T15:10:35Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - Adaptive Attribute and Structure Subspace Clustering Network [49.040136530379094]
We propose a novel self-expressiveness-based subspace clustering network.
We first consider an auto-encoder to represent input data samples.
Then, we construct a mixed signed and symmetric structure matrix to capture the local geometric structure underlying data.
We perform self-expressiveness on the constructed attribute structure and matrices to learn their affinity graphs.
arXiv Detail & Related papers (2021-09-28T14:00:57Z) - Multiple Kernel Representation Learning on Networks [12.106994960669924]
We propose a weighted matrix factorization model that encodes random walk-based information about nodes of the network.
We extend the approach with a multiple kernel learning formulation that provides the flexibility of learning the kernel as the linear combination of a dictionary of kernels.
arXiv Detail & Related papers (2021-06-09T13:22:26Z) - Joint Optimization of an Autoencoder for Clustering and Embedding [22.16059261437617]
We present an alternative where the autoencoder and the clustering are learned simultaneously.
That simple neural network, referred to as the clustering module, can be integrated into a deep autoencoder resulting in a deep clustering model.
arXiv Detail & Related papers (2020-12-07T14:38:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.