Deep Double Self-Expressive Subspace Clustering
- URL: http://arxiv.org/abs/2306.11592v1
- Date: Tue, 20 Jun 2023 15:10:35 GMT
- Title: Deep Double Self-Expressive Subspace Clustering
- Authors: Ling Zhao, Yunpeng Ma, Shanxiong Chen, Jun Zhou
- Abstract summary: We propose a double self-expressive subspace clustering algorithm.
The proposed algorithm can achieve better clustering than state-of-the-art methods.
- Score: 7.875193047472789
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep subspace clustering based on auto-encoder has received wide attention.
However, most subspace clustering based on auto-encoder does not utilize the
structural information in the self-expressive coefficient matrix, which limits
the clustering performance. In this paper, we propose a double self-expressive
subspace clustering algorithm. The key idea of our solution is to view the
self-expressive coefficient as a feature representation of the example to get
another coefficient matrix. Then, we use the two coefficient matrices to
construct the affinity matrix for spectral clustering. We find that it can
reduce the subspace-preserving representation error and improve connectivity.
To further enhance the clustering performance, we proposed a self-supervised
module based on contrastive learning, which can further improve the performance
of the trained network. Experiments on several benchmark datasets demonstrate
that the proposed algorithm can achieve better clustering than state-of-the-art
methods.
Related papers
- Scalable Co-Clustering for Large-Scale Data through Dynamic Partitioning and Hierarchical Merging [7.106620444966807]
Co-clustering simultaneously clusters rows and columns, revealing more fine-grained groups.
Existing co-clustering methods suffer from poor scalability and cannot handle large-scale data.
This paper presents a novel and scalable co-clustering method designed to uncover intricate patterns in high-dimensional, large-scale datasets.
arXiv Detail & Related papers (2024-10-09T04:47:22Z) - Adaptive Graph Convolutional Subspace Clustering [10.766537212211217]
Spectral-type subspace clustering algorithms have shown excellent performance in many subspace clustering applications.
In this paper, inspired by graph convolutional networks, we use the graph convolution technique to develop a feature extraction method and a coefficient matrix constraint simultaneously.
We claim that by using AGCSC, the aggregated feature representation of original data samples is suitable for subspace clustering.
arXiv Detail & Related papers (2023-05-05T10:27:23Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Ensemble Clustering via Co-association Matrix Self-enhancement [16.928049559092454]
Ensemble clustering integrates a set of base clustering results to generate a stronger one.
Existing methods usually rely on a co-association (CA) matrix that measures how many times two samples are grouped into the same cluster.
We propose a simple yet effective CA matrix self-enhancement framework that can improve the CA matrix to achieve better clustering performance.
arXiv Detail & Related papers (2022-05-12T07:54:32Z) - Deep Embedded K-Means Clustering [1.5697094704362897]
Key idea is that representation learning and clustering can reinforce each other.
In this paper, we propose DEKM (for Deep Embedded K-Means) to answer these two questions.
Experimental results on the real-world datasets demonstrate that DEKM achieves state-of-the-art performance.
arXiv Detail & Related papers (2021-09-30T14:12:59Z) - Adaptive Attribute and Structure Subspace Clustering Network [49.040136530379094]
We propose a novel self-expressiveness-based subspace clustering network.
We first consider an auto-encoder to represent input data samples.
Then, we construct a mixed signed and symmetric structure matrix to capture the local geometric structure underlying data.
We perform self-expressiveness on the constructed attribute structure and matrices to learn their affinity graphs.
arXiv Detail & Related papers (2021-09-28T14:00:57Z) - Clustering Ensemble Meets Low-rank Tensor Approximation [50.21581880045667]
This paper explores the problem of clustering ensemble, which aims to combine multiple base clusterings to produce better performance than that of the individual one.
We propose a novel low-rank tensor approximation-based method to solve the problem from a global perspective.
Experimental results over 7 benchmark data sets show that the proposed model achieves a breakthrough in clustering performance, compared with 12 state-of-the-art methods.
arXiv Detail & Related papers (2020-12-16T13:01:37Z) - Overcomplete Deep Subspace Clustering Networks [80.16644725886968]
Experimental results on four benchmark datasets show the effectiveness of the proposed method over DSC and other clustering methods in terms of clustering error.
Our method is also not as dependent as DSC is on where pre-training should be stopped to get the best performance and is also more robust to noise.
arXiv Detail & Related papers (2020-11-16T22:07:18Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z) - Multi-View Spectral Clustering with High-Order Optimal Neighborhood
Laplacian Matrix [57.11971786407279]
Multi-view spectral clustering can effectively reveal the intrinsic cluster structure among data.
This paper proposes a multi-view spectral clustering algorithm that learns a high-order optimal neighborhood Laplacian matrix.
Our proposed algorithm generates the optimal Laplacian matrix by searching the neighborhood of the linear combination of both the first-order and high-order base.
arXiv Detail & Related papers (2020-08-31T12:28:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.