Adaptively Topological Tensor Network for Multi-view Subspace Clustering
- URL: http://arxiv.org/abs/2305.00716v1
- Date: Mon, 1 May 2023 08:28:33 GMT
- Title: Adaptively Topological Tensor Network for Multi-view Subspace Clustering
- Authors: Yipeng Liu, Yingcong Lu, Weiting Ou, Zhen Long, Ce Zhu
- Abstract summary: Multi-view subspace clustering uses learned self-representation tensors to exploit low rank information.
A pre-defined tensor decomposition may not fully exploit low rank information for a certain dataset.
We propose the adaptively topological tensor network (ATTN) by determining the edge ranks from the structural information of the self-representation tensor.
- Score: 36.790637575875635
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multi-view subspace clustering methods have employed learned
self-representation tensors from different tensor decompositions to exploit low
rank information. However, the data structures embedded with
self-representation tensors may vary in different multi-view datasets.
Therefore, a pre-defined tensor decomposition may not fully exploit low rank
information for a certain dataset, resulting in sub-optimal multi-view
clustering performance. To alleviate the aforementioned limitations, we propose
the adaptively topological tensor network (ATTN) by determining the edge ranks
from the structural information of the self-representation tensor, and it can
give a better tensor representation with the data-driven strategy.
Specifically, in multi-view tensor clustering, we analyze the higher-order
correlations among different modes of a self-representation tensor, and prune
the links of the weakly correlated ones from a fully connected tensor network.
Therefore, the newly obtained tensor networks can efficiently explore the
essential clustering information with self-representation with different tensor
structures for various datasets. A greedy adaptive rank-increasing strategy is
further applied to improve the capture capacity of low rank structure. We apply
ATTN on multi-view subspace clustering and utilize the alternating direction
method of multipliers to solve it. Experimental results show that multi-view
subspace clustering based on ATTN outperforms the counterparts on six
multi-view datasets.
Related papers
- Interpretable Multi-View Clustering Based on Anchor Graph Tensor Factorization [64.00146569922028]
Multi-view clustering methods based on anchor graph factorization lack adequate cluster interpretability for the decomposed matrix.
We address this limitation by using non-negative tensor factorization to decompose an anchor graph tensor that combines anchor graphs from multiple views.
arXiv Detail & Related papers (2024-04-01T03:23:55Z) - Probing clustering in neural network representations [30.640266399583613]
We study how the many design choices involved in neural network training affect the clusters formed in the hidden representations.
We isolate the training dataset and architecture as important factors affecting clusterability.
We find that normalization strategies affect which layers yield the best clustering performance.
arXiv Detail & Related papers (2023-11-14T02:33:54Z) - Hyper-Laplacian Regularized Concept Factorization in Low-rank Tensor
Space for Multi-view Clustering [0.0]
We propose a hyper-Laplacian regularized concept factorization (HLRCF) in low-rank tensor space for multi-view clustering.
Specifically, we adopt the concept factorization to explore the latent cluster-wise representation of each view.
Considering that different tensor singular values associate structural information with unequal importance, we develop a self-weighted tensor Schatten p-norm.
arXiv Detail & Related papers (2023-04-22T15:46:58Z) - Multi-View Clustering via Semi-non-negative Tensor Factorization [120.87318230985653]
We develop a novel multi-view clustering based on semi-non-negative tensor factorization (Semi-NTF)
Our model directly considers the between-view relationship and exploits the between-view complementary information.
In addition, we provide an optimization algorithm for the proposed method and prove mathematically that the algorithm always converges to the stationary KKT point.
arXiv Detail & Related papers (2023-03-29T14:54:19Z) - Tucker-O-Minus Decomposition for Multi-view Tensor Subspace Clustering [36.790637575875635]
We propose a new tensor decomposition called Tucker-O-Minus Decomposition (TOMD) for multi-view clustering.
Numerical experiments on six benchmark data sets demonstrate the superiority of our proposed method in terms of F-score, precision, recall, normalized mutual information, adjusted rand index, and accuracy.
arXiv Detail & Related papers (2022-10-23T07:20:22Z) - Deep Attention-guided Graph Clustering with Dual Self-supervision [49.040136530379094]
We propose a novel method, namely deep attention-guided graph clustering with dual self-supervision (DAGC)
We develop a dual self-supervision solution consisting of a soft self-supervision strategy with a triplet Kullback-Leibler divergence loss and a hard self-supervision strategy with a pseudo supervision loss.
Our method consistently outperforms state-of-the-art methods on six benchmark datasets.
arXiv Detail & Related papers (2021-11-10T06:53:03Z) - Learning Debiased and Disentangled Representations for Semantic
Segmentation [52.35766945827972]
We propose a model-agnostic and training scheme for semantic segmentation.
By randomly eliminating certain class information in each training iteration, we effectively reduce feature dependencies among classes.
Models trained with our approach demonstrate strong results on multiple semantic segmentation benchmarks.
arXiv Detail & Related papers (2021-10-31T16:15:09Z) - Attention-driven Graph Clustering Network [49.040136530379094]
We propose a novel deep clustering method named Attention-driven Graph Clustering Network (AGCN)
AGCN exploits a heterogeneous-wise fusion module to dynamically fuse the node attribute feature and the topological graph feature.
AGCN can jointly perform feature learning and cluster assignment in an unsupervised fashion.
arXiv Detail & Related papers (2021-08-12T02:30:38Z) - Tensor-based Intrinsic Subspace Representation Learning for Multi-view
Clustering [18.0093330816895]
We propose a novel-based Intrinsic Subspace Representation (TISRL) for multi-view clustering in this paper.
It can be seen that specific information contained in different views is fully investigated by the rank preserving decomposition.
Experimental results on nine common used real-world multi-view datasets illustrate the superiority of TISRL.
arXiv Detail & Related papers (2020-10-19T03:36:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.