Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation
- URL: http://arxiv.org/abs/2004.14705v2
- Date: Sat, 1 Aug 2020 13:35:01 GMT
- Title: Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation
- Authors: Yuheng Jia, Hui Liu, Junhui Hou, Sam Kwong, Qingfu Zhang
- Abstract summary: This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
- Score: 105.33409035876691
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper explores the problem of multi-view spectral clustering (MVSC)
based on tensor low-rank modeling. Unlike the existing methods that all adopt
an off-the-shelf tensor low-rank norm without considering the special
characteristics of the tensor in MVSC, we design a novel structured tensor
low-rank norm tailored to MVSC. Specifically, we explicitly impose a symmetric
low-rank constraint and a structured sparse low-rank constraint on the frontal
and horizontal slices of the tensor to characterize the intra-view and
inter-view relationships, respectively. Moreover, the two constraints could be
jointly optimized to achieve mutual refinement. On the basis of the novel
tensor low-rank norm, we formulate MVSC as a convex low-rank tensor recovery
problem, which is then efficiently solved with an augmented Lagrange multiplier
based method iteratively. Extensive experimental results on five benchmark
datasets show that the proposed method outperforms state-of-the-art methods to
a significant extent. Impressively, our method is able to produce perfect
clustering. In addition, the parameters of our method can be easily tuned, and
the proposed model is robust to different datasets, demonstrating its potential
in practice.
Related papers
- Low-Rank Tensors for Multi-Dimensional Markov Models [33.35376484951434]
We present low-rank tensors for representing transition probabilities on multi-dimensional state spaces.
Our proposed model yields a parsimonious representation with fewer parameters than matrix-based approaches.
arXiv Detail & Related papers (2024-11-04T14:06:49Z) - Irregular Tensor Low-Rank Representation for Hyperspectral Image Representation [71.69331824668954]
Low-rank tensor representation is an important approach to alleviate spectral variations.
Previous low-rank representation methods can only be applied to the regular data cubes.
We propose a novel irregular lowrank representation method that can efficiently model the irregular 3D cubes.
arXiv Detail & Related papers (2024-10-24T02:56:22Z) - Data-freeWeight Compress and Denoise for Large Language Models [101.53420111286952]
We propose a novel approach termed Data-free Joint Rank-k Approximation for compressing the parameter matrices.
We achieve a model pruning of 80% parameters while retaining 93.43% of the original performance without any calibration data.
arXiv Detail & Related papers (2024-02-26T05:51:47Z) - Hyper-Laplacian Regularized Concept Factorization in Low-rank Tensor
Space for Multi-view Clustering [0.0]
We propose a hyper-Laplacian regularized concept factorization (HLRCF) in low-rank tensor space for multi-view clustering.
Specifically, we adopt the concept factorization to explore the latent cluster-wise representation of each view.
Considering that different tensor singular values associate structural information with unequal importance, we develop a self-weighted tensor Schatten p-norm.
arXiv Detail & Related papers (2023-04-22T15:46:58Z) - Multi-View Clustering via Semi-non-negative Tensor Factorization [120.87318230985653]
We develop a novel multi-view clustering based on semi-non-negative tensor factorization (Semi-NTF)
Our model directly considers the between-view relationship and exploits the between-view complementary information.
In addition, we provide an optimization algorithm for the proposed method and prove mathematically that the algorithm always converges to the stationary KKT point.
arXiv Detail & Related papers (2023-03-29T14:54:19Z) - Semi-Supervised Subspace Clustering via Tensor Low-Rank Representation [64.49871502193477]
We propose a novel semi-supervised subspace clustering method, which is able to simultaneously augment the initial supervisory information and construct a discriminative affinity matrix.
Comprehensive experimental results on six commonly-used benchmark datasets demonstrate the superiority of our method over state-of-the-art methods.
arXiv Detail & Related papers (2022-05-21T01:47:17Z) - Spectral Tensor Train Parameterization of Deep Learning Layers [136.4761580842396]
We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context.
We show the effects of neural network compression in the classification setting and both compression and improved stability training in the generative adversarial training setting.
arXiv Detail & Related papers (2021-03-07T00:15:44Z) - Multi-mode Core Tensor Factorization based Low-Rankness and Its
Applications to Tensor Completion [0.0]
Low-rank tensor completion is widely used in computer and machine learning.
This paper develops a kind of multi-modal tensorization algorithm (MCTF) together with a low-rankness measure and a better nonspectral relaxation form of it.
arXiv Detail & Related papers (2020-12-03T13:57:00Z) - Enhanced nonconvex low-rank approximation of tensor multi-modes for
tensor completion [1.3406858660972554]
We propose a novel low-rank approximation tensor multi-modes (LRATM)
A block-bound method-based algorithm is designed to efficiently solve the proposed model.
Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors.
arXiv Detail & Related papers (2020-05-28T08:53:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.