Tensor Full Feature Measure and Its Nonconvex Relaxation Applications to
Tensor Recovery
- URL: http://arxiv.org/abs/2109.12257v1
- Date: Sat, 25 Sep 2021 01:44:34 GMT
- Title: Tensor Full Feature Measure and Its Nonconvex Relaxation Applications to
Tensor Recovery
- Authors: Hongbing Zhang, Xinyi Liu, Hongtao Fan, Yajing Li, Yinlin Ye
- Abstract summary: We propose a new tensor sparsity measure called Full Feature Measure (FFM)
It can simultaneously describe the feature dimension each dimension, and connect the Tucker rank with the tensor tube rank.
Two efficient models based on FFM are proposed, and two Alternating Multiplier Method (ADMM) algorithms are developed to solve the proposed model.
- Score: 1.8899300124593645
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Tensor sparse modeling as a promising approach, in the whole of science and
engineering has been a huge success. As is known to all, various data in
practical application are often generated by multiple factors, so the use of
tensors to represent the data containing the internal structure of multiple
factors came into being. However, different from the matrix case, constructing
reasonable sparse measure of tensor is a relatively difficult and very
important task. Therefore, in this paper, we propose a new tensor sparsity
measure called Tensor Full Feature Measure (FFM). It can simultaneously
describe the feature information of each dimension of the tensor and the
related features between two dimensions, and connect the Tucker rank with the
tensor tube rank. This measurement method can describe the sparse features of
the tensor more comprehensively. On this basis, we establish its non-convex
relaxation, and apply FFM to low rank tensor completion (LRTC) and tensor
robust principal component analysis (TRPCA). LRTC and TRPCA models based on FFM
are proposed, and two efficient Alternating Direction Multiplier Method (ADMM)
algorithms are developed to solve the proposed model. A variety of real
numerical experiments substantiate the superiority of the proposed methods
beyond state-of-the-arts.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Tensorized LSSVMs for Multitask Regression [48.844191210894245]
Multitask learning (MTL) can utilize the relatedness between multiple tasks for performance improvement.
New MTL is proposed by leveraging low-rank tensor analysis and Least Squares Support Vectorized Least Squares Support Vectorized tLSSVM-MTL.
arXiv Detail & Related papers (2023-03-04T16:36:03Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Multi-mode Core Tensor Factorization based Low-Rankness and Its
Applications to Tensor Completion [0.0]
Low-rank tensor completion is widely used in computer and machine learning.
This paper develops a kind of multi-modal tensorization algorithm (MCTF) together with a low-rankness measure and a better nonspectral relaxation form of it.
arXiv Detail & Related papers (2020-12-03T13:57:00Z) - Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model [5.414544833902815]
This paper proposes a new method to analyze Robust tensor principal component analysis (RTPCA)
It is based on the recently developed tensor-tensor product and tensor singular value decomposition (t-SVD)
arXiv Detail & Related papers (2020-08-05T16:26:10Z) - Geometric All-Way Boolean Tensor Decomposition [14.065968221500246]
We present GETF, which sequentially identifies the rank-1 basis for a tensor from a geometric perspective.
Experiments on both synthetic and real-world data demonstrated that GETF has significantly improved performance in reconstruction accuracy, extraction of latent structures and it is an order of magnitude faster than other state-of-the-art methods.
arXiv Detail & Related papers (2020-07-31T03:29:44Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z) - A Unified Framework for Coupled Tensor Completion [42.19293115131073]
Coupled tensor decomposition reveals the joint data structure by incorporating priori knowledge that come from the latent coupled factors.
The TR has powerful expression ability and achieves success in some multi-dimensional data processing applications.
The proposed method is validated on numerical experiments on synthetic data, and experimental results on real-world data demonstrate its superiority over the state-of-the-art methods in terms of recovery accuracy.
arXiv Detail & Related papers (2020-01-09T02:15:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.