Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model
- URL: http://arxiv.org/abs/2008.02211v1
- Date: Wed, 5 Aug 2020 16:26:10 GMT
- Title: Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model
- Authors: Bo Shen, Zhenyu (James) Kong
- Abstract summary: This paper proposes a new method to analyze Robust tensor principal component analysis (RTPCA)
It is based on the recently developed tensor-tensor product and tensor singular value decomposition (t-SVD)
- Score: 5.414544833902815
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Tensor, also known as multi-dimensional array, arises from many applications
in signal processing, manufacturing processes, healthcare, among others. As one
of the most popular methods in tensor literature, Robust tensor principal
component analysis (RTPCA) is a very effective tool to extract the low rank and
sparse components in tensors. In this paper, a new method to analyze RTPCA is
proposed based on the recently developed tensor-tensor product and tensor
singular value decomposition (t-SVD). Specifically, it aims to solve a convex
optimization problem whose objective function is a weighted combination of the
tensor nuclear norm and the l1-norm. In most of literature of RTPCA, the exact
recovery is built on the tensor incoherence conditions and the assumption of a
uniform model on the sparse support. Unlike this conventional way, in this
paper, without any assumption of randomness, the exact recovery can be achieved
in a completely deterministic fashion by characterizing the tensor
rank-sparsity incoherence, which is an uncertainty principle between the
low-rank tensor spaces and the pattern of sparse tensor.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Low-Multi-Rank High-Order Bayesian Robust Tensor Factorization [7.538654977500241]
We propose a novel high-order TRPCA method, named as Low-Multi-rank High-order Robust Factorization (LMH-BRTF) within the Bayesian framework.
Specifically, we decompose the observed corrupted tensor into three parts, i.e., the low-rank component, the sparse component, and the noise component.
By constructing a low-rank model for the low-rank component based on the order-$d$ t-SVD, LMH-BRTF can automatically determine the tensor multi-rank.
arXiv Detail & Related papers (2023-11-10T06:15:38Z) - Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory [5.124256074746721]
This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor.
Non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery.
An efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism.
arXiv Detail & Related papers (2023-02-11T22:23:27Z) - Decomposable Sparse Tensor on Tensor Regression [1.370633147306388]
We consider the sparse low rank tensor on tensor regression where predictors $mathcalX$ and responses $mathcalY$ are both high-dimensional tensors.
We propose a fast solution based on stagewise search composed by contraction part and generation part which are optimized alternatively.
arXiv Detail & Related papers (2022-12-09T18:16:41Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Tensor Full Feature Measure and Its Nonconvex Relaxation Applications to
Tensor Recovery [1.8899300124593645]
We propose a new tensor sparsity measure called Full Feature Measure (FFM)
It can simultaneously describe the feature dimension each dimension, and connect the Tucker rank with the tensor tube rank.
Two efficient models based on FFM are proposed, and two Alternating Multiplier Method (ADMM) algorithms are developed to solve the proposed model.
arXiv Detail & Related papers (2021-09-25T01:44:34Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.