On Recoverability of Randomly Compressed Tensors with Low CP Rank
- URL: http://arxiv.org/abs/2001.02370v1
- Date: Wed, 8 Jan 2020 04:44:13 GMT
- Title: On Recoverability of Randomly Compressed Tensors with Low CP Rank
- Authors: Shahana Ibrahim, Xiao Fu, Xingguo Li
- Abstract summary: We show that if the number of measurements is on the same order of magnitude as that of the model parameters, then the tensor is recoverable.
Our proof is based on deriving a textitrestricted isometry property (R.I.P.) under the CPD model via set covering techniques.
- Score: 29.00634848772122
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Our interest lies in the recoverability properties of compressed tensors
under the \textit{canonical polyadic decomposition} (CPD) model. The considered
problem is well-motivated in many applications, e.g., hyperspectral image and
video compression. Prior work studied this problem under somewhat special
assumptions---e.g., the latent factors of the tensor are sparse or drawn from
absolutely continuous distributions. We offer an alternative result: We show
that if the tensor is compressed by a subgaussian linear mapping, then the
tensor is recoverable if the number of measurements is on the same order of
magnitude as that of the model parameters---without strong assumptions on the
latent factors. Our proof is based on deriving a \textit{restricted isometry
property} (R.I.P.) under the CPD model via set covering techniques, and thus
exhibits a flavor of classic compressive sensing. The new recoverability result
enriches the understanding to the compressed CP tensor recovery problem; it
offers theoretical guarantees for recovering tensors whose elements are not
necessarily continuous or sparse.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - Guaranteed Tensor Recovery Fused Low-rankness and Smoothness [38.0243349269575]
We build a unique regularization term, which essentially encodes both L and S priors of a tensor simultaneously.
This should be the first exact-recovery results among all related L+S methods for tensor recovery.
arXiv Detail & Related papers (2023-02-04T12:20:32Z) - Low-Rank Tensor Function Representation for Multi-Dimensional Data
Recovery [52.21846313876592]
Low-rank tensor function representation (LRTFR) can continuously represent data beyond meshgrid with infinite resolution.
We develop two fundamental concepts for tensor functions, i.e., the tensor function rank and low-rank tensor function factorization.
Our method substantiates the superiority and versatility of our method as compared with state-of-the-art methods.
arXiv Detail & Related papers (2022-12-01T04:00:38Z) - Lower Bounds for the Convergence of Tensor Power Iteration on Random
Overcomplete Models [3.7565501074323224]
We show that tensorly many steps are necessary for convergence of tensor power iteration to any true component.
We prove that a popular objective function for tensor decomposition is strictly increasing along the power iteration path.
arXiv Detail & Related papers (2022-11-07T19:23:51Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - CP Degeneracy in Tensor Regression [11.193867567895353]
We show that CANDECOMP/PARAFAC (CP) low-rank constraints are often imposed on the coefficient parameter in the (penalized) $M$-estimation.
This is closely related to a phenomenon, called CP degeneracy, in low-rank approximation problems.
arXiv Detail & Related papers (2020-10-22T16:08:44Z) - Low-Rank and Sparse Enhanced Tucker Decomposition for Tensor Completion [3.498620439731324]
We introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion.
Our model possesses a sparse regularization term to promote a sparse core tensor, which is beneficial for tensor data compression.
It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors.
arXiv Detail & Related papers (2020-10-01T12:45:39Z) - Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model [5.414544833902815]
This paper proposes a new method to analyze Robust tensor principal component analysis (RTPCA)
It is based on the recently developed tensor-tensor product and tensor singular value decomposition (t-SVD)
arXiv Detail & Related papers (2020-08-05T16:26:10Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.