Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory
- URL: http://arxiv.org/abs/2302.05798v1
- Date: Sat, 11 Feb 2023 22:23:27 GMT
- Title: Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory
- Authors: Mohamed El Amine Seddik, Mohammed Mahfoud, Merouane Debbah
- Abstract summary: This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor.
Non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery.
An efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism.
- Score: 5.124256074746721
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper tackles the problem of recovering a low-rank signal tensor with
possibly correlated components from a random noisy tensor, or so-called spiked
tensor model. When the underlying components are orthogonal, they can be
recovered efficiently using tensor deflation which consists of successive
rank-one approximations, while non-orthogonal components may alter the tensor
deflation mechanism, thereby preventing efficient recovery. Relying on recently
developed random tensor tools, this paper deals precisely with the
non-orthogonal case by deriving an asymptotic analysis of a parameterized
deflation procedure performed on an order-three and rank-two spiked tensor.
Based on this analysis, an efficient tensor deflation algorithm is proposed by
optimizing the parameter introduced in the deflation mechanism, which in turn
is proven to be optimal by construction for the studied tensor model. The same
ideas could be extended to more general low-rank tensor models, e.g., higher
ranks and orders, leading to more efficient tensor methods with a broader
impact on machine learning and beyond.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - Decomposable Sparse Tensor on Tensor Regression [1.370633147306388]
We consider the sparse low rank tensor on tensor regression where predictors $mathcalX$ and responses $mathcalY$ are both high-dimensional tensors.
We propose a fast solution based on stagewise search composed by contraction part and generation part which are optimized alternatively.
arXiv Detail & Related papers (2022-12-09T18:16:41Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Understanding Deflation Process in Over-parametrized Tensor
Decomposition [17.28303004783945]
We study the training dynamics for gradient flow on over-parametrized tensor decomposition problems.
Empirically, such training process often first fits larger components and then discovers smaller components.
arXiv Detail & Related papers (2021-06-11T18:51:36Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model [5.414544833902815]
This paper proposes a new method to analyze Robust tensor principal component analysis (RTPCA)
It is based on the recently developed tensor-tensor product and tensor singular value decomposition (t-SVD)
arXiv Detail & Related papers (2020-08-05T16:26:10Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.