Exact nuclear norm, completion and decomposition for random overcomplete
tensors via degree-4 SOS
- URL: http://arxiv.org/abs/2011.09416v2
- Date: Thu, 28 Oct 2021 17:59:19 GMT
- Title: Exact nuclear norm, completion and decomposition for random overcomplete
tensors via degree-4 SOS
- Authors: Bohdan Kivva and Aaron Potechin
- Abstract summary: We show that simple semidefinite programs inspired by degree $4$ SOS can exactly solve the tensor nuclear norm, tensor decomposition, and tensor completion problems on tensors with random asymmetric components.
We show that w.h.p. these semidefinite programs can exactly find the nuclear norm and components of an $(ntimes ntimes n)$-tensor $mathcalT$ with $mleq n3/2/polylog(n)$ random asymmetric components.
- Score: 0.7233897166339269
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we show that simple semidefinite programs inspired by degree
$4$ SOS can exactly solve the tensor nuclear norm, tensor decomposition, and
tensor completion problems on tensors with random asymmetric components. More
precisely, for tensor nuclear norm and tensor decomposition, we show that
w.h.p. these semidefinite programs can exactly find the nuclear norm and
components of an $(n\times n\times n)$-tensor $\mathcal{T}$ with $m\leq
n^{3/2}/polylog(n)$ random asymmetric components. Unlike most of the previous
algorithms, our algorithm provides a certificate for the decomposition, does
not require knowledge about the number of components in the decomposition and
does not make any assumptions on the sizes of the coefficients in the
decomposition. As a byproduct, we show that w.h.p. the nuclear norm
decomposition exactly coincides with the minimum rank decomposition for tensors
with $m\leq n^{3/2}/polylog(n)$ random asymmetric components.
For tensor completion, we show that w.h.p. the semidefinite program,
introduced by Potechin & Steurer (2017) for tensors with orthogonal components,
can exactly recover an $(n\times n\times n)$-tensor $\mathcal{T}$ with $m$
random asymmetric components from only $n^{3/2}m polylog(n)$ randomly observed
entries. For non-orthogonal tensors, this improves the dependence on $m$ of the
number of entries needed for exact recovery over all previously known
algorithms and provides the first theoretical guarantees for exact tensor
completion in the overcomplete regime.
Related papers
- Overcomplete Tensor Decomposition via Koszul-Young Flattenings [63.01248796170617]
We give a new algorithm for decomposing an $n_times n times n_3$ tensor as the sum of a minimal number of rank-1 terms.
We show that an even more general class of degree-$d$s cannot surpass rank $Cn$ for a constant $C = C(d)$.
arXiv Detail & Related papers (2024-11-21T17:41:09Z) - On the Accuracy of Hotelling-Type Asymmetric Tensor Deflation: A Random
Tensor Analysis [14.809070996761013]
Hotelling-type tensor deflation is studied in the presence of noise.
We analytically characterize the estimated singular values and the alignment of estimated and true singular vectors at each step of the deflation procedure.
arXiv Detail & Related papers (2023-10-28T14:40:10Z) - On the Accuracy of Hotelling-Type Tensor Deflation: A Random Tensor
Analysis [16.28927188636617]
A rank-$r$ asymmetric spiked model of the form $sum_i=1r beta_i A_i + W$ is considered.
We provide a study of Hotelling-type deflation in the large dimensional regime.
arXiv Detail & Related papers (2022-11-16T16:01:56Z) - Average-Case Complexity of Tensor Decomposition for Low-Degree
Polynomials [93.59919600451487]
"Statistical-computational gaps" occur in many statistical inference tasks.
We consider a model for random order-3 decomposition where one component is slightly larger in norm than the rest.
We show that tensor entries can accurately estimate the largest component when $ll n3/2$ but fail to do so when $rgg n3/2$.
arXiv Detail & Related papers (2022-11-10T00:40:37Z) - Near-Linear Time and Fixed-Parameter Tractable Algorithms for Tensor
Decompositions [51.19236668224547]
We study low rank approximation of tensors, focusing on the tensor train and Tucker decompositions.
For tensor train decomposition, we give a bicriteria $(1 + eps)$-approximation algorithm with a small bicriteria rank and $O(q cdot nnz(A))$ running time.
In addition, we extend our algorithm to tensor networks with arbitrary graphs.
arXiv Detail & Related papers (2022-07-15T11:55:09Z) - Monogamy of entanglement between cones [68.8204255655161]
We show that monogamy is not only a feature of quantum theory, but that it characterizes the minimal tensor product of general pairs of convex cones.
Our proof makes use of a new characterization of products of simplices up to affine equivalence.
arXiv Detail & Related papers (2022-06-23T16:23:59Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Beyond Lazy Training for Over-parameterized Tensor Decomposition [69.4699995828506]
We show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
Our results show that gradient descent on over-parametrized objective could go beyond the lazy training regime and utilize certain low-rank structure in the data.
arXiv Detail & Related papers (2020-10-22T00:32:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.