Exact Tensor Completion Powered by Arbitrary Linear Transforms
- URL: http://arxiv.org/abs/2402.03468v1
- Date: Fri, 2 Feb 2024 13:26:38 GMT
- Title: Exact Tensor Completion Powered by Arbitrary Linear Transforms
- Authors: Li Ge, Xue Jiang, Lin Chen
- Abstract summary: A new theoretical guarantee of exact tensor completion with arbitrary linear transforms is established.
An efficient algorithm based on alternating direction of multipliers is designed to solve the transformed tensor completion program.
Our model and proof greatly enhance the flexibility of tensor completion and extensive experiments validate the superiority of the proposed method.
- Score: 10.698749077862747
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this work, a tensor completion problem is studied, which aims to perfectly
recover the tensor from partial observations. Existing theoretical guarantee
requires the involved transform to be orthogonal, which hinders its
applications. In this paper, jumping out of the constraints of isotropy or
self-adjointness, the theoretical guarantee of exact tensor completion with
arbitrary linear transforms is established. To that end, we define a new
tensor-tensor product, which leads us to a new definition of the tensor nuclear
norm. Equipped with these tools, an efficient algorithm based on alternating
direction of multipliers is designed to solve the transformed tensor completion
program and the theoretical bound is obtained. Our model and proof greatly
enhance the flexibility of tensor completion and extensive experiments validate
the superiority of the proposed method.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Provable Tensor Completion with Graph Information [49.08648842312456]
We introduce a novel model, theory, and algorithm for solving the dynamic graph regularized tensor completion problem.
We develop a comprehensive model simultaneously capturing the low-rank and similarity structure of the tensor.
In terms of theory, we showcase the alignment between the proposed graph smoothness regularization and a weighted tensor nuclear norm.
arXiv Detail & Related papers (2023-10-04T02:55:10Z) - Optimizing Orthogonalized Tensor Deflation via Random Tensor Theory [5.124256074746721]
This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor.
Non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery.
An efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism.
arXiv Detail & Related papers (2023-02-11T22:23:27Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Noisy Tensor Completion via Low-rank Tensor Ring [41.86521269183527]
tensor completion is a fundamental tool for incomplete data analysis, where the goal is to predict missing entries from partial observations.
Existing methods often make the explicit or implicit assumption that the observed entries are noise-free to provide a theoretical guarantee of exact recovery of missing entries.
This paper proposes a novel noisy tensor completion model, which complements the incompetence of existing works in handling the degeneration of high-order and noisy observations.
arXiv Detail & Related papers (2022-03-14T14:09:43Z) - Robust M-estimation-based Tensor Ring Completion: a Half-quadratic
Minimization Approach [14.048989759890475]
We develop a robust approach to tensor ring completion that uses an M-estimator as its error statistic.
We present two HQ-based algorithms based on truncated singular value decomposition and matrix factorization.
arXiv Detail & Related papers (2021-06-19T04:37:50Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Implicit Regularization in Deep Tensor Factorization [0.0]
We introduce deep Tucker and TT unconstrained factorization to deal with the completion task.
Experiments on both synthetic and real data show that gradient descent promotes solution with low-rank.
arXiv Detail & Related papers (2021-05-04T07:48:40Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.