Robust Low-tubal-rank Tensor Completion based on Tensor Factorization
and Maximum Correntopy Criterion
- URL: http://arxiv.org/abs/2010.11740v1
- Date: Thu, 22 Oct 2020 13:56:55 GMT
- Title: Robust Low-tubal-rank Tensor Completion based on Tensor Factorization
and Maximum Correntopy Criterion
- Authors: Yicong He, George K. Atia
- Abstract summary: We propose a new objective function for low-tubal-rank tensor completion, which uses correntropy as the error measure to mitigate the effect of the outliers.
Numerical results using both synthetic and real data demonstrate the robust and superior performance of the proposed algorithms.
- Score: 12.02023514105999
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The goal of tensor completion is to recover a tensor from a subset of its
entries, often by exploiting its low-rank property. Among several useful
definitions of tensor rank, the low-tubal-rank was shown to give a valuable
characterization of the inherent low-rank structure of a tensor. While some
low-tubal-rank tensor completion algorithms with favorable performance have
been recently proposed, these algorithms utilize second-order statistics to
measure the error residual, which may not work well when the observed entries
contain large outliers. In this paper, we propose a new objective function for
low-tubal-rank tensor completion, which uses correntropy as the error measure
to mitigate the effect of the outliers. To efficiently optimize the proposed
objective, we leverage a half-quadratic minimization technique whereby the
optimization is transformed to a weighted low-tubal-rank tensor factorization
problem. Subsequently, we propose two simple and efficient algorithms to obtain
the solution and provide their convergence and complexity analysis. Numerical
results using both synthetic and real data demonstrate the robust and superior
performance of the proposed algorithms.
Related papers
- Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Low-Rank Tensor Completion via Novel Sparsity-Inducing Regularizers [30.920908325825668]
To alleviate l1-norm in the low-rank tensor completion problem, non-rank surrogates/regularizers have been suggested.
These regularizers are applied to nuclear-rank restoration, and efficient algorithms based on the method of multipliers are proposed.
arXiv Detail & Related papers (2023-10-10T01:00:13Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - A Novel Tensor Factorization-Based Method with Robustness to Inaccurate
Rank Estimation [9.058215418134209]
We propose a new tensor norm with a dual low-rank constraint, which utilizes the low-rank prior and rank information at the same time.
It is proven theoretically that the resulting tensor completion model can effectively avoid performance degradation caused by inaccurate rank estimation.
Based on this, the total cost at each iteration of the optimization algorithm is reduced to $mathcalO(n3log n +kn3)$ from $mathcalO(n4)$ achieved with standard methods.
arXiv Detail & Related papers (2023-05-19T06:26:18Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Noisy Tensor Completion via Low-rank Tensor Ring [41.86521269183527]
tensor completion is a fundamental tool for incomplete data analysis, where the goal is to predict missing entries from partial observations.
Existing methods often make the explicit or implicit assumption that the observed entries are noise-free to provide a theoretical guarantee of exact recovery of missing entries.
This paper proposes a novel noisy tensor completion model, which complements the incompetence of existing works in handling the degeneration of high-order and noisy observations.
arXiv Detail & Related papers (2022-03-14T14:09:43Z) - Robust M-estimation-based Tensor Ring Completion: a Half-quadratic
Minimization Approach [14.048989759890475]
We develop a robust approach to tensor ring completion that uses an M-estimator as its error statistic.
We present two HQ-based algorithms based on truncated singular value decomposition and matrix factorization.
arXiv Detail & Related papers (2021-06-19T04:37:50Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - Support recovery and sup-norm convergence rates for sparse pivotal
estimation [79.13844065776928]
In high dimensional sparse regression, pivotal estimators are estimators for which the optimal regularization parameter is independent of the noise level.
We show minimax sup-norm convergence rates for non smoothed and smoothed, single task and multitask square-root Lasso-type estimators.
arXiv Detail & Related papers (2020-01-15T16:11:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.