Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality
- URL: http://arxiv.org/abs/2006.08580v1
- Date: Mon, 15 Jun 2020 17:47:13 GMT
- Title: Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality
- Authors: Changxiao Cai, H. Vincent Poor, Yuxin Chen
- Abstract summary: We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
- Score: 92.35257908210316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the distribution and uncertainty of nonconvex optimization for noisy
tensor completion -- the problem of estimating a low-rank tensor given
incomplete and corrupted observations of its entries. Focusing on a two-stage
estimation algorithm proposed by Cai et al. (2019), we characterize the
distribution of this nonconvex estimator down to fine scales. This
distributional theory in turn allows one to construct valid and short
confidence intervals for both the unseen tensor entries and the unknown tensor
factors. The proposed inferential procedure enjoys several important features:
(1) it is fully adaptive to noise heteroscedasticity, and (2) it is data-driven
and automatically adapts to unknown noise distributions. Furthermore, our
findings unveil the statistical optimality of nonconvex tensor completion: it
attains un-improvable $\ell_{2}$ accuracy -- including both the rates and the
pre-constants -- when estimating both the unknown tensor and the underlying
tensor factors.
Related papers
- Uncertainty-Aware Relational Graph Neural Network for Few-Shot Knowledge Graph Completion [12.887073684904147]
Few-shot knowledge graph completion (FKGC) aims to query the unseen facts of a relation given its few-shot reference entity pairs.
Existing FKGC works neglect such uncertainty, which leads them more susceptible to limited reference samples with noises.
We propose a novel uncertainty-aware few-shot KG completion framework (UFKGC) to model uncertainty for a better understanding of the limited data.
arXiv Detail & Related papers (2024-03-07T14:23:25Z) - Breaking the Heavy-Tailed Noise Barrier in Stochastic Optimization Problems [56.86067111855056]
We consider clipped optimization problems with heavy-tailed noise with structured density.
We show that it is possible to get faster rates of convergence than $mathcalO(K-(alpha - 1)/alpha)$, when the gradients have finite moments of order.
We prove that the resulting estimates have negligible bias and controllable variance.
arXiv Detail & Related papers (2023-11-07T17:39:17Z) - Optimizing the Noise in Self-Supervised Learning: from Importance
Sampling to Noise-Contrastive Estimation [80.07065346699005]
It is widely assumed that the optimal noise distribution should be made equal to the data distribution, as in Generative Adversarial Networks (GANs)
We turn to Noise-Contrastive Estimation which grounds this self-supervised task as an estimation problem of an energy-based model of the data.
We soberly conclude that the optimal noise may be hard to sample from, and the gain in efficiency can be modest compared to choosing the noise distribution equal to the data's.
arXiv Detail & Related papers (2023-01-23T19:57:58Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Noisy Tensor Completion via Low-rank Tensor Ring [41.86521269183527]
tensor completion is a fundamental tool for incomplete data analysis, where the goal is to predict missing entries from partial observations.
Existing methods often make the explicit or implicit assumption that the observed entries are noise-free to provide a theoretical guarantee of exact recovery of missing entries.
This paper proposes a novel noisy tensor completion model, which complements the incompetence of existing works in handling the degeneration of high-order and noisy observations.
arXiv Detail & Related papers (2022-03-14T14:09:43Z) - Robust M-estimation-based Tensor Ring Completion: a Half-quadratic
Minimization Approach [14.048989759890475]
We develop a robust approach to tensor ring completion that uses an M-estimator as its error statistic.
We present two HQ-based algorithms based on truncated singular value decomposition and matrix factorization.
arXiv Detail & Related papers (2021-06-19T04:37:50Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Inference for Low-rank Tensors -- No Need to Debias [22.163281794187544]
In this paper, we consider the statistical inference for several low-rank tensor models.
For the rank-one PCA model, we establish the theory for inference on each individual singular tensor.
Finally, simulations are presented to corroborate our theoretical discoveries.
arXiv Detail & Related papers (2020-12-29T16:48:02Z) - Nonparametric Score Estimators [49.42469547970041]
Estimating the score from a set of samples generated by an unknown distribution is a fundamental task in inference and learning of probabilistic models.
We provide a unifying view of these estimators under the framework of regularized nonparametric regression.
We propose score estimators based on iterative regularization that enjoy computational benefits from curl-free kernels and fast convergence.
arXiv Detail & Related papers (2020-05-20T15:01:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.