Inference for Low-rank Tensors -- No Need to Debias
- URL: http://arxiv.org/abs/2012.14844v1
- Date: Tue, 29 Dec 2020 16:48:02 GMT
- Title: Inference for Low-rank Tensors -- No Need to Debias
- Authors: Dong Xia and Anru R. Zhang and Yuchen Zhou
- Abstract summary: In this paper, we consider the statistical inference for several low-rank tensor models.
For the rank-one PCA model, we establish the theory for inference on each individual singular tensor.
Finally, simulations are presented to corroborate our theoretical discoveries.
- Score: 22.163281794187544
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this paper, we consider the statistical inference for several low-rank
tensor models. Specifically, in the Tucker low-rank tensor PCA or regression
model, provided with any estimates achieving some attainable error rate, we
develop the data-driven confidence regions for the singular subspace of the
parameter tensor based on the asymptotic distribution of an updated estimate by
two-iteration alternating minimization. The asymptotic distributions are
established under some essential conditions on the signal-to-noise ratio (in
PCA model) or sample size (in regression model). If the parameter tensor is
further orthogonally decomposable, we develop the methods and theory for
inference on each individual singular vector. For the rank-one tensor PCA
model, we establish the asymptotic distribution for general linear forms of
principal components and confidence interval for each entry of the parameter
tensor. Finally, numerical simulations are presented to corroborate our
theoretical discoveries.
In all these models, we observe that different from many matrix/vector
settings in existing work, debiasing is not required to establish the
asymptotic distribution of estimates or to make statistical inference on
low-rank tensors. In fact, due to the widely observed
statistical-computational-gap for low-rank tensor estimation, one usually
requires stronger conditions than the statistical (or information-theoretic)
limit to ensure the computationally feasible estimation is achievable.
Surprisingly, such conditions ``incidentally" render a feasible low-rank tensor
inference without debiasing.
Related papers
- Robust Estimation for Kernel Exponential Families with Smoothed Total Variation Distances [2.317910166616341]
In statistical inference, we commonly assume that samples are independent and identically distributed from a probability distribution.
In this paper, we explore the application of GAN-like estimators to a general class of statistical models.
arXiv Detail & Related papers (2024-10-28T05:50:47Z) - Statistical Inference in Tensor Completion: Optimal Uncertainty Quantification and Statistical-to-Computational Gaps [7.174572371800217]
This paper presents a simple yet efficient method for statistical inference of tensor linear forms using incomplete and noisy observations.
It is suitable for various statistical inference tasks, including constructing confidence intervals, inference under heteroskedastic and sub-exponential noise, and simultaneous testing.
arXiv Detail & Related papers (2024-10-15T03:09:52Z) - Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - Statistical and computational rates in high rank tensor estimation [11.193504036335503]
Higher-order tensor datasets arise commonly in recommendation systems, neuroimaging, and social networks.
We consider a generative latent variable tensor model that incorporates both high rank and low rank models.
We show that the statistical-computational gap emerges only for latent variable tensors of order 3 or higher.
arXiv Detail & Related papers (2023-04-08T15:34:26Z) - Kernel-based off-policy estimation without overlap: Instance optimality
beyond semiparametric efficiency [53.90687548731265]
We study optimal procedures for estimating a linear functional based on observational data.
For any convex and symmetric function class $mathcalF$, we derive a non-asymptotic local minimax bound on the mean-squared error.
arXiv Detail & Related papers (2023-01-16T02:57:37Z) - Statistical Efficiency of Score Matching: The View from Isoperimetry [96.65637602827942]
We show a tight connection between statistical efficiency of score matching and the isoperimetric properties of the distribution being estimated.
We formalize these results both in the sample regime and in the finite regime.
arXiv Detail & Related papers (2022-10-03T06:09:01Z) - Near-optimal inference in adaptive linear regression [60.08422051718195]
Even simple methods like least squares can exhibit non-normal behavior when data is collected in an adaptive manner.
We propose a family of online debiasing estimators to correct these distributional anomalies in at least squares estimation.
We demonstrate the usefulness of our theory via applications to multi-armed bandit, autoregressive time series estimation, and active learning with exploration.
arXiv Detail & Related papers (2021-07-05T21:05:11Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Uncertainty quantification for nonconvex tensor completion: Confidence
intervals, heteroscedasticity and optimality [92.35257908210316]
We study the problem of estimating a low-rank tensor given incomplete and corrupted observations.
We find that it attains unimprovable rates $ell-2$ accuracy.
arXiv Detail & Related papers (2020-06-15T17:47:13Z) - An Optimal Statistical and Computational Framework for Generalized
Tensor Estimation [10.899518267165666]
This paper describes a flexible framework for low-rank tensor estimation problems.
It includes many important instances from applications in computational imaging, genomics, and network analysis.
arXiv Detail & Related papers (2020-02-26T01:54:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.