Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of
CUR Decompositions
- URL: http://arxiv.org/abs/2103.11037v1
- Date: Fri, 19 Mar 2021 22:00:21 GMT
- Title: Mode-wise Tensor Decompositions: Multi-dimensional Generalizations of
CUR Decompositions
- Authors: HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
- Abstract summary: We study the characterization, perturbation analysis, and an efficient sampling strategy for two primary tensor CUR approximations, namely Chidori and Fiber CUR.
- Score: 9.280330114137778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Low rank tensor approximation is a fundamental tool in modern machine
learning and data science. In this paper, we study the characterization,
perturbation analysis, and an efficient sampling strategy for two primary
tensor CUR approximations, namely Chidori and Fiber CUR. We characterize exact
tensor CUR decompositions for low multilinear rank tensors. We also present
theoretical error bound of the tensor CUR approximations when (adversarial or
Gaussian) noise appears. Moreover, we show that low cost uniform sampling is
sufficient for tensor CUR approximations if the tensor has an incoherent
structure. Empirical performance evaluations, with both synthetic and
real-world datasets, establish the advantage of the tensor CUR approximations
over other state-of-the-art low multilinear rank tensor approximations.
Related papers
- Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Fast and Provable Tensor Robust Principal Component Analysis via Scaled
Gradient Descent [30.299284742925852]
This paper tackles tensor robust principal component analysis (RPCA)
It aims to recover a low-rank tensor from its observations contaminated by sparse corruptions.
We show that the proposed algorithm achieves better and more scalable performance than state-of-the-art matrix and tensor RPCA algorithms.
arXiv Detail & Related papers (2022-06-18T04:01:32Z) - Truncated tensor Schatten p-norm based approach for spatiotemporal
traffic data imputation with complicated missing patterns [77.34726150561087]
We introduce four complicated missing patterns, including missing and three fiber-like missing cases according to the mode-drivenn fibers.
Despite nonity of the objective function in our model, we derive the optimal solutions by integrating alternating data-mputation method of multipliers.
arXiv Detail & Related papers (2022-05-19T08:37:56Z) - Optimizing Information-theoretical Generalization Bounds via Anisotropic
Noise in SGLD [73.55632827932101]
We optimize the information-theoretical generalization bound by manipulating the noise structure in SGLD.
We prove that with constraint to guarantee low empirical risk, the optimal noise covariance is the square root of the expected gradient covariance.
arXiv Detail & Related papers (2021-10-26T15:02:27Z) - Tensor Full Feature Measure and Its Nonconvex Relaxation Applications to
Tensor Recovery [1.8899300124593645]
We propose a new tensor sparsity measure called Full Feature Measure (FFM)
It can simultaneously describe the feature dimension each dimension, and connect the Tucker rank with the tensor tube rank.
Two efficient models based on FFM are proposed, and two Alternating Multiplier Method (ADMM) algorithms are developed to solve the proposed model.
arXiv Detail & Related papers (2021-09-25T01:44:34Z) - Fast Robust Tensor Principal Component Analysis via Fiber CUR
Decomposition [8.821527277034336]
We study the problem of tensor subtraction principal component analysis (TRPCA), which aims to separate an underlying low-multi-rank tensor and an outlier from their sum.
In work, we propose a fast non-linear decomposition algorithm, coined Robust CURCUR, for empirically sparse problems.
arXiv Detail & Related papers (2021-08-23T23:49:40Z) - Scalable Variational Gaussian Processes via Harmonic Kernel
Decomposition [54.07797071198249]
We introduce a new scalable variational Gaussian process approximation which provides a high fidelity approximation while retaining general applicability.
We demonstrate that, on a range of regression and classification problems, our approach can exploit input space symmetries such as translations and reflections.
Notably, our approach achieves state-of-the-art results on CIFAR-10 among pure GP models.
arXiv Detail & Related papers (2021-06-10T18:17:57Z) - Scaling and Scalability: Provable Nonconvex Low-Rank Tensor Estimation
from Incomplete Measurements [30.395874385570007]
A fundamental task is to faithfully recover tensors from highly incomplete measurements.
We develop an algorithm to directly recover the tensor factors in the Tucker decomposition.
We show that it provably converges at a linear independent rate of the ground truth tensor for two canonical problems.
arXiv Detail & Related papers (2021-04-29T17:44:49Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Enhanced nonconvex low-rank approximation of tensor multi-modes for
tensor completion [1.3406858660972554]
We propose a novel low-rank approximation tensor multi-modes (LRATM)
A block-bound method-based algorithm is designed to efficiently solve the proposed model.
Numerical results on three types of public multi-dimensional datasets have tested and shown that our algorithm can recover a variety of low-rank tensors.
arXiv Detail & Related papers (2020-05-28T08:53:54Z) - Multi-View Spectral Clustering Tailored Tensor Low-Rank Representation [105.33409035876691]
This paper explores the problem of multi-view spectral clustering (MVSC) based on tensor low-rank modeling.
We design a novel structured tensor low-rank norm tailored to MVSC.
We show that the proposed method outperforms state-of-the-art methods to a significant extent.
arXiv Detail & Related papers (2020-04-30T11:52:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.