When Random Tensors meet Random Matrices
- URL: http://arxiv.org/abs/2112.12348v1
- Date: Thu, 23 Dec 2021 04:05:01 GMT
- Title: When Random Tensors meet Random Matrices
- Authors: Mohamed El Amine Seddik and Maxime Guillaud and Romain Couillet
- Abstract summary: This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
- Score: 50.568841545067144
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Relying on random matrix theory (RMT), this paper studies asymmetric
order-$d$ spiked tensor models with Gaussian noise. Using the variational
definition of the singular vectors and values of [Lim, 2005], we show that the
analysis of the considered model boils down to the analysis of an equivalent
spiked symmetric \textit{block-wise} random matrix, that is constructed from
\textit{contractions} of the studied tensor with the singular vectors
associated to its best rank-1 approximation. Our approach allows the exact
characterization of the almost sure asymptotic singular value and alignments of
the corresponding singular vectors with the true spike components, when
$\frac{n_i}{\sum_{j=1}^d n_j}\to c_i\in [0, 1]$ with $n_i$'s the tensor
dimensions. In contrast to other works that rely mostly on tools from
statistical physics to study random tensors, our results rely solely on
classical RMT tools such as Stein's lemma. Finally, classical RMT results
concerning spiked random matrices are recovered as a particular case.
Related papers
- Analysing heavy-tail properties of Stochastic Gradient Descent by means of Stochastic Recurrence Equations [0.0]
In recent works, it has been observed that heavy tail properties of Gradient Descent (SGD) can be studied in the probabilistic framework of recursions.
We will answer several open questions of the quoted paper and extend their results by applying the theory of irreducibleproximal (i-p) matrices.
arXiv Detail & Related papers (2024-03-20T13:39:19Z) - On the Accuracy of Hotelling-Type Asymmetric Tensor Deflation: A Random
Tensor Analysis [14.809070996761013]
Hotelling-type tensor deflation is studied in the presence of noise.
We analytically characterize the estimated singular values and the alignment of estimated and true singular vectors at each step of the deflation procedure.
arXiv Detail & Related papers (2023-10-28T14:40:10Z) - On the Accuracy of Hotelling-Type Tensor Deflation: A Random Tensor
Analysis [16.28927188636617]
A rank-$r$ asymmetric spiked model of the form $sum_i=1r beta_i A_i + W$ is considered.
We provide a study of Hotelling-type deflation in the large dimensional regime.
arXiv Detail & Related papers (2022-11-16T16:01:56Z) - Test Set Sizing Via Random Matrix Theory [91.3755431537592]
This paper uses techniques from Random Matrix Theory to find the ideal training-testing data split for a simple linear regression.
It defines "ideal" as satisfying the integrity metric, i.e. the empirical model error is the actual measurement noise.
This paper is the first to solve for the training and test size for any model in a way that is truly optimal.
arXiv Detail & Related papers (2021-12-11T13:18:33Z) - A Random Matrix Perspective on Random Tensors [40.89521598604993]
We study the spectra of random matrices arising from contractions of a given random tensor.
Our technique yields a hitherto unknown characterization of the local maximum of the ML problem.
Our approach is versatile and can be extended to other models, such as asymmetric, non-Gaussian and higher-order ones.
arXiv Detail & Related papers (2021-08-02T10:42:22Z) - A Precise Performance Analysis of Support Vector Regression [105.94855998235232]
We study the hard and soft support vector regression techniques applied to a set of $n$ linear measurements.
Our results are then used to optimally tune the parameters intervening in the design of hard and soft support vector regression algorithms.
arXiv Detail & Related papers (2021-05-21T14:26:28Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Asymptotic errors for convex penalized linear regression beyond Gaussian
matrices [23.15629681360836]
We consider the problem of learning a coefficient vector $x_0$ in $RN$ from noisy linear observations.
We provide a rigorous derivation of an explicit formula for the mean squared error.
We show that our predictions agree remarkably well with numerics even for very moderate sizes.
arXiv Detail & Related papers (2020-02-11T13:43:32Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.