Rademacher Random Projections with Tensor Networks
- URL: http://arxiv.org/abs/2110.13970v2
- Date: Thu, 28 Oct 2021 05:04:54 GMT
- Title: Rademacher Random Projections with Tensor Networks
- Authors: Beheshteh T. Rakhshan and Guillaume Rabusseau
- Abstract summary: We consider a tensorizedrandom projection relying on Train decomposition.
Experiments onsynthetic data demonstrate that tensorized Rademacher RP can outperform thetensorized Gaussian RP.
- Score: 10.140147080535222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Random projection (RP) have recently emerged as popular techniques in
themachine learning community for their ability in reducing the dimension of
veryhigh-dimensional tensors. Following the work in [29], we consider a
tensorizedrandom projection relying on Tensor Train (TT) decomposition where
each elementof the core tensors is drawn from a Rademacher distribution. Our
theoreticalresults reveal that the Gaussian low-rank tensor represented in
compressed formin TT format in [29] can be replaced by a TT tensor with core
elements drawnfrom a Rademacher distribution with the same embedding size.
Experiments onsynthetic data demonstrate that tensorized Rademacher RP can
outperform thetensorized Gaussian RP studied in [29]. In addition, we show both
theoreticallyand experimentally, that the tensorized RP in the Matrix Product
Operator (MPO)format proposed in [5] for performing SVD on large matrices is
not a Johnson-Lindenstrauss transform (JLT) and therefore not a well-suited
random projectionmap
Related papers
- Oblivious subspace embeddings for compressed Tucker decompositions [8.349583867022204]
This work establishes general Johnson-Lindenstrauss type guarantees for the estimation of Tucker decompositions.
On moderately large face image and fMRI neuroimaging datasets, empirical results show that substantial dimension reduction is possible.
arXiv Detail & Related papers (2024-06-13T17:58:32Z) - Exact Non-Oblivious Performance of Rademacher Random Embeddings [79.28094304325116]
This paper revisits the performance of Rademacher random projections.
It establishes novel statistical guarantees that are numerically sharp and non-oblivious with respect to the input data.
arXiv Detail & Related papers (2023-03-21T11:45:27Z) - Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Langevin Monte Carlo for Contextual Bandits [72.00524614312002]
Langevin Monte Carlo Thompson Sampling (LMC-TS) is proposed to directly sample from the posterior distribution in contextual bandits.
We prove that the proposed algorithm achieves the same sublinear regret bound as the best Thompson sampling algorithms for a special case of contextual bandits.
arXiv Detail & Related papers (2022-06-22T17:58:23Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - Efficient Tensor Robust PCA under Hybrid Model of Tucker and Tensor
Train [33.33426557160802]
We propose an efficient principal component analysis (TRPCA) under hybrid model of Tucker and TT.
Specifically, in theory we reveal that TT nuclear norm (TTNN) of the original big tensor can be equivalently converted to that of a much smaller tensor via a Tucker compression format.
Numerical experiments on both synthetic and real-world tensor data verify the superiority of the proposed model.
arXiv Detail & Related papers (2021-12-20T01:15:45Z) - MTC: Multiresolution Tensor Completion from Partial and Coarse
Observations [49.931849672492305]
Existing completion formulation mostly relies on partial observations from a single tensor.
We propose an efficient Multi-resolution Completion model (MTC) to solve the problem.
arXiv Detail & Related papers (2021-06-14T02:20:03Z) - Tensor Train Random Projection [0.0]
This work proposes a novel tensor train random projection (TTRP) method for dimension reduction.
Our TTRP is systematically constructed through a tensor train representation with TT-ranks equal to one.
Based on the tensor train format, this new random projection method can speed up the dimension reduction procedure for high-dimensional datasets.
arXiv Detail & Related papers (2020-10-21T07:31:45Z) - Robust Tensor Principal Component Analysis: Exact Recovery via
Deterministic Model [5.414544833902815]
This paper proposes a new method to analyze Robust tensor principal component analysis (RTPCA)
It is based on the recently developed tensor-tensor product and tensor singular value decomposition (t-SVD)
arXiv Detail & Related papers (2020-08-05T16:26:10Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Tensorized Random Projections [8.279639493543401]
We propose two tensorized random projection maps relying on the tensor train(TT) and CP decomposition format, respectively.
The two maps offer very low memory requirements and can be applied efficiently when the inputs are low rank tensors given in the CP or TT format.
Our results reveal that the TT format is substantially superior to CP in terms of the size of the random projection needed to achieve the same distortion ratio.
arXiv Detail & Related papers (2020-03-11T03:56:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.