Tensorized Random Projections
- URL: http://arxiv.org/abs/2003.05101v1
- Date: Wed, 11 Mar 2020 03:56:44 GMT
- Title: Tensorized Random Projections
- Authors: Beheshteh T. Rakhshan and Guillaume Rabusseau
- Abstract summary: We propose two tensorized random projection maps relying on the tensor train(TT) and CP decomposition format, respectively.
The two maps offer very low memory requirements and can be applied efficiently when the inputs are low rank tensors given in the CP or TT format.
Our results reveal that the TT format is substantially superior to CP in terms of the size of the random projection needed to achieve the same distortion ratio.
- Score: 8.279639493543401
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a novel random projection technique for efficiently reducing the
dimension of very high-dimensional tensors. Building upon classical results on
Gaussian random projections and Johnson-Lindenstrauss transforms~(JLT), we
propose two tensorized random projection maps relying on the tensor train~(TT)
and CP decomposition format, respectively. The two maps offer very low memory
requirements and can be applied efficiently when the inputs are low rank
tensors given in the CP or TT format. Our theoretical analysis shows that the
dense Gaussian matrix in JLT can be replaced by a low-rank tensor implicitly
represented in compressed form with random factors, while still approximately
preserving the Euclidean distance of the projected inputs. In addition, our
results reveal that the TT format is substantially superior to CP in terms of
the size of the random projection needed to achieve the same distortion ratio.
Experiments on synthetic data validate our theoretical analysis and demonstrate
the superiority of the TT decomposition.
Related papers
- Error Analysis of Tensor-Train Cross Approximation [88.83467216606778]
We provide accuracy guarantees in terms of the entire tensor for both exact and noisy measurements.
Results are verified by numerical experiments, and may have important implications for the usefulness of cross approximations for high-order tensors.
arXiv Detail & Related papers (2022-07-09T19:33:59Z) - Orthogonal Matrix Retrieval with Spatial Consensus for 3D Unknown-View
Tomography [58.60249163402822]
Unknown-view tomography (UVT) reconstructs a 3D density map from its 2D projections at unknown, random orientations.
The proposed OMR is more robust and performs significantly better than the previous state-of-the-art OMR approach.
arXiv Detail & Related papers (2022-07-06T21:40:59Z) - Tensor Shape Search for Optimum Data Compression [6.610488230919323]
We study the effect of the tensor shape on the tensor decomposition.
We propose an optimization model to find an optimum shape for the tensor train (TT) decomposition.
arXiv Detail & Related papers (2022-05-21T17:58:33Z) - 2D+3D facial expression recognition via embedded tensor manifold
regularization [16.98176664818354]
A novel approach via embedded tensor manifold regularization for 2D+3D facial expression recognition (FERETMR) is proposed.
We establish the first-order optimality condition in terms of stationary points, and then design a block coordinate descent (BCD) algorithm with convergence analysis.
Numerical results on BU-3DFE database and Bosphorus databases demonstrate the effectiveness of our proposed approach.
arXiv Detail & Related papers (2022-01-29T06:11:00Z) - Near-optimal estimation of smooth transport maps with kernel
sums-of-squares [81.02564078640275]
Under smoothness conditions, the squared Wasserstein distance between two distributions could be efficiently computed with appealing statistical error upper bounds.
The object of interest for applications such as generative modeling is the underlying optimal transport map.
We propose the first tractable algorithm for which the statistical $L2$ error on the maps nearly matches the existing minimax lower-bounds for smooth map estimation.
arXiv Detail & Related papers (2021-12-03T13:45:36Z) - Rademacher Random Projections with Tensor Networks [10.140147080535222]
We consider a tensorizedrandom projection relying on Train decomposition.
Experiments onsynthetic data demonstrate that tensorized Rademacher RP can outperform thetensorized Gaussian RP.
arXiv Detail & Related papers (2021-10-26T19:18:20Z) - Tensor Random Projection for Low Memory Dimension Reduction [22.715952036307648]
Random projections reduce the dimension of a set of vectors while preserving structural information.
This paper proposes a novel use of row-product random matrices in random projection.
It requires substantially less memory than existing dimension reduction maps.
arXiv Detail & Related papers (2021-04-30T22:08:04Z) - Spectral Tensor Train Parameterization of Deep Learning Layers [136.4761580842396]
We study low-rank parameterizations of weight matrices with embedded spectral properties in the Deep Learning context.
We show the effects of neural network compression in the classification setting and both compression and improved stability training in the generative adversarial training setting.
arXiv Detail & Related papers (2021-03-07T00:15:44Z) - Tensor Train Random Projection [0.0]
This work proposes a novel tensor train random projection (TTRP) method for dimension reduction.
Our TTRP is systematically constructed through a tensor train representation with TT-ranks equal to one.
Based on the tensor train format, this new random projection method can speed up the dimension reduction procedure for high-dimensional datasets.
arXiv Detail & Related papers (2020-10-21T07:31:45Z) - T-Basis: a Compact Representation for Neural Networks [89.86997385827055]
We introduce T-Basis, a concept for a compact representation of a set of tensors, each of an arbitrary shape, which is often seen in Neural Networks.
We evaluate the proposed approach on the task of neural network compression and demonstrate that it reaches high compression rates at acceptable performance drops.
arXiv Detail & Related papers (2020-07-13T19:03:22Z) - Augmented Sliced Wasserstein Distances [55.028065567756066]
We propose a new family of distance metrics, called augmented sliced Wasserstein distances (ASWDs)
ASWDs are constructed by first mapping samples to higher-dimensional hypersurfaces parameterized by neural networks.
Numerical results demonstrate that the ASWD significantly outperforms other Wasserstein variants for both synthetic and real-world problems.
arXiv Detail & Related papers (2020-06-15T23:00:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.