Non-PSD Matrix Sketching with Applications to Regression and
Optimization
- URL: http://arxiv.org/abs/2106.08544v1
- Date: Wed, 16 Jun 2021 04:07:48 GMT
- Title: Non-PSD Matrix Sketching with Applications to Regression and
Optimization
- Authors: Zhili Feng, Fred Roosta, David P. Woodruff
- Abstract summary: We present dimensionality reduction methods for non-PSD and square-roots" matrices.
We show how these techniques can be used for multiple downstream tasks.
- Score: 56.730993511802865
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A variety of dimensionality reduction techniques have been applied for
computations involving large matrices. The underlying matrix is randomly
compressed into a smaller one, while approximately retaining many of its
original properties. As a result, much of the expensive computation can be
performed on the small matrix. The sketching of positive semidefinite (PSD)
matrices is well understood, but there are many applications where the related
matrices are not PSD, including Hessian matrices in non-convex optimization and
covariance matrices in regression applications involving complex numbers. In
this paper, we present novel dimensionality reduction methods for non-PSD
matrices, as well as their ``square-roots", which involve matrices with complex
entries. We show how these techniques can be used for multiple downstream
tasks. In particular, we show how to use the proposed matrix sketching
techniques for both convex and non-convex optimization, $\ell_p$-regression for
every $1 \leq p \leq \infty$, and vector-matrix-vector queries.
Related papers
- Optimal Quantization for Matrix Multiplication [35.007966885532724]
We present a universal quantizer based on nested lattices with an explicit guarantee of approximation error for any (non-random) pair of matrices $A$, $B$ in terms of only Frobenius norms $|A|_F, |B|_F$ and $|Atop B|_F$.
arXiv Detail & Related papers (2024-10-17T17:19:48Z) - One-sided Matrix Completion from Two Observations Per Row [95.87811229292056]
We propose a natural algorithm that involves imputing the missing values of the matrix $XTX$.
We evaluate our algorithm on one-sided recovery of synthetic data and low-coverage genome sequencing.
arXiv Detail & Related papers (2023-06-06T22:35:16Z) - Sufficient dimension reduction for feature matrices [3.04585143845864]
We propose a method called principal support matrix machine (PSMM) for the matrix sufficient dimension reduction.
Our numerical analysis demonstrates that the PSMM outperforms existing methods and has strong interpretability in real data applications.
arXiv Detail & Related papers (2023-03-07T23:16:46Z) - Multiresolution kernel matrix algebra [0.0]
We show the compression of kernel matrices by means of samplets produces optimally sparse matrices in a certain S-format.
The inverse of a kernel matrix (if it exists) is compressible in the S-format as well.
The matrix algebra is justified mathematically by pseudo differential calculus.
arXiv Detail & Related papers (2022-11-21T17:50:22Z) - Sublinear Time Approximation of Text Similarity Matrices [50.73398637380375]
We introduce a generalization of the popular Nystr"om method to the indefinite setting.
Our algorithm can be applied to any similarity matrix and runs in sublinear time in the size of the matrix.
We show that our method, along with a simple variant of CUR decomposition, performs very well in approximating a variety of similarity matrices.
arXiv Detail & Related papers (2021-12-17T17:04:34Z) - Sparse Factorization of Large Square Matrices [10.94053598642913]
In this paper, we propose to approximate a large square matrix with a product of sparse full-rank matrices.
In the approximation, our method needs only $N(log N)2$ non-zero numbers for an $Ntimes N$ full matrix.
We show that our method gives a better approximation when the approximated matrix is sparse and high-rank.
arXiv Detail & Related papers (2021-09-16T18:42:21Z) - A Non-commutative Extension of Lee-Seung's Algorithm for Positive
Semidefinite Factorizations [0.0]
We describe a non-commutative extension of Lee-Seung's algorithm, which we call the Matrix Multiplicative Update (MMU) algorithm, for computing Positive Semidefinite (PSD) factorizations.
We show that under our update scheme the squared loss objective is non-increasing and fixed points correspond to critical points.
arXiv Detail & Related papers (2021-06-01T07:55:09Z) - Hutch++: Optimal Stochastic Trace Estimation [75.45968495410048]
We introduce a new randomized algorithm, Hutch++, which computes a $(1 pm epsilon)$ approximation to $tr(A)$ for any positive semidefinite (PSD) $A$.
We show that it significantly outperforms Hutchinson's method in experiments.
arXiv Detail & Related papers (2020-10-19T16:45:37Z) - Sketching Transformed Matrices with Applications to Natural Language
Processing [76.6222695417524]
We propose a space-efficient sketching algorithm for computing the product of a given small matrix with the transformed matrix.
We show that our approach obtains small error and is efficient in both space and time.
arXiv Detail & Related papers (2020-02-23T03:07:31Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.