Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform
- URL: http://arxiv.org/abs/2002.00864v5
- Date: Fri, 23 Oct 2020 12:35:02 GMT
- Title: Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform
- Authors: Jonathan Lacotte, Sifan Liu, Edgar Dobriban and Mert Pilanci
- Abstract summary: We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
- Score: 64.90148466525754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Random projections or sketching are widely used in many algorithmic and
learning contexts. Here we study the performance of iterative Hessian sketch
for least-squares problems. By leveraging and extending recent results from
random matrix theory on the limiting spectrum of matrices randomly projected
with the subsampled randomized Hadamard transform, and truncated Haar matrices,
we can study and compare the resulting algorithms to a level of precision that
has not been possible before. Our technical contributions include a novel
formula for the second moment of the inverse of projected matrices. We also
find simple closed-form expressions for asymptotically optimal step-sizes and
convergence rates. These show that the convergence rate for Haar and randomized
Hadamard matrices are identical, and asymptotically improve upon Gaussian
random projections. These techniques may be applied to other algorithms that
employ randomized dimension reduction.
Related papers
- Randomized Algorithms for Symmetric Nonnegative Matrix Factorization [2.1753766244387402]
Symmetric Nonnegative Matrix Factorization (SymNMF) is a technique in data analysis and machine learning.
We develop two randomized algorithms for its computation.
We show that our methods approximately maintain solution quality and achieve significant speed ups for both large dense and large sparse problems.
arXiv Detail & Related papers (2024-02-13T00:02:05Z) - A fixed-point algorithm for matrix projections with applications in
quantum information [7.988085110283119]
We show that our algorithm converges exponentially fast to the optimal solution in the number of iterations.
We discuss several applications of our algorithm in quantum resource theories and quantum Shannon theory.
arXiv Detail & Related papers (2023-12-22T11:16:11Z) - Learning the Positions in CountSketch [49.57951567374372]
We consider sketching algorithms which first compress data by multiplication with a random sketch matrix, and then apply the sketch to quickly solve an optimization problem.
In this work, we propose the first learning-based algorithms that also optimize the locations of the non-zero entries.
arXiv Detail & Related papers (2023-06-11T07:28:35Z) - Fast Projected Newton-like Method for Precision Matrix Estimation under
Total Positivity [15.023842222803058]
Current algorithms are designed using the block coordinate descent method or the proximal point algorithm.
We propose a novel algorithm based on the two-metric projection method, incorporating a carefully designed search direction and variable partitioning scheme.
Experimental results on synthetic and real-world datasets demonstrate that our proposed algorithm provides a significant improvement in computational efficiency compared to the state-of-the-art methods.
arXiv Detail & Related papers (2021-12-03T14:39:10Z) - Learning-Augmented Sketches for Hessians [54.97773807211337]
We show how to design learned sketches for the Hessian in the context of second order methods.
We show empirically that learned sketches, compared with their "non-learned" counterparts, improve the approximation accuracy for important problems.
arXiv Detail & Related papers (2021-02-24T14:50:59Z) - Precise expressions for random projections: Low-rank approximation and
randomized Newton [63.68433510953756]
Matrix sketching has emerged as a powerful technique for performing such dimensionality reduction very efficiently.
We develop techniques that provide provably accurate expressions for the expected value of random projection matrices obtained via sketching.
These expressions can be used to characterize the performance of dimensionality reduction in a variety of common machine learning tasks.
arXiv Detail & Related papers (2020-06-18T16:23:00Z) - Effective Dimension Adaptive Sketching Methods for Faster Regularized
Least-Squares Optimization [56.05635751529922]
We propose a new randomized algorithm for solving L2-regularized least-squares problems based on sketching.
We consider two of the most popular random embeddings, namely, Gaussian embeddings and the Subsampled Randomized Hadamard Transform (SRHT)
arXiv Detail & Related papers (2020-06-10T15:00:09Z) - Optimal Randomized First-Order Methods for Least-Squares Problems [56.05635751529922]
This class of algorithms encompasses several randomized methods among the fastest solvers for least-squares problems.
We focus on two classical embeddings, namely, Gaussian projections and subsampled Hadamard transforms.
Our resulting algorithm yields the best complexity known for solving least-squares problems with no condition number dependence.
arXiv Detail & Related papers (2020-02-21T17:45:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.