Low-Rank Updates of Matrix Square Roots
- URL: http://arxiv.org/abs/2201.13156v3
- Date: Mon, 5 Jun 2023 08:00:24 GMT
- Title: Low-Rank Updates of Matrix Square Roots
- Authors: Shany Shumeli, Petros Drineas, Haim Avron
- Abstract summary: We consider the matrix square root and inverse square root operations.
Given a low rank perturbation to a matrix, we argue that a low-rank approximate correction to the (inverse) square root exists.
We then discuss how a low-rank solution to that equation can be computed.
- Score: 7.832944895330117
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Models in which the covariance matrix has the structure of a sparse matrix
plus a low rank perturbation are ubiquitous in data science applications. It is
often desirable for algorithms to take advantage of such structures, avoiding
costly matrix computations that often require cubic time and quadratic storage.
This is often accomplished by performing operations that maintain such
structures, e.g. matrix inversion via the Sherman-Morrison-Woodbury formula. In
this paper we consider the matrix square root and inverse square root
operations. Given a low rank perturbation to a matrix, we argue that a low-rank
approximate correction to the (inverse) square root exists. We do so by
establishing a geometric decay bound on the true correction's eigenvalues. We
then proceed to frame the correction as the solution of an algebraic Riccati
equation, and discuss how a low-rank solution to that equation can be computed.
We analyze the approximation error incurred when approximately solving the
algebraic Riccati equation, providing spectral and Frobenius norm forward and
backward error bounds. Finally, we describe several applications of our
algorithms, and demonstrate their utility in numerical experiments.
Related papers
- Entrywise error bounds for low-rank approximations of kernel matrices [55.524284152242096]
We derive entrywise error bounds for low-rank approximations of kernel matrices obtained using the truncated eigen-decomposition.
A key technical innovation is a delocalisation result for the eigenvectors of the kernel matrix corresponding to small eigenvalues.
We validate our theory with an empirical study of a collection of synthetic and real-world datasets.
arXiv Detail & Related papers (2024-05-23T12:26:25Z) - Approximate Secular Equations for the Cubic Regularization Subproblem [20.396963952753435]
We propose a novel solver for the cubic regularization subproblem (CRS)
The proposed solver outperforms two state-of-the-art methods.
arXiv Detail & Related papers (2022-09-27T09:22:44Z) - Fast Differentiable Matrix Square Root and Inverse Square Root [65.67315418971688]
We propose two more efficient variants to compute the differentiable matrix square root and the inverse square root.
For the forward propagation, one method is to use Matrix Taylor Polynomial (MTP), and the other method is to use Matrix Pad'e Approximants (MPA)
A series of numerical tests show that both methods yield considerable speed-up compared with the SVD or the NS iteration.
arXiv Detail & Related papers (2022-01-29T10:00:35Z) - Sparse Factorization of Large Square Matrices [10.94053598642913]
In this paper, we propose to approximate a large square matrix with a product of sparse full-rank matrices.
In the approximation, our method needs only $N(log N)2$ non-zero numbers for an $Ntimes N$ full matrix.
We show that our method gives a better approximation when the approximated matrix is sparse and high-rank.
arXiv Detail & Related papers (2021-09-16T18:42:21Z) - Statistical limits of dictionary learning: random matrix theory and the
spectral replica method [28.54289139061295]
We consider increasingly complex models of matrix denoising and dictionary learning in the Bayes-optimal setting.
We introduce a novel combination of the replica method from statistical mechanics together with random matrix theory, coined spectral replica method.
arXiv Detail & Related papers (2021-09-14T12:02:32Z) - Non-PSD Matrix Sketching with Applications to Regression and
Optimization [56.730993511802865]
We present dimensionality reduction methods for non-PSD and square-roots" matrices.
We show how these techniques can be used for multiple downstream tasks.
arXiv Detail & Related papers (2021-06-16T04:07:48Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Fast Rank Reduction for Non-negative Matrices via Mean Field Theory [5.634825161148483]
We formulate rank reduction as a mean-field approximation by modeling matrices via a log-linear model on structured sample space.
We empirically show that our rank reduction method is faster than NMF and its popular variant, lraNMF, while achieving competitive low rank approximation error on synthetic and real-world datasets.
arXiv Detail & Related papers (2020-06-09T14:55:47Z) - Sketching Transformed Matrices with Applications to Natural Language
Processing [76.6222695417524]
We propose a space-efficient sketching algorithm for computing the product of a given small matrix with the transformed matrix.
We show that our approach obtains small error and is efficient in both space and time.
arXiv Detail & Related papers (2020-02-23T03:07:31Z) - Optimal Iterative Sketching with the Subsampled Randomized Hadamard
Transform [64.90148466525754]
We study the performance of iterative sketching for least-squares problems.
We show that the convergence rate for Haar and randomized Hadamard matrices are identical, andally improve upon random projections.
These techniques may be applied to other algorithms that employ randomized dimension reduction.
arXiv Detail & Related papers (2020-02-03T16:17:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.