Fast Generalized Matrix Regression with Applications in Machine Learning
- URL: http://arxiv.org/abs/1912.12008v1
- Date: Fri, 27 Dec 2019 07:03:37 GMT
- Title: Fast Generalized Matrix Regression with Applications in Machine Learning
- Authors: Haishan Ye, Shusen Wang, Zhihua Zhang, Tong Zhang
- Abstract summary: We propose a fast generalized matrix regression algorithm (Fast GMR) which utilizes sketching technique to solve the GMR problem efficiently.
We apply the Fast GMR algorithm to the symmetric positive definite matrix approximation and single pass singular value decomposition.
- Score: 46.79740387890322
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Fast matrix algorithms have become the fundamental tools of machine learning
in big data era.
The generalized matrix regression problem is widely used in the matrix
approximation such as CUR decomposition, kernel matrix approximation, and
stream singular value decomposition (SVD), etc.
In this paper, we propose a fast generalized matrix regression algorithm
(Fast GMR) which utilizes sketching technique to solve the GMR problem
efficiently.
Given error parameter $0<\epsilon<1$, the Fast GMR algorithm can achieve a
$(1+\epsilon)$ relative error with the sketching sizes being of order
$\cO(\epsilon^{-1/2})$ for a large group of GMR problems.
We apply the Fast GMR algorithm to the symmetric positive definite matrix
approximation and single pass singular value decomposition and they achieve a
better performance than conventional algorithms.
Our empirical study also validates the effectiveness and efficiency of our
proposed algorithms.
Related papers
- Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - An Efficient Algorithm for Clustered Multi-Task Compressive Sensing [60.70532293880842]
Clustered multi-task compressive sensing is a hierarchical model that solves multiple compressive sensing tasks.
The existing inference algorithm for this model is computationally expensive and does not scale well in high dimensions.
We propose a new algorithm that substantially accelerates model inference by avoiding the need to explicitly compute these covariance matrices.
arXiv Detail & Related papers (2023-09-30T15:57:14Z) - Deep Unrolling for Nonconvex Robust Principal Component Analysis [75.32013242448151]
We design algorithms for Robust Component Analysis (A)
It consists in decomposing a matrix into the sum of a low Principaled matrix and a sparse Principaled matrix.
arXiv Detail & Related papers (2023-07-12T03:48:26Z) - Learning the Positions in CountSketch [49.57951567374372]
We consider sketching algorithms which first compress data by multiplication with a random sketch matrix, and then apply the sketch to quickly solve an optimization problem.
In this work, we propose the first learning-based algorithms that also optimize the locations of the non-zero entries.
arXiv Detail & Related papers (2023-06-11T07:28:35Z) - Accelerated nonlinear primal-dual hybrid gradient algorithms with
applications to machine learning [0.0]
primal-dual hybrid gradient (PDHG) is a first-order method that splits convex optimization problems with saddle-point structure into smaller subproblems.
PDHG requires stepsize parameters fine-tuned for the problem at hand.
We introduce accelerated nonlinear variants of the PDHG algorithm that can achieve, for a broad class of optimization problems relevant to machine learning.
arXiv Detail & Related papers (2021-09-24T22:37:10Z) - Weighted Low Rank Matrix Approximation and Acceleration [0.5177947445379687]
Low-rank matrix approximation is one of the central concepts in machine learning.
Low-rank matrix completion (LRMC) solves the LRMA problem when some observations are missing.
We propose an algorithm for solving the weighted problem, as well as two acceleration techniques.
arXiv Detail & Related papers (2021-09-22T22:03:48Z) - Robust 1-bit Compressive Sensing with Partial Gaussian Circulant
Matrices and Generative Priors [54.936314353063494]
We provide recovery guarantees for a correlation-based optimization algorithm for robust 1-bit compressive sensing.
We make use of a practical iterative algorithm, and perform numerical experiments on image datasets to corroborate our results.
arXiv Detail & Related papers (2021-08-08T05:28:06Z) - Fast and Accurate Pseudoinverse with Sparse Matrix Reordering and
Incremental Approach [4.710916891482697]
A pseudoinverse is a generalization of a matrix inverse, which has been extensively utilized in machine learning.
FastPI is a novel incremental singular value decomposition (SVD) based pseudoinverse method for sparse matrices.
We show that FastPI computes the pseudoinverse faster than other approximate methods without loss of accuracy.
arXiv Detail & Related papers (2020-11-09T07:47:10Z) - QR and LQ Decomposition Matrix Backpropagation Algorithms for Square,
Wide, and Deep -- Real or Complex -- Matrices and Their Software
Implementation [0.0]
This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_m, n$, that are either square (m = n), wide (m n), or deep (m > n), with rank $k = min(m, n)$.
We derive novel matrix backpropagation results for the pivoted (full-rank) QR decomposition and for the LQ decomposition of deep input matrices.
arXiv Detail & Related papers (2020-09-19T21:03:37Z) - Denise: Deep Robust Principal Component Analysis for Positive
Semidefinite Matrices [8.1371986647556]
Denise is a deep learning-based algorithm for robust PCA of covariance matrices.
Experiments show that Denise matches state-of-the-art performance in terms of decomposition quality.
arXiv Detail & Related papers (2020-04-28T15:45:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.