Rapid Robust Principal Component Analysis: CUR Accelerated Inexact Low
Rank Estimation
- URL: http://arxiv.org/abs/2010.07422v3
- Date: Sun, 7 Feb 2021 06:43:25 GMT
- Title: Rapid Robust Principal Component Analysis: CUR Accelerated Inexact Low
Rank Estimation
- Authors: HanQin Cai, Keaton Hamm, Longxiu Huang, Jiaqi Li, Tao Wang
- Abstract summary: We propose a novel non-RPC algorithm coined Iterated Robust CUR (IRCUR)
IRCUR is able to process only small submatrices and avoid expensive computing on the full matrix through the entire algorithm.
Numerical experiments establish the computational advantage of IRCUR on both synthetic and real-world datasets.
- Score: 8.169365031508885
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Robust principal component analysis (RPCA) is a widely used tool for
dimension reduction. In this work, we propose a novel non-convex algorithm,
coined Iterated Robust CUR (IRCUR), for solving RPCA problems, which
dramatically improves the computational efficiency in comparison with the
existing algorithms. IRCUR achieves this acceleration by employing CUR
decomposition when updating the low rank component, which allows us to obtain
an accurate low rank approximation via only three small submatrices.
Consequently, IRCUR is able to process only the small submatrices and avoid
expensive computing on the full matrix through the entire algorithm. Numerical
experiments establish the computational advantage of IRCUR over the
state-of-art algorithms on both synthetic and real-world datasets.
Related papers
- Rapid Grassmannian Averaging with Chebyshev Polynomials [8.394689129416067]
We propose new algorithms to efficiently average a collection of points on a Grassmannian manifold in both the centralized and decentralized settings.
Our proposed algorithms, Rapid Grassmannian Averaging (RGrAv) and Decentralized Rapid Grassmannian Averaging (DRGrAv), overcome this challenge by leveraging the spectral structure of the problem to rapidly compute an average.
We provide a theoretical guarantee of optimality and present numerical experiments which demonstrate that our algorithms outperform state-of-the-art methods in providing high accuracy solutions in minimal time.
arXiv Detail & Related papers (2024-10-11T16:25:06Z) - Stochastic Optimization for Non-convex Problem with Inexact Hessian
Matrix, Gradient, and Function [99.31457740916815]
Trust-region (TR) and adaptive regularization using cubics have proven to have some very appealing theoretical properties.
We show that TR and ARC methods can simultaneously provide inexact computations of the Hessian, gradient, and function values.
arXiv Detail & Related papers (2023-10-18T10:29:58Z) - Spectral Entry-wise Matrix Estimation for Low-Rank Reinforcement
Learning [53.445068584013896]
We study matrix estimation problems arising in reinforcement learning (RL) with low-rank structure.
In low-rank bandits, the matrix to be recovered specifies the expected arm rewards, and for low-rank Markov Decision Processes (MDPs), it may for example characterize the transition kernel of the MDP.
We show that simple spectral-based matrix estimation approaches efficiently recover the singular subspaces of the matrix and exhibit nearly-minimal entry-wise error.
arXiv Detail & Related papers (2023-10-10T17:06:41Z) - Deep Unrolling for Nonconvex Robust Principal Component Analysis [75.32013242448151]
We design algorithms for Robust Component Analysis (A)
It consists in decomposing a matrix into the sum of a low Principaled matrix and a sparse Principaled matrix.
arXiv Detail & Related papers (2023-07-12T03:48:26Z) - Efficient Model-Free Exploration in Low-Rank MDPs [76.87340323826945]
Low-Rank Markov Decision Processes offer a simple, yet expressive framework for RL with function approximation.
Existing algorithms are either (1) computationally intractable, or (2) reliant upon restrictive statistical assumptions.
We propose the first provably sample-efficient algorithm for exploration in Low-Rank MDPs.
arXiv Detail & Related papers (2023-07-08T15:41:48Z) - Riemannian CUR Decompositions for Robust Principal Component Analysis [4.060731229044571]
Robust Principal Component Analysis (PCA) has received massive attention in recent years.
This paper proposes Robustian CUR, which is a robust PCA decomposition algorithm.
It is able to tolerate a significant amount of outliers, and is comparable to Accelerated Projections, which has high outlier tolerance but worse computational complexity than the proposed method.
arXiv Detail & Related papers (2022-06-17T22:58:09Z) - Large-scale Optimization of Partial AUC in a Range of False Positive
Rates [51.12047280149546]
The area under the ROC curve (AUC) is one of the most widely used performance measures for classification models in machine learning.
We develop an efficient approximated gradient descent method based on recent practical envelope smoothing technique.
Our proposed algorithm can also be used to minimize the sum of some ranked range loss, which also lacks efficient solvers.
arXiv Detail & Related papers (2022-03-03T03:46:18Z) - Fast and Robust Iterative Closest Point [32.42799285301607]
Iterative Closest Point (ICP) is a fundamental technique for rigid registration between two point sets.
Recent work such as Sparse ICP achieves robustness via sparsity optimization at the cost of computational speed.
We show that the classical point-to-point ICP can be treated as a majorization-minimization (MM) algorithm, and propose an Anderson acceleration approach to speed up its convergence.
arXiv Detail & Related papers (2020-07-15T11:32:53Z) - Approximation Algorithms for Sparse Principal Component Analysis [57.5357874512594]
Principal component analysis (PCA) is a widely used dimension reduction technique in machine learning and statistics.
Various approaches to obtain sparse principal direction loadings have been proposed, which are termed Sparse Principal Component Analysis.
We present thresholding as a provably accurate, time, approximation algorithm for the SPCA problem.
arXiv Detail & Related papers (2020-06-23T04:25:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.