Robust CUR Decomposition: Theory and Imaging Applications
- URL: http://arxiv.org/abs/2101.05231v1
- Date: Tue, 5 Jan 2021 17:58:15 GMT
- Title: Robust CUR Decomposition: Theory and Imaging Applications
- Authors: HanQin Cai, Keaton Hamm, Longxiu Huang, Deanna Needell
- Abstract summary: This paper considers the use of Robust PCA in a CUR decomposition framework and applications thereof.
We consider two key imaging applications of Robust PCA: video foreground-background separation and face modeling.
- Score: 9.280330114137778
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper considers the use of Robust PCA in a CUR decomposition framework
and applications thereof. Our main algorithms produce a robust version of
column-row factorizations of matrices $\mathbf{D}=\mathbf{L}+\mathbf{S}$ where
$\mathbf{L}$ is low-rank and $\mathbf{S}$ contains sparse outliers. These
methods yield interpretable factorizations at low computational cost, and
provide new CUR decompositions that are robust to sparse outliers, in contrast
to previous methods. We consider two key imaging applications of Robust PCA:
video foreground-background separation and face modeling. This paper examines
the qualitative behavior of our Robust CUR decompositions on the benchmark
videos and face datasets, and find that our method works as well as standard
Robust PCA while being significantly faster. Additionally, we consider hybrid
randomized and deterministic sampling methods which produce a compact CUR
decomposition of a given matrix, and apply this to video sequences to produce
canonical frames thereof.
Related papers
- Deep Unrolling for Nonconvex Robust Principal Component Analysis [75.32013242448151]
We design algorithms for Robust Component Analysis (A)
It consists in decomposing a matrix into the sum of a low Principaled matrix and a sparse Principaled matrix.
arXiv Detail & Related papers (2023-07-12T03:48:26Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Batch-efficient EigenDecomposition for Small and Medium Matrices [65.67315418971688]
EigenDecomposition (ED) is at the heart of many computer vision algorithms and applications.
We propose a QR-based ED method dedicated to the application scenarios of computer vision.
arXiv Detail & Related papers (2022-07-09T09:14:12Z) - Riemannian CUR Decompositions for Robust Principal Component Analysis [4.060731229044571]
Robust Principal Component Analysis (PCA) has received massive attention in recent years.
This paper proposes Robustian CUR, which is a robust PCA decomposition algorithm.
It is able to tolerate a significant amount of outliers, and is comparable to Accelerated Projections, which has high outlier tolerance but worse computational complexity than the proposed method.
arXiv Detail & Related papers (2022-06-17T22:58:09Z) - Exact Decomposition of Joint Low Rankness and Local Smoothness Plus
Sparse Matrices [39.47324019377441]
We propose a new RPCA model based on three-dimensional correlated total variation regularization (3DCTV-RPCA for short)
We prove that under some mild assumptions, the proposed 3DCTV-RPCA model can decompose both components exactly.
arXiv Detail & Related papers (2022-01-29T13:58:03Z) - Sublinear Time Approximation of Text Similarity Matrices [50.73398637380375]
We introduce a generalization of the popular Nystr"om method to the indefinite setting.
Our algorithm can be applied to any similarity matrix and runs in sublinear time in the size of the matrix.
We show that our method, along with a simple variant of CUR decomposition, performs very well in approximating a variety of similarity matrices.
arXiv Detail & Related papers (2021-12-17T17:04:34Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - QR and LQ Decomposition Matrix Backpropagation Algorithms for Square,
Wide, and Deep -- Real or Complex -- Matrices and Their Software
Implementation [0.0]
This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_m, n$, that are either square (m = n), wide (m n), or deep (m > n), with rank $k = min(m, n)$.
We derive novel matrix backpropagation results for the pivoted (full-rank) QR decomposition and for the LQ decomposition of deep input matrices.
arXiv Detail & Related papers (2020-09-19T21:03:37Z) - Robust Low-rank Matrix Completion via an Alternating Manifold Proximal
Gradient Continuation Method [47.80060761046752]
Robust low-rank matrix completion (RMC) has been studied extensively for computer vision, signal processing and machine learning applications.
This problem aims to decompose a partially observed matrix into the superposition of a low-rank matrix and a sparse matrix, where the sparse matrix captures the grossly corrupted entries of the matrix.
A widely used approach to tackle RMC is to consider a convex formulation, which minimizes the nuclear norm of the low-rank matrix (to promote low-rankness) and the l1 norm of the sparse matrix (to promote sparsity).
In this paper, motivated by some recent works on low-
arXiv Detail & Related papers (2020-08-18T04:46:22Z) - Denise: Deep Robust Principal Component Analysis for Positive
Semidefinite Matrices [8.1371986647556]
Denise is a deep learning-based algorithm for robust PCA of covariance matrices.
Experiments show that Denise matches state-of-the-art performance in terms of decomposition quality.
arXiv Detail & Related papers (2020-04-28T15:45:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.