Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact
Recovery
- URL: http://arxiv.org/abs/2109.11154v1
- Date: Thu, 23 Sep 2021 05:54:46 GMT
- Title: Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact
Recovery
- Authors: Lijun Ding, Liwei Jiang, Yudong Chen, Qing Qu, Zhihui Zhu
- Abstract summary: We consider a robust factorization of a low-rank matrix with no prior knowledge on the rank.
We show that our particular design of diminishing the size of the matrix effectively prevents overfitting under overized models.
- Score: 37.05862765171643
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study the robust recovery of a low-rank matrix from sparsely and grossly
corrupted Gaussian measurements, with no prior knowledge on the intrinsic rank.
We consider the robust matrix factorization approach. We employ a robust
$\ell_1$ loss function and deal with the challenge of the unknown rank by using
an overspecified factored representation of the matrix variable. We then solve
the associated nonconvex nonsmooth problem using a subgradient method with
diminishing stepsizes. We show that under a regularity condition on the sensing
matrices and corruption, which we call restricted direction preserving property
(RDPP), even with rank overspecified, the subgradient method converges to the
exact low-rank solution at a sublinear rate. Moreover, our result is more
general in the sense that it automatically speeds up to a linear rate once the
factor rank matches the unknown rank. On the other hand, we show that the RDPP
condition holds under generic settings, such as Gaussian measurements under
independent or adversarial sparse corruptions, where the result could be of
independent interest. Both the exact recovery and the convergence rate of the
proposed subgradient method are numerically verified in the overspecified
regime. Moreover, our experiment further shows that our particular design of
diminishing stepsize effectively prevents overfitting for robust recovery under
overparameterized models, such as robust matrix sensing and learning robust
deep image prior. This regularization effect is worth further investigation.
Related papers
- Guarantees of a Preconditioned Subgradient Algorithm for Overparameterized Asymmetric Low-rank Matrix Recovery [8.722715843502321]
We focus on a matrix factorization-based approach for robust low-rank and asymmetric matrix recovery from corrupted measurements.
We propose a subgradient algorithm that inherits the merits of preconditioned algorithms, whose rate of convergence does not depend on the condition number of the sought matrix.
arXiv Detail & Related papers (2024-10-22T08:58:44Z) - A Validation Approach to Over-parameterized Matrix and Image Recovery [29.29430943998287]
We consider the problem of recovering a low-rank matrix from several random linear measurements.
We show that the proposed validation approach can also be efficiently used for image prior, an an image with a deep network.
arXiv Detail & Related papers (2022-09-21T22:01:23Z) - Solving weakly supervised regression problem using low-rank manifold
regularization [77.34726150561087]
We solve a weakly supervised regression problem.
Under "weakly" we understand that for some training points the labels are known, for some unknown, and for others uncertain due to the presence of random noise or other reasons such as lack of resources.
In the numerical section, we applied the suggested method to artificial and real datasets using Monte-Carlo modeling.
arXiv Detail & Related papers (2021-04-13T23:21:01Z) - Implicit Regularization of Sub-Gradient Method in Robust Matrix
Recovery: Don't be Afraid of Outliers [6.320141734801679]
We show that a simple sub-gradient method converges to the true low-rank solution efficiently.
We also build upon a new notion of restricted isometry property, called sign-RIP, to prove the robustness of the method.
arXiv Detail & Related papers (2021-02-05T02:52:00Z) - Low-Rank Matrix Recovery with Scaled Subgradient Methods: Fast and
Robust Convergence Without the Condition Number [34.0533596121548]
Many problems in data science can be treated as estimating a low-rank from highly incomplete, sometimes even corrupted, observations.
One popular approach is to resort to matrix factorization, where the low-rank matrix factors are optimized via first-order methods over a smooth loss.
arXiv Detail & Related papers (2020-10-26T06:21:14Z) - A Scalable, Adaptive and Sound Nonconvex Regularizer for Low-rank Matrix
Completion [60.52730146391456]
We propose a new non scalable low-rank regularizer called "nuclear Frobenius norm" regularizer, which is adaptive and sound.
It bypasses the computation of singular values and allows fast optimization by algorithms.
It obtains state-of-the-art recovery performance while being the fastest in existing matrix learning methods.
arXiv Detail & Related papers (2020-08-14T18:47:58Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z) - Robust Compressed Sensing using Generative Models [98.64228459705859]
In this paper we propose an algorithm inspired by the Median-of-Means (MOM)
Our algorithm guarantees recovery for heavy-tailed data, even in the presence of outliers.
arXiv Detail & Related papers (2020-06-16T19:07:41Z) - Robust Recovery via Implicit Bias of Discrepant Learning Rates for
Double Over-parameterization [46.22605476383572]
We show that gradient descent with discrepant learning rates provably recovers the underlying matrix even without prior knowledge on neither rank of the matrix nor sparsity of the corruption.
Our method handles different test images and varying corruption levels with a single learning pipeline where the network width and termination conditions do not need to be adjusted on a case-by-case basis.
arXiv Detail & Related papers (2020-06-16T01:21:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.