Learning Sparse and Low-Rank Priors for Image Recovery via Iterative
Reweighted Least Squares Minimization
- URL: http://arxiv.org/abs/2304.10536v1
- Date: Thu, 20 Apr 2023 17:59:45 GMT
- Title: Learning Sparse and Low-Rank Priors for Image Recovery via Iterative
Reweighted Least Squares Minimization
- Authors: Stamatios Lefkimmiatis and Iaroslav Koshelev
- Abstract summary: We introduce a novel optimization algorithm for image recovery under learned sparse and low-rank constraints.
Our proposed algorithm generalizes the Iteratively Reweighted Least Squares (IRLS) method, used for signal recovery.
Our reconstruction results are shown to be very competitive and in many cases outperform those of existing unrolled networks.
- Score: 12.487990897680422
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We introduce a novel optimization algorithm for image recovery under learned
sparse and low-rank constraints, which we parameterize as weighted extensions
of the $\ell_p^p$-vector and $\mathcal S_p^p$ Schatten-matrix quasi-norms for
$0\!<p\!\le1$, respectively. Our proposed algorithm generalizes the Iteratively
Reweighted Least Squares (IRLS) method, used for signal recovery under $\ell_1$
and nuclear-norm constrained minimization. Further, we interpret our overall
minimization approach as a recurrent network that we then employ to deal with
inverse low-level computer vision problems. Thanks to the convergence
guarantees that our IRLS strategy offers, we are able to train the derived
reconstruction networks using a memory-efficient implicit back-propagation
scheme, which does not pose any restrictions on their effective depth. To
assess our networks' performance, we compare them against other existing
reconstruction methods on several inverse problems, namely image deblurring,
super-resolution, demosaicking and sparse recovery. Our reconstruction results
are shown to be very competitive and in many cases outperform those of existing
unrolled networks, whose number of parameters is orders of magnitude higher
than that of our learned models.
Related papers
- Precise asymptotics of reweighted least-squares algorithms for linear diagonal networks [15.074950361970194]
We provide a unified analysis for a family of algorithms that encompasses IRLS, the recently proposed linlin-RFM algorithm, and the alternating diagonal neural networks.
We show that, with appropriately chosen reweighting policy, a handful of sparse structures can achieve favorable performance.
We also show that leveraging this in the reweighting scheme provably improves test error compared to coordinate-wise reweighting.
arXiv Detail & Related papers (2024-06-04T20:37:17Z) - Iterative Reweighted Least Squares Networks With Convergence Guarantees
for Solving Inverse Imaging Problems [12.487990897680422]
We present a novel optimization strategy for image reconstruction tasks under analysis-based image regularization.
We parameterize such regularizers using potential functions that correspond to weighted extensions of the $ell_pp$-vector and $mathcalS_pp$ Schatten-matrix quasi-norms.
We show that thanks to the convergence guarantees of our proposed minimization strategy, such optimization can be successfully performed with a memory-efficient implicit back-propagation scheme.
arXiv Detail & Related papers (2023-08-10T17:59:46Z) - Iterative Soft Shrinkage Learning for Efficient Image Super-Resolution [91.3781512926942]
Image super-resolution (SR) has witnessed extensive neural network designs from CNN to transformer architectures.
This work investigates the potential of network pruning for super-resolution iteration to take advantage of off-the-shelf network designs and reduce the underlying computational overhead.
We propose a novel Iterative Soft Shrinkage-Percentage (ISS-P) method by optimizing the sparse structure of a randomly network at each and tweaking unimportant weights with a small amount proportional to the magnitude scale on-the-fly.
arXiv Detail & Related papers (2023-03-16T21:06:13Z) - Effective Invertible Arbitrary Image Rescaling [77.46732646918936]
Invertible Neural Networks (INN) are able to increase upscaling accuracy significantly by optimizing the downscaling and upscaling cycle jointly.
A simple and effective invertible arbitrary rescaling network (IARN) is proposed to achieve arbitrary image rescaling by training only one model in this work.
It is shown to achieve a state-of-the-art (SOTA) performance in bidirectional arbitrary rescaling without compromising perceptual quality in LR outputs.
arXiv Detail & Related papers (2022-09-26T22:22:30Z) - InfoNeRF: Ray Entropy Minimization for Few-Shot Neural Volume Rendering [55.70938412352287]
We present an information-theoretic regularization technique for few-shot novel view synthesis based on neural implicit representation.
The proposed approach minimizes potential reconstruction inconsistency that happens due to insufficient viewpoints.
We achieve consistently improved performance compared to existing neural view synthesis methods by large margins on multiple standard benchmarks.
arXiv Detail & Related papers (2021-12-31T11:56:01Z) - Robust lEarned Shrinkage-Thresholding (REST): Robust unrolling for
sparse recover [87.28082715343896]
We consider deep neural networks for solving inverse problems that are robust to forward model mis-specifications.
We design a new robust deep neural network architecture by applying algorithm unfolding techniques to a robust version of the underlying recovery problem.
The proposed REST network is shown to outperform state-of-the-art model-based and data-driven algorithms in both compressive sensing and radar imaging problems.
arXiv Detail & Related papers (2021-10-20T06:15:45Z) - A Deep-Unfolded Reference-Based RPCA Network For Video
Foreground-Background Separation [86.35434065681925]
This paper proposes a new deep-unfolding-based network design for the problem of Robust Principal Component Analysis (RPCA)
Unlike existing designs, our approach focuses on modeling the temporal correlation between the sparse representations of consecutive video frames.
Experimentation using the moving MNIST dataset shows that the proposed network outperforms a recently proposed state-of-the-art RPCA network in the task of video foreground-background separation.
arXiv Detail & Related papers (2020-10-02T11:40:09Z) - Neural Network-based Reconstruction in Compressed Sensing MRI Without
Fully-sampled Training Data [17.415937218905125]
CS-MRI has shown promise in reconstructing under-sampled MR images.
Deep learning models have been developed that model the iterative nature of classical techniques by unrolling iterations in a neural network.
In this paper, we explore a novel strategy to train an unrolled reconstruction network in an unsupervised fashion by adopting a loss function widely-used in classical optimization schemes.
arXiv Detail & Related papers (2020-07-29T17:46:55Z) - Iterative Network for Image Super-Resolution [69.07361550998318]
Single image super-resolution (SISR) has been greatly revitalized by the recent development of convolutional neural networks (CNN)
This paper provides a new insight on conventional SISR algorithm, and proposes a substantially different approach relying on the iterative optimization.
A novel iterative super-resolution network (ISRN) is proposed on top of the iterative optimization.
arXiv Detail & Related papers (2020-05-20T11:11:47Z) - Sparse aNETT for Solving Inverse Problems with Deep Learning [2.5234156040689237]
We propose a sparse reconstruction framework (aNETT) for solving inverse problems.
We train an autoencoder network $D circ E$ with $E$ acting as a nonlinear sparsifying transform.
Numerical results are presented for sparse view CT.
arXiv Detail & Related papers (2020-04-20T18:43:13Z) - Interpretable Deep Recurrent Neural Networks via Unfolding Reweighted
$\ell_1$-$\ell_1$ Minimization: Architecture Design and Generalization
Analysis [19.706363403596196]
This paper develops a novel deep recurrent neural network (coined reweighted-RNN) by the unfolding of a reweighted minimization algorithm.
To the best of our knowledge, this is the first deep unfolding method that explores reweighted minimization.
The experimental results on the moving MNIST dataset demonstrate that the proposed deep reweighted-RNN significantly outperforms existing RNN models.
arXiv Detail & Related papers (2020-03-18T17:02:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.