Loop Unrolled Shallow Equilibrium Regularizer (LUSER) -- A
Memory-Efficient Inverse Problem Solver
- URL: http://arxiv.org/abs/2210.04987v1
- Date: Mon, 10 Oct 2022 19:50:37 GMT
- Title: Loop Unrolled Shallow Equilibrium Regularizer (LUSER) -- A
Memory-Efficient Inverse Problem Solver
- Authors: Peimeng Guan, Jihui Jin, Justin Romberg, Mark A. Davenport
- Abstract summary: In inverse problems we aim to reconstruct some underlying signal of interest from potentially corrupted and often ill-posed measurements.
We propose an LU algorithm with shallow equilibrium regularizers (L)
These implicit models are as expressive as deeper convolutional networks, but far more memory efficient during training.
- Score: 26.87738024952936
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In inverse problems we aim to reconstruct some underlying signal of interest
from potentially corrupted and often ill-posed measurements. Classical
optimization-based techniques proceed by optimizing a data consistency metric
together with a regularizer. Current state-of-the-art machine learning
approaches draw inspiration from such techniques by unrolling the iterative
updates for an optimization-based solver and then learning a regularizer from
data. This loop unrolling (LU) method has shown tremendous success, but often
requires a deep model for the best performance leading to high memory costs
during training. Thus, to address the balance between computation cost and
network expressiveness, we propose an LU algorithm with shallow equilibrium
regularizers (LUSER). These implicit models are as expressive as deeper
convolutional networks, but far more memory efficient during training. The
proposed method is evaluated on image deblurring, computed tomography (CT), as
well as single-coil Magnetic Resonance Imaging (MRI) tasks and shows similar,
or even better, performance while requiring up to 8 times less computational
resources during training when compared against a more typical LU architecture
with feedforward convolutional regularizers.
Related papers
- An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Curvature regularization for Non-line-of-sight Imaging from
Under-sampled Data [5.591221518341613]
Non-line-of-sight (NLOS) imaging aims to reconstruct the three-dimensional hidden scenes from the data measured in the line-of-sight.
We propose novel NLOS reconstruction models based on curvature regularization.
We evaluate the proposed algorithms on both synthetic and real datasets.
arXiv Detail & Related papers (2023-01-01T14:10:43Z) - Effective Invertible Arbitrary Image Rescaling [77.46732646918936]
Invertible Neural Networks (INN) are able to increase upscaling accuracy significantly by optimizing the downscaling and upscaling cycle jointly.
A simple and effective invertible arbitrary rescaling network (IARN) is proposed to achieve arbitrary image rescaling by training only one model in this work.
It is shown to achieve a state-of-the-art (SOTA) performance in bidirectional arbitrary rescaling without compromising perceptual quality in LR outputs.
arXiv Detail & Related papers (2022-09-26T22:22:30Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Denoising Diffusion Restoration Models [110.1244240726802]
Denoising Diffusion Restoration Models (DDRM) is an efficient, unsupervised posterior sampling method.
We demonstrate DDRM's versatility on several image datasets for super-resolution, deblurring, inpainting, and colorization.
arXiv Detail & Related papers (2022-01-27T20:19:07Z) - Joint inference and input optimization in equilibrium networks [68.63726855991052]
deep equilibrium model is a class of models that foregoes traditional network depth and instead computes the output of a network by finding the fixed point of a single nonlinear layer.
We show that there is a natural synergy between these two settings.
We demonstrate this strategy on various tasks such as training generative models while optimizing over latent codes, training models for inverse problems like denoising and inpainting, adversarial training and gradient based meta-learning.
arXiv Detail & Related papers (2021-11-25T19:59:33Z) - DRO: Deep Recurrent Optimizer for Structure-from-Motion [46.34708595941016]
This paper presents a novel optimization method based on recurrent neural networks in structure-from-motion (SfM)
Our neural alternatively updates the depth and camera poses through iterations to minimize a feature-metric cost.
Experiments demonstrate that our recurrent computation effectively reduces the feature-metric cost while refining the depth and poses.
arXiv Detail & Related papers (2021-03-24T13:59:40Z) - Deep Equilibrium Architectures for Inverse Problems in Imaging [14.945209750917483]
Recent efforts on solving inverse problems in imaging via deep neural networks use architectures inspired by a fixed number of iterations of an optimization method.
This paper describes an alternative approach corresponding to an em infinite number of iterations, yielding up to a 4dB PSNR improvement in reconstruction accuracy.
arXiv Detail & Related papers (2021-02-16T03:49:58Z) - Optimization-driven Machine Learning for Intelligent Reflecting Surfaces
Assisted Wireless Networks [82.33619654835348]
Intelligent surface (IRS) has been employed to reshape the wireless channels by controlling individual scattering elements' phase shifts.
Due to the large size of scattering elements, the passive beamforming is typically challenged by the high computational complexity.
In this article, we focus on machine learning (ML) approaches for performance in IRS-assisted wireless networks.
arXiv Detail & Related papers (2020-08-29T08:39:43Z) - Neural Network-based Reconstruction in Compressed Sensing MRI Without
Fully-sampled Training Data [17.415937218905125]
CS-MRI has shown promise in reconstructing under-sampled MR images.
Deep learning models have been developed that model the iterative nature of classical techniques by unrolling iterations in a neural network.
In this paper, we explore a novel strategy to train an unrolled reconstruction network in an unsupervised fashion by adopting a loss function widely-used in classical optimization schemes.
arXiv Detail & Related papers (2020-07-29T17:46:55Z) - Hyperspectral Unmixing Network Inspired by Unfolding an Optimization
Problem [2.4016406737205753]
The hyperspectral image (HSI) unmixing task is essentially an inverse problem, which is commonly solved by optimization algorithms.
We propose two novel network architectures, named U-ADMM-AENet and U-ADMM-BUNet, for abundance estimation and blind unmixing.
We show that the unfolded structures can find corresponding interpretations in machine learning literature, which further demonstrates the effectiveness of proposed methods.
arXiv Detail & Related papers (2020-05-21T18:49:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.