Async-RED: A Provably Convergent Asynchronous Block Parallel Stochastic
Method using Deep Denoising Priors
- URL: http://arxiv.org/abs/2010.01446v1
- Date: Sat, 3 Oct 2020 23:55:36 GMT
- Title: Async-RED: A Provably Convergent Asynchronous Block Parallel Stochastic
Method using Deep Denoising Priors
- Authors: Yu Sun, Jiaming Liu, Yiran Sun, Brendt Wohlberg, Ulugbek S. Kamilov
- Abstract summary: Regularization by denoising (RED) is a recently developed framework for solving inverse problems by integrating advanced denoisers as image priors.
We propose a new asynchronous RED (ASYNC-RED) algorithm that enables asynchronous parallel processing of data, making it significantly faster than its serial counterparts for large-scale inverse problems.
- Score: 31.773305606551197
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Regularization by denoising (RED) is a recently developed framework for
solving inverse problems by integrating advanced denoisers as image priors.
Recent work has shown its state-of-the-art performance when combined with
pre-trained deep denoisers. However, current RED algorithms are inadequate for
parallel processing on multicore systems. We address this issue by proposing a
new asynchronous RED (ASYNC-RED) algorithm that enables asynchronous parallel
processing of data, making it significantly faster than its serial counterparts
for large-scale inverse problems. The computational complexity of ASYNC-RED is
further reduced by using a random subset of measurements at every iteration. We
present complete theoretical analysis of the algorithm by establishing its
convergence under explicit assumptions on the data-fidelity and the denoiser.
We validate ASYNC-RED on image recovery using pre-trained deep denoisers as
priors.
Related papers
- Asynchronous Stochastic Gradient Descent with Decoupled Backpropagation and Layer-Wise Updates [1.9241821314180372]
One major shortcoming of backpropagation is the interlocking between the forward and backward phases of the algorithm.
We propose a method that parallelises SGD updates across the layers of a model by asynchronously updating them from multiple threads.
We show that this approach yields close to state-of-theart results while running up to 2.97x faster than Hogwild! scaled on multiple devices.
arXiv Detail & Related papers (2024-10-08T12:32:36Z) - Plug-and-Play image restoration with Stochastic deNOising REgularization [8.678250057211368]
We propose a new framework called deNOising REgularization (SNORE)
SNORE applies the denoiser only to images with noise of the adequate level.
It is based on an explicit regularization, which leads to a descent to solve inverse problems.
arXiv Detail & Related papers (2024-02-01T18:05:47Z) - Towards Understanding the Generalizability of Delayed Stochastic
Gradient Descent [63.43247232708004]
A gradient descent performed in an asynchronous manner plays a crucial role in training large-scale machine learning models.
Existing generalization error bounds are rather pessimistic and cannot reveal the correlation between asynchronous delays and generalization.
Our theoretical results indicate that asynchronous delays reduce the generalization error of the delayed SGD algorithm.
arXiv Detail & Related papers (2023-08-18T10:00:27Z) - Efficient Dataset Distillation Using Random Feature Approximation [109.07737733329019]
We propose a novel algorithm that uses a random feature approximation (RFA) of the Neural Network Gaussian Process (NNGP) kernel.
Our algorithm provides at least a 100-fold speedup over KIP and can run on a single GPU.
Our new method, termed an RFA Distillation (RFAD), performs competitively with KIP and other dataset condensation algorithms in accuracy over a range of large-scale datasets.
arXiv Detail & Related papers (2022-10-21T15:56:13Z) - Monotonically Convergent Regularization by Denoising [19.631197002314305]
Regularization by denoising (RED) is a widely-used framework for solving inverse problems by leveraging image denoisers as image priors.
Recent work has reported the state-of-the-art performance of RED in a number of imaging applications using pre-trained deep neural nets as denoisers.
This work addresses this issue by developing a new monotone RED (MRED) algorithm, whose convergence does not require nonexpansiveness of the deep denoising prior.
arXiv Detail & Related papers (2022-02-10T11:32:41Z) - An Interpretation of Regularization by Denoising and its Application
with the Back-Projected Fidelity Term [55.34375605313277]
We show that the RED gradient can be seen as a (sub)gradient of a prior function--but taken at a denoised version of the point.
We propose to combine RED with the Back-Projection (BP) fidelity term rather than the common Least Squares (LS) term that is used in previous works.
arXiv Detail & Related papers (2021-01-27T18:45:35Z) - A Deep-Unfolded Reference-Based RPCA Network For Video
Foreground-Background Separation [86.35434065681925]
This paper proposes a new deep-unfolding-based network design for the problem of Robust Principal Component Analysis (RPCA)
Unlike existing designs, our approach focuses on modeling the temporal correlation between the sparse representations of consecutive video frames.
Experimentation using the moving MNIST dataset shows that the proposed network outperforms a recently proposed state-of-the-art RPCA network in the task of video foreground-background separation.
arXiv Detail & Related papers (2020-10-02T11:40:09Z) - PSO-PS: Parameter Synchronization with Particle Swarm Optimization for
Distributed Training of Deep Neural Networks [16.35607080388805]
We propose a new algorithm of integrating Particle Swarm Optimization into the distributed training process of Deep Neural Networks (DNNs)
In the proposed algorithm, a computing work is encoded by a particle, the weights of DNNs and the training loss are modeled by the particle attributes.
At each synchronization stage, the weights are updated by PSO from the sub weights gathered from all workers, instead of averaging the weights or the gradients.
arXiv Detail & Related papers (2020-09-06T05:18:32Z) - Regularization by Denoising via Fixed-Point Projection (RED-PRO) [34.89374374708481]
Regularization by Denoising (RED) and Plug-and-Play Prior (RED) are used in image processing.
While both have shown state-of-the-art results in various recovery tasks, their theoretical justification is incomplete.
arXiv Detail & Related papers (2020-08-01T09:35:22Z) - Communication-Efficient Distributed Stochastic AUC Maximization with
Deep Neural Networks [50.42141893913188]
We study a distributed variable for large-scale AUC for a neural network as with a deep neural network.
Our model requires a much less number of communication rounds and still a number of communication rounds in theory.
Our experiments on several datasets show the effectiveness of our theory and also confirm our theory.
arXiv Detail & Related papers (2020-05-05T18:08:23Z) - Lagrangian Decomposition for Neural Network Verification [148.0448557991349]
A fundamental component of neural network verification is the computation of bounds on the values their outputs can take.
We propose a novel approach based on Lagrangian Decomposition.
We show that we obtain bounds comparable with off-the-shelf solvers in a fraction of their running time.
arXiv Detail & Related papers (2020-02-24T17:55:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.