Superkernel Neural Architecture Search for Image Denoising
- URL: http://arxiv.org/abs/2004.08870v1
- Date: Sun, 19 Apr 2020 14:52:22 GMT
- Title: Superkernel Neural Architecture Search for Image Denoising
- Authors: Marcin Mo\.zejko, Tomasz Latkowski, {\L}ukasz Treszczotko, Micha{\l}
Szafraniuk, Krzysztof Trojanowski
- Abstract summary: We focus on exploring NAS for a dense prediction task that is image denoising.
Due to a costly training procedure, most NAS solutions for image enhancement rely on reinforcement learning or evolutionary algorithm exploration.
We introduce a new efficient implementation of various super Kernel techniques that enable fast single-shot training of models for dense predictions.
We demonstrate the effectiveness of our method on the SIDD+ benchmark for image denoising.
- Score: 4.633161758939184
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advancements in Neural Architecture Search(NAS) resulted in finding
new state-of-the-art Artificial Neural Network (ANN) solutions for tasks like
image classification, object detection, or semantic segmentation without
substantial human supervision. In this paper, we focus on exploring NAS for a
dense prediction task that is image denoising. Due to a costly training
procedure, most NAS solutions for image enhancement rely on reinforcement
learning or evolutionary algorithm exploration, which usually take weeks (or
even months) to train. Therefore, we introduce a new efficient implementation
of various superkernel techniques that enable fast (6-8 RTX2080 GPU hours)
single-shot training of models for dense predictions. We demonstrate the
effectiveness of our method on the SIDD+ benchmark for image denoising.
Related papers
- Masked Image Training for Generalizable Deep Image Denoising [53.03126421917465]
We present a novel approach to enhance the generalization performance of denoising networks.
Our method involves masking random pixels of the input image and reconstructing the missing information during training.
Our approach exhibits better generalization ability than other deep learning models and is directly applicable to real-world scenarios.
arXiv Detail & Related papers (2023-03-23T09:33:44Z) - Single Cell Training on Architecture Search for Image Denoising [16.72206392993489]
We re-frame the optimal search problem by focusing at component block level.
In addition, we integrate an innovative dimension matching modules for dealing with spatial and channel-wise mismatch.
Our proposed Denoising Prior Neural Architecture Search (DPNAS) was demonstrated by having it complete an optimal architecture search for an image restoration task by just one day with a single GPU.
arXiv Detail & Related papers (2022-12-13T04:47:24Z) - Enhancing convolutional neural network generalizability via low-rank weight approximation [6.763245393373041]
Sufficient denoising is often an important first step for image processing.
Deep neural networks (DNNs) have been widely used for image denoising.
We introduce a new self-supervised framework for image denoising based on the Tucker low-rank tensor approximation.
arXiv Detail & Related papers (2022-09-26T14:11:05Z) - Practical Blind Image Denoising via Swin-Conv-UNet and Data Synthesis [148.16279746287452]
We propose a swin-conv block to incorporate the local modeling ability of residual convolutional layer and non-local modeling ability of swin transformer block.
For the training data synthesis, we design a practical noise degradation model which takes into consideration different kinds of noise.
Experiments on AGWN removal and real image denoising demonstrate that the new network architecture design achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-03-24T18:11:31Z) - Self-Denoising Neural Networks for Few Shot Learning [66.38505903102373]
We present a new training scheme that adds noise at multiple stages of an existing neural architecture while simultaneously learning to be robust to this added noise.
This architecture, which we call a Self-Denoising Neural Network (SDNN), can be applied easily to most modern convolutional neural architectures.
arXiv Detail & Related papers (2021-10-26T03:28:36Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - Exploring ensembles and uncertainty minimization in denoising networks [0.522145960878624]
We propose a fusion model consisting of two attention modules, which focus on assigning the proper weights to pixels and channels.
The experimental results show that our model achieves better performance on top of the baseline of regular pre-trained denoising networks.
arXiv Detail & Related papers (2021-01-24T20:48:18Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Deep Learning on Image Denoising: An overview [92.07378559622889]
We offer a comparative study of deep techniques in image denoising.
We first classify the deep convolutional neural networks (CNNs) for additive white noisy images.
Next, we compare the state-of-the-art methods on public denoising datasets in terms of quantitative and qualitative analysis.
arXiv Detail & Related papers (2019-12-31T05:03:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.