Single Cell Training on Architecture Search for Image Denoising
- URL: http://arxiv.org/abs/2212.06368v1
- Date: Tue, 13 Dec 2022 04:47:24 GMT
- Title: Single Cell Training on Architecture Search for Image Denoising
- Authors: Bokyeung Lee, Kyungdeuk Ko, Jonghwan Hong and Hanseok Ko
- Abstract summary: We re-frame the optimal search problem by focusing at component block level.
In addition, we integrate an innovative dimension matching modules for dealing with spatial and channel-wise mismatch.
Our proposed Denoising Prior Neural Architecture Search (DPNAS) was demonstrated by having it complete an optimal architecture search for an image restoration task by just one day with a single GPU.
- Score: 16.72206392993489
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Neural Architecture Search (NAS) for automatically finding the optimal
network architecture has shown some success with competitive performances in
various computer vision tasks. However, NAS in general requires a tremendous
amount of computations. Thus reducing computational cost has emerged as an
important issue. Most of the attempts so far has been based on manual
approaches, and often the architectures developed from such efforts dwell in
the balance of the network optimality and the search cost. Additionally, recent
NAS methods for image restoration generally do not consider dynamic operations
that may transform dimensions of feature maps because of the dimensionality
mismatch in tensor calculations. This can greatly limit NAS in its search for
optimal network structure. To address these issues, we re-frame the optimal
search problem by focusing at component block level. From previous work, it's
been shown that an effective denoising block can be connected in series to
further improve the network performance. By focusing at block level, the search
space of reinforcement learning becomes significantly smaller and evaluation
process can be conducted more rapidly. In addition, we integrate an innovative
dimension matching modules for dealing with spatial and channel-wise mismatch
that may occur in the optimal design search. This allows much flexibility in
optimal network search within the cell block. With these modules, then we
employ reinforcement learning in search of an optimal image denoising network
at a module level. Computational efficiency of our proposed Denoising Prior
Neural Architecture Search (DPNAS) was demonstrated by having it complete an
optimal architecture search for an image restoration task by just one day with
a single GPU.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - ISNAS-DIP: Image-Specific Neural Architecture Search for Deep Image
Prior [6.098254376499899]
We show that optimal neural architectures in the DIP framework are image-dependent.
We propose an image-specific NAS strategy for the DIP framework that requires substantially less training than typical NAS approaches.
Our experiments show that image-specific metrics can reduce the search space to a small cohort of models, of which the best model outperforms current NAS approaches for image restoration.
arXiv Detail & Related papers (2021-11-27T13:53:25Z) - Combined Depth Space based Architecture Search For Person
Re-identification [70.86236888223569]
We aim to design a lightweight and suitable network for person re-identification (ReID)
We propose a novel search space called Combined Depth Space (CDS), based on which we search for an efficient network architecture, which we call CDNet.
We then propose a low-cost search strategy named the Top-k Sample Search strategy to make full use of the search space and avoid trapping in local optimal result.
arXiv Detail & Related papers (2021-04-09T02:40:01Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Superkernel Neural Architecture Search for Image Denoising [4.633161758939184]
We focus on exploring NAS for a dense prediction task that is image denoising.
Due to a costly training procedure, most NAS solutions for image enhancement rely on reinforcement learning or evolutionary algorithm exploration.
We introduce a new efficient implementation of various super Kernel techniques that enable fast single-shot training of models for dense predictions.
We demonstrate the effectiveness of our method on the SIDD+ benchmark for image denoising.
arXiv Detail & Related papers (2020-04-19T14:52:22Z) - DCNAS: Densely Connected Neural Architecture Search for Semantic Image
Segmentation [44.46852065566759]
We propose a Densely Connected NAS (DCNAS) framework, which directly searches the optimal network structures for the multi-scale representations of visual information.
Specifically, by connecting cells with each other using learnable weights, we introduce a densely connected search space to cover an abundance of mainstream network designs.
We demonstrate that the architecture obtained from our DCNAS algorithm achieves state-of-the-art performances on public semantic image segmentation benchmarks.
arXiv Detail & Related papers (2020-03-26T13:21:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.