Neural Architecture Search as Sparse Supernet
- URL: http://arxiv.org/abs/2007.16112v2
- Date: Wed, 31 Mar 2021 16:35:16 GMT
- Title: Neural Architecture Search as Sparse Supernet
- Authors: Yan Wu, Aoming Liu, Zhiwu Huang, Siwei Zhang, Luc Van Gool
- Abstract summary: This paper aims at enlarging the problem of Neural Architecture Search (NAS) from Single-Path and Multi-Path Search to automated Mixed-Path Search.
We model the NAS problem as a sparse supernet using a new continuous architecture representation with a mixture of sparsity constraints.
The sparse supernet enables us to automatically achieve sparsely-mixed paths upon a compact set of nodes.
- Score: 78.09905626281046
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper aims at enlarging the problem of Neural Architecture Search (NAS)
from Single-Path and Multi-Path Search to automated Mixed-Path Search. In
particular, we model the NAS problem as a sparse supernet using a new
continuous architecture representation with a mixture of sparsity constraints.
The sparse supernet enables us to automatically achieve sparsely-mixed paths
upon a compact set of nodes. To optimize the proposed sparse supernet, we
exploit a hierarchical accelerated proximal gradient algorithm within a
bi-level optimization framework. Extensive experiments on Convolutional Neural
Network and Recurrent Neural Network search demonstrate that the proposed
method is capable of searching for compact, general and powerful neural
architectures.
Related papers
- Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Neural Architecture Search for Image Super-Resolution Using Densely
Constructed Search Space: DeCoNAS [18.191710317555952]
We use neural architecture search (NAS) methods to find a lightweight densely connected network named DeCoNASNet.
We define a complexity-based penalty for solving image super-resolution, which can be considered a multi-objective problem.
Experiments show that our DeCoNASNet outperforms the state-of-the-art lightweight super-resolution networks designed by handcraft methods and existing NAS-based design.
arXiv Detail & Related papers (2021-04-19T04:51:16Z) - Searching Efficient Model-guided Deep Network for Image Denoising [61.65776576769698]
We present a novel approach by connecting model-guided design with NAS (MoD-NAS)
MoD-NAS employs a highly reusable width search strategy and a densely connected search block to automatically select the operations of each layer.
Experimental results on several popular datasets show that our MoD-NAS has achieved even better PSNR performance than current state-of-the-art methods.
arXiv Detail & Related papers (2021-04-06T14:03:01Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - Trilevel Neural Architecture Search for Efficient Single Image
Super-Resolution [127.92235484598811]
This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR)
For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width.
An efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner.
arXiv Detail & Related papers (2021-01-17T12:19:49Z) - Optimal Transport Kernels for Sequential and Parallel Neural
Architecture Search [42.654535636271085]
Neural architecture search (NAS) automates the design of deep neural networks.
One of the main challenges is to compare the similarity of networks that the conventional Euclidean metric may fail to capture.
We build upon tree-Wasserstein (TW) which is a negative definite variant of OT.
arXiv Detail & Related papers (2020-06-13T08:44:41Z) - DC-NAS: Divide-and-Conquer Neural Architecture Search [108.57785531758076]
We present a divide-and-conquer (DC) approach to effectively and efficiently search deep neural architectures.
We achieve a $75.1%$ top-1 accuracy on the ImageNet dataset, which is higher than that of state-of-the-art methods using the same search space.
arXiv Detail & Related papers (2020-05-29T09:02:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.