Neural Architecture Search via Combinatorial Multi-Armed Bandit
- URL: http://arxiv.org/abs/2101.00336v1
- Date: Fri, 1 Jan 2021 23:29:33 GMT
- Title: Neural Architecture Search via Combinatorial Multi-Armed Bandit
- Authors: Hanxun Huang, Xingjun Ma, Sarah M. Erfani, James Bailey
- Abstract summary: We formulate NAS as a Combinatorial Multi-Armed Bandit (CMAB) problem (CMAB-NAS)
This allows the decomposition of a large search space into smaller blocks where tree-search methods can be applied more effectively and efficiently.
We leverage a tree-based method called Nested Monte-Carlo Search to tackle the CMAB-NAS problem.
On CIFAR-10, our approach discovers a cell structure that achieves a low error rate that is comparable to the state-of-the-art, using only 0.58 GPU days.
- Score: 43.29214413461234
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Architecture Search (NAS) has gained significant popularity as an
effective tool for designing high performance deep neural networks (DNNs). NAS
can be performed via policy gradient, evolutionary algorithms, differentiable
architecture search or tree-search methods. While significant progress has been
made for both policy gradient and differentiable architecture search,
tree-search methods have so far failed to achieve comparable accuracy or search
efficiency. In this paper, we formulate NAS as a Combinatorial Multi-Armed
Bandit (CMAB) problem (CMAB-NAS). This allows the decomposition of a large
search space into smaller blocks where tree-search methods can be applied more
effectively and efficiently. We further leverage a tree-based method called
Nested Monte-Carlo Search to tackle the CMAB-NAS problem. On CIFAR-10, our
approach discovers a cell structure that achieves a low error rate that is
comparable to the state-of-the-art, using only 0.58 GPU days, which is 20 times
faster than current tree-search methods. Moreover, the discovered structure
transfers well to large-scale datasets such as ImageNet.
Related papers
- When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search [117.89827740405694]
Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
arXiv Detail & Related papers (2022-04-11T07:34:21Z) - Prioritized Architecture Sampling with Monto-Carlo Tree Search [54.72096546595955]
One-shot neural architecture search (NAS) methods significantly reduce the search cost by considering the whole search space as one network.
In this paper, we introduce a sampling strategy based on Monte Carlo tree search (MCTS) with the search space modeled as a Monte Carlo tree (MCT)
For a fair comparison, we construct an open-source NAS benchmark of a macro search space evaluated on CIFAR-10, namely NAS-Bench-Macro.
arXiv Detail & Related papers (2021-03-22T15:09:29Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z) - NAS evaluation is frustratingly hard [1.7188280334580197]
Neural Architecture Search (NAS) is an exciting new field which promises to be as much as a game-changer as Convolutional Neural Networks were in 2012.
Comparison between different methods is still very much an open issue.
Our first contribution is a benchmark of $8$ NAS methods on $5$ datasets.
arXiv Detail & Related papers (2019-12-28T21:24:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.