When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search
- URL: http://arxiv.org/abs/2204.04918v1
- Date: Mon, 11 Apr 2022 07:34:21 GMT
- Title: When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search
- Authors: Guocheng Qian, Xuanyang Zhang, Guohao Li, Chen Zhao, Yukang Chen,
Xiangyu Zhang, Bernard Ghanem, Jian Sun
- Abstract summary: Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
- Score: 117.89827740405694
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The key challenge in neural architecture search (NAS) is designing how to
explore wisely in the huge search space. We propose a new NAS method called
TNAS (NAS with trees), which improves search efficiency by exploring only a
small number of architectures while also achieving a higher search accuracy.
TNAS introduces an architecture tree and a binary operation tree, to factorize
the search space and substantially reduce the exploration size. TNAS performs a
modified bi-level Breadth-First Search in the proposed trees to discover a
high-performance architecture. Impressively, TNAS finds the global optimal
architecture on CIFAR-10 with test accuracy of 94.37\% in four GPU hours in
NAS-Bench-201. The average test accuracy is 94.35\%, which outperforms the
state-of-the-art. Code is available at:
\url{https://github.com/guochengqian/TNAS}.
Related papers
- $\alpha$NAS: Neural Architecture Search using Property Guided Synthesis [1.2746672439030722]
We develop techniques that enable efficient neural architecture search (NAS) in a significantly larger design space.
Our key insights are as follows: (1) the abstract search space is significantly smaller than the original search space, and (2) architectures with similar program properties also have similar performance.
We implement our approach, $alpha$NAS, within an evolutionary framework, where the mutations are guided by the program properties.
arXiv Detail & Related papers (2022-05-08T21:48:03Z) - Generative Adversarial Neural Architecture Search [21.05611902967155]
We propose Generative Adversarial NAS (GA-NAS) with theoretically provable convergence guarantees.
We show that GA-NAS can be used to improve already optimized baselines found by other NAS methods.
arXiv Detail & Related papers (2021-05-19T18:54:44Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - OPANAS: One-Shot Path Aggregation Network Architecture Search for Object
Detection [82.04372532783931]
Recently, neural architecture search (NAS) has been exploited to design feature pyramid networks (FPNs)
We propose a novel One-Shot Path Aggregation Network Architecture Search (OPANAS) algorithm, which significantly improves both searching efficiency and detection accuracy.
arXiv Detail & Related papers (2021-03-08T01:48:53Z) - Neural Architecture Search via Combinatorial Multi-Armed Bandit [43.29214413461234]
We formulate NAS as a Combinatorial Multi-Armed Bandit (CMAB) problem (CMAB-NAS)
This allows the decomposition of a large search space into smaller blocks where tree-search methods can be applied more effectively and efficiently.
We leverage a tree-based method called Nested Monte-Carlo Search to tackle the CMAB-NAS problem.
On CIFAR-10, our approach discovers a cell structure that achieves a low error rate that is comparable to the state-of-the-art, using only 0.58 GPU days.
arXiv Detail & Related papers (2021-01-01T23:29:33Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z) - DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning [135.27931587381596]
We propose an efficient and unified NAS framework termed DDPNAS via dynamic distribution pruning.
In particular, we first sample architectures from a joint categorical distribution. Then the search space is dynamically pruned and its distribution is updated every few epochs.
With the proposed efficient network generation method, we directly obtain the optimal neural architectures on given constraints.
arXiv Detail & Related papers (2019-05-28T06:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.