Generative Adversarial Neural Architecture Search
- URL: http://arxiv.org/abs/2105.09356v1
- Date: Wed, 19 May 2021 18:54:44 GMT
- Title: Generative Adversarial Neural Architecture Search
- Authors: Seyed Saeed Changiz Rezaei, Fred X. Han, Di Niu, Mohammad Salameh,
Keith Mills, Shuo Lian, Wei Lu, and Shangling Jui
- Abstract summary: We propose Generative Adversarial NAS (GA-NAS) with theoretically provable convergence guarantees.
We show that GA-NAS can be used to improve already optimized baselines found by other NAS methods.
- Score: 21.05611902967155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the empirical success of neural architecture search (NAS) in deep
learning applications, the optimality, reproducibility and cost of NAS schemes
remain hard to assess. In this paper, we propose Generative Adversarial NAS
(GA-NAS) with theoretically provable convergence guarantees, promoting
stability and reproducibility in neural architecture search. Inspired by
importance sampling, GA-NAS iteratively fits a generator to previously
discovered top architectures, thus increasingly focusing on important parts of
a large search space. Furthermore, we propose an efficient adversarial learning
approach, where the generator is trained by reinforcement learning based on
rewards provided by a discriminator, thus being able to explore the search
space without evaluating a large number of architectures. Extensive experiments
show that GA-NAS beats the best published results under several cases on three
public NAS benchmarks. In the meantime, GA-NAS can handle ad-hoc search
constraints and search spaces. We show that GA-NAS can be used to improve
already optimized baselines found by other NAS methods, including EfficientNet
and ProxylessNAS, in terms of ImageNet accuracy or the number of parameters, in
their original search space.
Related papers
- TopoNAS: Boosting Search Efficiency of Gradient-based NAS via Topological Simplification [11.08910129925713]
TopoNAS is a model-agnostic approach for gradient-based one-shot NAS.
It significantly reduces searching time and memory usage by topological simplification of searchable paths.
arXiv Detail & Related papers (2024-08-02T15:01:29Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - PRE-NAS: Predictor-assisted Evolutionary Neural Architecture Search [34.06028035262884]
We propose a novel evolutionary-based NAS strategy, Predictor-assisted E-NAS (PRE-NAS)
PRE-NAS leverages new evolutionary search strategies and integrates high-fidelity weight inheritance over generations.
Experiments on NAS-Bench-201 and DARTS search spaces show that PRE-NAS can outperform state-of-the-art NAS methods.
arXiv Detail & Related papers (2022-04-27T06:40:39Z) - When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search [117.89827740405694]
Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
arXiv Detail & Related papers (2022-04-11T07:34:21Z) - Guided Evolution for Neural Architecture Search [1.0499611180329804]
We propose a novel approach for guided evolutionary Neural Architecture Search (NAS)
The rationale behind G-EA, is to explore the search space by generating and evaluating several architectures in each generation.
G-EA forces exploitation of the most performant networks by descendant generation while at the same time forcing exploration by parent mutation.
arXiv Detail & Related papers (2021-10-28T15:43:20Z) - Understanding and Accelerating Neural Architecture Search with
Training-Free and Theory-Grounded Metrics [117.4281417428145]
This work targets designing a principled and unified training-free framework for Neural Architecture Search (NAS)
NAS has been explosively studied to automate the discovery of top-performer neural networks, but suffers from heavy resource consumption and often incurs search bias due to truncated training or approximations.
We present a unified framework to understand and accelerate NAS, by disentangling "TEG" characteristics of searched networks.
arXiv Detail & Related papers (2021-08-26T17:52:07Z) - Search to aggregate neighborhood for graph neural network [47.47628113034479]
We propose a framework, which tries to Search to Aggregate NEighborhood (SANE) to automatically design data-specific GNN architectures.
By designing a novel and expressive search space, we propose a differentiable search algorithm, which is more efficient than previous reinforcement learning based methods.
arXiv Detail & Related papers (2021-04-14T03:15:19Z) - AdvantageNAS: Efficient Neural Architecture Search with Credit
Assignment [23.988393741948485]
We propose a novel search strategy for one-shot and sparse propagation NAS, namely AdvantageNAS.
AdvantageNAS is a gradient-based approach that improves the search efficiency by introducing credit assignment in gradient estimation for architecture updates.
Experiments on the NAS-Bench-201 and PTB dataset show that AdvantageNAS discovers an architecture with higher performance under a limited time budget.
arXiv Detail & Related papers (2020-12-11T05:45:03Z) - Angle-based Search Space Shrinking for Neural Architecture Search [78.49722661000442]
Angle-Based search space Shrinking (ABS) for Neural Architecture Search (NAS)
Our approach progressively simplifies the original search space by dropping unpromising candidates.
ABS can dramatically enhance existing NAS approaches by providing a promising shrunk search space.
arXiv Detail & Related papers (2020-04-28T11:26:46Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.