Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of
Tabular NAS Benchmarks
- URL: http://arxiv.org/abs/2008.09777v4
- Date: Thu, 14 Apr 2022 15:23:32 GMT
- Title: Surrogate NAS Benchmarks: Going Beyond the Limited Search Spaces of
Tabular NAS Benchmarks
- Authors: Arber Zela, Julien Siems, Lucas Zimmer, Jovita Lukasik, Margret
Keuper, Frank Hutter
- Abstract summary: We propose a methodology to create cheap NAS surrogate benchmarks for arbitrary search spaces.
We show that surrogate NAS benchmarks can lead to faithful estimates of how well different NAS methods work on the original non-surrogate benchmark.
We believe that surrogate NAS benchmarks are an indispensable tool to extend scientifically sound work on NAS to large and exciting search spaces.
- Score: 41.73906939640346
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The most significant barrier to the advancement of Neural Architecture Search
(NAS) is its demand for large computational resources, which hinders
scientifically sound empirical evaluations of NAS methods. Tabular NAS
benchmarks have alleviated this problem substantially, making it possible to
properly evaluate NAS methods in seconds on commodity machines. However, an
unintended consequence of tabular NAS benchmarks has been a focus on extremely
small architectural search spaces since their construction relies on exhaustive
evaluations of the space. This leads to unrealistic results that do not
transfer to larger spaces. To overcome this fundamental limitation, we propose
a methodology to create cheap NAS surrogate benchmarks for arbitrary search
spaces. We exemplify this approach by creating surrogate NAS benchmarks on the
existing tabular NAS-Bench-101 and on two widely used NAS search spaces with up
to $10^{21}$ architectures ($10^{13}$ times larger than any previous tabular
NAS benchmark). We show that surrogate NAS benchmarks can model the true
performance of architectures better than tabular benchmarks (at a small
fraction of the cost), that they lead to faithful estimates of how well
different NAS methods work on the original non-surrogate benchmark, and that
they can generate new scientific insight. We open-source all our code and
believe that surrogate NAS benchmarks are an indispensable tool to extend
scientifically sound work on NAS to large and exciting search spaces.
Related papers
- Accel-NASBench: Sustainable Benchmarking for Accelerator-Aware NAS [3.598880812393792]
We present a technique that allows searching for training proxies that reduce the cost of benchmark construction by significant margins.
We show that the benchmark is accurate and allows searching for state-of-the-art hardware-aware models at zero cost.
arXiv Detail & Related papers (2024-04-09T06:23:41Z) - Generalization Properties of NAS under Activation and Skip Connection
Search [66.8386847112332]
We study the generalization properties of Neural Architecture Search (NAS) under a unifying framework.
We derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime.
We show how the derived results can guide NAS to select the top-performing architectures, even in the case without training.
arXiv Detail & Related papers (2022-09-15T12:11:41Z) - UnrealNAS: Can We Search Neural Architectures with Unreal Data? [84.78460976605425]
Neural architecture search (NAS) has shown great success in the automatic design of deep neural networks (DNNs)
Previous work has analyzed the necessity of having ground-truth labels in NAS and inspired broad interest.
We take a further step to question whether real data is necessary for NAS to be effective.
arXiv Detail & Related papers (2022-05-04T16:30:26Z) - When NAS Meets Trees: An Efficient Algorithm for Neural Architecture
Search [117.89827740405694]
Key challenge in neural architecture search (NAS) is designing how to explore wisely in the huge search space.
We propose a new NAS method called TNAS (NAS with trees), which improves search efficiency by exploring only a small number of architectures.
TNAS finds the global optimal architecture on CIFAR-10 with test accuracy of 94.37% in four GPU hours in NAS-Bench-201.
arXiv Detail & Related papers (2022-04-11T07:34:21Z) - NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy [37.72015163462501]
We present an in-depth analysis of popular NAS algorithms and performance prediction methods across 25 different combinations of search spaces and datasets.
We introduce NAS-Bench-Suite, a comprehensive and collection of NAS benchmarks, accessible through a unified interface.
arXiv Detail & Related papers (2022-01-31T18:02:09Z) - AdvantageNAS: Efficient Neural Architecture Search with Credit
Assignment [23.988393741948485]
We propose a novel search strategy for one-shot and sparse propagation NAS, namely AdvantageNAS.
AdvantageNAS is a gradient-based approach that improves the search efficiency by introducing credit assignment in gradient estimation for architecture updates.
Experiments on the NAS-Bench-201 and PTB dataset show that AdvantageNAS discovers an architecture with higher performance under a limited time budget.
arXiv Detail & Related papers (2020-12-11T05:45:03Z) - DSNAS: Direct Neural Architecture Search without Parameter Retraining [112.02966105995641]
We propose a new problem definition for NAS, task-specific end-to-end, based on this observation.
We propose DSNAS, an efficient differentiable NAS framework that simultaneously optimize architecture and parameters with a low-biased Monte Carlo estimate.
DSNAS successfully discovers networks with comparable accuracy (74.4%) on ImageNet in 420 GPU hours, reducing the total time by more than 34%.
arXiv Detail & Related papers (2020-02-21T04:41:47Z) - NAS-Bench-1Shot1: Benchmarking and Dissecting One-shot Neural
Architecture Search [42.82951139084501]
One-shot neural architecture search (NAS) has played a crucial role in making NAS methods computationally feasible in practice.
We introduce a general framework for one-shot NAS that can be instantiated to many recently-introduced variants and introduce a general benchmarking framework.
arXiv Detail & Related papers (2020-01-28T15:50:22Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.