Local Search is a Remarkably Strong Baseline for Neural Architecture
Search
- URL: http://arxiv.org/abs/2004.08996v3
- Date: Sat, 25 Jul 2020 11:04:47 GMT
- Title: Local Search is a Remarkably Strong Baseline for Neural Architecture
Search
- Authors: T. Den Ottelander, A. Dushatskiy, M. Virgolin, P. A. N. Bosman
- Abstract summary: We consider, for the first time, a simple Local Search (LS) algorithm for Neural Architecture Search (NAS)
We release two benchmark datasets, named MacroNAS-C10 and MacroNAS-C100, containing 200K saved network evaluations for two established image classification tasks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS), i.e., the automation of neural network
design, has gained much popularity in recent years with increasingly complex
search algorithms being proposed. Yet, solid comparisons with simple baselines
are often missing. At the same time, recent retrospective studies have found
many new algorithms to be no better than random search (RS). In this work we
consider, for the first time, a simple Local Search (LS) algorithm for NAS. We
particularly consider a multi-objective NAS formulation, with network accuracy
and network complexity as two objectives, as understanding the trade-off
between these two objectives is arguably the most interesting aspect of NAS.
The proposed LS algorithm is compared with RS and two evolutionary algorithms
(EAs), as these are often heralded as being ideal for multi-objective
optimization. To promote reproducibility, we create and release two benchmark
datasets, named MacroNAS-C10 and MacroNAS-C100, containing 200K saved network
evaluations for two established image classification tasks, CIFAR-10 and
CIFAR-100. Our benchmarks are designed to be complementary to existing
benchmarks, especially in that they are better suited for multi-objective
search. We additionally consider a version of the problem with a much larger
architecture space. While we find and show that the considered algorithms
explore the search space in fundamentally different ways, we also find that LS
substantially outperforms RS and even performs nearly as good as
state-of-the-art EAs. We believe that this provides strong evidence that LS is
truly a competitive baseline for NAS against which new NAS algorithms should be
benchmarked.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - Are Neural Architecture Search Benchmarks Well Designed? A Deeper Look
Into Operation Importance [5.065947993017157]
We conduct an empirical analysis of the widely used NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101 benchmarks.
We found that only a subset of the operation pool is required to generate architectures close to the upper-bound of the performance range.
We consistently found convolution layers to have the highest impact on the architecture's performance.
arXiv Detail & Related papers (2023-03-29T18:03:28Z) - NAS-Bench-Suite: NAS Evaluation is (Now) Surprisingly Easy [37.72015163462501]
We present an in-depth analysis of popular NAS algorithms and performance prediction methods across 25 different combinations of search spaces and datasets.
We introduce NAS-Bench-Suite, a comprehensive and collection of NAS benchmarks, accessible through a unified interface.
arXiv Detail & Related papers (2022-01-31T18:02:09Z) - TND-NAS: Towards Non-differentiable Objectives in Progressive
Differentiable NAS Framework [6.895590095853327]
Differentiable architecture search has gradually become the mainstream research topic in the field of Neural Architecture Search (NAS)
Recent differentiable NAS also aims at further improving the search performance and reducing the GPU-memory consumption.
We propose the TND-NAS, which is with the merits of the high efficiency in differentiable NAS framework and the compatibility among non-differentiable metrics in Multi-objective NAS.
arXiv Detail & Related papers (2021-11-06T14:19:36Z) - Hierarchical Neural Architecture Search for Deep Stereo Matching [131.94481111956853]
We propose the first end-to-end hierarchical NAS framework for deep stereo matching.
Our framework incorporates task-specific human knowledge into the neural architecture search framework.
It is ranked at the top 1 accuracy on KITTI stereo 2012, 2015 and Middlebury benchmarks, as well as the top 1 on SceneFlow dataset.
arXiv Detail & Related papers (2020-10-26T11:57:37Z) - NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and
Size [31.903475598150152]
We propose NATS-Bench, a unified benchmark on searching for both architecture topology and size.
NATS-Bench includes the search space of 15,625 neural cell candidates for architecture topology and 32,768 for architecture size on three datasets.
arXiv Detail & Related papers (2020-08-28T21:34:56Z) - CATCH: Context-based Meta Reinforcement Learning for Transferrable
Architecture Search [102.67142711824748]
CATCH is a novel Context-bAsed meTa reinforcement learning algorithm for transferrable arChitecture searcH.
The combination of meta-learning and RL allows CATCH to efficiently adapt to new tasks while being agnostic to search spaces.
It is also capable of handling cross-domain architecture search as competitive networks on ImageNet, COCO, and Cityscapes are identified.
arXiv Detail & Related papers (2020-07-18T09:35:53Z) - DA-NAS: Data Adapted Pruning for Efficient Neural Architecture Search [76.9225014200746]
Efficient search is a core issue in Neural Architecture Search (NAS)
We present DA-NAS that can directly search the architecture for large-scale target tasks while allowing a large candidate set in a more efficient manner.
It is 2x faster than previous methods while the accuracy is currently state-of-the-art, at 76.2% under small FLOPs constraint.
arXiv Detail & Related papers (2020-03-27T17:55:21Z) - NAS-Bench-201: Extending the Scope of Reproducible Neural Architecture
Search [55.12928953187342]
We propose an extension to NAS-Bench-101: NAS-Bench-201 with a different search space, results on multiple datasets, and more diagnostic information.
NAS-Bench-201 has a fixed search space and provides a unified benchmark for almost any up-to-date NAS algorithms.
We provide additional diagnostic information such as fine-grained loss and accuracy, which can give inspirations to new designs of NAS algorithms.
arXiv Detail & Related papers (2020-01-02T05:28:26Z) - Scalable NAS with Factorizable Architectural Parameters [102.51428615447703]
Neural Architecture Search (NAS) is an emerging topic in machine learning and computer vision.
This paper presents a scalable algorithm by factorizing a large set of candidate operators into smaller subspaces.
With a small increase in search costs and no extra costs in re-training, we find interesting architectures that were not explored before.
arXiv Detail & Related papers (2019-12-31T10:26:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.