B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization
Modules for Neural Architecture Search
- URL: http://arxiv.org/abs/2202.03005v1
- Date: Mon, 7 Feb 2022 08:50:21 GMT
- Title: B2EA: An Evolutionary Algorithm Assisted by Two Bayesian Optimization
Modules for Neural Architecture Search
- Authors: Hyunghun Cho, Jungwook Shin, Wonjong Rhee
- Abstract summary: We develop Btextsuperscript2EA that is a surrogate assisted EA with two BO surrogate models and a mutation step in between.
Btextsuperscript2EA is robust and efficient over the 14 benchmarks for three difficulty levels of target performance.
- Score: 3.126118485851773
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: The early pioneering Neural Architecture Search (NAS) works were multi-trial
methods applicable to any general search space. The subsequent works took
advantage of the early findings and developed weight-sharing methods that
assume a structured search space typically with pre-fixed hyperparameters.
Despite the amazing computational efficiency of the weight-sharing NAS
algorithms, it is becoming apparent that multi-trial NAS algorithms are also
needed for identifying very high-performance architectures, especially when
exploring a general search space. In this work, we carefully review the latest
multi-trial NAS algorithms and identify the key strategies including
Evolutionary Algorithm (EA), Bayesian Optimization (BO), diversification, input
and output transformations, and lower fidelity estimation. To accommodate the
key strategies into a single framework, we develop B\textsuperscript{2}EA that
is a surrogate assisted EA with two BO surrogate models and a mutation step in
between. To show that B\textsuperscript{2}EA is robust and efficient, we
evaluate three performance metrics over 14 benchmarks with general and
cell-based search spaces. Comparisons with state-of-the-art multi-trial
algorithms reveal that B\textsuperscript{2}EA is robust and efficient over the
14 benchmarks for three difficulty levels of target performance. The
B\textsuperscript{2}EA code is publicly available at
\url{https://github.com/snu-adsl/BBEA}.
Related papers
- Efficient Architecture Search via Bi-level Data Pruning [70.29970746807882]
This work pioneers an exploration into the critical role of dataset characteristics for DARTS bi-level optimization.
We introduce a new progressive data pruning strategy that utilizes supernet prediction dynamics as the metric.
Comprehensive evaluations on the NAS-Bench-201 search space, DARTS search space, and MobileNet-like search space validate that BDP reduces search costs by over 50%.
arXiv Detail & Related papers (2023-12-21T02:48:44Z) - Searching a High-Performance Feature Extractor for Text Recognition
Network [92.12492627169108]
We design a domain-specific search space by exploring principles for having good feature extractors.
As the space is huge and complexly structured, no existing NAS algorithms can be applied.
We propose a two-stage algorithm to effectively search in the space.
arXiv Detail & Related papers (2022-09-27T03:49:04Z) - Neural Architecture Search as Multiobjective Optimization Benchmarks:
Problem Formulation and Performance Assessment [30.264524448340406]
We formulate neural architecture search (NAS) tasks into general multi-objective optimization problems.
We analyze the complex characteristics from an optimization point of view.
We present an end-to-end pipeline, dubbed $texttEvoXBench$, to generate benchmark test problems for EMO algorithms to run efficiently.
arXiv Detail & Related papers (2022-08-08T02:07:49Z) - $\beta$-DARTS: Beta-Decay Regularization for Differentiable Architecture
Search [85.84110365657455]
We propose a simple-but-efficient regularization method, termed as Beta-Decay, to regularize the DARTS-based NAS searching process.
Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets.
arXiv Detail & Related papers (2022-03-03T11:47:14Z) - AutoBERT-Zero: Evolving BERT Backbone from Scratch [94.89102524181986]
We propose an Operation-Priority Neural Architecture Search (OP-NAS) algorithm to automatically search for promising hybrid backbone architectures.
We optimize both the search algorithm and evaluation of candidate models to boost the efficiency of our proposed OP-NAS.
Experiments show that the searched architecture (named AutoBERT-Zero) significantly outperforms BERT and its variants of different model capacities in various downstream tasks.
arXiv Detail & Related papers (2021-07-15T16:46:01Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - NATS-Bench: Benchmarking NAS Algorithms for Architecture Topology and
Size [31.903475598150152]
We propose NATS-Bench, a unified benchmark on searching for both architecture topology and size.
NATS-Bench includes the search space of 15,625 neural cell candidates for architecture topology and 32,768 for architecture size on three datasets.
arXiv Detail & Related papers (2020-08-28T21:34:56Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - Local Search is a Remarkably Strong Baseline for Neural Architecture
Search [0.0]
We consider, for the first time, a simple Local Search (LS) algorithm for Neural Architecture Search (NAS)
We release two benchmark datasets, named MacroNAS-C10 and MacroNAS-C100, containing 200K saved network evaluations for two established image classification tasks.
arXiv Detail & Related papers (2020-04-20T00:08:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.