NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural
Architecture Search
- URL: http://arxiv.org/abs/2007.10396v1
- Date: Mon, 20 Jul 2020 18:30:11 GMT
- Title: NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural
Architecture Search
- Authors: Zhichao Lu and Kalyanmoy Deb and Erik Goodman and Wolfgang Banzhaf and
Vishnu Naresh Boddeti
- Abstract summary: We propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives.
It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency.
We demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets.
- Score: 22.848528877480796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose an efficient NAS algorithm for generating
task-specific models that are competitive under multiple competing objectives.
It comprises of two surrogates, one at the architecture level to improve sample
efficiency and one at the weights level, through a supernet, to improve
gradient descent training efficiency. On standard benchmark datasets (C10,
C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or
outperform models from existing approaches with the search being orders of
magnitude more sample efficient. Furthermore, we demonstrate the effectiveness
and versatility of the proposed method on six diverse non-standard datasets,
e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases,
NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that
NAS can be a viable alternative to conventional transfer learning approaches in
handling diverse scenarios such as small-scale or fine-grained datasets. Code
is available at https://github.com/mikelzc1990/nsganetv2
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.