NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural
Architecture Search
- URL: http://arxiv.org/abs/2007.10396v1
- Date: Mon, 20 Jul 2020 18:30:11 GMT
- Title: NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural
Architecture Search
- Authors: Zhichao Lu and Kalyanmoy Deb and Erik Goodman and Wolfgang Banzhaf and
Vishnu Naresh Boddeti
- Abstract summary: We propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives.
It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency.
We demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets.
- Score: 22.848528877480796
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose an efficient NAS algorithm for generating
task-specific models that are competitive under multiple competing objectives.
It comprises of two surrogates, one at the architecture level to improve sample
efficiency and one at the weights level, through a supernet, to improve
gradient descent training efficiency. On standard benchmark datasets (C10,
C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or
outperform models from existing approaches with the search being orders of
magnitude more sample efficient. Furthermore, we demonstrate the effectiveness
and versatility of the proposed method on six diverse non-standard datasets,
e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases,
NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that
NAS can be a viable alternative to conventional transfer learning approaches in
handling diverse scenarios such as small-scale or fine-grained datasets. Code
is available at https://github.com/mikelzc1990/nsganetv2
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DiffusionNAG: Predictor-guided Neural Architecture Generation with Diffusion Models [56.584561770857306]
We propose a novel conditional Neural Architecture Generation (NAG) framework based on diffusion models, dubbed DiffusionNAG.
Specifically, we consider the neural architectures as directed graphs and propose a graph diffusion model for generating them.
We validate the effectiveness of DiffusionNAG through extensive experiments in two predictor-based NAS scenarios: Transferable NAS and Bayesian Optimization (BO)-based NAS.
When integrated into a BO-based algorithm, DiffusionNAG outperforms existing BO-based NAS approaches, particularly in the large MobileNetV3 search space on the ImageNet 1K dataset.
arXiv Detail & Related papers (2023-05-26T13:58:18Z) - Neural Attentive Circuits [93.95502541529115]
We introduce a general purpose, yet modular neural architecture called Neural Attentive Circuits (NACs)
NACs learn the parameterization and a sparse connectivity of neural modules without using domain knowledge.
NACs achieve an 8x speedup at inference time while losing less than 3% performance.
arXiv Detail & Related papers (2022-10-14T18:00:07Z) - Accelerating Multi-Objective Neural Architecture Search by Random-Weight
Evaluation [24.44521525130034]
We introduce a new performance estimation metric named Random-Weight Evaluation (RWE) to quantify the quality of CNNs.
RWE only trains its last layer and leaves the remainders with randomly weights, which results in a single network evaluation in seconds.
Our proposed method obtains a set of efficient models with state-of-the-art performance in two real-world search spaces.
arXiv Detail & Related papers (2021-10-08T06:35:20Z) - Pareto-wise Ranking Classifier for Multi-objective Evolutionary Neural
Architecture Search [15.454709248397208]
This study focuses on how to find feasible deep models under diverse design objectives.
We propose a classification-wise Pareto evolution approach for one-shot NAS, where an online classifier is trained to predict the dominance relationship between the candidate and constructed reference architectures.
We find a number of neural architectures with different model sizes ranging from 2M to 6M under diverse objectives and constraints.
arXiv Detail & Related papers (2021-09-14T13:28:07Z) - Rapid Neural Architecture Search by Learning to Generate Graphs from
Datasets [42.993720854755736]
We propose an efficient Neural Search (NAS) framework that is trained once on a database consisting of datasets and pretrained networks.
We show that our model meta-learned on subsets of ImageNet-1K and architectures from NAS-Bench 201 search space successfully generalizes to multiple unseen datasets.
arXiv Detail & Related papers (2021-07-02T06:33:59Z) - Improving Calibration for Long-Tailed Recognition [68.32848696795519]
We propose two methods to improve calibration and performance in such scenarios.
For dataset bias due to different samplers, we propose shifted batch normalization.
Our proposed methods set new records on multiple popular long-tailed recognition benchmark datasets.
arXiv Detail & Related papers (2021-04-01T13:55:21Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z) - Neural Architecture Transfer [20.86857986471351]
Existing approaches require one complete search for each deployment specification of hardware or objective.
We propose Neural Architecture Transfer (NAT) to overcome this limitation.
NAT is designed to efficiently generate task-specific custom models that are competitive under multiple conflicting objectives.
arXiv Detail & Related papers (2020-05-12T15:30:36Z) - DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning [135.27931587381596]
We propose an efficient and unified NAS framework termed DDPNAS via dynamic distribution pruning.
In particular, we first sample architectures from a joint categorical distribution. Then the search space is dynamically pruned and its distribution is updated every few epochs.
With the proposed efficient network generation method, we directly obtain the optimal neural architectures on given constraints.
arXiv Detail & Related papers (2019-05-28T06:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.