Guided Evolution for Neural Architecture Search
- URL: http://arxiv.org/abs/2110.15232v1
- Date: Thu, 28 Oct 2021 15:43:20 GMT
- Title: Guided Evolution for Neural Architecture Search
- Authors: Vasco Lopes, Miguel Santos, Bruno Degardin, Lu\'is A. Alexandre
- Abstract summary: We propose a novel approach for guided evolutionary Neural Architecture Search (NAS)
The rationale behind G-EA, is to explore the search space by generating and evaluating several architectures in each generation.
G-EA forces exploitation of the most performant networks by descendant generation while at the same time forcing exploration by parent mutation.
- Score: 1.0499611180329804
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Architecture Search (NAS) methods have been successfully applied to
image tasks with excellent results. However, NAS methods are often complex and
tend to converge to local minima as soon as generated architectures seem to
yield good results. In this paper, we propose G-EA, a novel approach for guided
evolutionary NAS. The rationale behind G-EA, is to explore the search space by
generating and evaluating several architectures in each generation at
initialization stage using a zero-proxy estimator, where only the
highest-scoring network is trained and kept for the next generation. This
evaluation at initialization stage allows continuous extraction of knowledge
from the search space without increasing computation, thus allowing the search
to be efficiently guided. Moreover, G-EA forces exploitation of the most
performant networks by descendant generation while at the same time forcing
exploration by parent mutation and by favouring younger architectures to the
detriment of older ones. Experimental results demonstrate the effectiveness of
the proposed method, showing that G-EA achieves state-of-the-art results in
NAS-Bench-201 search space in CIFAR-10, CIFAR-100 and ImageNet16-120, with mean
accuracies of 93.98%, 72.12% and 45.94% respectively.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - TG-NAS: Leveraging Zero-Cost Proxies with Transformer and Graph Convolution Networks for Efficient Neural Architecture Search [1.30891455653235]
TG-NAS aims to create training-free proxies for architecture performance prediction.
We introduce TG-NAS, a novel model-based universal proxy that leverages a transformer-based operator embedding generator and a graph convolution network (GCN) to predict architecture performance.
TG-NAS achieves up to 300X improvements in search efficiency compared to previous SOTA ZC proxy methods.
arXiv Detail & Related papers (2024-03-30T07:25:30Z) - GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - Guided Evolutionary Neural Architecture Search With Efficient
Performance Estimation [4.637328271312329]
This paper proposes GEA, a novel approach for guided Neural Architecture Search (NAS)
GEA guides the evolution by generating and evaluating several architectures in each generation at initialisation stage.
Results show that GEA achieves state-of-the-art results on all data sets of NAS-Bench-101, NAS-Bench-201 and TransNAS-Bench-101 benchmarks.
arXiv Detail & Related papers (2022-07-22T10:58:32Z) - BaLeNAS: Differentiable Architecture Search via the Bayesian Learning
Rule [95.56873042777316]
Differentiable Architecture Search (DARTS) has received massive attention in recent years, mainly because it significantly reduces the computational cost.
This paper formulates the neural architecture search as a distribution learning problem through relaxing the architecture weights into Gaussian distributions.
We demonstrate how the differentiable NAS benefits from Bayesian principles, enhancing exploration and improving stability.
arXiv Detail & Related papers (2021-11-25T18:13:42Z) - ZARTS: On Zero-order Optimization for Neural Architecture Search [94.41017048659664]
Differentiable architecture search (DARTS) has been a popular one-shot paradigm for NAS due to its high efficiency.
This work turns to zero-order optimization and proposes a novel NAS scheme, called ZARTS, to search without enforcing the above approximation.
In particular, results on 12 benchmarks verify the outstanding robustness of ZARTS, where the performance of DARTS collapses due to its known instability issue.
arXiv Detail & Related papers (2021-10-10T09:35:15Z) - Generative Adversarial Neural Architecture Search [21.05611902967155]
We propose Generative Adversarial NAS (GA-NAS) with theoretically provable convergence guarantees.
We show that GA-NAS can be used to improve already optimized baselines found by other NAS methods.
arXiv Detail & Related papers (2021-05-19T18:54:44Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - DrNAS: Dirichlet Neural Architecture Search [88.56953713817545]
We treat the continuously relaxed architecture mixing weight as random variables, modeled by Dirichlet distribution.
With recently developed pathwise derivatives, the Dirichlet parameters can be easily optimized with gradient-based generalization.
To alleviate the large memory consumption of differentiable NAS, we propose a simple yet effective progressive learning scheme.
arXiv Detail & Related papers (2020-06-18T08:23:02Z) - Optimizing Neural Architecture Search using Limited GPU Time in a
Dynamic Search Space: A Gene Expression Programming Approach [0.0]
We propose an evolutionary-based neural architecture search approach for efficient discovery of convolutional models.
With its efficient search environment and phenotype representation, Gene Expression Programming is adapted for network's cell generation.
Our proposal achieved similar state-of-the-art to manually-designed convolutional networks and also NAS-generated ones, even beating similar constrained evolutionary-based NAS works.
arXiv Detail & Related papers (2020-05-15T17:32:30Z) - DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning [135.27931587381596]
We propose an efficient and unified NAS framework termed DDPNAS via dynamic distribution pruning.
In particular, we first sample architectures from a joint categorical distribution. Then the search space is dynamically pruned and its distribution is updated every few epochs.
With the proposed efficient network generation method, we directly obtain the optimal neural architectures on given constraints.
arXiv Detail & Related papers (2019-05-28T06:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.