A Survey on Evolutionary Neural Architecture Search
- URL: http://arxiv.org/abs/2008.10937v4
- Date: Fri, 4 Feb 2022 03:45:53 GMT
- Title: A Survey on Evolutionary Neural Architecture Search
- Authors: Yuqiao Liu, Yanan Sun, Bing Xue, Mengjie Zhang, Gary G. Yen, Kay Chen
Tan
- Abstract summary: Neural Architecture Search (NAS) is a type of technology that can design the architectures automatically.
EC-based NAS algorithms have recently gained much attention and success.
This paper reviews over 200 papers of most recent EC-based NAS methods in light of the core components.
- Score: 20.658525685384557
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep Neural Networks (DNNs) have achieved great success in many applications.
The architectures of DNNs play a crucial role in their performance, which is
usually manually designed with rich expertise. However, such a design process
is labour intensive because of the trial-and-error process, and also not easy
to realize due to the rare expertise in practice. Neural Architecture Search
(NAS) is a type of technology that can design the architectures automatically.
Among different methods to realize NAS, Evolutionary Computation (EC) methods
have recently gained much attention and success. Unfortunately, there has not
yet been a comprehensive summary of the EC-based NAS algorithms. This paper
reviews over 200 papers of most recent EC-based NAS methods in light of the
core components, to systematically discuss their design principles as well as
justifications on the design. Furthermore, current challenges and issues are
also discussed to identify future research in this emerging field.
Related papers
- Evolution and Efficiency in Neural Architecture Search: Bridging the Gap Between Expert Design and Automated Optimization [1.7385545432331702]
The paper provides a comprehensive overview of Neural Architecture Search.
It emphasizes its evolution from manual design to automated, computationally-driven approaches.
It highlights its application across various domains, including medical imaging and natural language processing.
arXiv Detail & Related papers (2024-02-11T18:27:29Z) - Efficient Automation of Neural Network Design: A Survey on
Differentiable Neural Architecture Search [70.31239620427526]
Differentiable Neural Architecture Search (DNAS) rapidly imposed itself as the trending approach to automate the discovery of deep neural network architectures.
This rise is mainly due to the popularity of DARTS, one of the first major DNAS methods.
In this comprehensive survey, we focus specifically on DNAS and review recent approaches in this field.
arXiv Detail & Related papers (2023-04-11T13:15:29Z) - Neural Architecture Search: Insights from 1000 Papers [50.27255667347091]
We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
arXiv Detail & Related papers (2023-01-20T18:47:24Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - Neural Architecture Search in operational context: a remote sensing
case-study [0.0]
Neural Architecture Search (NAS) is a framework introduced to mitigate risks by jointly optimizing the network architectures and its weights.
We aim to evaluate its ability to tackle a challenging operational task: semantic segmentation of objects of interest in satellite imagery.
arXiv Detail & Related papers (2021-09-15T08:18:12Z) - A Comprehensive Survey on Hardware-Aware Neural Architecture Search [6.23453131728063]
Neural Architecture Search (NAS) methods have been growing in popularity.
NAS has been extensively studied in the past few years.
Applying NAS to real-world problems still poses significant challenges and is not widely practical.
One solution growing in popularity is to use multi-objective optimization algorithms in the NAS search strategy by taking into account execution latency, energy consumption, memory footprint, etc.
This kind of NAS, called hardware-aware NAS (HW-NAS), makes searching the most efficient architecture more complicated and opens several questions.
arXiv Detail & Related papers (2021-01-22T21:13:46Z) - Weight-Sharing Neural Architecture Search: A Battle to Shrink the
Optimization Gap [90.93522795555724]
Neural architecture search (NAS) has attracted increasing attentions in both academia and industry.
Weight-sharing methods were proposed in which exponentially many architectures share weights in the same super-network.
This paper provides a literature review on NAS, in particular the weight-sharing methods.
arXiv Detail & Related papers (2020-08-04T11:57:03Z) - A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions [48.76705090826339]
Neural Architecture Search (NAS) is a revolutionary algorithm, and the related research work is complicated and rich.
We provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms.
Besides, we conduct a detailed and comprehensive analysis, comparison, and summary of these works.
arXiv Detail & Related papers (2020-06-01T13:08:03Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z) - Modeling Neural Architecture Search Methods for Deep Networks [9.561123408923489]
We propose a general abstraction model for neural architecture search (NAS) methods.
It is possible to compare different design approaches for categorizing and identifying critical areas of interest.
arXiv Detail & Related papers (2019-12-31T05:51:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.