Evolution and Efficiency in Neural Architecture Search: Bridging the Gap Between Expert Design and Automated Optimization
- URL: http://arxiv.org/abs/2403.17012v2
- Date: Tue, 2 Apr 2024 06:35:04 GMT
- Title: Evolution and Efficiency in Neural Architecture Search: Bridging the Gap Between Expert Design and Automated Optimization
- Authors: Fanfei Meng, Chen-Ao Wang, Lele Zhang,
- Abstract summary: The paper provides a comprehensive overview of Neural Architecture Search.
It emphasizes its evolution from manual design to automated, computationally-driven approaches.
It highlights its application across various domains, including medical imaging and natural language processing.
- Score: 1.7385545432331702
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The paper provides a comprehensive overview of Neural Architecture Search (NAS), emphasizing its evolution from manual design to automated, computationally-driven approaches. It covers the inception and growth of NAS, highlighting its application across various domains, including medical imaging and natural language processing. The document details the shift from expert-driven design to algorithm-driven processes, exploring initial methodologies like reinforcement learning and evolutionary algorithms. It also discusses the challenges of computational demands and the emergence of efficient NAS methodologies, such as Differentiable Architecture Search and hardware-aware NAS. The paper further elaborates on NAS's application in computer vision, NLP, and beyond, demonstrating its versatility and potential for optimizing neural network architectures across different tasks. Future directions and challenges, including computational efficiency and the integration with emerging AI domains, are addressed, showcasing NAS's dynamic nature and its continued evolution towards more sophisticated and efficient architecture search methods.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - SONATA: Self-adaptive Evolutionary Framework for Hardware-aware Neural
Architecture Search [0.7646713951724011]
HW-aware Neural Architecture Search (HW-aware NAS) emerges as an attractive strategy to automate the design of NN.
We propose SONATA, a self-adaptive evolutionary algorithm for HW-aware NAS.
Our method leverages adaptive evolutionary operators guided by the learned importance of NN design parameters.
arXiv Detail & Related papers (2024-02-20T18:15:11Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - Accelerating Neural Architecture Exploration Across Modalities Using
Genetic Algorithms [5.620334754517149]
We show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration.
NAS research efforts have centered around computer vision tasks and only recently have other modalities, such as the rapidly growing field of natural language processing, been investigated in depth.
arXiv Detail & Related papers (2022-02-25T20:01:36Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - A Survey on Evolutionary Neural Architecture Search [20.658525685384557]
Neural Architecture Search (NAS) is a type of technology that can design the architectures automatically.
EC-based NAS algorithms have recently gained much attention and success.
This paper reviews over 200 papers of most recent EC-based NAS methods in light of the core components.
arXiv Detail & Related papers (2020-08-25T11:00:46Z) - A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions [48.76705090826339]
Neural Architecture Search (NAS) is a revolutionary algorithm, and the related research work is complicated and rich.
We provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms.
Besides, we conduct a detailed and comprehensive analysis, comparison, and summary of these works.
arXiv Detail & Related papers (2020-06-01T13:08:03Z) - An Introduction to Neural Architecture Search for Convolutional Networks [0.0]
Neural Architecture Search (NAS) is a research field concerned with utilizing optimization algorithms to design optimal neural network architectures.
We provide an introduction to the basic concepts of NAS for convolutional networks, along with the major advances in search spaces, algorithms and evaluation techniques.
arXiv Detail & Related papers (2020-05-22T09:33:22Z) - Stage-Wise Neural Architecture Search [65.03109178056937]
Modern convolutional networks such as ResNet and NASNet have achieved state-of-the-art results in many computer vision applications.
These networks consist of stages, which are sets of layers that operate on representations in the same resolution.
It has been demonstrated that increasing the number of layers in each stage improves the prediction ability of the network.
However, the resulting architecture becomes computationally expensive in terms of floating point operations, memory requirements and inference time.
arXiv Detail & Related papers (2020-04-23T14:16:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.