A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions
- URL: http://arxiv.org/abs/2006.02903v3
- Date: Tue, 2 Mar 2021 08:35:02 GMT
- Title: A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions
- Authors: Pengzhen Ren, Yun Xiao, Xiaojun Chang, Po-Yao Huang, Zhihui Li,
Xiaojiang Chen, and Xin Wang
- Abstract summary: Neural Architecture Search (NAS) is a revolutionary algorithm, and the related research work is complicated and rich.
We provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms.
Besides, we conduct a detailed and comprehensive analysis, comparison, and summary of these works.
- Score: 48.76705090826339
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning has made breakthroughs and substantial in many fields due to
its powerful automatic representation capabilities. It has been proven that
neural architecture design is crucial to the feature representation of data and
the final performance. However, the design of the neural architecture heavily
relies on the researchers' prior knowledge and experience. And due to the
limitations of human' inherent knowledge, it is difficult for people to jump
out of their original thinking paradigm and design an optimal model. Therefore,
an intuitive idea would be to reduce human intervention as much as possible and
let the algorithm automatically design the neural architecture. Neural
Architecture Search (NAS) is just such a revolutionary algorithm, and the
related research work is complicated and rich. Therefore, a comprehensive and
systematic survey on the NAS is essential. Previously related surveys have
begun to classify existing work mainly based on the key components of NAS:
search space, search strategy, and evaluation strategy. While this
classification method is more intuitive, it is difficult for readers to grasp
the challenges and the landmark work involved. Therefore, in this survey, we
provide a new perspective: beginning with an overview of the characteristics of
the earliest NAS algorithms, summarizing the problems in these early NAS
algorithms, and then providing solutions for subsequent related research work.
Besides, we conduct a detailed and comprehensive analysis, comparison, and
summary of these works. Finally, we provide some possible future research
directions.
Related papers
- Insights from the Use of Previously Unseen Neural Architecture Search Datasets [6.239015118429603]
We present eight new datasets created for a series of NAS Challenges: AddNIST, Language, MultNIST, CIFARTile, Gutenberg, Isabella, GeoClassing, and Chesseract.
These datasets and challenges are developed to direct attention to issues in NAS development and to encourage authors to consider how their models will perform on datasets unknown to them at development time.
arXiv Detail & Related papers (2024-04-02T16:48:34Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Neural Architecture Search: Insights from 1000 Papers [50.27255667347091]
We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
arXiv Detail & Related papers (2023-01-20T18:47:24Z) - Towards Data-and Knowledge-Driven Artificial Intelligence: A Survey on Neuro-Symbolic Computing [73.0977635031713]
Neural-symbolic computing (NeSy) has been an active research area of Artificial Intelligence (AI) for many years.
NeSy shows promise of reconciling the advantages of reasoning and interpretability of symbolic representation and robust learning in neural networks.
arXiv Detail & Related papers (2022-10-28T04:38:10Z) - Accelerating Neural Architecture Exploration Across Modalities Using
Genetic Algorithms [5.620334754517149]
We show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration.
NAS research efforts have centered around computer vision tasks and only recently have other modalities, such as the rapidly growing field of natural language processing, been investigated in depth.
arXiv Detail & Related papers (2022-02-25T20:01:36Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - Poisoning the Search Space in Neural Architecture Search [0.0]
We evaluate the robustness of one such algorithm known as Efficient NAS against data poisoning attacks on the original search space.
Our results provide insights into the challenges to surmount in using NAS for more adversarially robust architecture search.
arXiv Detail & Related papers (2021-06-28T05:45:57Z) - A Comprehensive Survey on Hardware-Aware Neural Architecture Search [6.23453131728063]
Neural Architecture Search (NAS) methods have been growing in popularity.
NAS has been extensively studied in the past few years.
Applying NAS to real-world problems still poses significant challenges and is not widely practical.
One solution growing in popularity is to use multi-objective optimization algorithms in the NAS search strategy by taking into account execution latency, energy consumption, memory footprint, etc.
This kind of NAS, called hardware-aware NAS (HW-NAS), makes searching the most efficient architecture more complicated and opens several questions.
arXiv Detail & Related papers (2021-01-22T21:13:46Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.