Neural Architecture Search: Insights from 1000 Papers
- URL: http://arxiv.org/abs/2301.08727v1
- Date: Fri, 20 Jan 2023 18:47:24 GMT
- Title: Neural Architecture Search: Insights from 1000 Papers
- Authors: Colin White, Mahmoud Safari, Rhea Sukthanker, Binxin Ru, Thomas
Elsken, Arber Zela, Debadeepta Dey, Frank Hutter
- Abstract summary: We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
- Score: 50.27255667347091
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the past decade, advances in deep learning have resulted in breakthroughs
in a variety of areas, including computer vision, natural language
understanding, speech recognition, and reinforcement learning. Specialized,
high-performing neural architectures are crucial to the success of deep
learning in these areas. Neural architecture search (NAS), the process of
automating the design of neural architectures for a given task, is an
inevitable next step in automating machine learning and has already outpaced
the best human-designed architectures on many tasks. In the past few years,
research in NAS has been progressing rapidly, with over 1000 papers released
since 2020. In this survey, we provide an organized and comprehensive guide to
neural architecture search. We give a taxonomy of search spaces, algorithms,
and speedup techniques, and we discuss resources such as benchmarks, best
practices, other surveys, and open-source libraries.
Related papers
- An Approach for Efficient Neural Architecture Search Space Definition [0.0]
We propose a novel cell-based hierarchical search space, easy to comprehend and manipulate.
The objectives of the proposed approach are to optimize the search-time and to be general enough to handle most of state of the art CNN architectures.
arXiv Detail & Related papers (2023-10-25T08:07:29Z) - Automating Neural Architecture Design without Search [3.651848964235307]
We study the automated architecture design from a new perspective that eliminates the need to sequentially evaluate each neural architecture generated during algorithm execution.
We implement the proposed approach by using a graph neural network for link prediction and acquired the knowledge from NAS-Bench-101.
In addition, we also utilized the learned knowledge from NAS-Bench-101 to automate architecture design in the DARTS search space, and achieved 97.82% accuracy on CIFAR10, and 76.51% top-1 accuracy on ImageNet consuming only $2times10-4$ GPU days.
arXiv Detail & Related papers (2022-04-21T14:41:05Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - Accelerating Neural Architecture Exploration Across Modalities Using
Genetic Algorithms [5.620334754517149]
We show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration.
NAS research efforts have centered around computer vision tasks and only recently have other modalities, such as the rapidly growing field of natural language processing, been investigated in depth.
arXiv Detail & Related papers (2022-02-25T20:01:36Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - NeuralArTS: Structuring Neural Architecture Search with Type Theory [0.0]
We present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system.
We show how NeuralArTS can be applied to convolutional layers and propose several future directions.
arXiv Detail & Related papers (2021-10-17T03:28:27Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z) - A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions [48.76705090826339]
Neural Architecture Search (NAS) is a revolutionary algorithm, and the related research work is complicated and rich.
We provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms.
Besides, we conduct a detailed and comprehensive analysis, comparison, and summary of these works.
arXiv Detail & Related papers (2020-06-01T13:08:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.