Neural Architecture Search for Dense Prediction Tasks in Computer Vision
- URL: http://arxiv.org/abs/2202.07242v1
- Date: Tue, 15 Feb 2022 08:06:50 GMT
- Title: Neural Architecture Search for Dense Prediction Tasks in Computer Vision
- Authors: Thomas Elsken, Arber Zela, Jan Hendrik Metzen, Benedikt Staffler,
Thomas Brox, Abhinav Valada, Frank Hutter
- Abstract summary: Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
- Score: 74.9839082859151
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The success of deep learning in recent years has lead to a rising demand for
neural network architecture engineering. As a consequence, neural architecture
search (NAS), which aims at automatically designing neural network
architectures in a data-driven manner rather than manually, has evolved as a
popular field of research. With the advent of weight sharing strategies across
architectures, NAS has become applicable to a much wider range of problems. In
particular, there are now many publications for dense prediction tasks in
computer vision that require pixel-level predictions, such as semantic
segmentation or object detection. These tasks come with novel challenges, such
as higher memory footprints due to high-resolution data, learning multi-scale
representations, longer training times, and more complex and larger neural
architectures. In this manuscript, we provide an overview of NAS for dense
prediction tasks by elaborating on these novel challenges and surveying ways to
address them to ease future research and application of existing methods to
novel problems.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Homological Neural Networks: A Sparse Architecture for Multivariate
Complexity [0.0]
We develop a novel deep neural network unit characterized by a sparse higher-order graphical architecture built over the homological structure of underlying data.
Results demonstrate the advantages of this novel design which can tie or overcome the results of state-of-the-art machine learning and deep learning models using only a fraction of parameters.
arXiv Detail & Related papers (2023-06-27T09:46:16Z) - Neural Architecture Search: Insights from 1000 Papers [50.27255667347091]
We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
arXiv Detail & Related papers (2023-01-20T18:47:24Z) - DQNAS: Neural Architecture Search using Reinforcement Learning [6.33280703577189]
Convolutional Neural Networks have been used in a variety of image related applications.
In this paper, we propose an automated Neural Architecture Search framework, guided by the principles of Reinforcement Learning.
arXiv Detail & Related papers (2023-01-17T04:01:47Z) - SuperNet in Neural Architecture Search: A Taxonomic Survey [14.037182039950505]
This survey focuses on the supernet optimization that builds a neural network that assembles all the architectures as its sub models by using weight sharing.
We aim to accomplish that by proposing them as solutions to the common challenges found in the literature: data-side optimization, poor rank correlation alleviation, and transferable NAS for a number of deployment scenarios.
arXiv Detail & Related papers (2022-04-08T08:29:52Z) - Accelerating Neural Architecture Exploration Across Modalities Using
Genetic Algorithms [5.620334754517149]
We show how genetic algorithms can be paired with lightly trained objective predictors in an iterative cycle to accelerate multi-objective architectural exploration.
NAS research efforts have centered around computer vision tasks and only recently have other modalities, such as the rapidly growing field of natural language processing, been investigated in depth.
arXiv Detail & Related papers (2022-02-25T20:01:36Z) - Neural Architecture Search in operational context: a remote sensing
case-study [0.0]
Neural Architecture Search (NAS) is a framework introduced to mitigate risks by jointly optimizing the network architectures and its weights.
We aim to evaluate its ability to tackle a challenging operational task: semantic segmentation of objects of interest in satellite imagery.
arXiv Detail & Related papers (2021-09-15T08:18:12Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - A Comprehensive Survey of Neural Architecture Search: Challenges and
Solutions [48.76705090826339]
Neural Architecture Search (NAS) is a revolutionary algorithm, and the related research work is complicated and rich.
We provide a new perspective: beginning with an overview of the characteristics of the earliest NAS algorithms, summarizing the problems in these early NAS algorithms.
Besides, we conduct a detailed and comprehensive analysis, comparison, and summary of these works.
arXiv Detail & Related papers (2020-06-01T13:08:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.