MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search
- URL: http://arxiv.org/abs/2009.13940v1
- Date: Tue, 29 Sep 2020 11:56:01 GMT
- Title: MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search
- Authors: Cristian Cioflan, Radu Timofte
- Abstract summary: We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
- Score: 94.80212602202518
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) has proved effective in offering
outperforming alternatives to handcrafted neural networks. In this paper we
analyse the benefits of NAS for image classification tasks under strict
computational constraints. Our aim is to automate the design of highly
efficient deep neural networks, capable of offering fast and accurate
predictions and that could be deployed on a low-memory, low-power
system-on-chip. The task thus becomes a three-party trade-off between accuracy,
computational complexity, and memory requirements. To address this concern, we
propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS). We
employ a one-shot architecture search approach in order to obtain a reduced
search cost and we focus on an anytime prediction setting. Through the usage of
multiple-scaled features and early classifiers, we achieved state-of-the-art
results in terms of accuracy-speed trade-off.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - DONNAv2 -- Lightweight Neural Architecture Search for Vision tasks [6.628409795264665]
We present the next-generation neural architecture design for computationally efficient neural architecture distillation - DONNAv2.
DONNAv2 reduces the computational cost of DONNA by 10x for the larger datasets.
To improve the quality of NAS search space, DONNAv2 leverages a block knowledge distillation filter to remove blocks with high inference costs.
arXiv Detail & Related papers (2023-09-26T04:48:50Z) - A Survey on Multi-Objective Neural Architecture Search [9.176056742068813]
Multi-Objective Neural architecture Search (MONAS) has been attracting attentions.
We present an overview of principal and state-of-the-art works in the field of MONAS.
arXiv Detail & Related papers (2023-07-18T09:42:51Z) - IQNAS: Interpretable Integer Quadratic Programming Neural Architecture
Search [40.77061519007659]
A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS)
Previous methods use complicated predictors for the accuracy of the network.
We introduce Interpretable Quadratic programming Neural Architecture Search (IQNAS)
arXiv Detail & Related papers (2021-10-24T09:45:00Z) - Evolutionary Neural Architecture Search Supporting Approximate
Multipliers [0.5414308305392761]
We propose a multi-objective NAS method based on Cartesian genetic programming for evolving convolutional neural networks (CNN)
The most suitable approximate multipliers are automatically selected from a library of approximate multipliers.
Evolved CNNs are compared with common human-created CNNs of a similar complexity on the CIFAR-10 benchmark problem.
arXiv Detail & Related papers (2021-01-28T09:26:03Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - FBNetV3: Joint Architecture-Recipe Search using Predictor Pretraining [65.39532971991778]
We present an accuracy predictor that scores architecture and training recipes jointly, guiding both sample selection and ranking.
We run fast evolutionary searches in just CPU minutes to generate architecture-recipe pairs for a variety of resource constraints.
FBNetV3 makes up a family of state-of-the-art compact neural networks that outperform both automatically and manually-designed competitors.
arXiv Detail & Related papers (2020-06-03T05:20:21Z) - NAS-Count: Counting-by-Density with Neural Architecture Search [74.92941571724525]
We automate the design of counting models with Neural Architecture Search (NAS)
We introduce an end-to-end searched encoder-decoder architecture, Automatic Multi-Scale Network (AMSNet)
arXiv Detail & Related papers (2020-02-29T09:18:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.