SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via
Train-Free Metrics
- URL: http://arxiv.org/abs/2204.09320v1
- Date: Wed, 20 Apr 2022 08:55:01 GMT
- Title: SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via
Train-Free Metrics
- Authors: Rob Geada, Andrew Stephen McGough
- Abstract summary: Neural Architecture Search (NAS) algorithms are intended to remove the burden of manual neural network design.
NAS algorithms require a variety of design parameters in the form of user configuration or hard-coded decisions which limit the variety of networks that can be discovered.
We present SpiderNet, a hybrid differentiable-evolutionary and hardware-aware algorithm that rapidly and efficiently produces state-of-the-art networks.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Architecture Search (NAS) algorithms are intended to remove the burden
of manual neural network design, and have shown to be capable of designing
excellent models for a variety of well-known problems. However, these
algorithms require a variety of design parameters in the form of user
configuration or hard-coded decisions which limit the variety of networks that
can be discovered. This means that NAS algorithms do not eliminate model design
tuning, they instead merely shift the burden of where that tuning needs to be
applied. In this paper, we present SpiderNet, a hybrid
differentiable-evolutionary and hardware-aware algorithm that rapidly and
efficiently produces state-of-the-art networks. More importantly, SpiderNet is
a proof-of-concept of a minimally-configured NAS algorithm; the majority of
design choices seen in other algorithms are incorporated into SpiderNet's
dynamically-evolving search space, minimizing the number of user choices to
just two: reduction cell count and initial channel count. SpiderNet produces
models highly-competitive with the state-of-the-art, and outperforms random
search in accuracy, runtime, memory size, and parameter count.
Related papers
- DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - NAS-ASDet: An Adaptive Design Method for Surface Defect Detection
Network using Neural Architecture Search [5.640706784987607]
We propose a new method called NAS-ASDet to adaptively design network for surface defect detection.
First, a refined and industry-appropriate search space that can adaptively adjust the feature distribution is designed.
Then, a progressive search strategy with a deep supervision mechanism is used to explore the search space faster and better.
arXiv Detail & Related papers (2023-11-18T03:15:45Z) - UDC: Unified DNAS for Compressible TinyML Models [10.67922101024593]
This work bridges the gap between NPU HW capability and NN model design by proposing a neural arcthiecture search (NAS) algorithm.
We demonstrate Unified DNAS for Compressible models (UDC) on CIFAR100, ImageNet, and DIV2K super resolution tasks.
On ImageNet, we find dominant compressible models, which are 1.9x smaller or 5.76% more accurate.
arXiv Detail & Related papers (2022-01-15T12:35:26Z) - IQNAS: Interpretable Integer Quadratic Programming Neural Architecture
Search [40.77061519007659]
A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS)
Previous methods use complicated predictors for the accuracy of the network.
We introduce Interpretable Quadratic programming Neural Architecture Search (IQNAS)
arXiv Detail & Related papers (2021-10-24T09:45:00Z) - Neural network relief: a pruning algorithm based on neural activity [47.57448823030151]
We propose a simple importance-score metric that deactivates unimportant connections.
We achieve comparable performance for LeNet architectures on MNIST.
The algorithm is not designed to minimize FLOPs when considering current hardware and software implementations.
arXiv Detail & Related papers (2021-09-22T15:33:49Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Neural Architecture Search of SPD Manifold Networks [79.45110063435617]
We propose a new neural architecture search (NAS) problem of Symmetric Positive Definite (SPD) manifold networks.
We first introduce a geometrically rich and diverse SPD neural architecture search space for an efficient SPD cell design.
We exploit a differentiable NAS algorithm on our relaxed continuous search space for SPD neural architecture search.
arXiv Detail & Related papers (2020-10-27T18:08:57Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Binarizing MobileNet via Evolution-based Searching [66.94247681870125]
We propose a use of evolutionary search to facilitate the construction and training scheme when binarizing MobileNet.
Inspired by one-shot architecture search frameworks, we manipulate the idea of group convolution to design efficient 1-Bit Convolutional Neural Networks (CNNs)
Our objective is to come up with a tiny yet efficient binary neural architecture by exploring the best candidates of the group convolution.
arXiv Detail & Related papers (2020-05-13T13:25:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.