Search Space Adaptation for Differentiable Neural Architecture Search in
Image Classification
- URL: http://arxiv.org/abs/2206.02098v1
- Date: Sun, 5 Jun 2022 05:27:12 GMT
- Title: Search Space Adaptation for Differentiable Neural Architecture Search in
Image Classification
- Authors: Youngkee Kim, Soyi Jung, Minseok Choi and Joongheon Kim
- Abstract summary: Differentiable neural architecture search (NAS) has a great impact by reducing the search cost to the level of training a single network.
In this paper, we propose an adaptation scheme of the search space by introducing a search scope.
The effectiveness of proposed method is demonstrated with ProxylessNAS for the image classification task.
- Score: 15.641353388251465
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: As deep neural networks achieve unprecedented performance in various tasks,
neural architecture search (NAS), a research field for designing neural network
architectures with automated processes, is actively underway. More recently,
differentiable NAS has a great impact by reducing the search cost to the level
of training a single network. Besides, the search space that defines candidate
architectures to be searched directly affects the performance of the final
architecture. In this paper, we propose an adaptation scheme of the search
space by introducing a search scope. The effectiveness of proposed method is
demonstrated with ProxylessNAS for the image classification task. Furthermore,
we visualize the trajectory of architecture parameter updates and provide
insights to improve the architecture search.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - SuperNet in Neural Architecture Search: A Taxonomic Survey [14.037182039950505]
This survey focuses on the supernet optimization that builds a neural network that assembles all the architectures as its sub models by using weight sharing.
We aim to accomplish that by proposing them as solutions to the common challenges found in the literature: data-side optimization, poor rank correlation alleviation, and transferable NAS for a number of deployment scenarios.
arXiv Detail & Related papers (2022-04-08T08:29:52Z) - Neural Architecture Search for Speech Emotion Recognition [72.1966266171951]
We propose to apply neural architecture search (NAS) techniques to automatically configure the SER models.
We show that NAS can improve SER performance (54.89% to 56.28%) while maintaining model parameter sizes.
arXiv Detail & Related papers (2022-03-31T10:16:10Z) - Full-attention based Neural Architecture Search using Context
Auto-regression [18.106878746065536]
We propose a full-attention based NAS method to search attention networks.
A stage-wise search space is constructed that allows various attention operations to be adopted for different layers of a network.
A self-supervised search algorithm is proposed that uses context auto-regression to discover the full-attention architecture.
arXiv Detail & Related papers (2021-11-13T16:07:37Z) - Making Differentiable Architecture Search less local [9.869449181400466]
Differentiable neural architecture search (DARTS) is a promising NAS approach that dramatically increases search efficiency.
It has been shown to suffer from performance collapse, where the search often leads to detrimental architectures.
We develop a more global optimisation scheme that is able to better explore the space without changing the DARTS problem formulation.
arXiv Detail & Related papers (2021-04-21T10:36:43Z) - Enhanced Gradient for Differentiable Architecture Search [17.431144144044968]
We propose a neural network architecture search algorithm aiming to simultaneously improve network performance and reduce network complexity.
The proposed framework automatically builds the network architecture at two stages: block-level search and network-level search.
Experiment results demonstrate that our method outperforms all evaluated hand-crafted networks in image classification.
arXiv Detail & Related papers (2021-03-23T13:27:24Z) - Task-Aware Neural Architecture Search [33.11791812491669]
We propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary.
By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks.
arXiv Detail & Related papers (2020-10-27T00:10:40Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.