Task-Aware Neural Architecture Search
- URL: http://arxiv.org/abs/2010.13962v3
- Date: Mon, 15 Mar 2021 22:02:53 GMT
- Title: Task-Aware Neural Architecture Search
- Authors: Cat P. Le, Mohammadreza Soltani, Robert Ravier, Vahid Tarokh
- Abstract summary: We propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary.
By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks.
- Score: 33.11791812491669
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The design of handcrafted neural networks requires a lot of time and
resources. Recent techniques in Neural Architecture Search (NAS) have proven to
be competitive or better than traditional handcrafted design, although they
require domain knowledge and have generally used limited search spaces. In this
paper, we propose a novel framework for neural architecture search, utilizing a
dictionary of models of base tasks and the similarity between the target task
and the atoms of the dictionary; hence, generating an adaptive search space
based on the base models of the dictionary. By introducing a gradient-based
search algorithm, we can evaluate and discover the best architecture in the
search space without fully training the networks. The experimental results show
the efficacy of our proposed task-aware approach.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - An Approach for Efficient Neural Architecture Search Space Definition [0.0]
We propose a novel cell-based hierarchical search space, easy to comprehend and manipulate.
The objectives of the proposed approach are to optimize the search-time and to be general enough to handle most of state of the art CNN architectures.
arXiv Detail & Related papers (2023-10-25T08:07:29Z) - Neural Architecture Search: Insights from 1000 Papers [50.27255667347091]
We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
arXiv Detail & Related papers (2023-01-20T18:47:24Z) - DQNAS: Neural Architecture Search using Reinforcement Learning [6.33280703577189]
Convolutional Neural Networks have been used in a variety of image related applications.
In this paper, we propose an automated Neural Architecture Search framework, guided by the principles of Reinforcement Learning.
arXiv Detail & Related papers (2023-01-17T04:01:47Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Search Space Adaptation for Differentiable Neural Architecture Search in
Image Classification [15.641353388251465]
Differentiable neural architecture search (NAS) has a great impact by reducing the search cost to the level of training a single network.
In this paper, we propose an adaptation scheme of the search space by introducing a search scope.
The effectiveness of proposed method is demonstrated with ProxylessNAS for the image classification task.
arXiv Detail & Related papers (2022-06-05T05:27:12Z) - NeuralArTS: Structuring Neural Architecture Search with Type Theory [0.0]
We present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system.
We show how NeuralArTS can be applied to convolutional layers and propose several future directions.
arXiv Detail & Related papers (2021-10-17T03:28:27Z) - Neural Architecture Search From Fr\'echet Task Distance [50.9995960884133]
We show how the distance between a target task and each task in a given set of baseline tasks can be used to reduce the neural architecture search space for the target task.
The complexity reduction in search space for task-specific architectures is achieved by building on the optimized architectures for similar tasks instead of doing a full search without using this side information.
arXiv Detail & Related papers (2021-03-23T20:43:31Z) - Contrastive Embeddings for Neural Architectures [1.90365714903665]
We show that traditional black-box optimization algorithms, without modification, can reach state-of-the-art performance in Neural Architecture Search.
We also show the evolution of embeddings during training, motivating future studies into using embeddings at different training stages to gain a deeper understanding of the networks in a search space.
arXiv Detail & Related papers (2021-02-08T14:06:35Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.