NeuralArTS: Structuring Neural Architecture Search with Type Theory
- URL: http://arxiv.org/abs/2110.08710v2
- Date: Tue, 19 Oct 2021 10:24:36 GMT
- Title: NeuralArTS: Structuring Neural Architecture Search with Type Theory
- Authors: Robert Wu, Nayan Saxena, Rohan Jain
- Abstract summary: We present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system.
We show how NeuralArTS can be applied to convolutional layers and propose several future directions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Architecture Search (NAS) algorithms automate the task of finding
optimal deep learning architectures given an initial search space of possible
operations. Developing these search spaces is usually a manual affair with
pre-optimized search spaces being more efficient, rather than searching from
scratch. In this paper we present a new framework called Neural Architecture
Type System (NeuralArTS) that categorizes the infinite set of network
operations in a structured type system. We further demonstrate how NeuralArTS
can be applied to convolutional layers and propose several future directions.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Neural Architecture Search: Insights from 1000 Papers [50.27255667347091]
We provide an organized and comprehensive guide to neural architecture search.
We give a taxonomy of search spaces, algorithms, and speedup techniques.
We discuss resources such as benchmarks, best practices, other surveys, and open-source libraries.
arXiv Detail & Related papers (2023-01-20T18:47:24Z) - HiveNAS: Neural Architecture Search using Artificial Bee Colony
Optimization [0.0]
In this study, we evaluate the viability of Artificial Bee Colony optimization for Neural Architecture Search.
Our proposed framework, HiveNAS, outperforms existing state-of-the-art Swarm Intelligence-based NAS frameworks in a fraction of the time.
arXiv Detail & Related papers (2022-11-18T14:11:47Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Search Space Adaptation for Differentiable Neural Architecture Search in
Image Classification [15.641353388251465]
Differentiable neural architecture search (NAS) has a great impact by reducing the search cost to the level of training a single network.
In this paper, we propose an adaptation scheme of the search space by introducing a search scope.
The effectiveness of proposed method is demonstrated with ProxylessNAS for the image classification task.
arXiv Detail & Related papers (2022-06-05T05:27:12Z) - A Novel Evolutionary Algorithm for Hierarchical Neural Architecture
Search [0.0]
We propose a novel evolutionary algorithm for neural architecture search, applicable to global search spaces.
The algorithm's architectural representation organizes the topology in multiple hierarchical modules, while the design process exploits this representation, in order to explore the search space.
We apply our method to Fashion-MNIST and NAS-Bench101, achieving accuracies of $93.2%$ and $94.8%$ respectively in a relatively small number of generations.
arXiv Detail & Related papers (2021-07-18T16:19:53Z) - On the Exploitation of Neuroevolutionary Information: Analyzing the Past
for a More Efficient Future [60.99717891994599]
We propose an approach that extracts information from neuroevolutionary runs, and use it to build a metamodel.
We inspect the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics.
arXiv Detail & Related papers (2021-05-26T20:55:29Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - Contrastive Embeddings for Neural Architectures [1.90365714903665]
We show that traditional black-box optimization algorithms, without modification, can reach state-of-the-art performance in Neural Architecture Search.
We also show the evolution of embeddings during training, motivating future studies into using embeddings at different training stages to gain a deeper understanding of the networks in a search space.
arXiv Detail & Related papers (2021-02-08T14:06:35Z) - Task-Aware Neural Architecture Search [33.11791812491669]
We propose a novel framework for neural architecture search, utilizing a dictionary of models of base tasks and the similarity between the target task and the atoms of the dictionary.
By introducing a gradient-based search algorithm, we can evaluate and discover the best architecture in the search space without fully training the networks.
arXiv Detail & Related papers (2020-10-27T00:10:40Z) - NAS-Navigator: Visual Steering for Explainable One-Shot Deep Neural
Network Synthesis [53.106414896248246]
We present a framework that allows analysts to effectively build the solution sub-graph space and guide the network search by injecting their domain knowledge.
Applying this technique in an iterative manner allows analysts to converge to the best performing neural network architecture for a given application.
arXiv Detail & Related papers (2020-09-28T01:48:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.