A Novel Framework for Neural Architecture Search in the Hill Climbing
Domain
- URL: http://arxiv.org/abs/2102.12985v1
- Date: Mon, 22 Feb 2021 04:34:29 GMT
- Title: A Novel Framework for Neural Architecture Search in the Hill Climbing
Domain
- Authors: Mudit Verma, Pradyumna Sinha, Karan Goyal, Apoorva Verma and Seba
Susan
- Abstract summary: We propose a new framework for neural architecture search based on a hill-climbing procedure.
We achieve a 4.96% error rate on the CIFAR-10 dataset in 19.4 hours of a single GPU training.
- Score: 2.729898906885749
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural networks have now long been used for solving complex problems of image
domain, yet designing the same needs manual expertise. Furthermore, techniques
for automatically generating a suitable deep learning architecture for a given
dataset have frequently made use of reinforcement learning and evolutionary
methods which take extensive computational resources and time. We propose a new
framework for neural architecture search based on a hill-climbing procedure
using morphism operators that makes use of a novel gradient update scheme. The
update is based on the aging of neural network layers and results in the
reduction in the overall training time. This technique can search in a broader
search space which subsequently yields competitive results. We achieve a 4.96%
error rate on the CIFAR-10 dataset in 19.4 hours of a single GPU training.
Related papers
- DQNAS: Neural Architecture Search using Reinforcement Learning [6.33280703577189]
Convolutional Neural Networks have been used in a variety of image related applications.
In this paper, we propose an automated Neural Architecture Search framework, guided by the principles of Reinforcement Learning.
arXiv Detail & Related papers (2023-01-17T04:01:47Z) - Neural Architecture Search for Dense Prediction Tasks in Computer Vision [74.9839082859151]
Deep learning has led to a rising demand for neural network architecture engineering.
neural architecture search (NAS) aims at automatically designing neural network architectures in a data-driven manner rather than manually.
NAS has become applicable to a much wider range of problems in computer vision.
arXiv Detail & Related papers (2022-02-15T08:06:50Z) - Self Semi Supervised Neural Architecture Search for Semantic
Segmentation [6.488575826304023]
We propose a Neural Architecture Search strategy based on self supervision and semi-supervised learning for the task of semantic segmentation.
Our approach builds an optimized neural network model for this task.
Experiments on the Cityscapes and PASCAL VOC 2012 datasets demonstrate that the discovered neural network is more efficient than a state-of-the-art hand-crafted NN model.
arXiv Detail & Related papers (2022-01-29T19:49:44Z) - Improving the sample-efficiency of neural architecture search with
reinforcement learning [0.0]
In this work, we would like to contribute to the area of Automated Machine Learning (AutoML)
Our focus is on one of the most promising research directions, reinforcement learning.
The validation accuracies of the child networks serve as a reward signal for training the controller.
We propose to modify this to a more modern and complex algorithm, PPO, which has demonstrated to be faster and more stable in other environments.
arXiv Detail & Related papers (2021-10-13T14:30:09Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Efficient Neural Architecture Search with Performance Prediction [0.0]
We use a neural architecture search to find the best network architecture for the task at hand.
Existing NAS algorithms generally evaluate the fitness of a new architecture by fully training from scratch.
An end-to-end offline performance predictor is proposed to accelerate the evaluation of sampled architectures.
arXiv Detail & Related papers (2021-08-04T05:44:16Z) - Firefly Neural Architecture Descent: a General Approach for Growing
Neural Networks [50.684661759340145]
Firefly neural architecture descent is a general framework for progressively and dynamically growing neural networks.
We show that firefly descent can flexibly grow networks both wider and deeper, and can be applied to learn accurate but resource-efficient neural architectures.
In particular, it learns networks that are smaller in size but have higher average accuracy than those learned by the state-of-the-art methods.
arXiv Detail & Related papers (2021-02-17T04:47:18Z) - NAS-DIP: Learning Deep Image Prior with Neural Architecture Search [65.79109790446257]
Recent work has shown that the structure of deep convolutional neural networks can be used as a structured image prior.
We propose to search for neural architectures that capture stronger image priors.
We search for an improved network by leveraging an existing neural architecture search algorithm.
arXiv Detail & Related papers (2020-08-26T17:59:36Z) - Multi-fidelity Neural Architecture Search with Knowledge Distillation [69.09782590880367]
We propose a bayesian multi-fidelity method for neural architecture search: MF-KD.
Knowledge distillation adds to a loss function a term forcing a network to mimic some teacher network.
We show that training for a few epochs with such a modified loss function leads to a better selection of neural architectures than training for a few epochs with a logistic loss.
arXiv Detail & Related papers (2020-06-15T12:32:38Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.