RC-DARTS: Resource Constrained Differentiable Architecture Search
- URL: http://arxiv.org/abs/1912.12814v1
- Date: Mon, 30 Dec 2019 05:02:38 GMT
- Title: RC-DARTS: Resource Constrained Differentiable Architecture Search
- Authors: Xiaojie Jin, Jiang Wang, Joshua Slocum, Ming-Hsuan Yang, Shengyang
Dai, Shuicheng Yan, Jiashi Feng
- Abstract summary: We propose the resource constrained differentiable architecture search (RC-DARTS) method to learn architectures that are significantly smaller and faster.
We show that the RC-DARTS method learns lightweight neural architectures which have smaller model size and lower computational complexity.
- Score: 162.7199952019152
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent advances show that Neural Architectural Search (NAS) method is able to
find state-of-the-art image classification deep architectures. In this paper,
we consider the one-shot NAS problem for resource constrained applications.
This problem is of great interest because it is critical to choose different
architectures according to task complexity when the resource is constrained.
Previous techniques are either too slow for one-shot learning or does not take
the resource constraint into consideration. In this paper, we propose the
resource constrained differentiable architecture search (RC-DARTS) method to
learn architectures that are significantly smaller and faster while achieving
comparable accuracy. Specifically, we propose to formulate the RC-DARTS task as
a constrained optimization problem by adding the resource constraint. An
iterative projection method is proposed to solve the given constrained
optimization problem. We also propose a multi-level search strategy to enable
layers at different depths to adaptively learn different types of neural
architectures. Through extensive experiments on the Cifar10 and ImageNet
datasets, we show that the RC-DARTS method learns lightweight neural
architectures which have smaller model size and lower computational complexity
while achieving comparable or better performances than the state-of-the-art
methods.
Related papers
- Efficient Search of Multiple Neural Architectures with Different
Complexities via Importance Sampling [3.759936323189417]
This study focuses on the architecture complexity-aware one-shot NAS that optimize the objective function composed of the weighted sum of two metrics.
The proposed method is applied to the architecture search of convolutional neural networks on the CIAFR-10 and ImageNet datasets.
arXiv Detail & Related papers (2022-07-21T07:06:03Z) - Resource-Constrained Neural Architecture Search on Tabular Datasets [38.765317261872504]
The best neural architecture for a given machine learning problem depends on many factors, including the complexity and structure of the dataset.
Previous NAS algorithms incorporate resource constraints directly into the reinforcement learning rewards.
We propose a new reinforcement learning controller to address these challenges.
arXiv Detail & Related papers (2022-04-15T19:03:25Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Rethinking Architecture Selection in Differentiable NAS [74.61723678821049]
Differentiable Neural Architecture Search is one of the most popular NAS methods for its search efficiency and simplicity.
We propose an alternative perturbation-based architecture selection that directly measures each operation's influence on the supernet.
We find that several failure modes of DARTS can be greatly alleviated with the proposed selection method.
arXiv Detail & Related papers (2021-08-10T00:53:39Z) - Elastic Architecture Search for Diverse Tasks with Different Resources [87.23061200971912]
We study a new challenging problem of efficient deployment for diverse tasks with different resources, where the resource constraint and task of interest corresponding to a group of classes are dynamically specified at testing time.
Previous NAS approaches seek to design architectures for all classes simultaneously, which may not be optimal for some individual tasks.
We present a novel and general framework, called Elastic Architecture Search (EAS), permitting instant specializations at runtime for diverse tasks with various resource constraints.
arXiv Detail & Related papers (2021-08-03T00:54:27Z) - iDARTS: Differentiable Architecture Search with Stochastic Implicit
Gradients [75.41173109807735]
Differentiable ARchiTecture Search (DARTS) has recently become the mainstream of neural architecture search (NAS)
We tackle the hypergradient computation in DARTS based on the implicit function theorem.
We show that the architecture optimisation with the proposed method, named iDARTS, is expected to converge to a stationary point.
arXiv Detail & Related papers (2021-06-21T00:44:11Z) - Effective, Efficient and Robust Neural Architecture Search [4.273005643715522]
Recent advances in adversarial attacks show the vulnerability of deep neural networks searched by Neural Architecture Search (NAS)
We propose an Effective, Efficient, and Robust Neural Architecture Search (E2RNAS) method to search a neural network architecture by taking the performance, robustness, and resource constraint into consideration.
Experiments on benchmark datasets show that the proposed E2RNAS method can find adversarially robust architectures with optimized model size and comparable classification accuracy.
arXiv Detail & Related papers (2020-11-19T13:46:23Z) - Breaking the Curse of Space Explosion: Towards Efficient NAS with
Curriculum Search [94.46818035655943]
We propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space.
With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods.
arXiv Detail & Related papers (2020-07-07T02:29:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.