Unchain the Search Space with Hierarchical Differentiable Architecture
Search
- URL: http://arxiv.org/abs/2101.04028v2
- Date: Tue, 12 Jan 2021 04:00:56 GMT
- Title: Unchain the Search Space with Hierarchical Differentiable Architecture
Search
- Authors: Guanting Liu, Yujie Zhong, Sheng Guo, Matthew R. Scott, Weilin Huang
- Abstract summary: DAS-based methods mainly focus on searching for a repeatable cell structure, which is then stacked sequentially in multiple stages to form the networks.
We propose a Hierarchical Differentiable Architecture Search (H-DAS) that performs architecture search both at the cell level and at the stage level.
For the stage-level search, we systematically study the architectures of stages, including the number of cells in each stage and the connections between the cells.
- Score: 42.32368267716705
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differentiable architecture search (DAS) has made great progress in searching
for high-performance architectures with reduced computational cost. However,
DAS-based methods mainly focus on searching for a repeatable cell structure,
which is then stacked sequentially in multiple stages to form the networks.
This configuration significantly reduces the search space, and ignores the
importance of connections between the cells. To overcome this limitation, in
this paper, we propose a Hierarchical Differentiable Architecture Search
(H-DAS) that performs architecture search both at the cell level and at the
stage level. Specifically, the cell-level search space is relaxed so that the
networks can learn stage-specific cell structures. For the stage-level search,
we systematically study the architectures of stages, including the number of
cells in each stage and the connections between the cells. Based on insightful
observations, we design several search rules and losses, and mange to search
for better stage-level architectures. Such hierarchical search space greatly
improves the performance of the networks without introducing expensive search
cost. Extensive experiments on CIFAR10 and ImageNet demonstrate the
effectiveness of the proposed H-DAS. Moreover, the searched stage-level
architectures can be combined with the cell structures searched by existing DAS
methods to further boost the performance. Code is available at:
https://github.com/MalongTech/research-HDAS
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - Efficient NAS with FaDE on Hierarchical Spaces [0.6372911857214884]
We present FaDE which uses differentiable architecture search to obtain relative performance predictions on finite regions of a hierarchical NAS space.
FaDE is especially suited on deep hierarchical, respectively multi-cell search spaces.
arXiv Detail & Related papers (2024-04-24T21:33:17Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - Searching a High-Performance Feature Extractor for Text Recognition
Network [92.12492627169108]
We design a domain-specific search space by exploring principles for having good feature extractors.
As the space is huge and complexly structured, no existing NAS algorithms can be applied.
We propose a two-stage algorithm to effectively search in the space.
arXiv Detail & Related papers (2022-09-27T03:49:04Z) - On Redundancy and Diversity in Cell-based Neural Architecture Search [44.337381243798085]
We conduct an empirical analysis of architectures from the popular cell-based search spaces.
We find that the architecture performance is minimally sensitive to changes at large parts of the cells.
By explicitly constraining cells to include these patterns, randomly sampled architectures can match or even outperform the state of the art.
arXiv Detail & Related papers (2022-03-16T18:59:29Z) - Towards Improving the Consistency, Efficiency, and Flexibility of
Differentiable Neural Architecture Search [84.4140192638394]
Most differentiable neural architecture search methods construct a super-net for search and derive a target-net as its sub-graph for evaluation.
In this paper, we introduce EnTranNAS that is composed of Engine-cells and Transit-cells.
Our method also spares much memory and computation cost, which speeds up the search process.
arXiv Detail & Related papers (2021-01-27T12:16:47Z) - ISTA-NAS: Efficient and Consistent Neural Architecture Search by Sparse
Coding [86.40042104698792]
We formulate neural architecture search as a sparse coding problem.
In experiments, our two-stage method on CIFAR-10 requires only 0.05 GPU-day for search.
Our one-stage method produces state-of-the-art performances on both CIFAR-10 and ImageNet at the cost of only evaluation time.
arXiv Detail & Related papers (2020-10-13T04:34:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.