On Redundancy and Diversity in Cell-based Neural Architecture Search
- URL: http://arxiv.org/abs/2203.08887v1
- Date: Wed, 16 Mar 2022 18:59:29 GMT
- Title: On Redundancy and Diversity in Cell-based Neural Architecture Search
- Authors: Xingchen Wan, Binxin Ru, Pedro M. Esperan\c{c}a, Zhenguo Li
- Abstract summary: We conduct an empirical analysis of architectures from the popular cell-based search spaces.
We find that the architecture performance is minimally sensitive to changes at large parts of the cells.
By explicitly constraining cells to include these patterns, randomly sampled architectures can match or even outperform the state of the art.
- Score: 44.337381243798085
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Searching for the architecture cells is a dominant paradigm in NAS. However,
little attention has been devoted to the analysis of the cell-based search
spaces even though it is highly important for the continual development of NAS.
In this work, we conduct an empirical post-hoc analysis of architectures from
the popular cell-based search spaces and find that the existing search spaces
contain a high degree of redundancy: the architecture performance is minimally
sensitive to changes at large parts of the cells, and universally adopted
designs, like the explicit search for a reduction cell, significantly increase
the complexities but have very limited impact on the performance. Across
architectures found by a diverse set of search strategies, we consistently find
that the parts of the cells that do matter for architecture performance often
follow similar and simple patterns. By explicitly constraining cells to include
these patterns, randomly sampled architectures can match or even outperform the
state of the art. These findings cast doubts into our ability to discover truly
novel architectures in the existing cell-based search spaces, and inspire our
suggestions for improvement to guide future NAS research. Code is available at
https://github.com/xingchenwan/cell-based-NAS-analysis.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - einspace: Searching for Neural Architectures from Fundamental Operations [28.346238250052455]
We introduce einspace, a search space based on a parameterised probabilistic context-free grammar.
We show that competitive architectures can be obtained by searching from scratch, and we consistently find large improvements when initialising the search with strong baselines.
arXiv Detail & Related papers (2024-05-31T14:25:45Z) - Construction of Hierarchical Neural Architecture Search Spaces based on
Context-free Grammars [66.05096551112932]
We introduce a unifying search space design framework based on context-free grammars.
By enhancing and using their properties, we effectively enable search over the complete architecture.
We show that our search strategy can be superior to existing Neural Architecture Search approaches.
arXiv Detail & Related papers (2022-11-03T14:23:00Z) - On the Privacy Risks of Cell-Based NAS Architectures [28.71028000150282]
We systematically measure the privacy risks of NAS architectures.
We shed light on how to design robust NAS architectures against privacy attacks.
We offer a general methodology to understand the hidden correlation between the NAS-searched architectures and other privacy risks.
arXiv Detail & Related papers (2022-09-04T20:24:04Z) - Towards Less Constrained Macro-Neural Architecture Search [2.685668802278155]
Neural Architecture Search (NAS) networks achieve state-of-the-art performance in a variety of tasks.
Most NAS methods rely heavily on human-defined assumptions that constrain the search.
We present experiments showing that LCMNAS generates state-of-the-art architectures from scratch with minimal GPU computation.
arXiv Detail & Related papers (2022-03-10T17:53:03Z) - $\beta$-DARTS: Beta-Decay Regularization for Differentiable Architecture
Search [85.84110365657455]
We propose a simple-but-efficient regularization method, termed as Beta-Decay, to regularize the DARTS-based NAS searching process.
Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets.
arXiv Detail & Related papers (2022-03-03T11:47:14Z) - D-DARTS: Distributed Differentiable Architecture Search [75.12821786565318]
Differentiable ARchiTecture Search (DARTS) is one of the most trending Neural Architecture Search (NAS) methods.
We propose D-DARTS, a novel solution that addresses this problem by nesting several neural networks at cell-level.
arXiv Detail & Related papers (2021-08-20T09:07:01Z) - Unchain the Search Space with Hierarchical Differentiable Architecture
Search [42.32368267716705]
DAS-based methods mainly focus on searching for a repeatable cell structure, which is then stacked sequentially in multiple stages to form the networks.
We propose a Hierarchical Differentiable Architecture Search (H-DAS) that performs architecture search both at the cell level and at the stage level.
For the stage-level search, we systematically study the architectures of stages, including the number of cells in each stage and the connections between the cells.
arXiv Detail & Related papers (2021-01-11T17:01:43Z) - Memory-Efficient Hierarchical Neural Architecture Search for Image
Restoration [68.6505473346005]
We propose a memory-efficient hierarchical NAS HiNAS (HiNAS) for image denoising and image super-resolution tasks.
With a single GTX1080Ti GPU, it takes only about 1 hour for searching for denoising network on BSD 500 and 3.5 hours for searching for the super-resolution structure on DIV2K.
arXiv Detail & Related papers (2020-12-24T12:06:17Z) - Learning Architectures from an Extended Search Space for Language
Modeling [37.79977691127229]
We present a general approach to learn both intra-cell and inter-cell architectures of Neural architecture search (NAS)
For recurrent neural language modeling, it outperforms a strong baseline significantly on the PTB and WikiText data, with a new state-of-the-art on PTB.
The learned architectures show good transferability to other systems.
arXiv Detail & Related papers (2020-05-06T05:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.