DOTS: Decoupling Operation and Topology in Differentiable Architecture
Search
- URL: http://arxiv.org/abs/2010.00969v3
- Date: Thu, 8 Apr 2021 07:36:23 GMT
- Title: DOTS: Decoupling Operation and Topology in Differentiable Architecture
Search
- Authors: Yu-Chao Gu, Li-Juan Wang, Yun Liu, Yi Yang, Yu-Huan Wu, Shao-Ping Lu,
Ming-Ming Cheng
- Abstract summary: Differentiable Architecture Search (DARTS) has attracted extensive attention due to its efficiency in searching for cell structures.
We propose to Decouple the Operation and Topology Search (DOTS) to make an explicit topology search.
- Score: 115.89211594258573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Differentiable Architecture Search (DARTS) has attracted extensive attention
due to its efficiency in searching for cell structures. DARTS mainly focuses on
the operation search and derives the cell topology from the operation weights.
However, the operation weights can not indicate the importance of cell topology
and result in poor topology rating correctness. To tackle this, we propose to
Decouple the Operation and Topology Search (DOTS), which decouples the topology
representation from operation weights and makes an explicit topology search.
DOTS is achieved by introducing a topology search space that contains
combinations of candidate edges. The proposed search space directly reflects
the search objective and can be easily extended to support a flexible number of
edges in the searched cell. Existing gradient-based NAS methods can be
incorporated into DOTS for further improvement by the topology search.
Considering that some operations (e.g., Skip-Connection) can affect the
topology, we propose a group operation search scheme to preserve
topology-related operations for a better topology search. The experiments on
CIFAR10/100 and ImageNet demonstrate that DOTS is an effective solution for
differentiable NAS.
Related papers
- TopoNAS: Boosting Search Efficiency of Gradient-based NAS via Topological Simplification [11.08910129925713]
TopoNAS is a model-agnostic approach for gradient-based one-shot NAS.
It significantly reduces searching time and memory usage by topological simplification of searchable paths.
arXiv Detail & Related papers (2024-08-02T15:01:29Z) - Efficient NAS with FaDE on Hierarchical Spaces [0.6372911857214884]
We present FaDE which uses differentiable architecture search to obtain relative performance predictions on finite regions of a hierarchical NAS space.
FaDE is especially suited on deep hierarchical, respectively multi-cell search spaces.
arXiv Detail & Related papers (2024-04-24T21:33:17Z) - ECToNAS: Evolutionary Cross-Topology Neural Architecture Search [0.0]
ECToNAS is a cost-efficient evolutionary cross-topology neural architecture search algorithm.
It fuses training and topology optimisation together into one lightweight, resource-friendly process.
arXiv Detail & Related papers (2024-03-08T07:36:46Z) - $\beta$-DARTS: Beta-Decay Regularization for Differentiable Architecture
Search [85.84110365657455]
We propose a simple-but-efficient regularization method, termed as Beta-Decay, to regularize the DARTS-based NAS searching process.
Experimental results on NAS-Bench-201 show that our proposed method can help to stabilize the searching process and makes the searched network more transferable across different datasets.
arXiv Detail & Related papers (2022-03-03T11:47:14Z) - Exploring Complicated Search Spaces with Interleaving-Free Sampling [127.07551427957362]
In this paper, we build the search algorithm upon a complicated search space with long-distance connections.
We present a simple yet effective algorithm named textbfIF-NAS, where we perform a periodic sampling strategy to construct different sub-networks.
In the proposed search space, IF-NAS outperform both random sampling and previous weight-sharing search algorithms by a significant margin.
arXiv Detail & Related papers (2021-12-05T06:42:48Z) - DiNTS: Differentiable Neural Network Topology Search for 3D Medical
Image Segmentation [7.003867673687463]
Differentiable Network Topology Search scheme (DiNTS) is evaluated on the Medical Decathlon (MSD) challenge.
Our method achieves the state-of-the-art performance and the top ranking on the MSD challenge leaderboard.
arXiv Detail & Related papers (2021-03-29T21:02:42Z) - Towards Improving the Consistency, Efficiency, and Flexibility of
Differentiable Neural Architecture Search [84.4140192638394]
Most differentiable neural architecture search methods construct a super-net for search and derive a target-net as its sub-graph for evaluation.
In this paper, we introduce EnTranNAS that is composed of Engine-cells and Transit-cells.
Our method also spares much memory and computation cost, which speeds up the search process.
arXiv Detail & Related papers (2021-01-27T12:16:47Z) - Stretchable Cells Help DARTS Search Better [70.52254306274092]
Differentiable neural architecture search (DARTS) has gained much success in discovering flexible and diverse cell types.
Current DARTS methods are prone to wide and shallow cells, and this topology collapse induces sub-optimal searched cells.
In this paper, we endowing the cells with explicit stretchability, so the search can be directly implemented on our stretchable cells.
arXiv Detail & Related papers (2020-11-18T14:15:51Z) - MTL-NAS: Task-Agnostic Neural Architecture Search towards
General-Purpose Multi-Task Learning [71.90902837008278]
We propose to incorporate neural architecture search (NAS) into general-purpose multi-task learning (GP-MTL)
In order to adapt to different task combinations, we disentangle the GP-MTL networks into single-task backbones.
We also propose a novel single-shot gradient-based search algorithm that closes the performance gap between the searched architectures.
arXiv Detail & Related papers (2020-03-31T09:49:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.