DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning
- URL: http://arxiv.org/abs/1905.13543v3
- Date: Fri, 10 Mar 2023 18:30:42 GMT
- Title: DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning
- Authors: Xiawu Zheng, Chenyi Yang, Shaokun Zhang, Yan Wang, Baochang Zhang,
Yongjian Wu, Yunsheng Wu, Ling Shao, Rongrong Ji
- Abstract summary: We propose an efficient and unified NAS framework termed DDPNAS via dynamic distribution pruning.
In particular, we first sample architectures from a joint categorical distribution. Then the search space is dynamically pruned and its distribution is updated every few epochs.
With the proposed efficient network generation method, we directly obtain the optimal neural architectures on given constraints.
- Score: 135.27931587381596
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural Architecture Search (NAS) has demonstrated state-of-the-art
performance on various computer vision tasks. Despite the superior performance
achieved, the efficiency and generality of existing methods are highly valued
due to their high computational complexity and low generality. In this paper,
we propose an efficient and unified NAS framework termed DDPNAS via dynamic
distribution pruning, facilitating a theoretical bound on accuracy and
efficiency. In particular, we first sample architectures from a joint
categorical distribution. Then the search space is dynamically pruned and its
distribution is updated every few epochs. With the proposed efficient network
generation method, we directly obtain the optimal neural architectures on given
constraints, which is practical for on-device models across diverse search
spaces and constraints. The architectures searched by our method achieve
remarkable top-1 accuracies, 97.56 and 77.2 on CIFAR-10 and ImageNet (mobile
settings), respectively, with the fastest search process, i.e., only 1.8 GPU
hours on a Tesla V100. Codes for searching and network generation are available
at: https://openi.pcl.ac.cn/PCL AutoML/XNAS.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - GeNAS: Neural Architecture Search with Better Generalization [14.92869716323226]
Recent neural architecture search (NAS) approaches rely on validation loss or accuracy to find the superior network for the target data.
In this paper, we investigate a new neural architecture search measure for excavating architectures with better generalization.
arXiv Detail & Related papers (2023-05-15T12:44:54Z) - Lightweight Neural Architecture Search for Temporal Convolutional
Networks at the Edge [21.72253397805102]
This work focuses in particular on Temporal Convolutional Networks (TCNs), a convolutional model for time-series processing.
We propose the first NAS tool that explicitly targets the optimization of the most peculiar architectural parameters of TCNs.
We test the proposed NAS on four real-world, edge-relevant tasks, involving audio and bio-signals.
arXiv Detail & Related papers (2023-01-24T19:47:40Z) - You Only Search Once: On Lightweight Differentiable Architecture Search
for Resource-Constrained Embedded Platforms [10.11289927237036]
Differentiable neural architecture search (NAS) has evolved as the most dominant alternative to automatically design competitive deep neural networks (DNNs)
We introduce a lightweight hardware-aware differentiable NAS framework dubbed LightNAS, striving to find the required architecture through a one-time search.
Extensive experiments are conducted to show the superiority of LightNAS over previous state-of-the-art methods.
arXiv Detail & Related papers (2022-08-30T02:23:23Z) - Pruning-as-Search: Efficient Neural Architecture Search via Channel
Pruning and Structural Reparameterization [50.50023451369742]
Pruning-as-Search (PaS) is an end-to-end channel pruning method to search out desired sub-network automatically and efficiently.
Our proposed architecture outperforms prior arts by around $1.0%$ top-1 accuracy on ImageNet-1000 classification task.
arXiv Detail & Related papers (2022-06-02T17:58:54Z) - Efficient Architecture Search for Diverse Tasks [29.83517145790238]
We study neural architecture search (NAS) for efficiently solving diverse problems.
We introduce DASH, a differentiable NAS algorithm that computes the mixture-of-operations using the Fourier diagonalization of convolution.
We evaluate DASH-Bench-360, a suite of ten tasks designed for NAS benchmarking in diverse domains.
arXiv Detail & Related papers (2022-04-15T17:21:27Z) - BaLeNAS: Differentiable Architecture Search via the Bayesian Learning
Rule [95.56873042777316]
Differentiable Architecture Search (DARTS) has received massive attention in recent years, mainly because it significantly reduces the computational cost.
This paper formulates the neural architecture search as a distribution learning problem through relaxing the architecture weights into Gaussian distributions.
We demonstrate how the differentiable NAS benefits from Bayesian principles, enhancing exploration and improving stability.
arXiv Detail & Related papers (2021-11-25T18:13:42Z) - Optimizing Neural Architecture Search using Limited GPU Time in a
Dynamic Search Space: A Gene Expression Programming Approach [0.0]
We propose an evolutionary-based neural architecture search approach for efficient discovery of convolutional models.
With its efficient search environment and phenotype representation, Gene Expression Programming is adapted for network's cell generation.
Our proposal achieved similar state-of-the-art to manually-designed convolutional networks and also NAS-generated ones, even beating similar constrained evolutionary-based NAS works.
arXiv Detail & Related papers (2020-05-15T17:32:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.