DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs
- URL: http://arxiv.org/abs/2306.15390v1
- Date: Tue, 27 Jun 2023 11:28:29 GMT
- Title: DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs
- Authors: Yanjing Li, Sheng Xu, Xianbin Cao, Li'an Zhuo, Baochang Zhang, Tian
Wang, Guodong Guo
- Abstract summary: 1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
- Score: 53.82853297675979
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural architecture search (NAS) proves to be among the effective approaches
for many tasks by generating an application-adaptive neural architecture, which
is still challenged by high computational cost and memory consumption. At the
same time, 1-bit convolutional neural networks (CNNs) with binary weights and
activations show their potential for resource-limited embedded devices. One
natural approach is to use 1-bit CNNs to reduce the computation and memory cost
of NAS by taking advantage of the strengths of each in a unified framework,
while searching the 1-bit CNNs is more challenging due to the more complicated
processes involved. In this paper, we introduce Discrepant Child-Parent Neural
Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs, based on a new
framework of searching the 1-bit model (Child) under the supervision of a
real-valued model (Parent). Particularly, we first utilize a Parent model to
calculate a tangent direction, based on which the tangent propagation method is
introduced to search the optimized 1-bit Child. We further observe a coupling
relationship between the weights and architecture parameters existing in such
differentiable frameworks. To address the issue, we propose a decoupled
optimization method to search an optimized architecture. Extensive experiments
demonstrate that our DCP-NAS achieves much better results than prior arts on
both CIFAR-10 and ImageNet datasets. In particular, the backbones achieved by
our DCP-NAS achieve strong generalization performance on person
re-identification and object detection.
Related papers
- A Pairwise Comparison Relation-assisted Multi-objective Evolutionary Neural Architecture Search Method with Multi-population Mechanism [58.855741970337675]
Neural architecture search (NAS) enables re-searchers to automatically explore vast search spaces and find efficient neural networks.
NAS suffers from a key bottleneck, i.e., numerous architectures need to be evaluated during the search process.
We propose the SMEM-NAS, a pairwise com-parison relation-assisted multi-objective evolutionary algorithm based on a multi-population mechanism.
arXiv Detail & Related papers (2024-07-22T12:46:22Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - Differentiable NAS Framework and Application to Ads CTR Prediction [30.74403362212425]
We implement an inference and modular framework for Differentiable Neural Architecture Search (DNAS)
We apply DNAS to the problem of ads click-through rate (CTR) prediction, arguably the highest-value and most worked on AI problem at hyperscalers today.
We develop and tailor novel search spaces to a Deep Learning Recommendation Model (DLRM) backbone for CTR prediction, and report state-of-the-art results on the Criteo Kaggle CTR prediction dataset.
arXiv Detail & Related papers (2021-10-25T05:46:27Z) - BossNAS: Exploring Hybrid CNN-transformers with Block-wisely
Self-supervised Neural Architecture Search [100.28980854978768]
We present Block-wisely Self-supervised Neural Architecture Search (BossNAS)
We factorize the search space into blocks and utilize a novel self-supervised training scheme, named ensemble bootstrapping, to train each block separately.
We also present HyTra search space, a fabric-like hybrid CNN-transformer search space with searchable down-sampling positions.
arXiv Detail & Related papers (2021-03-23T10:05:58Z) - Binarized Neural Architecture Search for Efficient Object Recognition [120.23378346337311]
Binarized neural architecture search (BNAS) produces extremely compressed models to reduce huge computational cost on embedded devices for edge computing.
An accuracy of $96.53%$ vs. $97.22%$ is achieved on the CIFAR-10 dataset, but with a significantly compressed model, and a $40%$ faster search than the state-of-the-art PC-DARTS.
arXiv Detail & Related papers (2020-09-08T15:51:23Z) - CP-NAS: Child-Parent Neural Architecture Search for Binary Neural
Networks [27.867108193391633]
We propose a 1-bit convolutional neural network (CNN) to reduce the computation and memory cost of Neural Architecture Search (NAS)
A Child-Parent (CP) model is introduced to a differentiable NAS to search the binarized architecture (Child) under the supervision of a full-precision model (Parent)
It achieves the accuracy of $95.27%$ on CIFAR-10, $64.3%$ on ImageNet with binarized weights and activations, and a $30%$ faster search than prior arts.
arXiv Detail & Related papers (2020-04-30T19:09:55Z) - ADWPNAS: Architecture-Driven Weight Prediction for Neural Architecture
Search [6.458169480971417]
We propose an Architecture-Driven Weight Prediction (ADWP) approach for neural architecture search (NAS)
In our approach, we first design an architecture-intensive search space and then train a HyperNetwork by inputting encoding architecture parameters.
Results show that one search procedure can be completed in 4.0 GPU hours on CIFAR-10.
arXiv Detail & Related papers (2020-03-03T05:06:20Z) - DDPNAS: Efficient Neural Architecture Search via Dynamic Distribution
Pruning [135.27931587381596]
We propose an efficient and unified NAS framework termed DDPNAS via dynamic distribution pruning.
In particular, we first sample architectures from a joint categorical distribution. Then the search space is dynamically pruned and its distribution is updated every few epochs.
With the proposed efficient network generation method, we directly obtain the optimal neural architectures on given constraints.
arXiv Detail & Related papers (2019-05-28T06:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.